[Live-devel] What is a frame in the live-lib ?
Heintje Müller
live-devel
Sat Jun 11 23:51:57 PDT 2005
>> > So, is there one PES source for audio, and another PES source for
>> > video?
>>
>> Yes, our DVB parser currently delivers two PES, one stream for audio
>> and one for video. It could deliver ES instead, but then we guess
>> that both are out of sync ;-)
>
>
> No, not necessarily, because the timestamps in the video Elementary
> Stream will be used instead of the timestamps in the PES header.
>
> I suggest that you deliver your PES streams as an unstructured byte
> source - i.e., deliver as much data as the downstream reader requests
> (fMaxSize), but no more than the amount of data remaining in the
> last-read PES packet. *Do not* include the PES header in the data
> that you deliver.
>
> Then, feed the data from the video stream into a
> "MPEG1or2VideoStreamFramer", and feed the data from the audio stream
> into a "MPEG1or2AudioStreamFramer". (In the latter case, you may want
> to set the "syncWithInputSource" parameter to True.)
>
We changed our DVBParser to deliver two elementary streams (ES), set the
syncWithInputSource to true, but audio and video are out of sync.
[code]
audioSource = MPEG1or2AudioStreamFramer::createNew(*env, audioES, true);
[/code]
Just in order to make sure that our DVBParser works right, we tried to
stream two ES streams from files, but it's the same, not synced.
(Two PES-file-streames (each one goes to a MPEG1or2Demuxer) result in
the same way.)
Any ideas? I guess, we do something fundamentally wrong ;-)
fyi: our current system:
server (not sync!):
video: ByteStreamBufferSource (filled by our DVBParser) -> (FramedSource
-> )MPEG1or2VideoFramer -> MPEG1or2VideoRTPSink
audio: ByteStreamBufferSource (filled by our DVBParser) -> (FramedSource
-> )MPEG1or2AudioFramer -> MPEG1or2AudioRTPSink
proxy (be planned, not yet implemented):
video: MPEG1or2VideoRTPSource (received by server) -> to our buffer
audio: MPEG1or2VideoRTPSource (received by server) -> to our buffer
(two threads)
transcoder transcodes the video buffer
after that the same way like in the server.. (of course, in one thread)
Best regards,
Heintje ;o)
More information about the live-devel
mailing list