[Live-devel] Mpeg2 ES and audio PCM stream Synchronization with Live555
Ross Finlayson
finlayson at live555.com
Tue Feb 12 17:55:53 PST 2008
I haven't had time to wade through this in detail, but it seems that
you've run into the same problem that I had when I was developing the
"wis-streamer" application <http://www.live555.com/wis-streamer/>.
Namely, if your streaming server is fed audio and video data as two
streams of *unstructured data* - rather than two streams of discrete
frames - then it is difficult to ensure that the resulting audio and
video frames - when parsed from each stream - will get accurate
presentation times (and it will be difficult to ensure that they
remain accurate over time). (You get proper a/v synchronization at
the client end if and only if the server-generated presentation times
were accurate.)
As I found with "wis-streamer", the solution is to change your
application so that it is fed streams of *discrete* frames, each
having its own presentation timestamp that is generated directly,
when it the frame is produced (e.g., by a hardware encoder). Then
you don't have to parse the streams to extract (possibly inaccurate)
presentation times. I.e., instead of using
"MPEG1or2VideoStreamFramer", use "MPEG1or2VideoStreamDiscreteFramer".
So, I suggest that you revew the "wis-streamer" source code for hints
that may help your application.
(This will be my last posting on this subject.)
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list