[Live-devel] How to synchronize audio and video[already read FAQ]

Ross Finlayson finlayson at live.com
Sat Jul 30 22:18:29 PDT 2005


At 08:43 PM 7/30/2005, you wrote:
>Hello, all
>I've read the FAQ about synchronization. It says that the parameter 
>presentationTime passed to afterGetFrame() can be used to synchronize,

Yes, but note that this FAQ applies to *receiving* RTP streams (i.e., 
over a network).  I.e., it applies to subclasses of "RTPSource".

>  but I can't figure out how to use the parameter. I even can't 
> understand the exact mean of the parameter. Is it the same as the 
> MediaSource::fPresentationTime exactly? In other words, if in my 
> MultiFramedMediaSource::doGetNextFrame() implementation, I set 
> fPresentationTime={1000,1000} for one particular frame, the 
> receiver should get a presentationTime={1000,1000} for this frame 
> when call afterGettingFrame, shouldn't it?

It's not clear from your description exactly what you are using your 
"MultiFramedMediaSource" for.  But, if you are using it as a source 
for an outgoing RTP stream (i.e., feeding it into a "RTPSink"), then 
your presentation times should be aligned with real, 'wallclock' 
time.  I.e., the first time you generate "fPresentationTime" in your 
source object, you should do so using the value obtained by calling 
"gettimeofday()".  Plus, of course, you must have a "RTCPInstance" - 
at both the sender and receiver - for synchronization to work.

>Another attribute of MediaSource I don't understand is 
>fDurationInMicroseconds. In the above example, I set it to 10000 in 
>MultiFramedMediaSource::doGetNextFrame, but I always get a 0 for 
>durantiontimeInMicroseconds in afterGetttingFrame(), why?

"duration in microseconds" - unlike "presentation time" - is not a 
parameter that gets passed within RTP (i.e., from sender to 
receiver).  Instead, it is a parameter that is used only internally 
within a chain of "Media" objects.  In particular, for data sources 
that feed into a "MultiFramedRTPSink" (subclass), "duration in 
microseconds" tells the "MultiFramedRTPSink" how long to delay 
between sending each outgoing RTP packet.

To understand how the "LIVE.COM Streaming Media" code works, I 
suggest that you start by examing the code for the existing demo 
applications ("testProgs"), before trying to modify this to develop 
your own code.




More information about the live-devel mailing list