[Live-devel] synchronisation in a multithreading application

Ross Finlayson finlayson at live.com
Fri Nov 26 10:08:26 PST 2004

>What I don't understand is the following. In my example I have one thread 
>for each media source and in each thread I have a UsageEnvironment and a 
>TaskScheduler and for each media source I created a RTCPInstance. Each 
>media stream know nothing about the other media streams. So, how can they 
>be synchronized via RTCP?

See <http://www.live.com/liveMedia/faq.html#separate-rtp-streams>.  If you 
have "RTCPInstance" objects for each of your "RTPSource"s, then the 
"presentationTime" parameter that's passed to the 'afterGettingFunc' of 
"getNextFrame()" will (after a small number of seconds) be an accurate, 
time-synchronized time.  Note that these presentation times are generated 
originally by the *server*; your receiving application gets them using RTCP.

The fact that (in your case) each receiving "RTPSource"/"RTCPInstance" is 
running in a different thread is irrelevant.

Example: Suppose that you receive a video frame with presentation time 
42.123 seconds, and then (within the same, or another, thread - it doesn't 
matter) an audio frame with presentation time 42.456.  You then know that 
the video frame is 0.333 seconds (=42.456-42.123) ahead of the audio frame, 
and can synchronize accordingly.

	Ross Finlayson

More information about the live-devel mailing list