[Live-devel] synchronisation in a multithreading application

Eric Peters epeters at graphics.cs.uni-sb.de
Fri Nov 26 14:43:16 PST 2004


Hello,

as Ross mentioned I read the FAQ and the source code of MPlayer and 
openRTSP. I found the following difference between this code and my 
code. MPlayer and openRTSP are using MediaSession and MediaSubsessions. 
Do I need a MediaSession and for each media source a subsession for 
lipsynchronous video on the receiver side?

What I don't understand is the following. In my example I have one 
thread for each media source and in each thread I have a 
UsageEnvironment and a TaskScheduler and for each media source I created 
a RTCPInstance. Each media stream know nothing about the other media 
streams. So, how can they be synchronized via RTCP? How can liveMedia 
compute synchronous presentation timestampsin different threads? 
Shouldn't there be a controller which manages this? Is this the 
MediaSession?

Is it generally possible to create such a multithreaded receiver 
application with synchronous audio/video?

Thanks in advance
Eric



Eric Peters wrote:
> Hello to all,
> 
> I have written an application which decodes a DVD with some different 
> audio tracks (different languages for example). Now I want to stream 
> this video and audio tracks over RTP using liveMedia. So it should be 
> possible on the client side to see the movie in different languages.
> 
> For example I stream the movie and the audio multicast on the following 
> ports:
> audio1 (english): 6666
> audio2 (german):  6668
> audio3 (french):  6670
> video:            8888
> 
> So, when I use a sdp-file which includes audio on port 6668 and video on 
> port 8888 I want to see the movie in german.
> 
> Now my question: This is an multithreading application. I read the FAQ 
> for threads and I have for each streaming instance my own 
> UsageEnvironment and TaskScheduler. But on the client side I have 
> problems to synchronize the audio and video tracks.
> 
> Is this synchronisation possible at all when I use independent 
> UsageEnvironment and TaskScheduler? Or aren't the UsageEnvironments and 
> TaskSchedulers completely independent from each other?
> 
> How can I synchronize the audio and video track? Which timestamps/values 
> schould I use for synchronisation?



More information about the live-devel mailing list