[Live-devel] How to synchronize audio and video[already read FAQ]
    Ross Finlayson 
    finlayson at live.com
       
    Sun Jul 31 00:45:53 PDT 2005
    
    
  
>What's wrong with my codes?
I don't know.  (In general, I don't have time to debug people's 
custom code - except for our consulting clients.)  However, in your 
receiver, you can try calling 
"RTPSource::hasBeenSynchronizedUsingRTCP()" for each received packet, 
to check if/when RTCP "Sender Reports" (from the server) ever get 
used to compute a synchronized presentation time.  (Until the first 
RTCP "Sender Report" is received, the receiving code uses 'wall 
clock' time as the presentation time.)
>That is to say the durationTime from RTPSource is useless?
For now at least, yes.
>To understand how the "<http://LIVE.COM>LIVE.COM Streaming Media" 
>code works, I
>suggest that you start by examing the code for the existing demo
>applications ("testProgs"), before trying to modify this to develop
>your own code.
>
>
>I have read testMP3Streamer and testMP3Receiver. but synchronization 
>is not involved in them.
Look instead at "testMPEG1or2AudioVideoStreamer", which sends 
separate, synchronized RTP streams for audio and video - from a MPEG 
Program Stream file.
	Ross Finlayson
	LIVE.COM
	<http://www.live.com/>
    
    
More information about the live-devel
mailing list