[Live-devel] Synchrnozing video and audio using OnDemandServerMediaSubsessions

Ross Finlayson finlayson at live555.com
Fri Sep 5 14:27:13 PDT 2008


To get proper audio/video synchronization, you must create a 
"RTCPInstance" for each "RTPSink".  However, the 
"OneDemandServerMediaSubsession" class does this automatically, so 
because you're subclassing this, you don't need to do anything 
special to implement RTCP - you already have it.

However, the second important thing that you need is that the 
presentation times that you give to each frame (that feeds into each 
"RTPSink") *must* be accurate.  It is those presentation times that 
get delivered to the receiver, and used (by the receiver) to do 
audio/video synchronization.

Delaying the transmission (or not) does not affect this at all; it 
doesn't matter if video packets get sent slightly ahead of audio 
packets (or vice versa).  What's important is the *presentation 
times* that you give each frame.  If those are correct, then you will 
get audio/video synchronization at the receiver.

This is assuming, of course, that your *receiver* implements standard 
RTCP-based synchronization correctly.  (If your receiver uses our 
library, than it will.)  But if your receiver is not standards 
compliant and doesn't implement this, then audio/video 
synchronization will never work.)
-- 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


More information about the live-devel mailing list