[Live-devel] Synchrnozing video and audio using OnDemandServerMediaSubsessions

Diego Barberio diego.barberio at redmondsoftware.com
Fri Sep 5 13:00:47 PDT 2008


Hi all,

 

I'm new to the live555 library.

 

I have a MediaServerSession with two SubSessions (one for H263 video and the
other for G.711 A-law audio), both SubSessions extend from
OnDemandServerMediaSubsession, the one I use for the video is called
CH263plusVideoDXServerMediaSubsession and the other is called
CALawAudioDXServerMediaSubsession. I have also two FramedSources one for
each SubSession, one called CH263plusVideoDXFrameSource and the other
CAlawAudioDXFrameSource.

 

The streaming for both medias works perfectly, but the audio is delayed
about 1.5 seconds from the video. To solve this I've tried to delay the
video subsession by adding 1500 milliseconds to the fPresentationTime
attribute in CH263plusVideoDXServerMediaSubsession::doGetNextFrame method,
however no change was perceived. So I started googling this problem, until I
reached to the question "Why do most RTP sessions use separate streams for
audio and video? How can a receiving client synchronize these streams?" from
FAQs.

 

The problem is that I don't know where I should create the instance for
RTCPInstance class, and there's no variable or field where I can store it. I
looked in the OnDemandServerMediaSubsession and FramedSource class.

 

Is there any way to delay the video streaming without using the
RTCPInstance, if not, where I should create it and where I should store it?

 

 

If you need anything else, please ask for it.

 

 

Diego

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20080905/82e1315c/attachment.html>


More information about the live-devel mailing list