[Live-devel] Audio drift with live source

Mukherjee, Debargha debargha.mukherjee at hp.com
Thu Dec 17 11:37:08 PST 2009


HI,

I am receiving uncompressed audio and video from a live Directshow source, encoding them with ffmpeg and streaming them out using a RTSP server based on classes derived from Live OnDemandServerMediaSubsession. Then I play the feed on a remote machine using VLC player. The problem is that the audio and video start playing fine and well synchronized, but then the audio starts drifting slowly. Specifically, it gets faster. Within 5-10 minutes it is noticeably faster than the video and within 15 minutes, the audio stops playing altogether. I get error messages on VLC player saying that PTS is out of range. I am not sure if this is a VLC issue, but I suspect there is something about timestamps I may be doing wrong. 

In my implementation of deliverFrame() function in the classes for the sources, I read uncompressed audio and video from ring buffers (which are filled by another thread), compress them, and then fill the buffers accordingly before calling FramedSource::afterGetting(this). I also set fPresentationTime using gettimeofday(&fPresentationTime, NULL); and set fDurationInMicroseconds to 1000000/30 for video and the audio frame duration for audio. Occasionally, when the deliverFrame() function tries to read from the ring buffers, it does not find data available. Then I call envir().taskScheduler().scheduleDelayedTask(...) with a small delay interval and return. 

Any help or clues would be appreciated.
Debargha.


More information about the live-devel mailing list