[Live-devel] Audio drift with live source

Ross Finlayson finlayson at live555.com
Thu Dec 17 17:18:33 PST 2009


>In my implementation of deliverFrame() function in the classes for 
>the sources, I read uncompressed audio and video from ring buffers 
>(which are filled by another thread), compress them, and then fill 
>the buffers accordingly before calling 
>FramedSource::afterGetting(this). I also set fPresentationTime using 
>gettimeofday(&fPresentationTime, NULL); and set 
>fDurationInMicroseconds to 1000000/30 for video and the audio frame 
>duration for audio.

If "fPresentationTime" is set properly (to accurate 
wall-clock-synchronized presentation times) for both the audio and 
video frames, then VLC should be synchronizing them properly at the 
receiving end.  (This is assuming, of course, that RTCP "SR" packets 
from the server are also reaching the client - which they should.)

So, I suggest taking a closer look at the setting of 
"fPresentationTime" - for both audio and video frames - and making 
sure that they are accurate.

Also, in principle, because you are reading from a live source 
(rather than from a prerecorded file), you need not set 
"fDurationInMicroseconds" (and so it will get set to its default 
value of 0).  However, this would mean that the situation that you 
describe below will become the norm:

>Occasionally, when the deliverFrame() function tries to read from 
>the ring buffers, it does not find data available. Then I call 
>envir().taskScheduler().scheduleDelayedTask(...) with a small delay 
>interval and return.

This is OK, provided that (once again) you are setting the 
presentation time properly.  Ideally, you should be recording the 
presentation time (obtained by calling "gettimeofday()") at the time 
that the data is encoded, not at the time that it gets read from your 
ring buffer by your "FramedSource" subclass.
-- 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


More information about the live-devel mailing list