[Live-devel] Audio drift with live source
Mukherjee, Debargha
debargha.mukherjee at hp.com
Wed Jan 13 14:13:13 PST 2010
Hi Ross,
Having explored the issue further, I have narrowed down the problem, but not quite solved it yet. Any hints or help would be appreciated.
I am using MP2 audio encoding for which the compressed framesize is supposed to be 576 bytes (sampling rate is 32 KHz, single channel). However, occasionally fMaxSize in deliverFrame() is less than 576. It is 240 or so. When that happens, I write only 240 bytes to fTo, and assign fFrameSize to 240 instead of 576. It seems that these truncated frames are simply not being handled properly by RTPSink. In fact, they are not getting transmitted by RTPSink at all, and in addition they are causing a timestamp drift which builds up over time.
Is my handling of the case where fMaxSize is < 576 correct. If not, what should be the right way? Also, is there a way to prevent this case from happening?
Best Regards,
Debargha.
-----Original Message-----
From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson
Sent: Thursday, December 17, 2009 5:19 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Audio drift with live source
>In my implementation of deliverFrame() function in the classes for
>the sources, I read uncompressed audio and video from ring buffers
>(which are filled by another thread), compress them, and then fill
>the buffers accordingly before calling
>FramedSource::afterGetting(this). I also set fPresentationTime using
>gettimeofday(&fPresentationTime, NULL); and set
>fDurationInMicroseconds to 1000000/30 for video and the audio frame
>duration for audio.
If "fPresentationTime" is set properly (to accurate
wall-clock-synchronized presentation times) for both the audio and
video frames, then VLC should be synchronizing them properly at the
receiving end. (This is assuming, of course, that RTCP "SR" packets
from the server are also reaching the client - which they should.)
So, I suggest taking a closer look at the setting of
"fPresentationTime" - for both audio and video frames - and making
sure that they are accurate.
Also, in principle, because you are reading from a live source
(rather than from a prerecorded file), you need not set
"fDurationInMicroseconds" (and so it will get set to its default
value of 0). However, this would mean that the situation that you
describe below will become the norm:
>Occasionally, when the deliverFrame() function tries to read from
>the ring buffers, it does not find data available. Then I call
>envir().taskScheduler().scheduleDelayedTask(...) with a small delay
>interval and return.
This is OK, provided that (once again) you are setting the
presentation time properly. Ideally, you should be recording the
presentation time (obtained by calling "gettimeofday()") at the time
that the data is encoded, not at the time that it gets read from your
ring buffer by your "FramedSource" subclass.
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
_______________________________________________
live-devel mailing list
live-devel at lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
More information about the live-devel
mailing list