[Live-devel] Framed Source Issue

Ross Finlayson finlayson at live555.com
Wed Mar 18 14:58:26 PDT 2009


>I am having an issue on the streaming-out side. The audio and video 
>encoders read raw data from shared buffers using two derived 
>FramedSource classes modeled after DeviceSource.cpp. The 
>deliverFrame() function in these derived classes read raw audio and 
>video from respective shared buffers, encodes them using ffmpeg 
>libraries, fills up the buffers and sets other parameters 
>appropriately, before returning. Occasionally, when a shared buffer 
>is accessed for read, there is'nt enough data avaialable to read, 
>possibly due to jitter in processing time on the write side of the 
>shared buffers. What is the right action in that case?

If your "doGetNextFrame()" implementation can't deliver data (to its 
downstream object) immediately, it should instead just return 
(*without* calling "FramedSource::afterGetting()", because you 
haven't delivered any data in this case).  In this case, the future 
availability of sufficient data must be handled via an event in the 
event loop (delayed 'polling' using 
"TaskScheduler::scheduleDelayedTask()" is one way to do this, if you 
can't make the arrival of new data an 'event' in any other way).

>  If my buffer reader waited a few milli-secs until there is enough 
>data available to read (by using Events or otherwise), the receiver 
>side VLC player freezes. If I return with fFramesize = 0

No, don't do this.

>, the application crashes. The only thing that seems to work is if I 
>re-encoded the previous frame (for video) and encoded all-zero (for 
>audio), and filled up the buffers and other parameters the normal 
>way. Even in this case, the receiving VLC player freezes every few 
>min or so.

This 'freezing' suggests one of two possibilities:
1/ You might not be setting presentation times correctly on the data, 
before it gets fed to a "MultiFramedRTPSink" (subclass).  Your 
"doGetNextFrame()" implementation should set "fPresentationTime", 
before calling "FramedSource::afterGetting()".
2/ Your "doGetNextFrame()" implementation might not be setting 
"fDurationInMicroseconds" correctly, thereby causing the downstream 
"MultiFramedRTPSink" to delay excessively (after sending a RTP 
packet) before requesting new data.  Because you are streaming from a 
live source - rather than from a prerecorded file - you might be able 
to get away with not setting "fDurationInMicroseconds" at all (which 
will cause it to keep its default value of zero).

I hope this helps.
-- 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


More information about the live-devel mailing list