[Live-devel] Limiting frame rate for a source

Jan Ekholm jan.ekholm at d-pointer.com
Tue Apr 22 09:01:20 PDT 2014


On 22 apr 2014, at 18:28, Ross Finlayson <finlayson at live555.com> wrote:

>> I have a USB camera that I can stream as MJPEG using a JPEGVideoSource subclass.
>> It all works nicely and the frames are streamed and received ok. I would however like
>> to be able to limit the frame rate of the stream, as it now seems to be 20+ fps. In this
>> case it's way too high as the use case is a surveillance camera that grabs a big overview
>> image, not video conferencing. I tried looking at the FAQ and the examples but didn't
>> see anything.
> 
> I'm not sure I understand your question.  You have written your own media source class (in this case, a subclass of "JPEGVideoSource") that delivers encoded (JPEG) frames.  You want to reduce its frame rate - i.e., how soon it calls "FramedSource::afterGetting()" in response to each call to "doGetNextFrame()".  So just do it.  This is your code :-)

That I can do, of course. But isn't Live555 single threaded and if I decide to limit
the frame rate to, say, 1 fps I will basically block the entire application? I have several
sources that the same application will need to handle. Ideally I would like to multithread
my application, but I've already once rewritten the core loop to be based on Live555's
main loop mechanism.

I was hoping that there was some built in rate limitation that only caused doGetNextFrame()
to get called when a new frame was needed. 

Or can I perhaps manipulate the presentation time and Live555 would based on that
determine that it has enough frames for now and throttle a bit?

>> Also, is there some way to know when all clients have disconnected from a RTSP source
>> so that I could stop grabbing and encoding frames?
> 
> This should happen automatically.  I.e., when the last concurrent RTSP client has disconnected (or timed out), then your media source class's destructor will get called.  Therefore, you should write your media source class's destructor so that it stops grabbing/encoding.
> 
> Don't forget to have your subclass of "OnDemandServerMediaSubsession" set the "reuseFirstSource" parameter to True when it calls the "OnDemandServerMediaSubsession" constructor.  This will ensure that no more than one instance of your media source class will ever be created concurrently, regardless of the number of concurrent RTSP clients.

Yes, that happens for the H264 source, but not for the MJPEG one. I have reuseFirstSource set to true as
the camera can only be opened once. I will have to dig deeper so see why the MJPEG source does not stop.
Both OnDemandServerMediaSubsession are very similar, apart from the H264 one having the extra dummy
sink to get the aux data.

-- 
Jan Ekholm
jan.ekholm at d-pointer.com






More information about the live-devel mailing list