[Live-devel] Questions about live555:

saravanan saravanan.s at fossilshale.com
Mon Jan 28 05:11:58 PST 2013


Hi,

 

I could able to play MJPEG(640x480) and AAC  streams fine in separate RTSP
server media sessions. But, If I try to play both the streams through single
RTSP server media session then I am getting the following results,

 

.         Some time video is very slow but audio is OK

.         Some time video is not at all playing but audio is OK

.         Some time both audio and video are not at all playing

 

I am using OnDemandServerMediaSubsession for my testing. Your help would be
appreciated.

 

Thanks,

Saravanan S

 

 

From: saravanan [mailto:saravanan.s at fossilshale.com] 
Sent: Friday, January 25, 2013 9:13 PM
To: 'LIVE555 Streaming Media - development & use'
Subject: Re: [Live-devel] Questions about live555:

 

Dear Ross Finlayson,

 

Thanks, Now it works fine  with your input to set the flat to True.

 

Regards,

Saravanan S

 

From: Ross Finlayson [mailto:finlayson at live555.com] 
Sent: Friday, January 25, 2013 7:18 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Questions about live555:

 

I am using the Live555 server to stream MJPEG data captured from the device.
We derived a class JPEGDeviceSource from JPEGVideSource, and reading the
data from sharedbuffer(sent by device) in doGetNextFrame() of
JPEGDeviceSource.

 

Everything works fine for a single RTSP client session, if we open a second
RTSP client session the frame rate is reduced by 2 . The GetBuffer call to
the Device returns always new frame, so single session is working fine with
full frame rate(30fps). If we open one more session then the frame rate is
reduced by 2 (15fps). 

 

Once I call the GetBuffer(), I will get the complete frame. Now I want to
make sure that this frame data should be available for all the sessions
opened with this server. How to achieve this ?

 

You haven't said anything about your server implementation, but I presume
you have implemented your own subclass of "OnDemandServerMediaSubsession".

 

Your subclass's constructor - when it calls the
"OnDemandServerMediaSubsession" constructor - should make sure that the
"reuseFirstSource" parameter is "True".  You do this because you are
streaming from a live source, rather than from a file.  Setting
"reuseFirstSource" to "True" tells the server to use a single input stream
as a data source, regardless of how many clients are currently accessing the
server.

 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/ 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20130128/bf1389af/attachment-0001.html>


More information about the live-devel mailing list