[Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264

Bhawesh Kumar Choudhary bhawesh at vizexperts.com
Wed Jan 8 07:23:52 PST 2014


Hi,

I am using Live555 library to create rtsp stream for live device source. I
read the FAQ and mailing list and sub classed all the required classes ::

1.       Framed source (using the deviceSource.cpp model)

2.       OnDemandServerMediaSubsession (using
H264VideoFileServerMediaSubsession model)

But few on the questions are I am not able to figure out:

1.       In each call of my doGetNextFrame() of my device source I am
assuming that no frame data is available and I am returning. Instead
whenever I receive data I trigger a event which schedule getNextFrame() of
my device source. Is there is status check (waiting for event or running)
required for Live555 run loop  to schedule the task of the new data arrival
in the function signalNewFrameData() of device source?

2.       In my OnDemandServerMediaSubsession subclass I have implemented
both required function createNewStreamSource() and createNewRTPSink(). Since
I set the reuseFirstSource to true in my OnDemandServerMediaSunsession I
keep a reference of my device source in my dataMember variable. But the
following code

/// Function in OnDemandServerMediaSunsession causing application crash

char const* OnDemandServerMediaSubsession::sdpLines() {

  if (fSDPLines == NULL) {

    // We need to construct a set of SDP lines that describe this

    // subsession (as a unicast stream).  To do so, we first create

    // dummy (unused) source and "RTPSink" objects,

    // whose parameters we use for the SDP lines:

    unsigned estBitrate;

    FramedSource* inputSource = createNewStreamSource(0, estBitrate); //
will return only device source

    if (inputSource == NULL) return NULL; // file not found

 

    struct in_addr dummyAddr;

    dummyAddr.s_addr = 0;

    Groupsock dummyGroupsock(envir(), dummyAddr, 0, 0);

    unsigned char rtpPayloadType = 96 + trackNumber()-1; // if dynamic

    RTPSink* dummyRTPSink

      = createNewRTPSink(&dummyGroupsock, rtpPayloadType, inputSource);

 

    setSDPLinesFromRTPSink(dummyRTPSink, inputSource, estBitrate);

    Medium::close(dummyRTPSink);

    closeStreamSource(inputSource); // will deallocate my device source and
make my application crash. Commenting out this line and above line makes my
application running

  }

 

  return fSDPLines;

}

Commenting out those line is the right solution or I am doing something
wrong elsewhere? Also is it required to store the rtpSink returned by the
function createNewRTPSink() in my subclass of onDemandServerMediaSubSession?

 

Thanks and Regards,

Bhawesh Kumar Choudhary

Graphics Engineer

VizExperts India Pvt. Ltd.

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20140108/e5a51005/attachment-0001.html>


More information about the live-devel mailing list