[Live-devel] [Mirasys] Live555 RTSP server questions

Victor Vitkovskiy victor.vitkovskiy at mirasys.com
Wed Jan 12 02:44:30 PST 2022


Hello Ross, 

I have tried to override OnDemandServerMediaSubsession and create my own H264FramedSource but I have exception here:
void FramedSource::getNextFrame(unsigned char* to, unsigned maxSize, afterGettingFunc* afterGettingFunc, void* afterGettingClientData, onCloseFunc* onCloseFunc, void* onCloseClientData) {
  // Make sure we're not already being read:
  if (fIsCurrentlyAwaitingData) {
    envir() << "FramedSource[" << this << "]::getNextFrame(): attempting to read more than once at the same time!\n";
    envir().internalError();
  }

fIsCurrentlyAwaitingData is true, so envir().internalError() is called.

Call stack points me to H264or5VideoRTPSink:
>	Live555test.exe!FramedSource::getNextFrame(unsigned char * to, unsigned int maxSize, void(*)(void *, unsigned int, unsigned int, timeval, unsigned int) afterGettingFunc, void * afterGettingClientData, void(*)(void *) onCloseFunc, void * onCloseClientData) Line 64	C++
 	Live555test.exe!H264or5VideoStreamDiscreteFramer::doGetNextFrame() Line 104	C++
 	Live555test.exe!FramedSource::getNextFrame(unsigned char * to, unsigned int maxSize, void(*)(void *, unsigned int, unsigned int, timeval, unsigned int) afterGettingFunc, void * afterGettingClientData, void(*)(void *) onCloseFunc, void * onCloseClientData) Line 79	C++
 	Live555test.exe!H264or5Fragmenter::doGetNextFrame() Line 182	C++
 	Live555test.exe!FramedSource::getNextFrame(unsigned char * to, unsigned int maxSize, void(*)(void *, unsigned int, unsigned int, timeval, unsigned int) afterGettingFunc, void * afterGettingClientData, void(*)(void *) onCloseFunc, void * onCloseClientData) Line 79	C++
 	Live555test.exe!MultiFramedRTPSink::packFrame() Line 226	C++
 	Live555test.exe!MultiFramedRTPSink::buildAndSendPacket(bool isFirstPacket) Line 200	C++
 	Live555test.exe!MultiFramedRTPSink::continuePlaying() Line 160	C++
 	Live555test.exe!H264or5VideoRTPSink::continuePlaying() Line 128	C++
 	Live555test.exe!MediaSink::startPlaying(MediaSource & source, void(*)(void *) afterFunc, void * afterClientData) Line 79	C++
 	Live555test.exe!StreamState::startPlaying(Destinations * dests, unsigned int clientSessionId, void(*)(void *) rtcpRRHandler, void * rtcpRRHandlerClientData, void(*)(void *, unsigned char) serverRequestAlternativeByteHandler, void * serverRequestAlternativeByteHandlerClientData) Line 560	C++
 	Live555test.exe!OnDemandServerMediaSubsession::startStream(unsigned int clientSessionId, void * streamToken, void(*)(void *) rtcpRRHandler, void * rtcpRRHandlerClientData, unsigned short & rtpSeqNum, unsigned int & rtpTimestamp, void(*)(void *, unsigned char) serverRequestAlternativeByteHandler, void * serverRequestAlternativeByteHandlerClientData) Line 219	C++
 	Live555test.exe!RTSPServer::RTSPClientSession::handleCmd_PLAY(RTSPServer::RTSPClientConnection * ourClientConnection, ServerMediaSubsession * subsession, const char * fullRequestStr) Line 1905	C++
 	Live555test.exe!RTSPServer::RTSPClientSession::handleCmd_withinSession(RTSPServer::RTSPClientConnection * ourClientConnection, const char * cmdName, const char * urlPreSuffix, const char * urlSuffix, const char * fullRequestStr) Line 1696	C++
 	Live555test.exe!RTSPServer::RTSPClientConnection::handleRequestBytes(int newBytesRead) Line 871	C++
 	Live555test.exe!GenericMediaServer::ClientConnection::incomingRequestHandler() Line 323	C++
 	Live555test.exe!GenericMediaServer::ClientConnection::incomingRequestHandler(void * instance, int __formal) Line 304	C++
 	[External Code]	
 	Live555test.exe!main(int argc, char * * argv) Line 125	C++

My H264FramedSource is not called at all, so this happens in higher components.
My H264ServerMediaSubsession is simple and looks like this:

H264ServerMediaSubsession* H264ServerMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource) {
    return new H264ServerMediaSubsession(env, reuseFirstSource);
}

H264ServerMediaSubsession::H264ServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource) :
    OnDemandServerMediaSubsession(env, reuseFirstSource)
{
    mFramedSource = H264FramedSource::createNew(env);
}

H264ServerMediaSubsession::~H264ServerMediaSubsession() {
    Medium::close(mFramedSource);
}

FramedSource* H264ServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
    estBitrate = 500; // kbps, estimate

    // Create a framer for the Video Elementary Stream:
    return H264VideoStreamDiscreteFramer::createNew(envir(), mFramedSource);
}

RTPSink* H264ServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) {

    u_int8_t sps[] = { 0x00, 0x00, 0x01, 0x67, 0x42, 0x00, 0x29, 0xE2, 0x90, 0x0A, 0x00, 0xB7, 0x60, 0x2D, 0xC0, 0x40, 0x40, 0x78, 0x78, 0x91, 0x15 };
    u_int8_t pps[] = { 0x00, 0x00, 0x01, 0x68, 0xCE, 0x3C, 0x80 };

    return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic, sps, sizeof(sps), pps, sizeof(pps));
}

Server media session is created and added to RTSPServer like in example:
    ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString);
    sms->addSubsession(H264ServerMediaSubsession::createNew(*env, reuseFirstSource));
    rtspServer->addServerMediaSession(sms);
    announceURL(rtspServer, sms);

Could you please tell me what I am doing wrong? 

Best regards,
-----------------------------------------
Victor Vitkovskiy
Senior software developer
mailto: victor.vitkovskiy at mirasys.com
www.mirasys.com


-----Original Message-----
From: live-devel <live-devel-bounces at us.live555.com> On Behalf Of Ross Finlayson
Sent: Wednesday, 12 January 2022 12:07
To: LIVE555 Streaming Media - development & use <live-devel at us.live555.com>
Subject: Re: [Live-devel] [Mirasys] Live555 RTSP server questions

EXTERNAL


> On Jan 12, 2022, at 10:29 PM, Victor Vitkovskiy <victor.vitkovskiy at mirasys.com> wrote:
>
> Hello Ross,
>
> Thank you for your answers.
>
> Still I have some opened questions:
>>> You don’t need to be concerned at all with the internals of the LIVE555 code to do what you want here.
> This doesn't give me any information how to do this :).
> If I don't need to subclass from RTSPServer then how I can detect new client connected / disconnected

You don’t need to do this.  Our RTSP server code does this (detect/manage the connection/disconnection of clients) for you.  All you need to do is write a subclass of “FramedSource” that delivers a frame of data each time it’s asked (via “doGetNextFrame()”), and write your own subclass of “OnDemandServerMediaSubsession” (implementing the virtual functions "createNewStreamSource()” and “createNewRTPSink()”).  That’s all.  You don’t need to concern yourself with the RTSP protocol, or the connection/disconnection of RTSP clients, or RTP, or RTCP.  Our code does all of that for you.


> It is clear for me how to create my H264FramedSource, but it is not clear how to use it in higher levels.
> testOnDemandRTSPServer example use this code:
>        ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString);
>        sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource));
>        rtspServer->addServerMediaSession(sms);
> So I need to subclass from H264VideoFileServerMediaSubsession and override those two virtual functions: createNewStreamSource and createNewRTPSink, is this correct?
> Or I need to subclass from OnDemandServerMediaSubsession

It’s probably best for you to define a subclass of “OnDemandServerMediaSubsession”, and implement the "createNewStreamSource()” and “createNewRTPSink()” virtual functions.  That’s all.


> and do the same thing (like reading SDP information from H.264 stream)?

You shouldn't need to concern yourself with this.  Again, that’s our job.  However, for streaming H.264, it’s best if you tell your “H264VideoRTPSink” object (that you would create in your implementation of the “createNewRTPSink()” virtual function) about your H.264 stream’s SPS and PPS NAL units.   See “liveMedia/include/H264VideoRTPSink.hh”.  I.e., you should create your “H264VideoRTPSink” object using one of the forms of “createNew()” that take SPS and PPS NAL unit parameters.  E.g.

        RTPSink* MyH264VideoServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock,
                        unsigned char rtpPayloadTypeIfDynamic,
                        FramedSource* /*inputSource*/) {
                return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
                                SPS_NAL_unit, size_of_SPS_NAL_unit,
                                PPS_NAL_unit, size_of_PPS_NAL_unit);
        }

where “SPS_NAL_unit” and “PPS_NAL_unit” are binary data that you would get from your encoder.  If you don’t know the SPS and PPS NAL units, then you could instead subclass from “H264VideoFileServerMediaSubsession”, and rely upon that code to automatically read your input source to figure out the SPS and PPS NAL units (assuming that they’re present in the stream), but that’s messier.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


_______________________________________________
live-devel mailing list
live-devel at lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel



More information about the live-devel mailing list