[Live-devel] Proper use of StreamReplicator

Jan Ekholm jan.ekholm at d-pointer.com
Sat Oct 18 11:49:25 PDT 2014


On 17 okt 2014, at 21:48, Ross Finlayson <finlayson at live555.com> wrote:

> The data that you feed to “JPEGVideoRTPSink” MUST BE a subclass of “JPEGVideoSource”.  It can’t just redefine “isJPEGVideoSource()” to return True (or just do some type casting hack).  The reason for this is that “JPEGVideoRTPSink” needs to know the “type”, “qFactor”, “width”, and “height” of the frames that it receives (so it can pack these values into the appropriate fields of the outgoing RTP packet).
> 
> So, you’ll need to define your own subclass of “JPEGVideoSource” - e.g., called “ReplicaJPEGVideoSource”.  This must take as input another “FramedSource” object (a “StreamReplica”, in your case), and must implement the following (pure) virtual functions: “doGetNextFrame()”, “type()”, “qFactor()”, “width()”, “height()”.
> 
> Implementing “doGetNextFrame()” is easy; just call “getNextFrame()” on the input (“StreamReplica”) object.
> 
> To implement the other virtual functions (“type()”, “qFactor()”, “width()”, “height()”), you’ll need to have these four parameters added to each frame of data somehow.  I.e., you’ll need to modify your original JPEG video source object - i.e., the one that you feed into the “StreamReplicator” - to add a header at the start (or at the end) that contains these four values.
> 
> These four values will presumably also be useful to the other replica - the one that you feed into a “FileSink”.
> 
> Your “ReplicaJPEGVideoSource” class should also make sure that its destructor calls “Medium::close()” on the input source (a “StreamReplica”), and should also reimplement the “doStopGettingFrames()” virtual function to call “stopGettingFrames()” on the input source.  (Note the implementation of “FramedFilter”, which does the same thing.  In fact, you *might* try having your “ReplicaJPEGVideoSource” class inherit from both “JPEGVideoSource” and “FramedFilter”, but I’m not sure whether or not that will work.  (I’m wary of multiple inheritance in C++, and haven’t used it at all in any of the LIVE555 code so far.))
> 
> Finally, you’ll need to modify your implementation of “createNewStreamSource()” to not just return a new “StreamReplica”, but instead to feed this “StreamReplica” into a new “ReplicaJPEGVideoSource” object, and then return a pointer to this new “ReplicaJPEGVideoSource” object.

That approach could work, but there are some obstacles along the way. First let me paste the class I use, see discussion after the class:

class ReplicaJPEGVideoSource : public JPEGVideoSource {

public:

    ReplicaJPEGVideoSource (FramedSource * replica, MJpegFramedSource * source, UsageEnvironment& env) : JPEGVideoSource(env), m_replica(replica), m_source(source) {

    }

    virtual ~ReplicaJPEGVideoSource () {
        Medium::close( m_replica );
    }

    virtual u_int8_t type () {
        return m_source->type();
    }

    virtual u_int8_t qFactor () {
        return m_source->qFactor();
    }

    virtual u_int8_t width () {
        return m_source->width();
    }

    virtual u_int8_t height () {
        return m_source->height();
    }

    virtual void doGetNextFrame () {
        //m_source->doGetNextFrame();

        //m_replica->getNextFrame( fTo, fMaxSize, fAfterGettingFunc, fAfterGettingClientData, fOnCloseFunc, fOnCloseClientData );
    }

    virtual void doStopGettingFrames() {
        // TODO: is this needed?
        JPEGVideoSource::doStopGettingFrames();

        // TODO: should this be the source or the replica?
        m_source->stopGettingFrames();
    }


protected:

    FramedSource * m_replica;
    MJpegFramedSource * m_source;
};


The problems here arise in doGetFrame(). You said to just getNextFrame() on the replica, ie. a FramedSource created based on
this code:

FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned clientSessionID, unsigned& estBitRate) {
    // create and initialize a source for the camera
    MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
    if ( ! source->initialize( m_cameraParameters ) ) {
        return 0;
    }

    if ( m_replicator == 0 ) {
        m_replicator = StreamReplicator::createNew( envir(), source, False );
    }

    return new ReplicaJPEGVideoSource( m_replicator->createStreamReplica(), source, envir() );
}

The replica is wrapped in the above class, as per instructions. However, doGetNextFrame() can not simply call
getNextFrame() as that requires a set of parameters that are not accessible:

        m_replica->getNextFrame( fTo, fMaxSize, fAfterGettingFunc, fAfterGettingClientData, fOnCloseFunc, fOnCloseClientData );

The fAfterGettingClientData and fOnCloseClientData are private and not accessible to my wrapper. I can also not override getNextFrame()
in my wrapper and save said data as that method is not virtual. Instead if I assume you made a typo and meant:

	m_replica->doGetNextFrame();

This will crash when my MJpegFramedSource delivers the frame by copying raw data to fTo, as it has not been set. It seems to be NULL
or some random value. So apparently this extra proxying probably makes FramedSource::getNextFrame() get called for the wrong FramedSource
and the data is saved in the wrong instance.

As a test I did make the Live555 code work without any extra proxying layer, but the bad typecasts in the original code make it quite
ugly. This was doable by lifting out StreamReplica and have it implement the suitable isXYZ() methods and go through all the places with
hardcoded C style casts and fix them to use dynamic_cast<> and check for a StreamReplica. This does then not work for H264 streams, as
isFramedSource() is a private member of FramedSource. Why has it been inherited as private? 

Anyway, currently stream replicating does not work at all and it does not seem to be easy to do. The StreamReplicator class works in
trivial examples but breaks in real world code.

-- 
Jan Ekholm
jan.ekholm at d-pointer.com






More information about the live-devel mailing list