[Live-devel] how to use H264or5VideoRTPSink
Ross Finlayson
finlayson at live555.com
Thu Jan 12 09:50:40 PST 2017
I suspect that your problem is that you are not using the correct ‘framer’ class for your H.264 video ‘frames’ (in reality, H.264 NAL units).
Look at your implementation of the “createNewStreamSource()” virtual function (in your “OnDemandServerMediaSubsession” subclass). Because your data source - from your encoder - is discrete H.264 NAL units (i.e., one at a time), rather than a byte stream, you must use feed your input source to a “H264VideoStreamDiscreteFramer”, *not* a “H264VideoStreamFramer”.
Note also that each ‘frame’ that come from your input source must be a single H.264 NAL unit, and MUST NOT be prepended by a 0x00 0x00 0x00 0x01 ‘start code’.
> Currently, I came across a class called H264or5VideoRTPSink, which from its name seems to be exactly what I’d have to use. I have no idea however, how it is intended to be used. Also, you don’t use it in your sample – is there a reason to it? Maybe it’s not what I think it is?
This is an abstract base class; therefore is not used directly. Instead, you use (in your implementation of the “createNewRTPSink()” virtual function) a “H264VideoRTPSink” or a “H265VideoRTPSink”; those are subclasses of the abstract base class “H264or5VideoRTPSink”.
> By the way: I put some cout traces into constructors/destructors/methods of ByteStreamFileSource and realized that, when I connect to testH264VideoToTransportStream, the source is instantiated twice. First, it is created, then a few frames are read, then it gets destroyed again, then another one is created and this one is kept until the end. It doesn’t matter what client I use, it happens with VLC and testRTPSClient alike. Why is that?
This happens because - to create a proper SDP description for the H.264 stream (to be used in the response to the RTSP “DESCRIBE” command) - the server needs to inspect the input stream to get the proper ‘configuration’ information (specifically the H.264 SPS and VPS NAL units) for the stream. Therefore, the server first opens and reads part of the stream, then closes it, then opens it again when it’s doing the actual streaming to clients.
(A reminder also that because you’re streaming from a live source, rather than from a file, you should set the “reuseFirstSource” variable to True; see <http://live555.com/liveMedia/faq.html#liveInput-unicast>)
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list