[Live-devel] how to use H264or5VideoRTPSink

Roland Aigner Roland.Aigner at aec.at
Fri Jan 13 01:12:05 PST 2017


Hi,

Thanks a ton for your detailed reply. 

In the meantime I figured out that I completely misunderstood how the FramedSource is supposed to work and what has to happen in the respective methods. Others on stackoverflow, asking similar questions, seemed to misunderstand as well, so I completely was on the wrong track, sorry for that. What I thought was, you request a frame every call of deliverFrame, providing an appropriate buffer for this frame. I thought the case of frameSize > fMaxSize is an exception and just a safety check and should actually never happen since then the buffer could not hold the frame data, so I was searching for a way of preventing this from happening. As I know now, you instead gather data of 150.000 bytes, always telling me how much of the buffer is left and expect me to carry on where I left, typically in the midst of a frame. This was of course causing some trouble. 

@NAL units: I am using NvCodec for encoding and neither documentation nor samples even mention NALs, so at least I *think* what I have is indeed a raw byte stream, not a sequence of NAL units (MPEG start code is also not present). Except of course, H264 and H265 imply that coders deliver NAL units anyways and this goes without saying, so the documentation doesn’t mention. What I do have is data which, when written to a file, can be played back by any media player. Does this mean it's consisting of NAL units or a byte stream? I noticed that your testOnDemandRTSPServer instantiates a H264VideoFileServerMediaSubsession, which also does *not* use the discrete framers, so I also didn't use them. For now it works with VLC as a client - as does testOnDemandRTSPServer. As I said and as you can tell, I'm by no means an expert at this topic and the last time I encoded video several years ago I was using older codecs, so I'm a bit ignorant when it comes to H264 upwards. Could you help me understand why it works anyways? Do you think I'm dealing with byte streams or NAL units? Would I have to take care of creating NAL units from data delivered by encoders such as NvEnc/x264 by myself?

Thanks,
Roland 




More information about the live-devel mailing list