[Live-devel] H264 RTP stream recording to AVI
Ross Finlayson
finlayson at live555.com
Fri Jan 6 07:29:39 PST 2017
> So rather than using our existing video sources, I've been looking at using the OpenRTSP client to record streams from testOnDemandRTSPServer. I have a test H264 file in a Matroska container and can (sort of) successfully record a "raw" H264 file, playable by VLC using the following:
>
> openRTSP -b 400000 -d 25 rtsp://192.168.204.3:8854/matroskaFileTest
>
> I say "sort of" because I only get between 12-15s worth of video (when I specified 25s) and for the first half of the stream, the colours appear very much washed out, which may be related to I-Frames (or the lack of them possibly).
Our software doesn’t do any video decoding/encoding, so it shouldn’t affect the colours in your video. This suggests to me that you may have a problem with your original "test H264 file in a Matroska container”. (Are you able to play this file directly using a video player application (e.g., VLC)?)
Instead, I suggest starting with a ‘raw’ H.264 video file - e.g., one of the “.264” files here:
http://www.live555.com/liveMedia/public/264/
as noted in our FAQ entry here:
http://live555.com/liveMedia/faq.html#264-file
and use the “h264ESVideoTest” code in “testOnDemandRTSPServer”.
> Looking at the code for openRTSP, a key difference between that and what I've done in the testMPEG2TransportReceiver is the passing of sPropParameterSetsStr
I don’t know why you are looking at the code for “testMPEG2TransportReceiver”; that code is used for receiving a multicast RTP stream that contains *MPEG Transport Stream* data - i.e., not what you want.
You said that your goal is:
> the recording of H264 video received over RTP
Does this H.264/RTP video stream have a RTSP server (i.e., a “rtsp://“ URL)? If so, then you should just use “openRTSP” (or our “testRTSPClient” demo code). If, however, there’s no RTSP server, then I’m not really sure how your H.264/RTP video stream is supposed to be playable, because a video player (for a H.264) stream needs to know the SPS and PPS H.264 NAL units for the stream. It usually gets this information from the stream’s SDP description (i.e., using the RTSP protocol).
But if you *really* don’t have a RTSP server, then you’ll need to find/figure out the SPS and PPS H.264 NAL units for the stream, and pass them to the "H264VideoFileSink::createNew()” function. Or, prepend those NAL units to the H.264 video stream, if you want to do something else with it.
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list