[Live-devel] Streaming from a live source (camera) using H264VideoStreamFramer
Ross Finlayson
finlayson at live555.com
Sun Feb 23 13:59:29 PST 2020
Thanks for the note. There are a lot of things going on here:
1/ First, note that developers (using the LIVE555 libraries) never need to concern themselves with RTP timestamps; these are only used internally, by the RTP/RTCP protocol. Our library code automatically converts between presentation times and RTP timestamps. So, in the future, when describing possible problems, please reference only presentation times.
2/ “H264VideoStreamFramer” ignores the presentation times of all data that's fed to it. That's because the input to "H264VideoStreamFramer" (unlike "H264VideoStreamDiscreteFramer") is treated as a continuous byte stream, not a sequence of discrete NAL units. Instead, "H264VideoStreamFramer" generates its own presentation times (aligned to ‘wall clock’ time).
3/ HOWEVER, your issue revealed a bug in our server’s implementation of the RTSP “PAUSE” command - especially for "H264VideoStreamFramer”. It was not properly updating the presentation times when resuming after a pause.
I have released a new version (2020.02.23) of the “LIVE555 Streaming Media” code that should fix this problem. Thanks again for reporting this.
4/ Note that - for our RTSP server implementation - the presentation times of input data MUST be aligned with 'wall clock’ time - i.e., the times that you’d get by calling “gettimeofday()". Therefore, your “CameraDeviceSource” would be incorrect, even if they were used by “H264VideoStreamFramer” (which they're not).
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list