[Live-devel] Live H264 Streaming Timing Problem
Marc ARMSTRONG
marc.armstrong at st.com
Wed Apr 11 02:20:00 PDT 2012
Thanks very much for the info, and your quick response.
I am reading in H.264 data from a named pipe (which is essentially a file) and feeding into a H264VideoStreamFramer. I basically copied ByteStreamFileSource and read from a pipe rather than a file. Incidentally, I used pipes because I am using visual studio and couldn't find a way to pipe the encoder output to stdin of the test streamer application.
I am confident that the streamer application is receiving the frame count and of the integrity of the video input to the streamer application. I think the issue may lie with the speed of the capture from the hardware. Before I focus on this, can I ask a further question?
I am capturing H.264 data with low compression from a video encoder in 2Mb chunks and from the Live555 application to play in VLC player. Is this type of application well tested with the Live555 software? i.e. high data content streaming in real time. When I capture to file and stream from the file the application works perfectly.
Thanks again.
Marc.
From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson
Sent: 10 April 2012 19:36
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Live H264 Streaming Timing Problem
I'm having difficulty streaming live H264 video captured from a PCI Express board. It appears that the data acquisition is too slow for the Live555 streaming application and hence the displayed video (in VLC) is sticky. For the time being I am assuming that the hardware is okay and would like to investigate the possibility that the Live555 application is attempting to grab the data too quickly for the hardware.
Could you tell me how Live555 calculates the time period between calls to ByteStreamFileSource::doReadFromFile() and also where this is implemented in the code.
You haven't said exactly how you're reading H.264 data from your video capture device, but because you mentioned "ByteStreamFileSource", I'm assuming that you're treating the device as a file (a byte stream), and feeding the output from a "ByteStreamFileSource" (i.e., your video capture device) into a "H264VideoStreamFramer" (which parses the byte stream into a sequence of H.264 NAL units).
In this case, the code (in "H264VideoStreamFramer") figures out the frame rate (and thus, how long to delay between frames) from the "time_scale" parameter in "Sequence Parameter Set" (i.e., SPS) NAL units. If this parameter is not present, then the code can't figure out the frame rate, so it uses a default frame rate of 25 fps.
If, however, your video capture device delivers 'discrete' H.264 NAL units - i.e., one at a time - rather than an unstructured byte stream, then you should *not* attempt to read it as a file. Instead, you should write your own subclass of "FramedSource" (perhaps, based on the "DeviceSource" model) that delivers the NAL units, and feed these into a "H264VideoStreamDiscreteFramer" - *not* a "H264VideoStreamFramer". If you do this, then your 'input device' class - i.e., the subclass of "FramedSource" that you'll write - *must* set "fDurationInMicroseconds" for each NAL unit, before it delivers it to its downstream object (a "H264VideoStreamDiscreteFramer").
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20120411/6a0a2a69/attachment-0001.html>
More information about the live-devel
mailing list