[Live-devel] Capturing video from device
Ross Finlayson
finlayson at live555.com
Thu Sep 4 01:41:04 PDT 2008
>I am imlementing live RTSP video streaming server which capture
>video from live camera, encoded it in hardware and its output are
>NAL units.
>
>I noticed the DeviceSource template and got some questions about using it.
>
>1. Does my flow should be
>
> DevideSource::doGetNextFrame -> H264Framer ->H264RTPSinc.
Yes, sort of. The data flow is:
Your source of NAL units -> Your subclass of
H264VideoStreamFramer -> H264VideoRTPSink
>
>2. Does the Framer is source or filter.
"H264VideoStreamFramer", and thus the subclass of
"H264VideoStreamFramer" that you will write, is a filter.
>
>3. If my encoder output is NAL units, what is the function of the framer.
Two things. First, you must implement the virtual function
"currentNALUnitEndsAccessUnit()", which returns True iff the current
NAL unit is the end of a complete video frame. Second, you must set
the "fPresentationTime" variable appropriately for each NAL unit.
>
>4. what is the functionality that should be given to the
>DevideSource::deliverFrame().
Delivering a single NAL unit, in response to each call to "doGetNextFrame()"
>
>5. If my camera got it frame rate (PAL 25 fps), how do I implemet
>different farem rate using the schedualer.
If your capture device delivers NAL units in 'real time', then you
don't have to do anything special. The NAL units will get consumed
at the rate at which they arrive.
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20080904/c5ebe4db/attachment.html>
More information about the live-devel
mailing list