[Live-devel] Am I accidentally H.264 encoding twice???

temp2010 at forren.org temp2010 at forren.org
Fri Feb 1 03:23:39 PST 2013


(((Jacob, please see Ross'es quoted email response further below.  ABORT
the removal of double H.264...)))

Ross,

Thanks very much for this info.  Will do.

Regarding your statement [your 'framer' object should then be fed into a
"H264VideoRTPSink" object, for streaming], please help me understand this.
 Currently, the 'framer' object is sent to videoSink->startPlaying(), where
videoSink is a much more fundamental RTPSink.  This is exactly as inherited
from the original code of testH264VideoStreamer.  There are several layers
of inheritance between RTPSink and H264VideoRTPSink.  Might this be part of
my quality problem?  Or might you have mis-spoken?  Or does the use of
H264VideoRTPSink vs RTPSink not really matter in this case.  After all, the
testH264VideoStreamer program that uses RTPSink and not H264VideoRTPSink
ought to work (can't say I ever ran it).

Thanks very much,
-Helmut

P.S. To further help my understanding, please confirm
that ByteStreamFileSource must simply read files that include H.264
encoding and provide that encoding forward, at least in the
testH264VideoStreamer case.




On Thu, Jan 31, 2013 at 7:52 PM, Ross Finlayson <finlayson at live555.com>wrote:

> WHAT?  Just a while ago I realized I'm passing H.264 encoded buffers to
> H264VideoStreamFramer, which is perhaps doubly encoding them to H.264 again.
>
> Am I accidentally H.264 encoding twice?  Does the original
> ByteStreamFileSource fed to H264VideoStreamFramer feed raw buffers to
> H264VideoStreamFramer?
>
>
> I think you're confused about what our software does.  *None* of our
> software does *any* encoding.  In particular, the "H264VideoStreamFramer"
> and "H264VideoStreamDiscreteFramer" classes each take - as input -
> already-encoded H.264 video data.   They don't do any 'encoding' (because
> the input data is already encoded.  All they do is parse the input H.264
> video data, and output a sequence of H.264 'NAL units', with proper
> 'presentation time' and 'duration' values.
>
> The difference between these two classes is that "H264VideoStreamFramer"
> takes - as input - H.264 video data that appears in a byte stream (e.g. a
> file or pipe).  "H264VideoStreamDiscreteFramer", on the other hand, takes
> as input discrete NAL units (i.e., one NAL unit at a time), *without* any
> preceding 'start code'.
>
> So, the choice of which of these 'framer' classes to use depends on what
> kind of data comes out of your "MF_H264_DeviceSource" class.  If this class
> outputs an unstructured byte stream (that contains H.264 video data, with
> 'start codes' preceding each NAL units), then use a
> "H264VideoStreamFramer".  If, however, your "MF_H264_DeviceSource" class
> outputs a sequence of NAL units (one at a time, without a preceding 'start
> code'), then use a "H264VideoStreamDiscreteFramer" instead.
>
> In either case, your 'framer' object should then be fed into a
> "H264VideoRTPSink" object, for streaming.
>
> Ross Finlayson
> Live Networks, Inc.
> http://www.live555.com/
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20130201/9ec7489e/attachment.html>


More information about the live-devel mailing list