[Live-devel] H.264

Ross Finlayson finlayson at live555.com
Wed Jan 3 11:11:39 PST 2007


>Earlier video compression standards were always centered around the 
>concept of a bit stream. Higher layer syntax elements were separated 
>by start codes to allow re-synchronization to the bit stream in case 
>of corruption-be it the result of an erasure or of bit errors.
>
>
>
>H.264, when employing its optional Annex B, also allows such a 
>framing scheme, primarily to support a few legacy protocol 
>environments such as H.320 or MPEG-2 transport. The RTP 
>packetization, however, employs the native NAL interface that is 
>based on NAL units (NALUs).
>
>//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
>
>
>
>Does this mean that if my encoder employ Annex B file format, I can 
>use the livemedia MPEG2VideoFramer and MPEG2VideoSink to send the 
>h.264 data over?

No.  The text you quoted referred to the possibility of carrying 
H.264 video data within a MPEG-2 *Transport* Stream.  If you were to 
do that, then you could stream the Transport Stream data using our 
Transport Stream-related classes.  But to stream H.264/RTP data by 
itself, you must use "H264VideoRTPSink".

I'm probably not going to be answering any more questions about 
streaming H.264 for a while; I'm getting sick of them...
-- 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.live555.com/pipermail/live-devel/attachments/20070103/42ef6c7b/attachment.html 


More information about the live-devel mailing list