[Live-devel] Layered video with Live555

Ross Finlayson finlayson at live555.com
Sun Mar 4 13:30:21 PST 2007


>  > By the way, the *real* difficulty that you're going to face in
>>  writing your "H264VideoStreamFramer" subclass is computing proper
>>  presentation times and durations (the "fPresentationTime" and
>>  "fDurationInMicroseconds" member variables) for each NAL unit.  (This
>>  has tripped up other people who have tried streaming H.264 video from
>>  a file.
>For my purposes, I plan on just computing these based on the average
>bitrate of the file.  Not ideal, but should get the job done.
>
>That gets me to thinking though ... When it comes time to re-combine the
>streams on the client side, I assume the best (only?) way is to use the
>presentation times to put things back together correctly. 
>Unfortunately, I don't see any "Mux" classes to base this off of.  In
>fact, the only reordering code I see is the ReorderingPacketBuffer
>class, but this looks like it keys off of RTP sequence numbers, not
>presentation times.  Have you ever heard of someone needing to do this? 
>What would you suggest?

What you need is a 'jitter buffer' implementation (not really a "mux" 
("multiplexor"), because that implies that two or more incoming 
streams are being merged into one).

I suggest just basing your client implementation around VLC 
<www.videolan.org/vlc>, because it already receives/plays (regiular, 
non-structured) H.264/RTP streams (using the "LIVE555 Streaming 
Media" code).  If you use VLC, you'll be saving yourself lots of work 
(and, perhaps, the additions that you make to support structured 
H.264 video will end up being of use to the VLC codebase).
-- 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


More information about the live-devel mailing list