[Live-devel] MP4 streaming implementation in LIVE555 codebase

Ross Finlayson finlayson at live555.com
Wed Feb 13 07:00:42 PST 2008


>Content-class: urn:content-classes:message
>Content-Type: multipart/alternative;
>	boundary="----_=_NextPart_001_01C86E4B.3B145C33"
>
>Hi Ross,
>
>We are planning to implement MP4 streaming in server side in LIVE555 
>code base with the help of libmp4 library. We have the following 
>clarifications:
>
>1.       We are planning to create a separate class called 
>MP4VideoStreamParser which is derived from MPEGVideoStreamParser. 
>This class (MP4VideoStreamParser) contains methods to parse the MP4 
>file and return individual frames and set fTo and fStart variables.

A "MPEG4VideoStreamParser" class *already* exists.  However, see below...


>2.       After individual frame is read from MP4 file from above 
>step, the respective MPEG1or2VideoStreamFramer or 
>MPEG4VideoStreamFramer classes are called to construct the RTP 
>packet from the buffer and send the packet to the client. We assume 
>that MP4 file contains MPEG2 or MPEG4 video data only. Here we have 
>a clarification here. Do we need to create a separate class called 
>MP4VideoStreamFramer derived from FramedSource, which does the job 
>of constructing the RTP packet from the buffer and send the packet 
>to the client. Or We can use the existing classes 
>(MPEG1or2VideoStreamFramer or MPEG4VideoStreamFramer) to construct 
>and send the packets.

Because you are generating discrete frames - one at a time - from the 
"libmp4" library, instead of an unstructured byte stream, you should 
use the *existing* "MPEG4VideoStreamDiscreteFramer" (for MPEG-4 
video) or "MPEG1or2VideoStreamDiscreteFramer" (for MPEG-1 or MPEG-2 
video), *instead of* "MPEG4VideoStreamFramer" or 
"MPEG1or2VideoStreamFramer".

Also, as noted above, you don't need to write any new parsing code.
-- 

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


More information about the live-devel mailing list