[Live-devel] Synchronization of MPEG4 Video and AMR Audio

Ross Finlayson finlayson at live.com
Tue May 18 03:27:08 PDT 2004


>I would like to know if and where liveMedia handles such timing,
>especially for MPEG4.

All data that is sent by the LIVE.COM code via RTP is first given an 
accurate 'presentation time', which is then converted - in the outgoing RTP 
packets (by the "RTPSink" class) - to RTP timestamps.  When these packets 
are later received over the network - e.g., by a liveMedia "RTPSource" 
object, or by some other RTP/RTCP client implementation - then these 
timestamps will get converted back into the original 'presentation times', 
which can then be used for audio/video synchronization.  (The conversion 
from RTP timestamp back to presentation time is done using information in 
RTCP packets; see 
<http://www.live.com/liveMedia/faq.html#separate-rtp-streams> for more 
information.)

For MPEG-4 video streams, the LIVE.COM code computes the presentation times 
of each outgoing frame from the timing information that's in the MPEG-4 
headers.  See the calls to "computePresentationTime()" in 
"MPEG4VideoStreamFramer" (and its implementation in its parent class 
"MPEGVideoStreamFramer") for the details of how this is done.

>I also would like to know to which RFC is liveMedia compliant, maybe the
>MPEG4Generic family is for:
>[RFC 3016] RTP Payload Format for MPEG-4 Audio/Visual Streams,
>and the MPEG4ESVideo is for:
>[RFC 3640] RTP Payload Format for Transport of MPEG-4 Elementary
>Streams?

Actually, you have these reversed:
- The "video/MP4V-ES" MIME type (implemented by the LIVE.COM 
"MPEG4ESVideoRTPSink" class) is defined by RFC 3016.
- The "audio/MPEG4-GENERIC" MIME type (implemented by the LIVE.COM 
"MPEG4GenericRTPSink" class) is defined by RFC 3640.


	Ross Finlayson
	LIVE.COM
	<http://www.live.com/>



More information about the live-devel mailing list