[Live-devel] Synchronization of MPEG4 Video and AMR Audio
Cavalera Claudio
Claudio.Cavalera at icn.siemens.it
Tue May 18 11:40:26 PDT 2004
Hello,
I've read liveMedia faq about RTP audio video synchronization.
>From the RTP FAQ on http://www.cs.columbia.edu/~hgs/rtp/:
"The mechanism how end systems synchronize different media is not
prescribed by RTP, however, a workable approach is to periodically
exchange messages between applications to indicate what delay each
application would impose on the stream (including any media decoding
delays) if it were not to synchronize and then have all applications
choose the maximum of these delays."
I would like to know if and where liveMedia handles such timing,
especially for MPEG4.
AFAIK MPEG4 has a bit complicated timing mechanism based on two kind of
time stamps, one which means when information has to be decoded and one
which means when information has to be ready for presentation. I think
this is because of incremental interpolation between frames in some
coding schemes (i.e. before then a frame can be presented, the next one
has to be decoded also).
I also would like to know to which RFC is liveMedia compliant, maybe the
MPEG4Generic family is for:
[RFC 3016] RTP Payload Format for MPEG-4 Audio/Visual Streams,
and the MPEG4ESVideo is for:
[RFC 3640] RTP Payload Format for Transport of MPEG-4 Elementary
Streams?
Thanks a lot,
Claudio
More information about the live-devel
mailing list