[Live-devel] Timestamp gap in RTCP Report for MPEG1or2VideoStreamFramer

Guy Bonneau gbonneau at matrox.com
Tue Feb 5 13:44:53 PST 2008


Ross,

It didn't take me days. I have solved my problem of synchronization 
that was caused by wrong "presentation time" decoded by the library. 
I have had to modify the library as explained in my email exchange.

> At the RTP receiver end, you get - for each frame of media 
> data - a presentation time.  This presentation time is used 
> for decoding (and a/v sync).

I have used the VLC Media application for receiver.
 
> If you also implement RTCP, then (after the initial RTCP "SR" is
> received) the presentation times that you get at the RTP 
> receiver end will be *exactly the same* as the presentation 
> times that you provided at the RTP sender end.  You can 
> verify this for yourself.

Yes Indeed I have. And there are exactly the same until 
VLC Media receive hasBeenSynchronizedUsingRTCP status from
the library. 

> 
> In other words, you can - and should - think of the RTP/RTCP 
> implementation, at the sender and receiver ends, as being a 
> 'black box' that takes presentation times as input at one 
> end, and produces *the same* presentation times as output at 
> the other end.  This 'black box' happens to work by using RTP 
> timestamps and RTCP reports. 
> But you shouldn't concern yourself with this.  This code works. 
> Trust me.

Unfortunately there is a use case that doesn't work and this is
what I am trying to explain. The use case can only show up when
you try to synchronize at least those 2 streams. A video Mpeg 
stream and an audio PCM stream. (They might be also other 
use cases I haven't found)

> 
> Thus, the thing that you need to worry about is presentation times. 
> Look at the *presentation times* that are coming out of 
> "MPEG1or2VideoStreamFramer".  Are they correct? 

No there are not...!!! But only once library has called the 
function hasBeenSynchronizedUsingRTCP()at the receiver side.

> Because your 
> stream is I-frame only, then it's not inconceivable that 
> there is a problem with the way that 
> "MPEG1or2VideoStreamFramer" generates presentation times for 
> your streams.  

It do fine and there is no bug to the best of my knowledge. The
parser doesn't complain. The problem is in the Presentation
time delay associated to the Mpeg2 streamer of the library.

> You should be able to easily check this for 
> yourself.  

I did !!!

I have been able to reproduce the issue easily with a small testing
application that has been created from testing application 
testMPEG1or2VideoStreamer. If possible and if you can provide me 
an upload ftp I could provide you with a use case and steps to 
reproduce the problem. I also added some message at the StreamRead 
function of VLC Media Player inside the live555.cpp file that shows
the Presentation time that VLC receives (It is Win32 based however)
that I can send with the use case. The problem is easy to see with 
the message window of VLC Media player. Once VLC catch the status 
hasBeenSynchronizedUsingRTCP() the live555 library send desynchronized 
presentation time between audio/video. 

You see it.

I can also send you the modification to the library to solve the
problem. It is a very minor change in the file MPEGVideoStreamFramer.cpp.

Let me know.

Regards and thanks
Guy



More information about the live-devel mailing list