[Live-devel] Calculating rtptime for RTSP/PLAY-Response "RTP-Info" header

mamille1 at rockwellcollins.com mamille1 at rockwellcollins.com
Sun Aug 10 15:41:32 PDT 2008


All

The mention of "properly-synchronized presentation time" sounds like it 
might be related to a problem we are having. 

We have an application using the live555 library running on an embedded 
processor that record a digital audio/video stream to a file. The stream 
is generated by a separate DSP that converts analog audio and video and 
sends the stream out on a private network. Recordings look pretty good 
when played back on a PC (with VLC, for example). Playing back to the DSP 
(for conversion from digital to analog) looks much worse -- artifacts that 
we think are due to dropped frames and packets.

We recently confirmed that our embedded processor and associated DSP do 
not share a time reference, and there is no real-time clock in our 
embedded processor -- we just use the CPU's clock for our time reference.

We think that this is what's happening during a recording:

- our DSP generates an RTP stream using its own time reference to create 
timestamps and measure time intervals
- our embedded processor decodes this stream but does not rewrite 
timestamp info, so that the time info embedded in the file is our DSP's 
measurement of time

During playback on a PC, the timestamp info from the file is used. The 
file plays back correctly because our DSP's measurement of time is pretty 
close to the PC's measurement of time, due to the PC's real-time clock.

During playback from our embedded processor to our DSP, our embedded 
processor's measurement of time is used when reading timestamp info from 
the recorded file. After timing the playback of several files, it looks 
like our embedded processor is running about 2% faster relative to our 
DSP. This causes our processor to send packets and frames to our DSP 
slightly faster than our DSP can process it, leading to dropped packets 
and frames. 

We're digging into the liveMedia code to understand where we might make a 
subclass to compensate for this on playback. 
MPEG2TransportStreamFramer.cpp seems a good place to start, since it's 
already working with the PCR. As near as we can tell right now, however, 
the MPEG2TransportStreamFramer class is used during recording and playback 
in our system, so we need to be very careful with changes here. 

We're also looking into implementing our own gettimeofday() function, 
where we could apply this compensation during a playback. Since this could 
also affect recordings, we'd need to be sure that we had turned 
compensation off during recording and turned it back on only during a 
playback.

We would appreciate any advice on where we should focus to solve this 
problem. Thanks!

-=- Mike Miller
Rockwell Collins, Inc.
Cedar Rapids, IA 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20080810/453a21ea/attachment.html>


More information about the live-devel mailing list