<br><font size=2 face="sans-serif">All</font>
<br>
<br><font size=2 face="sans-serif">The mention of "</font><font size=2><tt>properly-synchronized
presentation time</tt></font><font size=2 face="sans-serif">" sounds
like it might be related to a problem we are having. </font>
<br>
<br><font size=2 face="sans-serif">We have an application using the live555
library running on an embedded processor that record a digital audio/video
stream to a file. The stream is generated by a separate DSP that converts
analog audio and video and sends the stream out on a private network. Recordings
look pretty good when played back on a PC (with VLC, for example). Playing
back to the DSP (for conversion from digital to analog) looks much worse
-- artifacts that we think are due to dropped frames and packets.</font>
<br>
<br><font size=2 face="sans-serif">We recently confirmed that our embedded
processor and associated DSP do not share a time reference, and there is
no real-time clock in our embedded processor -- we just use the CPU's clock
for our time reference.</font>
<br>
<br><font size=2 face="sans-serif">We think that this is what's happening
during a recording:</font>
<br>
<br><font size=2 face="sans-serif">- our DSP generates an RTP stream using
its own time reference to create timestamps and measure time intervals</font>
<br><font size=2 face="sans-serif">- our embedded processor decodes this
stream but does not rewrite timestamp info, so that the time info embedded
in the file is our DSP's measurement of time</font>
<br>
<br><font size=2 face="sans-serif">During playback on a PC, the timestamp
info from the file is used. The file plays back correctly because our DSP's
measurement of time is pretty close to the PC's measurement of time, due
to the PC's real-time clock.</font>
<br>
<br><font size=2 face="sans-serif">During playback from our embedded processor
to our DSP, our embedded processor's measurement of time is used when reading
timestamp info from the recorded file. After timing the playback of several
files, it looks like our embedded processor is running about 2% faster
relative to our DSP. This causes our processor to send packets and frames
to our DSP slightly faster than our DSP can process it, leading to dropped
packets and frames. </font>
<br>
<br><font size=2 face="sans-serif">We're digging into the liveMedia code
to understand where we might make a subclass to compensate for this on
playback. MPEG2TransportStreamFramer.cpp seems a good place to start, since
it's already working with the PCR. As near as we can tell right now, however,
the MPEG2TransportStreamFramer class is used during recording and playback
in our system, so we need to be very careful with changes here. </font>
<br>
<br><font size=2 face="sans-serif">We're also looking into implementing
our own gettimeofday() function, where we could apply this compensation
during a playback. Since this could also affect recordings, we'd need to
be sure that we had turned compensation off during recording and turned
it back on only during a playback.</font>
<br>
<br><font size=2 face="sans-serif">We would appreciate any advice on where
we should focus to solve this problem. Thanks!</font>
<br>
<br><font size=2 face="sans-serif">-=- Mike Miller<br>
Rockwell Collins, Inc.<br>
Cedar Rapids, IA </font>