[Live-devel] Slow timing deviation

stage.nexvision at laposte.net stage.nexvision at laposte.net
Fri Jan 7 19:38:37 PST 2005


Hi all,

   I'm sorry to ask this, but I think I don't understand the timing mechanism in RTCP.
I've a RTSP server delivering realtime MPEG4 stream. When I connect a client on it, 
and I display the difference between the timestamp of the rtp packet received and the 
"processed" timestamp (from Win32 timers), I can see that the delay increases as the 
time goes on. While this is okay for a 4h stream, after a week the delay is nearly 10 
sec. Which, of course, is not good for "realtime" video.

Compared to the measurement, this is a 10 / (86400s * 7) = 16.5 microsecond 
accuraly lost.

The client is sending RTCP receiver report to synchronise and uses the RTCP 
Sender report to get the useful informations (NTP time).

The synchronisation works because, before I use it, the delay increased of 1 mn per 
day.

Am I missing something ?

Sincerly,
Cyril RUSSO

PS: The client is under WinXP and uses QueryPerformanceCounter for precise time 
measurement, while the server is under Linux (using gettimeofday()).
PPS: If I stop the stream, and restart it, the delay is resetted, but this causes 
interruptions (I would like to avoid).




More information about the live-devel mailing list