<div>I'm trying to implement trick-mode support server and client side. My original attempt for UI notification of playback time was to use the pts provided by ffmpeg for the position within the media, and on the client side take this pts out of the total duration to update my media slider. After any trick-mode operations, however, my send/receive values of the pts stopped matching up. I'm assuming this is because of the jitter calculations or some kind of rtp correction.</div>
<div><br></div><div>Then I found the getNormalPlayTime method of the subsession, and I changed my server to just provide the gettimeofday for the pts. This seemed to be more stable than my original attempt, but occasionally a seek or even a "pause then play" would throw the npt way past my media duration, even though my video frames update as expected.</div>
<div><br></div><div>So my question: is the rtpSource->curPacketRTPTimestamp() always supposed to be larger than rtpInfo.timestamp?</div><div><br></div><div>In MediaSubsession::getNormalPlayTime(...), when rtpInfo.infoIsNew, an offset is set as such:</div>
<div><div><div> u_int32_t timestampOffset = rtpSource()->curPacketRTPTimestamp() - rtpInfo.timestamp;</div><div><br></div><div>While debugging this method, it seemed that the offset would roll over, indicating that the rtpInfo.timestamp was bigger than the rtpSource->curPacketRTPTimestamp. I changed the declaration of timestampOffset to int, and my media slider updates correctly now even after seek, rewind, or fast-forward. I'm a little uncomfortable modifying the library source because I don't know the rtsp protocol very well. That's why I'm using your wonderful library...</div>
<div><br></div><div>Thanks for any reply, sorry for the overly long request.</div></div></div>