Sorry if this is a stupid question, but I can't fully understand how Live555's Presentation Times are to be applied, *aside* from synchronizing parallel media streams. Is there any way to use these presentation times to determine the receiver's time offset from the server? In my case, I'm trying to achieve stable, consistent and most-importantly, low-latency streaming. I've pulled this off fairly well under controlled conditions, but I'm still a bit mystified about how (or whether) it's possible to relate presentation times as computed for afterGettingFrame() to absolute time, as the server understands it. This would help calculate round-trip latency or ask the server to skip ahead when the client falls behind due to network or other delays. I know that SR's and RR's have something to do with it and Live555 'takes care of it' for me, but I'd like to understand more and can't seem to find the resources that would explain it clearly. Can anyone point me to a resource (other than the IETF docs which I've already read and somewhat understand)?<div>
<br></div><div>Right now my method involves recording the presentation time at the first moment that synchronization occurs, and use that as my 'absolute' server-client time offset - but this is obviously a flawed approach.<br>
<div><br>Thanks,<div>Jesse</div></div></div>