[Live-devel] Regarding the h264 video stream's rtp packet timestamp

Tony fantasyvideo at 126.com
Tue Oct 22 02:18:10 PDT 2013


Thanks your answer.
There is another question about the scheduleDelayedTask(duration,x,x).
So how should I set the duration, then the audio and video would be sync.
Currently my every audio's frame is 20000ms. the video frame rate is 25fps.. Now I set the audio's next getframe time is 20000ms, video's next getframe time is 8000ms.
In such case, I use vlc to access it, it also shows that the audio is "PTS is out of range". It seems that the audio is too late. So if I slowed the video send rate, the result is video is "PTS is out range".
So is there any solution to solve it ?






在 2013-10-20 16:29:31,"Ross Finlayson" <finlayson at live555.com> 写道:

   In the videoframesource’s getnextframe, if the buffer is nalu, not completely frame. So the fPresentationTime and fDurationInMicroseconds should only be set when the buffer is the last nalu in current frame.
         Is it right?


Not quite.  "fPresentationTime" should be set for every NAL unit that you deliver.  However, for NAL units that make up the same access unit, the "fPresentationTime" value will be the same.


Also, if you are streaming from a live source (i.e., from an encoder), rather than from a file, then you don't need to set "fDurationInMicroseconds" at all.  If, however, you are streaming pre-recorded video (e.g., from a file), then you will need to set "fDurationInMicroseconds" for the last NAL unit of the access unit (and leave "fDurationInMicroseconds" for the other NAL units at the default value of 0).



Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20131022/322c132a/attachment.html>


More information about the live-devel mailing list