[Live-devel] Does live555 exist a way to sync video/audio streams?

Ross Finlayson finlayson at live555.com
Sun Aug 16 20:01:36 PDT 2020



> On Aug 17, 2020, at 12:27 PM, Eric Hsieh <Eric.Hsieh at liteon.com> wrote:
> 
> And one question for sync video/audio stream.
> 
> Background:
> Our system will prepare audio/video frames to memory, always capture frames from driver.
> For video, it is H264, FPS is 30 and driver output one I frame per second.
> For audio, it is AAC, 16KHz.
> 
> Behavior:
> When player ask RTSP streaming from our server.
> For video stream, we go to memory and find latest I frame to send.
> For audio stream, we go to memory and find latest frame to send.
> On worst case, first timestamp of video frame <= first timestamp of audio frame + 1(second). 
> 
> Question:
> We want to sync video/audio streams when player ask a stream.
> Video/audio streams are belonged to same ServerMediaSession.
> Is it possible to pass the first timestamp of video frame to audio stream?
> Is there a way to sync data between FramedSource?

Yes.  As we note here
	http://live555.com/liveMedia/faq.html#separate-rtp-streams
the LIVE555 code does this automatically, *provided that*:
	1/ Accurate presentation times (i.e., the “fPresentationTime” variable) are set each time your “FramedSource” subclass delivers a frame (i.e., when implementing “doGetNextFrame()”, and
	2/ These presentation times are aligned to ‘wall clock’ time - i.e., the time that you’d get by calling “gettimeofday()”.

Note that you should continue to use the term “presentation time” rather than “timestamp”.  (The term “timestamp” can be confusing, because RTP has ’timestamps’, but they are used only internally by the RTP/RTCP protocol, and are not visible to server (or client) application code.)

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/




More information about the live-devel mailing list