[Live-devel] Live555 Streaming from a live source
Ross Finlayson
finlayson at live555.com
Mon Jul 26 18:12:56 PDT 2010
>how is the presentationtime of two streams synchronised?
Please read the FAQ!
>I have to synchronise the mpeg-4 es and a wave file. I am able to
>send the two streams together by creating single servermediasession
>and adding two separate servermediasubsession, but they are not
>synchronised.
>In case of mpeg-4 es video, the gettimeofday() is getting called
>when the constructor of MPEGVideoStreamFramer is called and in case
>of wave, in WAVAudioFileSource::doGetNextFrame(). I think due to
>this the video and audio is not getting synchronised. So in this
>case how should i synchronise the audio and video?
You *must* set accurate "fPresentationTime" values for each frame of
each of your sources. These values - and only these values - are
what are used for synchronization. If the "fPresentationTime" values
are not accurate - and synchronized - at the server, then they cannot
possibly become synchronized at a client.
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list