[Live-devel] Synchronizing audio and video streams
Severin Schoepke
severin.schoepke at gmail.com
Fri May 25 09:35:51 PDT 2007
Hello list,
I'm working on an application that streams live generated content (audio
and video) using the Darwin Streamnig Server. I decided to use ffmpeg
for the encoding part and live555 for the streaming part. The basic
architecture is as follows: In a first thread I generate the content,
and in a second one I stream the content to the DSS. Data is passed
between the threads using two buffers.
I implemented two FramedSource subclasses for the streaming part, one
for audio and one for video data. Both work as follows: They read raw
data from their input buffer and encode it using ffmpeg (using MPEG4 and
MP2 codecs) and write to their output buffers (fTo). My sources are then
connected to an MPEG4VideoStreamFramer and an MPEG1or2AudioStreamFramer
respectively. These are connected to a DarwinInjector (based on the code
of testMPEG4VideoToDarwin).
The problem I have now is that the streams are not synchronized when
viewing them (using QuickTime or VLC). Based on debug printf's I found
out that the audio source's doGetNextFrame() is called much more that
the video source's. Therefore, the audio stream is played ahead and the
video stream is lagging behind.
I set correct presentation times for both streams, so I thought that
live555 does 'the rest' synchronizing the streams, but it seems not to
work. Therefore I'd like to ask you if I should implement a
synchronization of the streams by myself, or if I'm doing something wrong...
I hope someone can help me and I'd like to thank you for any answers in
advance!
cheers, Severin
More information about the live-devel
mailing list