[Live-devel] .m4v / .mp3 Synchronization

Michael Russell mrussell at frontiernet.net
Tue Jul 7 19:16:19 PDT 2009


Ross Finlayson wrote:
> That's correct.  The timestamps (specifically, the "fPresentationTime" 
> variable) should be set by each Framer object. These are used to set 
> the SCR timestamps in the resulting Transport Stream.  So I'm not sure 
> why this isn't working for you; you're going to have to track this 
> down yourself.
>
> In particular, look at how the "fSCR" variable is set in the 
> "InputESSourceRecord::afterGettingFrame()" function, in 
> "MPEG2TransportStreamFromESSource".
Thanks for your help, Ross.

I investigated my synchronization problem further and I have some additional information in case you (or anyone else for that matter) are able to offer further suggestions or direction.

To recap, here is what may data chain looks like after I took your suggestion to write my transport stream to a file instead of using RTP:

I have two independent ByteStreamFileSource objects - 
  One feeds MPEG-4 video elementary stream (.m4v) data to an MPEG4VideoStreamFramer.
  One feeds MPEG-1, Layer 3 (.mp3) audio data to an MPEG1or2AudioStreamFramer.

Those framers then each feed a MPEG2TransportStreamFromESSource object.
The resultant transport stream feeds a FileSink object that writes my .ts file.

The synchronization problem that I originally described has turned into a "truncated video" problem.  The only difference now is that I'm not streaming over the network anymore.  Weird, but whatever.

Here is what I have observed/concluded so far:

1) If I remove either the audio or video framer, the resultant transport stream file plays fine (Just audio or video of course, but it plays fine.)

2) With both the audio and video framers in place, the resultant transport stream contains a truncated video stream (about 7% of its original length).  I have verified this with various media analysis tools.  Of course, when I try to play the file, the video portion soon stops while the audio portion continues to play to completion.

3) I enabled the debug output in LiveMedia and noted that the PTS and SCR calculations for both audio and video data streams look correct.  I saw nothing out of the ordinary in the debug output except for an early end to PTS/SCR calculations for the video stream (further evidence of its truncation).

My own additional debug output suggests that my MPEG2TransportStreamFromESSource object just stops calling getNextFrame() on the video input source (MPEG4VideoStreamFramer).  I have no idea why this would be. It seems that the MPEG2TransportStreamFromESSource is running into trouble when it has to deal with my two framer objects feeding it at the same time.  

Here's a thought: Should I be using ByteStreamMultiFileSource instead of two separate ByteStreamFileSource objects?  Any other ideas for me to try?


Regards,
Mike.



More information about the live-devel mailing list