[Live-devel] Live video and audio streaming using one RTSPServer

Ross Finlayson finlayson at live555.com
Thu Nov 5 02:15:54 PST 2015


> Sorry, I meant fLastPlayTime

That is a member variable used internally by the implementation of the “ByteStreamFileSource” class, and should not be something that you need to concern yourself with.

But presumably - if you’re using a “ByteStreamFileSource” - you’re feeding it into a “H264VideoStreamFramer” (and then to a “H264VideoRTPSink”).  The output from the “ByteStreamFileSource” object will be a sequence of H.264 NAL units, each with a proper presentation time.

However, if you’re trying to synchronize this video with audio, then you may want to rethink your decision to use have audio and video be read as byte streams.  If, instead, you read each as discrete frames - one at a time - then it’ll be much easier to generate proper presentation times that are synchronized.


(Unfortunately, we’ve probably now reached the limit for how much assistance I can provide you ‘for free’ on this mailing list.  If your company is interested in having me consult further on your project, then please have your management let me know (via separate email).)

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/




More information about the live-devel mailing list