[Live-devel] How to sync H264 and AAC timestamp from live streaming

Ross Finlayson finlayson at live555.com
Mon Oct 5 18:35:58 PDT 2015


> I am a new guy to learn live555 and porting it into our platform.
> Now, we are coding a RTSP server based on
> testOnDemandRTSPServer sample code.
> We create 4 new classes to read H264 and AAC frames from our ring buffer not file.
> Each time call doGetNextFrame(), the class will deliver a “discrete” frame to fTo.
> Now, what we face is very familiar with
> http://lists.live555.com/pipermail/live-devel/2014-September/018686.html <http://lists.live555.com/pipermail/live-devel/2014-September/018686.html>
Eric,

Please read that message once again - because it explains your problem.  Specifically:

> 6. Now, we use   H264VideoStreamFramer not
> H264VideoStreamDiscreteFramer,
>   because we need a parser to create SDP for H264.

No, once again, if you’re streaming audio along with the video, then you should use a “H264VideoStreamDiscreteFramer”, *not* a “H264VideoStreamFramer”.

Getting the SPS and PPS NAL units for the stream should not be a problem.  This is usually something that you need to do just once (unless you are changing the parameters of your H.264 stream each time).  Once you have the SPS and PPS NAL units, you can just modify your implementation of the “CreateNewRTPSink()” virtual function to use one of the variants of “H264VideoRTPSink::createNew()” that take the SPS and PPS NAL units as parameters.  See “liveMedia/include/H264VideoRTPSink.hh”.

Another thing that you need to be aware of is that your “fPresentationTime” values - for both video and audio - need to be aligned with ‘wall clock’ time-  i.e., with the times that you’d get by calling “gettimeofday()”.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20151005/e6772017/attachment.html>


More information about the live-devel mailing list