[Live-devel] How to sync H264 and AAC timestamp from live streaming
Eric_Hsieh at alphanetworks.com
Eric_Hsieh at alphanetworks.com
Mon Oct 5 18:55:06 PDT 2015
Dear Ross,
Thanks for your kindly helping.
Now, we use H264VideoStreamDiscreteFramer to do live streaming, it is better than before.
And we sure the timestamp is correct and sync with “wall clock time”.
But, we still have some questions, need your help.
1. Now, H264+AAC is working well.
After playing about 3 hours, audio is good but video will not very smooth to play.
Do you have any suggestions?
2. We still need to stream JPEG+AAC live streaming.
After playing 2 minutes, video will block which is same with H264+AAC issue.
There is no DiscreteFramer class for JPEG streaming, do you have any suggestions?
Thanks a lot.
regards, eric, 10/06
On Oct 6, 2015, at 09:35, Ross Finlayson <finlayson at live555.com<mailto:finlayson at live555.com>> wrote:
I am a new guy to learn live555 and porting it into our platform.
Now, we are coding a RTSP server based on testOnDemandRTSPServer sample code.
We create 4 new classes to read H264 and AAC frames from our ring buffer not file.
Each time call doGetNextFrame(), the class will deliver a “discrete” frame to fTo.
Now, what we face is very familiar with
http://lists.live555.com/pipermail/live-devel/2014-September/018686.html
Eric,
Please read that message once again - because it explains your problem. Specifically:
6. Now, we use H264VideoStreamFramer not H264VideoStreamDiscreteFramer,
because we need a parser to create SDP for H264.
No, once again, if you’re streaming audio along with the video, then you should use a “H264VideoStreamDiscreteFramer”, *not* a “H264VideoStreamFramer”.
Getting the SPS and PPS NAL units for the stream should not be a problem. This is usually something that you need to do just once (unless you are changing the parameters of your H.264 stream each time). Once you have the SPS and PPS NAL units, you can just modify your implementation of the “CreateNewRTPSink()” virtual function to use one of the variants of “H264VideoRTPSink::createNew()” that take the SPS and PPS NAL units as parameters. See “liveMedia/include/H264VideoRTPSink.hh”.
Another thing that you need to be aware of is that your “fPresentationTime” values - for both video and audio - need to be aligned with ‘wall clock’ time- i.e., with the times that you’d get by calling “gettimeofday()”.
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
_______________________________________________
live-devel mailing list
live-devel at lists.live555.com<mailto:live-devel at lists.live555.com>
http://lists.live555.com/mailman/listinfo/live-devel
This electronic mail transmission is intended only for the named recipient. It contains information which may be privileged,confidential and exempt from disclosure under applicable law. Dissemination, distribution, or copying of this communication by anyone other than the recipient or the recipient's agent is strictly prohibited. If this electronic mail transmission is received in error, Please notify us immediately and delete the message and all attachments of it from your computer system. Thank you for your cooperation.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20151006/bfc218e3/attachment.html>
More information about the live-devel
mailing list