[Live-devel] Regarding the h264 video stream's rtp packet timestamp
Ross Finlayson
finlayson at live555.com
Fri Oct 25 21:05:12 PDT 2013
> nextTask() = envir().taskScheduler().scheduleDelayedTask(20000,(TaskFunc*)FramedSource::afterGetting, this);
[...]
> nextTask() = envir().taskScheduler().scheduleDelayedTask(8000,(TaskFunc*)FramedSource::afterGetting, this);
This is wrong. Once you've delivered a frame of data (using your "GetNextAudioFrame()"/"GetNextVideoFrame()" calls) to the downstream object, you shouldn't be delaying at all before you complete delivery of the object. Instead, just call
FramedSource::afterGetting(this):
directly, in each case.
Note, however, that you call this ***only if*** you were able to successfully deliver a frame of data - i.e., if the acquired frame size was >0. If, instead, you were not able to deliver a frame of data (i.e., the acquired frame size was 0), then you must arrange for the delivery (and the call to "FramedSource::afterGetting(this):") to take place later, when a frame of data becomes available.
Once again, I suggest using the "DeviceSource" code as a model.
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20131025/8cb0bc7d/attachment.html>
More information about the live-devel
mailing list