[Live-devel] Adding secondary audio track to existing H264 RTP stream
Ross Finlayson
finlayson at live555.com
Thu Mar 29 10:29:22 PDT 2018
> In DummySink::afterGettingFrame i just fwrite buffer to file, nothing more.
Alternatively, you could have just used the existing “FileSink” class.
> Didnt want to cross post but this is a debug info from afterGettingFrame - microseconds delay between each calls of this function:
> fDurationInMicroseconds 23219
> Time diff 25142
OK, the problem here is that delaying by “fDurationInMicroseconds” was wrong, because it didn’t account for the overhead of (e.g.) writing to your file. You should be able to fix this by replacing the final call to “scheduleDelayTask()” in your “ModifiedADTSFileSource” (again, don’t modify the original code ‘in place’) with something like:
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
int uSecondsToDelay = (fPresentationTime.tv_sec - timeNow.tv_sec)*1000000 + (fPresentationTime.tv_usec - timeNow.tv_usec);
unsigned timeToDelay = uSecondsToDelay < 0 ? 0 : (unsigned)uSecondsToDelay;
nextTask() = envir().taskScheduler().scheduleDelayedTask(timeToDelay, (TaskFunc*)FramedSource::afterGetting, this);
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list