[Live-devel] unicast onDemand from live source NAL Units
Pablo Gomez
Pablo.Gomez at scch.at
Wed Jan 23 02:27:37 PST 2013
>First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a >"H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink").
I did the H264LiveServerMediaSubsession based on the H264FileServerMediaSubssesion.
I'm using the H264VideoRTPSink.cpp, H264VideoStreamDiscreteFramer.cpp and the object that inherits FramedSource where I'm reading the NAL units
This is how it is connected in the media subsession:
FramedSource* H264LiveServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 10000; // kbps, estimate
// Create the video source:
H264LiveStreamFramedSource* liveFramer = H264LiveStreamFramedSource::createNew(envir(),liveBuffer);
H264VideoStreamDiscreteFramer* discFramer = H264VideoStreamDiscreteFramer::createNew(envir(),liveFramer);
// Create a framer for the Video Elementary Stream:
return H264VideoStreamFramer::createNew(envir(), discFramer);
}
RTPSink* H264LiveServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* /*inputSource*/) {
return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}
This is the doGetNextFrame in the H264LiveStreamFramedSource I'm using:
void H264LiveStreamFramedSource::doGetNextFrame() {
// Try to read as many bytes as will fit in the buffer provided (or "fPreferredFrameSize" if less)
fFrameSize=fBuffer->read(fTo,fMaxSize,&fNumTruncatedBytes);
// We don't know a specific play time duration for this data,
// so just record the current time as being the 'presentation time':
gettimeofday(&fPresentationTime, NULL);
// Inform the downstream object that it has data:
FramedSource::afterGetting(this);
}
About the call fBuffer.read
fBuffer->read(fTo,fMaxSize,&fNumTruncatedBytes);
is basically to the object that contains the NAL units. This object I have two implementations one tries to copy the whole NAL unit and sets the fNumTruncatedBytes to the truncatedBytes in the read operation.. It returns the number of bytes copied to fTo.
The second implementation I have of this buffer is a Ring Buffer. When I write to the ring buffer I write all bytes and when I read from it I read the minimum between availableBytes in buffer and the fMaxSize. I start reading from the last read position+1. Thus, in this approach I do not truncate anything. But, I guess somehow the NAL units are broken. Because if the last read position is in the middle of a NAL unit, the next Read will not have any SPS/PPS.
>Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting >this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you??
Yes, 10 million is insane there are no units with that size. Just wrote it to test. Now I set it up to 250000 which is big enough but it does not matter, the fMaxSize is always smaller than that and I'm getting truncated frames quiet often.
>If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. >Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets -> the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers.
I have checked the Nvidia encoder parameters and it has one parameter to set up the number of slices. I set it up to 4 and 10. I also test it the default mode which lets the encoder decide the slice number. Nevertheless, I'm testing on a lan network so it is basically lossless. Thus, I guess this parameter should not be a problem.
Best
Pablo
----------------------------------------------------------------------
Message: 1
Date: Tue, 22 Jan 2013 10:46:08 -0800
From: Ross Finlayson <finlayson at live555.com>
To: LIVE555 Streaming Media - development & use
<live-devel at ns.live555.com>
Subject: Re: [Live-devel] unicast onDemand from live source NAL Units
NVidia
Message-ID: <BFB7D2A7-9EDE-4221-B5D9-6FCF2047C528 at live555.com>
Content-Type: text/plain; charset="iso-8859-1"
First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a "H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink").
> I tried to set up in the Streamer code enough size in the OutputPacketBuffer but this does not seem to work....
> {
> OutPacketBuffer::maxSize=10000000;
If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers.
Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you??
If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers.
Nonetheless, if you set "OutPacketBuffer::maxSize" to a value larger than the largest expected NAL unit, then this should work (i.e., you should find that "fMaxSize" will always be large enough for you to copy a whole NAL unit).
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
----------------------------------------------------------------------
More information about the live-devel
mailing list