[Live-devel] Am I doing this right?
Ben Rush
ben at ben-rush.net
Thu Apr 6 13:37:37 PDT 2017
"No, you don’t ‘set’ “fTo” to point to the data; you *copy* the data to the
address pointed to by “fTo”"
Yes, you are correct. I was using "loosey goosey" terminology here.
Specifically, I do something like this:
memmove(fTo, nal.p_payload ......
A bit more details on the crash I'm encountering. First, it's not really a
"crash" in that the server dies. It appears to be a C++ exception that is
being fired and handled somewhere (I get a notification of it in my
development environment). But since my streaming server isn't streaming the
file (according to VLC, no bytes are being received), I assume there's
something wrong somewhere. This makes me suspicious of the fact I am NOT
prepending the NAL units with 0x0 0x0 0x0 0x1 as you described. However,
I'm a bit confused as to why my live streaming code is working, then, as
I'm not explicitly doing that from within it. Here is the code that's
popping a NAL unit off a queue I'm building and writing it onto fTo. This
is for my LIVE streaming server and it's working just fine and is in
production. I found the code for doing the "truncation" business online and
it's worked great for us; admittedly I didn't understand it too well.
......
x264_nal_t nal = this->_nalQueue.front();
this->_nalQueue.pop();
assert(nal.p_payload != NULL);
// You need to remove the start code which is there in front of every
nal unit.
// the start code might be 0x00000001 or 0x000001. so detect it and
remove it. pass remaining data to live555
int trancate = 0;
if (nal.i_payload >= 4 && nal.p_payload[0] == 0 && nal.p_payload[1] ==
0 && nal.p_payload[2] == 0 && nal.p_payload[3] == 1)
{
trancate = 4;
}
else
{
if (nal.i_payload >= 3 && nal.p_payload[0] == 0 && nal.p_payload[1]
== 0 && nal.p_payload[2] == 1)
{
trancate = 3;
}
}
if (nal.i_payload - trancate > fMaxSize)
{
fFrameSize = fMaxSize;
fNumTruncatedBytes = nal.i_payload - trancate - fMaxSize;
}
else
{
fFrameSize = nal.i_payload - trancate;
}
//http://comments.gmane.org/gmane.comp.multimedia.live555.devel/4930
//
http://stackoverflow.com/questions/13863673/how-to-write-a-live555-framedsource-to-allow-me-to-stream-h-264-live
timeval lastPresentationTime = fPresentationTime;
fPresentationTime = this->_time;
if (newData)
{
fDurationInMicroseconds = 1000000 / m_fps; // 66000;
}
else
{
fDurationInMicroseconds = 0;
}
memmove(fTo, nal.p_payload + trancate, fFrameSize);
.....
And this is the stack for the exception that does occur (again, this
exception DOES NOT happen in the live streaming code, only when trying to
stream a static file). As you can see, it's within the call to
getAuxSDPLine. I notice "test4Bytes" which makes me think it's related to
this prepending of 0x0 0x0 0x0 0x1?
ManagedStreamingServerLibrary.dll!StreamParser::ensureValidBytes1(unsigned
int) C++
ManagedStreamingServerLibrary.dll!StreamParser::ensureValidBytes(unsigned
int) C++
ManagedStreamingServerLibrary.dll!StreamParser::test4Bytes(void) C++
ManagedStreamingServerLibrary.dll!H264or5VideoStreamParser::parse(void)
C++
ManagedStreamingServerLibrary.dll!MPEGVideoStreamFramer::continueReadProcessing(void)
C++
ManagedStreamingServerLibrary.dll!MPEGVideoStreamFramer::doGetNextFrame(void)
C++
ManagedStreamingServerLibrary.dll!FramedSource::getNextFrame(unsigned
char *,unsigned int,void (*)(void *,unsigned int,unsigned int,struct
timeval,unsigned int),void *,void (*)(void *),void *) C++
ManagedStreamingServerLibrary.dll!H264or5Fragmenter::doGetNextFrame(void)
C++
ManagedStreamingServerLibrary.dll!FramedSource::getNextFrame(unsigned
char *,unsigned int,void (*)(void *,unsigned int,unsigned int,struct
timeval,unsigned int),void *,void (*)(void *),void *) C++
ManagedStreamingServerLibrary.dll!MultiFramedRTPSink::packFrame(void) C++
ManagedStreamingServerLibrary.dll!MultiFramedRTPSink::buildAndSendPacket(bool)
C++
ManagedStreamingServerLibrary.dll!MultiFramedRTPSink::continuePlaying(void)
C++
ManagedStreamingServerLibrary.dll!H264or5VideoRTPSink::continuePlaying(void)
C++
ManagedStreamingServerLibrary.dll!MediaSink::startPlaying(class
MediaSource &,void (*)(void *),void *) C++
> ManagedStreamingServerLibrary.dll!OCVFileServerMediaSubsession::getAuxSDPLine(RTPSink
* rtpSink, FramedSource * inputSource) Line 99 C++
At any rate, I tried modifying the code that moves the data to fTo to
include 0x0 0x0 0x0 0x1 and I'm still seeing these exceptions.
fFrameSize += 4;
fTo[0] = 0x0;
fTo[1] = 0x0;
fTo[2] = 0x0;
fTo[3] = 0x1;
memmove(fTo + 4, nal.p_payload + trancate, fFrameSize);
Any thoughts? Something is wonky, but I'm having a helluva time tracking it
down.
On Thu, Apr 6, 2017 at 3:02 PM Ross Finlayson <finlayson at live555.com> wrote:
> > FramedSource*
> OCVFileServerMediaSubsession::createNewStreamSource(unsigned
> /*clientSessionId*/, unsigned& estBitrate) {
> > estBitrate = 90000; // kbps, estimate
> >
> > // Create the video source:
> > {
> > std::ifstream in(fFileName, std::ifstream::ate |
> std::ifstream::binary);
> > fFileSize = in.tellg();
> > }
> >
> > OCVFileSource* fileSource = OCVFileSource::createNew(envir(),
> fFileName);
> > if (fileSource == NULL) return NULL;
> >
> > // Create a framer for the Video Elementary Stream:
> > return H264VideoStreamFramer::createNew(envir(), fileSource);
> > }
>
> This is correct, *provided that* your “OCVFileSource” delivers a stream of
> H.264 NAL units, each prepended with a 4-byte ‘start code’ (0x00 0x00 0x00
> 0x01). (That’s because the downstream ‘framer’ object -
> “H264VideoStreamFramer” - expects a stream of NAL units in this format.)
>
>
> > Then within OCVFileSource::doGetNextFrame() I use our own special file
> reader to get a frame from the OCV file and I use lib x264 to encode it
> into a group of NAL units (much like I do with our live camera solution).
>
> Again, each NAL unit in this ‘group of NAL units’ needs to be prepended
> with a (0x00 0x00 0x00 0x01) ‘start code’.
>
> > I then set "fTo" with the encoded data
>
> No, you don’t ‘set’ “fTo” to point to the data; you *copy* the data to the
> address pointed to by “fTo”. You should also check the size of the data
> against “fMaxSize” (and then set “fFrameSize” and “fNumTruncatedBytes” as
> appropriate).
>
> I.e., you do something like this:
> if (myDataSize > fMaxSize) {
> fFrameSize = fMaxSize;
> fNumTruncatedBytes = newFrameSize - fMaxSize;
> } else {
> fFrameSize = myDataSize;
> }
> memmove(fTo, myDataPointer, fFrameSize);
>
> Note that, in your case, you don’t need to set “fPresentationTime” or
> “fDurationInMicroseconds”; those will be computed by the downstream
> “H264VideoStreamFramer” object instead.
>
>
> > and call
> >
> > nextTask() = envir().taskScheduler().scheduleDelayedTask(0,
> > (TaskFunc*)FramedSource::afterGetting, this);
> >
> > to reschedule another call to doGetNextFrame().
>
> This will work. However, in your case you could replace this with:
> FramedSource::afterGetting(this);
> which is more efficient. (You can do this because you’re streaming to a
> network, rather than to a file. The downstream ‘RTPSink’ object will -
> after transmitting a RTP packet - return to the event loop, so you won’t
> get infinite recursion.)
>
>
> Ross Finlayson
> Live Networks, Inc.
> http://www.live555.com/
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20170406/591ca012/attachment-0001.html>
More information about the live-devel
mailing list