[Live-devel] Where can I trap and debug response to signalNewFrameData() ?

temp2010 at forren.org temp2010 at forren.org
Sat Feb 16 07:39:45 PST 2013


Ross,

Thanks for the advice.

I found MF_H264_DeviceSource::deliverFrame() as well as an old trace
statement I put in it back on my successful cobble.  This is in fact the
"deep inside" I was looking for, albeit having been called back from deep
inside and now back in my code.  The trace is coming out, which proves that
this is connected.  I also made the H264VideoStreamDiscreteFramer change
and removed the 'start code', but it didn't help.   My new cobble is still
not working.

Perhaps I should tell you the actual problem!  When I go back to using
ByteStreamFileSource as the FramedSource, VLC successfully opens the
announced rtsp://192.168.123.5:8554/h264ESVideoTest and plays it great.
 This proves the VLC side is working.  However, when I use
MF_H264_DeviceSource as the FramedSource, VLC says it can't even open it.
 Nevertheless, when VLC does try to open it, I do see a trace for my
MF_H264_DeviceSource being created and passed back as FramedSource, and I
see my trace in MF_H264_DeviceSource::deliverFrame() start happening.  For
some reason, however, there's a difference such that VLC can no longer open
the stream.

Here's my intercept.   Just for the easiest method and testing, I trap the
filename in H264VideoFileServerMediaSubsession:: createNewStreamSource.
 This function normally creates a ByteStreamFileSource.  But if the
filename is "<device>", I cause it to instead create a
MF_H264_DeviceSource.  You can also see the H264VideoStreamDiscreteFramer
change I just made.

FramedSource*
H264VideoFileServerMediaSubsession::createNewStreamSource(unsigned
/*clientSessionId*/, unsigned& estBitrate) {

// HgF: Add this DEBUG stuff
  #ifdef _DEBUG
//#define TRACE(msg) OutputDebugString(TEXT(msg)) /* HgF */
#define TRACE TRACEX
void CDECL TRACEX(PCSTR pszFormat, ...);
#else
#define TRACE(...) /* OMIT IF NOT DEBUGGING */
#endif

FramedSource* returnFramedSource;
FramedSource* videoES; // HgF: Add this var as common place for source
pointer. Note it's type better matches what
H264VideoStreamFramer::createNew() wants anyway
estBitrate = 500; // kbps, estimate

// Create the video source:
if (0==strcmp(fFileName,"<device>")) { // HgF: Add this hook for using
MF_H264_DeviceSource instead of ByteStreamFileSource
TRACE("BEFORE HOOK -
H264VideoFileServerMediaSubsession::createNewStreamSource() using
MF_H264_DeviceSource to receive live SPIIR-Capture H.264 stream\n");
MF_H264_DeviceSource* theSource
= MF_H264_DeviceSource::createNew(
*SPIIROnDemandRTSPServer::env /* UsageEnvironment& */,
&SPIIROnDemandRTSPServer::H264VideoDeviceStreamer_params /*
MF_H264_DeviceParameters */);
videoES = theSource;
TRACE("AFTER HOOK -
H264VideoFileServerMediaSubsession::createNewStreamSource() using
MF_H264_DeviceSource to receive live SPIIR-Capture H.264 stream\n");

// Create a framer for the Video Elementary Stream:
returnFramedSource = H264VideoStreamDiscreteFramer::createNew(envir(),
videoES); // HgF-20130215 per advice from Ross Finlayson, must use
H264VideoStreamDiscreteFramer, not H264VideoStreamFramer
} else { // HgF: The ByteStreamFileSource code in this clause is unchanged,
aside from copying fileSource into videoES and TRACE statements
TRACE("NOT HOOKED -
H264VideoFileServerMediaSubsession::createNewStreamSource() using
ByteStreamFileSource to play file %s\n", fFileName);
ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(envir(),
fFileName);
if (fileSource == NULL) {
TRACE("NOT HOOKED -
H264VideoFileServerMediaSubsession::createNewStreamSource() ERROR OPENING
FILE %s\n", fFileName);
return NULL;
}
fFileSize = fileSource->fileSize();
videoES = fileSource; // HgF: Add this copy to common place for source
pointer
TRACE("NOT HOOKED -
H264VideoFileServerMediaSubsession::createNewStreamSource() ready to play
%s\n", fFileName);

// Create a framer for the Video Elementary Stream:
returnFramedSource = H264VideoStreamFramer::createNew(envir(), videoES); //
HgF-20130215 per advice from Ross Finlayson, must use
H264VideoStreamDiscreteFramer, not H264VideoStreamFramer
}

return(returnFramedSource); // HgF-20130215 per advice from Ross Finlayson,
must use H264VideoStreamDiscreteFramer, not H264VideoStreamFramer
}


ADDITIONAL CONFUSION...  I thought tracking this down might help me find my
problem, but it appears not.

Please don't let the question below subtract from focus on the more
important question above, about VLC not being able to open the stream in
the MF_H264_DeviceSource case when it can in the ByteStreamFileSource case.
 I know VLC isn't yours.  The question is about how Live555 is reacting to
my code, not about VLC.

It appears that BasicTaskScheduler0::doEventLoop is being called
recursively.  SPIIROnDemandRTSPServer (derived from testOnDemandRTSPServer
calls it.  Then, when an incoming request event occurs, it ends up being
called again.  This causes the location of the watchVariable to be changed
for this recursive instance.  When using ByteStreamFileSource, it seems to
set this new watchVariable, and the recursive instance stops, returning to
the original instance.  But in my MF_H264_DeviceSource where VLC causes an
incoming request event yet can't connector for some reason, no such set of
the watchVariable occurs, so the recursive instance won't stop, and when my
mainline sets the watchVariable of the first instance of doEventLoop, it
does nothing because the recursive instance is still running.

Below is the call stack when the recursive doEventLoop() first begins.

> Capture.exe!BasicTaskScheduler0::doEventLoop(char * watchVariable) Line 85
C++
  Capture.exe!H264VideoFileServerMediaSubsession::getAuxSDPLine(RTPSink *
rtpSink, FramedSource * inputSource) Line 99 C++
  Capture.exe!OnDemandServerMediaSubsession::setSDPLinesFromRTPSink(RTPSink
* rtpSink, FramedSource * inputSource, unsigned int estBitrate) Line 325 C++
  Capture.exe!OnDemandServerMediaSubsession::sdpLines() Line 68 C++
  Capture.exe!ServerMediaSession::generateSDPDescription() Line 233 C++
  Capture.exe!RTSPServer::RTSPClientConnection::handleCmd_DESCRIBE(const
char * urlPreSuffix, const char * urlSuffix, const char * fullRequestStr)
Line 398 C++
  Capture.exe!RTSPServer::RTSPClientConnection::handleRequestBytes(int
newBytesRead) Line 759 C++
  Capture.exe!RTSPServer::RTSPClientConnection::incomingRequestHandler1()
Line 623 C++
  Capture.exe!RTSPServer::RTSPClientConnection::incomingRequestHandler(void
* instance, int __formal) Line 616 C++
  Capture.exe!BasicTaskScheduler::SingleStep(unsigned int maxDelayTime)
Line 164 C++
  Capture.exe!BasicTaskScheduler0::doEventLoop(char * watchVariable) Line 88
C++
  Capture.exe!SPIIROnDemandRTSPServer::doEventLoop(char * watchVariable)
Line 125 C++
  Capture.exe!OutputThreadFunction(void * lpParam) Line 113 C++
  kernel32.dll!BaseThreadInitThunk
() Unknown
  ntdll.dll!RtlUserThreadStart
() Unknown

On further analysis, it's watchVariable is fDoneFlag member of
H264VideoFileSerrverMdiaSubsession.  I see this var initialized to zero,
but can find nowhere it's ever set.  So I don't know how it's ever set in
the ByteStreamFileSource case.


On Fri, Feb 15, 2013 at 6:43 PM, Ross Finlayson <finlayson at live555.com>wrote:

> Now I'm trying to reproduce my cobble, but for unicast and derived
> from SPIIROnDemandRTSPServer.cpp instead.  Exactly as before, my frames get
> created and I call signalNewFrameData(), but then no frames are coming out
> IP.  With all this cobbling, I'm not sure I have all the lose ends
> reconnected.  Therefore, I'm interested in setting a breakpoint deep inside
> Live555 somewhere.
>
>
> No, you shouldn't be setting breakpoints 'deep inside LIVE555', because
> that code is working just fine, and it's code that you're unfamiliar with.
>  Instead, you should be setting breakpoints in *your own* code.
>  Specifically, you should be setting a breakpoint in your "deliverFrame0()"
> function - i.e., in the function that you passed as a parameter to the
> "createEventTrigger()" call.  That function is the function that should be
> getting called after each event is triggered.
>
> Don't forget, of course, that all of this happens within the LIVE555 event
> loop, so don't forget to call "doEventLoop()" in your application to start
> the event loop running.
>
>
> One other thing, both H264VideoDeviceStreamer.cpp and
> SPIIROnDemandRTSPServer.cpp use a ByteStreamFileSource.  In both cases I've
> replaced that with my MF_H264_DeviceSource.
>
>
> OK, but don't forget that - because your "MF_H264_DeviceSource" code
> (presumably) delivers discrete H.264 NAL units - you must feed this into a
> "H264VideoStreamDiscreteFramer", *not* a "H264VideoStreamFramer" (as was
> used when the input was a byte stream).  Also, the H.264 NAL units that
> come out of your "MF_H264_DeviceSource" code *must not* include a preceding
> 'start code' (i.e., must not start with 0x00 0x00 0x00 0x01).
>
>
> Ross Finlayson
> Live Networks, Inc.
> http://www.live555.com/
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20130216/cefcea31/attachment-0001.html>


More information about the live-devel mailing list