From 501861472 at qq.com Sun Sep 1 01:27:22 2013 From: 501861472 at qq.com (=?gb18030?B?u8a8zMCl?=) Date: Sun, 1 Sep 2013 16:27:22 +0800 Subject: [Live-devel] =?gb18030?b?u9i4tKO6IGZvciBoZWxw?= In-Reply-To: References: Message-ID: I have resolved this problem just use this function env->reclaim(); // ??????? if (scheduler) { delete scheduler; pScheduler=NULL; } it is ok now Steve ------------------ ???? ------------------ ???: "???"<501861472 at qq.com>; ????: 2013?9?1?(???) ??9:40 ???: "live-devel"; ??: [Live-devel] for help this is the firs time I use live555.I use testRTSPClient as a demo in my object , now there is running well, but how could I shut it down ? Now i just use the function of shutdownStream, but there is memory leaks as follow:Detected memory leaks! Dumping objects -> {1023} normal block at 0x00ACEC18, 32 bytes long. Data: 78 B2 07 10 1C F3 3C 00 1C F3 3C 00 00 00 00 00 {203} normal block at 0x00AC9118, 1024 bytes long. Data: < < > 0C B1 07 10 00 00 00 00 00 00 00 00 18 F3 3C 00 {201} normal block at 0x003CF7A0, 32 bytes long. Data: < > D4 B2 07 10 C8 B2 07 10 FF FF FF FF 00 00 00 00 {200} normal block at 0x003CF318, 1100 bytes long. Data: < > EC B1 07 10 AC B0 07 10 18 EC AC 00 18 EC AC 00 In my object I need to open RTSPClient and shut down it frequently, I can not bear any memory leaks , so I need some advice . Thank you Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Sep 1 02:21:11 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 1 Sep 2013 02:21:11 -0700 Subject: [Live-devel] =?utf-8?b?5Zue5aSN77yaIGZvciBoZWxw?= In-Reply-To: References: Message-ID: <27D82E99-7F3D-4BAC-925D-3A23FEAC79A7@live555.com> > I have resolved this problem > just use this function > env->reclaim(); > // ??????? > if (scheduler) > { > delete scheduler; > pScheduler=NULL; > } Alternatively, instead of deleting these two objects ("UsageEnvironment" and "TaskScheduler"), you could reuse them for the next "RTSPClient". (Or even better, create each "RTSPClient" object (along with a "UsageEnvironment" and "TaskScheduler") in a separate process.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stanley at gofuseit.com Mon Sep 2 06:58:28 2013 From: stanley at gofuseit.com (Stanley Biggs) Date: Mon, 2 Sep 2013 15:58:28 +0200 Subject: [Live-devel] Server getting confused with RTCP message before PLAY command (when using RTP over TCP) In-Reply-To: <4130DF8D-4596-4F9A-9CEA-37B08CCA86E6@live555.com> References: <49A3571C.8090906@schuckmannacres.com> <4130DF8D-4596-4F9A-9CEA-37B08CCA86E6@live555.com> Message-ID: Hi Ross I have upgraded to the very latest Live555 server code and the new Asynchronous Client Handler. We are using RTP-over-TCP. My server and client work fine over a regular, low-latency connection, however, the moment that latency increases somewhat, I am unable to get streaming working. I get an error from the server "405 Method Not Allowed". Upon inspecting the network traffic that the client sends to the server, I found that the client is sending RTCP "RR" packets early, before the "PLAY" command has been issued. As a result, the server gets confused and responds with a "405 Method Not Allowed" error. This looks to be the exact issue that the changelog shows as having been addressed in build 2012.10.04. This problem still seems to remain, however. The trace of client messages that I include below clearly illustrates the problem: ##Client Sends:## DESCRIBE rtsp://127.0.0.1:8554/ RTSP/1.0 CSeq: 2 User-Agent: LIVE555 Streaming Media v2013.08.16 Accept: application/sdp ##Client Receives:## RTSP/1.0 200 OK CSeq: 2 Date: Thu, Aug 29 2013 06:18:42 GMT Content-Base: rtsp://127.0.0.1:8554/nurv/ Content-Type : application/sdp Content-Length: 449 v=0 o=- 1377757120235695 1 IN IP4 192.168.56.1 s=MyVideo Streaming Session i=nurv t=0 0 a=tool:LIVE555 Streaming Media v2013.08.16 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:MyVideo Streaming Session a=x-qt-text-inf:nurv m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:3750 a=rtpmap:96 H264/90000 a=control:track1 m=audio 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:84602240 a=rtpmap:96 PCMU/48000/2 a=control:track2 ##Client Sends:## SETUP rtsp://127.0.0.1:8554/nurv/track1 RTSP/1.0 CSeq: 3 User-Agent: LIVE555 Streaming Media v2013.08.16 Transport: RTP/AVP/TCP;unicast;interleaved=0-1 ##Client Receives:## RTSP/1.0 200 OK CSeq: 3 Date: Thu, Aug 29 2013 06:18:43 GMT Transport: RTP/AVP/TCP;unicast;destination=127.0.0.1;source=127.0.0.1;interleaved=0-1 Session: 32A854D4 ##Client Sends:## SETUP rtsp://127.0.0.1:8554/nurv/track2 RTSP/1.0 CSeq: 4 User-Agent: LIVE555 Streaming Media v2013.08.16 Transport: RTP/AVP/TCP;unicast;interleaved=2-3 Session: 32A854D4 ##THIS IS THE "OUT-OF-SEQUENCE RR DATA THAT CAUSES THE PROBLEM. THIS SECTION IS ONLY SENT WHEN NETWORK HAS SOME LATENCY AND DOES NOT APPEAR WHEN STREAMING WORKS.## ##Client Sends:## 00000000 24 01 00 20 $.. 00000000 80 C9 00 01 23 7A EB 1D 81 CA 00 05 23 7A EB 1D ....#z......#z.. 00000010 6C 61 70 74 6F 70 6E 61 6D 65 0D 0A 00 00 00 00 ..LaptopName.... ##END OF SECTION THAT SHOWS THE "OUT-OF-SEQUENCE" DATA.## ##Client Receives:## RTSP/1.0 200 OK CSeq: 4 Date: Thu, Aug 29 2013 06:18:47 GMT Transport: RTP/AVP/TCP;unicast;destination=127.0.0.1;source=127.0.0.1;interleaved=2-3 Session: 32A854D4 ##Client Sends:## PLAY rtsp://127.0.0.1:8554/nurv/ RTSP/1.0 CSeq: 5 User-Agent: LIVE555 Streaming Media v2013.08.16 Session: 32A854D4 Range: npt=0.000- ##THIS IS THE ERROR RECEIVED FROM THE SERVER. IN CASES WHERE STREAMING WORKS, THIS ERROR IS NOT RECEIVED BUT WE RATHER START RECEIVING RTP AND RTCP PACKETS (DATA) FROM THE SERVER AND STREAMING STARTS.## ##Client Receives:## RTSP/1.0 405 Method Not Allowed CSeq: 5 Date: Thu, Aug 29 2013 06:18:48 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER ##Client Sends:## TEARDOWN rtsp://127.0.0.1:8554/nurv/ RTSP/1.0 CSeq: 6 User-Agent: LIVE555 Streaming Media v2013.08.16 Session: 32A854D4 Thanks for looking! Stanley On Thu, May 30, 2013 at 5:20 PM, Ross Finlayson wrote: > Stanley, > > I haven't had time yet to review and respond to your emails. You should > note, however, that the old 'synchronous' "RTSPClient" interface has now > been removed completely - as of the latest version (2013.05.30) of the code > - and is no longer supported, at all. (This old interface has been > out-of-date for three years now.) > > So, you should upgrade to the latest version of the code, and fix your > code to use the current, 'asynchronous' "RTSPClient" interface. Your > server application should also be using the latest version of our code. > > Then, and only then, please let us know what problems - if any - you are > having with your new server and client. (Please stop referring to archived > mailing list messages; they are usually referring to old, different > problems that are no longer relevant.) > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 2 10:42:45 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 2 Sep 2013 10:42:45 -0700 Subject: [Live-devel] Server getting confused with RTCP message before PLAY command (when using RTP over TCP) In-Reply-To: References: <49A3571C.8090906@schuckmannacres.com> <4130DF8D-4596-4F9A-9CEA-37B08CCA86E6@live555.com> Message-ID: <081E287E-B55F-46E9-A4E9-CA325E4E3C30@live555.com> > This looks to be the exact issue that the changelog shows as having been addressed in build 2012.10.04. Yes. That fix deferred the client's sending of RTCP "RR" packets until after it had received the response from the "PLAY" command. That's why I'm surprised by your report. Please see if you can reproduce this with our "testRTSPClient" demo application, modified only to change line 229 of "testRTSPClient.cpp" from #define REQUEST_STREAMING_OVER_TCP False to #define REQUEST_STREAMING_OVER_TCP True Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Tue Sep 3 01:27:02 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 3 Sep 2013 10:27:02 +0200 Subject: [Live-devel] RTSP server reclarmation time In-Reply-To: References: <31777_1377189010_52163C92_31777_186_3_1BE8971B6CFF3A4F97AF4011882AA2550156381C1E45@THSONEA01CMS01P.one.grp> <29990_1377246385_52171CB1_29990_6200_4_1BE8971B6CFF3A4F97AF4011882AA2550156381FB2D8@THSONEA01CMS01P.one.grp> <4650DCEB-8A38-41CB-8C37-012AD3C5CBA1@live555.com> <25519_1377767323_521F0F9A_25519_2337_1_1BE8971B6CFF3A4F97AF4011882AA255015638319466@THSONEA01CMS01P.one.grp> Message-ID: <3738_1378196916_52259DB4_3738_2383_1_1BE8971B6CFF3A4F97AF4011882AA2550156383EE955@THSONEA01CMS01P.one.grp> Ross, Thanks for your answer, it's clear 5s is too low. What I tried to ask is why the default value is 65 seconds and not 30 seconds or 120 seconds or something else... It's just to satisfy my curiosity, it does not matter ! Regards, Michel. [@@ THALES GROUP INTERNAL @@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : jeudi 29 ao?t 2013 11:21 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] RTSP server reclarmation time But the server side was using a reclamation timeout of 5s and the RR are sent with a period computed with a quite complex logic. The experience show that the RR are send each about 5-10 seconds. In RTSPServer the default value is 65 seconds, clearly 5 seconds is too low. Is there a particular reason for 65 seconds ? I don't understand what you're complaining/asking about here. Yes, a "reclamationTestSeconds" value of 5 (seconds) is ridiculously low. So, don't set it to such a low value! And yes, the default value of "reclamationTestSeconds" is 65 seconds. But so what? You weren't using the default value. You said that you were using a value of 5s, for some strange reason... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From srimugunth at csa.iisc.ernet.in Tue Sep 3 11:37:02 2013 From: srimugunth at csa.iisc.ernet.in (srimugunth at csa.iisc.ernet.in) Date: Wed, 4 Sep 2013 00:07:02 +0530 Subject: [Live-devel] modifying client end of ProxyServer Message-ID: <494bd8f466f5773bdc050f226aaccc50.squirrel@clmail.csa.iisc.ernet.in> Hi, In the proxyserver implementation i want to dump the frames after i got it from the client and before it is given to the server. (I do this as a first step to integrate our transcoder between RTSP client and RTSP server) With testRTSPClient.cpp, i was able to add the following code to DummySink::afterGettingFrame() and successfully dump .h264 frames " unsigned char const start_code[4] = {0x00, 0x00, 0x00, 0x01}; if(strcmp(fSubsession.mediumName(),"video") == 0) { if (!fHaveWrittenFirstFrame_dummy) { /* If we have PPS/SPS NAL units encoded in a "sprop parameter string", prepend these to the file: */ unsigned numSPropRecords; const char* fSPropParameterSetsStr = fSubsession.fmtp_spropparametersets(); SPropRecord* sPropRecords = parseSPropParameterSets(fSPropParameterSetsStr, numSPropRecords); for (unsigned i = 0; i < numSPropRecords; ++i) { fwrite(start_code, 1, 4, outFP); fwrite(sPropRecords[i].sPropBytes, 1, sPropRecords[i].sPropLength, outFP); } delete[] sPropRecords; fHaveWrittenFirstFrame_dummy = True; // for next time } fwrite(start_code, 1, 4, outFP); fwrite(this->fReceiveBuffer,sizeof(char),frameSize,outFP); } " So I tried to add the same code in ProxyServerMediaSession.cpp. There is a afterGettingFrame as part of PresentationTimeSubsessionNormalizer. Is this the correct place to modify to dump frames from Client. Also in PresentationTimeSubsessionNormalizer class i am not able to call fSubsession.fmtp_spropparametersets(); How will i get SPS+PPS info? -mugunthan -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From finlayson at live555.com Wed Sep 4 17:44:37 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 4 Sep 2013 17:44:37 -0700 Subject: [Live-devel] modifying client end of ProxyServer In-Reply-To: <494bd8f466f5773bdc050f226aaccc50.squirrel@clmail.csa.iisc.ernet.in> References: <494bd8f466f5773bdc050f226aaccc50.squirrel@clmail.csa.iisc.ernet.in> Message-ID: > With testRTSPClient.cpp, i was able to add the following code to > DummySink::afterGettingFrame() and successfully dump .h264 frames Much better: Use the existing "H264VideoFileSink" class (instead of "DummySink"), which (I think) already does exactly what you want. Even better: Use the existing "openRTSP" application, which will automatically output a H.264 video file: http://www.live555.com/openRTSP/ I don't see what this has to do with proxy RTSP servers (and I certainly don't see why you would want to modify the existing proxy server code). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felipel at lavid.ufpb.br Wed Sep 4 19:03:12 2013 From: felipel at lavid.ufpb.br (Felipe Lemos) Date: Wed, 4 Sep 2013 23:03:12 -0300 Subject: [Live-devel] Help me please, "testRTSPClient" Message-ID: *I'm capturing a stream from a camera H264, output:* Stream "rtsp://192.168.0.93/live1.sdp/"; audio/G726-32: Received 1000 bytes. Presentation time: 1378214178.942223! Stream "rtsp://192.168.0.93/live1.sdp/"; video/H264: Received 9263 bytes. Presentation time: 1378214179.640553! Stream "rtsp://192.168.0.93/live1.sdp/"; video/H264: Received 9119 bytes. Presentation time: 1378214179.678553! Stream "rtsp://192.168.0.93/live1.sdp/"; video/H264: Received 7007 bytes. Presentation time: 1378214179.718553! Stream "rtsp://192.168.0.93/live1.sdp/"; video/H264: Received 9193 bytes. Presentation time: 1378214179.758553! *VLC does not play this generated file, what should I do?* FILE *pFile; pFile = fopen("../video.mp4", "ab"); fwrite(fReceiveBuffer, 1, frameSize, pFile); fclose(pFile); What should I do with *fReceiverBuffe*r: Any suggestions? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 4 19:36:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 4 Sep 2013 19:36:19 -0700 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: References: Message-ID: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> The data in "fReceiveBuffer" (for the "video/H264" stream) is a 'raw' H.264 NAL unit. So you can't write it directly to a ".mp4"-format file. Instead, I suggest that you use the "openRTSP" application - see http://www.live555.com/openRTSP/ Note that it has an option "-4" for writing 'stdout' to a MPEG-4 format file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felipel at lavid.ufpb.br Thu Sep 5 05:59:25 2013 From: felipel at lavid.ufpb.br (Felipe Lemos) Date: Thu, 5 Sep 2013 09:59:25 -0300 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> Message-ID: But I need to integrate this system into another project. openRTSP has more features than I need. I need to capture the video rtsp bytes, encapsulated in MP4, and send UDP. I believe testRTSPCliente approaches what I need. Everything is ready, but the video does not play in any player. Any suggestions: thank you very much *Felipe Lemos **Graduate in Computer Science* *MSc. in Computer Science Federal University of Para?ba* *E-mail: felipel at lavid dot ufpb dot br* 2013/9/4 Ross Finlayson > The data in "fReceiveBuffer" (for the "video/H264" stream) is a 'raw' > H.264 NAL unit. So you can't write it directly to a ".mp4"-format file. > > Instead, I suggest that you use the "openRTSP" application - see > http://www.live555.com/openRTSP/ Note that it has an option "-4" for > writing 'stdout' to a MPEG-4 format file. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 5 08:13:51 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 5 Sep 2013 08:13:51 -0700 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> Message-ID: <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> Use "QuickTimeFileSink" instead of "DummySink". See the "openRTSP" code for guidance. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felipel at lavid.ufpb.br Thu Sep 5 11:24:23 2013 From: felipel at lavid.ufpb.br (Felipe Lemos) Date: Thu, 5 Sep 2013 15:24:23 -0300 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> Message-ID: Ok Ross, I looked at the code openRTSP. I changed the code of testRTSPClient adding QuickTimeFileSink :: createnew. But the video does not play generated. Code //scs.subsession->sink = DummySink::createNew(env, *scs.subsession, rtspClient->url()); qtOut = QuickTimeFileSink::createNew(env, *scs.session, "video.mp4", fileSinkBufferSize, 1280, 720, 25, false, true, true, true); qtOut->startPlaying(subsessionAfterPlaying, NULL); env << *rtspClient << "Created a data sink for the \"" << *scs.subsession << "\" subsession\n"; /*scs.subsession->miscPtr = rtspClient; scs.subsession->sink->startPlaying(*(scs.subsession->readSource()), subsessionAfterPlaying, scs.subsession);*/ if (scs.subsession->rtcpInstance() != NULL) { scs.subsession->rtcpInstance()->setByeHandler(subsessionByeHandler, scs.subsession); } What am I doing wrong? Sorry so many questions, but I'm having trouble. Thank you very much *Felipe Lemos **Graduate in Computer Science* *MSc. in Computer Science Federal University of Para?ba* *E-mail: felipel at lavid dot ufpb dot br* 2013/9/5 Ross Finlayson > Use "QuickTimeFileSink" instead of "DummySink". See the "openRTSP" code > for guidance. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 5 13:17:39 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 5 Sep 2013 13:17:39 -0700 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> Message-ID: > I looked at the code openRTSP. > I changed the code of testRTSPClient adding QuickTimeFileSink :: createnew. > But the video does not play generated. > > Code > > //scs.subsession->sink = DummySink::createNew(env, *scs.subsession, rtspClient->url()); > qtOut = QuickTimeFileSink::createNew(env, *scs.session, "video.mp4", fileSinkBufferSize, 1280, 720, 25, false, true, true, true); > qtOut->startPlaying(subsessionAfterPlaying, NULL); > > env << *rtspClient << "Created a data sink for the \"" << *scs.subsession << "\" subsession\n"; > > /*scs.subsession->miscPtr = rtspClient; > scs.subsession->sink->startPlaying(*(scs.subsession->readSource()), > subsessionAfterPlaying, scs.subsession);*/ > > if (scs.subsession->rtcpInstance() != NULL) { > scs.subsession->rtcpInstance()->setByeHandler(subsessionByeHandler, scs.subsession); > } > > > What am I doing wrong? First, because you have already called "qtOut<-startPlaying(...);", you must not also call "scs.subsession->sink->startPlaying(...);". Also, one important thing to understand about the "QuickTimeFileSink" class is that - to properly write the output file (including data 'trailers') - your application *must* close it properly - by calling "Medium::close(qtOut);" - when you're done. You cannot just '-c' your application. See the "Important note" here: http://www.live555.com/openRTSP/#quicktime If the sender sends RTCP "BYE" packets (for each subsession) when the stream ends, then your application's 'bye handler' should do this automatically. Otherwise, you have to set up a signal handler in your application (as "openRTSP" does), or set up a timer, to close "qtOut" after a certain period of time has elapsed. I suggest that you first figure out how to get the "openRTSP" application to read your stream - so it will write a proper MP4-format file - and then (and only then) start writing your own application that does the same. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zammargu at marvell.com Thu Sep 5 14:02:59 2013 From: zammargu at marvell.com (Zahira Ammarguellat) Date: Thu, 5 Sep 2013 14:02:59 -0700 Subject: [Live-devel] Packets and delay Message-ID: Ross, Can you please explain what happens when a packet is being delayed? I seem to understand that each packet has a life time and it gets discarded when its time is up. Does lives555 wait for the following packets to arrive, or does it continue delivering packets to client and then deliver the previous packets (out of order?). Thanks, -Zahira -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 5 14:45:15 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Sep 2013 09:45:15 +1200 Subject: [Live-devel] Packets and delay In-Reply-To: References: Message-ID: > Can you please explain what happens when a packet is being delayed? I seem to understand that each packet has a life time and it gets discarded when its time is up. Does lives555 wait for the following packets to arrive, or does it continue delivering packets to client and then deliver the previous packets (out of order?). Our "RTPSource" class (and subclasses) *always* delivers (the payload of) incoming RTP packets in order. In other words, data is *never* delivered out-of-order, even if incoming RTP packets happen to be misordered. If there is no packet loss - or misordering - in the network, then our code always delivers (the payload of) incoming RTP packets without delay. I.e., if the last-received RTP packet had RTP sequence number N, and the next incoming packet has RTP sequence number N+1 (mod 65536), then this next incoming packet is always delivered immediately. If, however, the next incoming packet has RTP sequence number N+2 (or later), then its payload is not delivered immediately. Instead, it is delayed by the 'packet reordering threshold', to see what subsequent packets happen to arrive. If packet N+1 arrives before the 'packet reordering threshold' time, then it will then get delivered, followed by (the already-arrived) packet N+2. If, however, the 'packet reordering threshold' threshold elapses without packet N+1 arriving, then (the already-arrived) packet N+2 will get delivered instead. In this case, packet N+1 will never get delivered, even if it happens to arrive later. By default, the 'packet reordering threshold' is 100 ms. Therefore, incoming packets may be delayed by this time - but *only* if there is packet loss in the network. It's important to understand that this is the *only* place in our code where incoming packets may be delayed before being delivered. If you are seeing delays greater than 100ms, then it is *not* happening in our code, so don't waste your time looking for it there. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felipel at lavid.ufpb.br Thu Sep 5 15:03:30 2013 From: felipel at lavid.ufpb.br (Felipe Lemos) Date: Thu, 5 Sep 2013 19:03:30 -0300 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> Message-ID: Ok Ross, thank you for explanations. Last question! Using the class QuickTimeFileSink, how to access the buffer and the number of bytes captured RTSP. DummySink this information are fReceiveBuffer and frameSize. I need access to this data! Thanks *Felipe Lemos **Graduate in Computer Science* *MSc. in Computer Science Federal University of Para?ba* *E-mail: felipel at lavid dot ufpb dot br* 2013/9/5 Ross Finlayson > I looked at the code openRTSP. > I changed the code of testRTSPClient adding QuickTimeFileSink :: createnew > . > But the video does not play generated. > > Code > > //scs.subsession->sink = DummySink::createNew(env, > *scs.subsession, rtspClient->url()); > qtOut = QuickTimeFileSink::createNew(env, *scs.session, > "video.mp4", fileSinkBufferSize, 1280, 720, 25, false, true, true, true); > qtOut->startPlaying(subsessionAfterPlaying, NULL); > > env << *rtspClient << "Created a data sink for the \"" << > *scs.subsession << "\" subsession\n"; > > /*scs.subsession->miscPtr = rtspClient; > scs.subsession->sink->startPlaying(*(scs.subsession->readSource()), > subsessionAfterPlaying, scs.subsession);*/ > > if (scs.subsession->rtcpInstance() != NULL) { > > scs.subsession->rtcpInstance()->setByeHandler(subsessionByeHandler, > scs.subsession); > } > > > What am I doing wrong? > > > First, because you have already called "qtOut<-startPlaying(...);", you > must not also call "scs.subsession->sink->startPlaying(...);". > > Also, one important thing to understand about the "QuickTimeFileSink" > class is that - to properly write the output file (including data > 'trailers') - your application *must* close it properly - by calling > "Medium::close(qtOut);" - when you're done. You cannot just '-c' > your application. > See the "Important note" here: http://www.live555.com/openRTSP/#quicktime > > If the sender sends RTCP "BYE" packets (for each subsession) when the > stream ends, then your application's 'bye handler' should do this > automatically. Otherwise, you have to set up a signal handler in your > application (as "openRTSP" does), or set up a timer, to close "qtOut" after > a certain period of time has elapsed. > > I suggest that you first figure out how to get the "openRTSP" application > to read your stream - so it will write a proper MP4-format file - and then > (and only then) start writing your own application that does the same. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 5 15:15:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 5 Sep 2013 15:15:32 -0700 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> Message-ID: <83704DA7-8947-4B5F-A734-1F1FB349C190@live555.com> > Using the class QuickTimeFileSink, how to access the buffer and the number of bytes captured RTSP. > DummySink this information are fReceiveBuffer and frameSize. I need access to this data! Why? The "QuickTimeFileSink" class automatically writes its incoming data into a (MOV or MP4-format) file. This data will be in the output file (named "video.mp4", in your case). If, however, you want to do something else with the incoming data, in addition to writing it to a "QuickTimeFileSink", then you could do so, using the "StreamReplicator" class. However, I'm not going to give you any help with that, until you first demonstrate that you can properly write your stream to a MP4-format file using "openRTSP". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felipel at lavid.ufpb.br Thu Sep 5 16:18:27 2013 From: felipel at lavid.ufpb.br (Felipe Lemos) Date: Thu, 5 Sep 2013 20:18:27 -0300 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: <83704DA7-8947-4B5F-A734-1F1FB349C190@live555.com> References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> <83704DA7-8947-4B5F-A734-1F1FB349C190@live555.com> Message-ID: Because I need to send a UDP bytes captured encapsulated in MP4. Write in file is for testing purposes only. Needed to know if the VLC played the bytes from the stream rtsp. I need to get the rtsp bytes, encapsulated in MP4 and send directly by UDP. Need bytes and the number of bytes received for it. Actually I do not need to write to file, was to test! 2013/9/5 Ross Finlayson > Using the class QuickTimeFileSink, how to access the buffer and the number > of bytes captured RTSP. > DummySink this information are fReceiveBuffer and frameSize. I need access > to this data! > > > Why? The "QuickTimeFileSink" class automatically writes its incoming data > into a (MOV or MP4-format) file. This data will be in the output file > (named "video.mp4", in your case). > > If, however, you want to do something else with the incoming data, in > addition to writing it to a "QuickTimeFileSink", then you could do so, > using the "StreamReplicator" class. > > However, I'm not going to give you any help with that, until you first > demonstrate that you can properly write your stream to a MP4-format file > using "openRTSP". > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From felipel at lavid.ufpb.br Thu Sep 5 18:57:36 2013 From: felipel at lavid.ufpb.br (Felipe Lemos) Date: Thu, 5 Sep 2013 22:57:36 -0300 Subject: [Live-devel] Help me please, "testRTSPClient" In-Reply-To: References: <108885AC-4A17-497C-8BCD-CF9CF3B59C9C@live555.com> <5AE9B6E6-404A-416D-B36F-E4823AFF5020@live555.com> <83704DA7-8947-4B5F-A734-1F1FB349C190@live555.com> Message-ID: Dear Ross, I already have begotten Video mp4 with openRTSP before posting to the forum. I need to know how to access the buffer and size after using QuickTimeFileSink, just that! Thanks 2013/9/5 Felipe Lemos > Because I need to send a UDP bytes captured encapsulated in MP4. > Write in file is for testing purposes only. > Needed to know if the VLC played the bytes from the stream rtsp. > > I need to get the rtsp bytes, encapsulated in MP4 and send directly by UDP > . > Need bytes and the number of bytes received for it. > > Actually I do not need to write to file, was to test! > > > 2013/9/5 Ross Finlayson > >> Using the class QuickTimeFileSink, how to access the buffer and the >> number of bytes captured RTSP. >> DummySink this information are fReceiveBuffer and frameSize. I need >> access to this data! >> >> >> Why? The "QuickTimeFileSink" class automatically writes its incoming >> data into a (MOV or MP4-format) file. This data will be in the output file >> (named "video.mp4", in your case). >> >> If, however, you want to do something else with the incoming data, in >> addition to writing it to a "QuickTimeFileSink", then you could do so, >> using the "StreamReplicator" class. >> >> However, I'm not going to give you any help with that, until you first >> demonstrate that you can properly write your stream to a MP4-format file >> using "openRTSP". >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From skaarj1 at gmail.com Sat Sep 7 05:09:59 2013 From: skaarj1 at gmail.com (Skaarj NaPali) Date: Sat, 7 Sep 2013 14:09:59 +0200 Subject: [Live-devel] scheduleNextQOSMeasurement() bug? Message-ID: Hi Ross, There is a problem with this line when "nextQOSMeasurementUSecs" < "timeNowUSecs", or in other words when "timeNowUSecs" has been acquired at a time after the time stored in "nextQOSMeasurementUSecs". This happened for me only while single stepping the code in the debugger and thus letting pass several seconds between the "gettimeofday" calls, but I guess that could also happen under high system load, etc. The effect is that the periodic QOS measurement practically "stops" working because a very high value for "microseconds" will get passed to "scheduleDelayedTask". When "usecsToDelay" gets negative due to above mentioned condition, the "uint usecsToDelay" gets populated to the "int64_t microseconds" parameter of "scheduleDelayedTask" without sign extension (which is correct) and thus a very large delay is accounted for that task - means in practice that it sort of stops working. I fixed this with changing "unsigned usecsToDelay" to "int usecsToDelay" Kind regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 7 17:48:49 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 7 Sep 2013 17:48:49 -0700 Subject: [Live-devel] scheduleNextQOSMeasurement() bug? In-Reply-To: References: Message-ID: <2BA06977-82C5-4B3C-B1CE-A25B686A7159@live555.com> > I fixed this with changing > "unsigned usecsToDelay" > to > "int usecsToDelay" I presume you're talking about line 1049 of "testProgs/playCommon.cpp" (the "openRTSP" code). (You didn't give any context for your message.) I've changed it to int64_t usecsToDelay = nextQOSMeasurementUSecs - timeNowUSecs; in the latest version of the software (2013.09.08: just released). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake at arahant.com Sun Sep 8 07:12:30 2013 From: blake at arahant.com (Blake McBride) Date: Sun, 8 Sep 2013 09:12:30 -0500 Subject: [Live-devel] Error building VLC because of changes arguments in RTSPClient::RTSPClient Message-ID: Greetings, I am attempting to build VLC with live.2013.09.08. Live builds and installs fine, but VLC-2.0.8 gives what I show below. It seems like one of the constructors in live has changed signatures. Others on the net have had the same problem and they suggested using an older version of live but I couldn't find one. Not sure what to do. Any help would be greatly appreciated. make[5]: Entering directory `/root/blake/vlc-2.0.8/modules/demux' CXX liblive555_plugin_la-live555.lo live555.cpp: In constructor 'RTSPClientVlc::RTSPClientVlc(UsageEnvironment&, const char*, int, const char*, portNumBits, demux_sys_t*)': live555.cpp:235: error: no matching function for call to 'RTSPClient::RTSPClient(UsageEnvironment&, const char*&, int&, const char*&, portNumBits&)' /usr/local/include/liveMedia/RTSPClient.hh:216: note: candidates are: RTSPClient::RTSPClient(UsageEnvironment&, const char*, int, const char*, portNumBits, int) /usr/local/include/liveMedia/RTSPClient.hh:37: note: RTSPClient::RTSPClient(RTSPClient&) live555.cpp: In function 'int Open(vlc_object_t*)': live555.cpp:360: warning: 'bool vlc_object_alive(const vlc_object_t*)' is deprecated (declared at ../../include/vlc_objects.h:81) live555.cpp:360: warning: 'bool vlc_object_alive(const vlc_object_t*)' is deprecated (declared at ../../include/vlc_objects.h:81) live555.cpp: In function 'int Connect(demux_t*)': live555.cpp:585: warning: 'bool vlc_object_alive(const vlc_object_t*)' is deprecated (declared at ../../include/vlc_objects.h:81) live555.cpp:585: warning: 'bool vlc_object_alive(const vlc_object_t*)' is deprecated (declared at ../../include/vlc_objects.h:81) live555.cpp: In function 'int SessionsSetup(demux_t*)': live555.cpp:704: warning: 'bool vlc_object_alive(const vlc_object_t*)' is deprecated (declared at ../../include/vlc_objects.h:81) live555.cpp:704: warning: 'bool vlc_object_alive(const vlc_object_t*)' is deprecated (declared at ../../include/vlc_objects.h:81) make[5]: *** [liblive555_plugin_la-live555.lo] Error 1 make[5]: Leaving directory `/root/blake/vlc-2.0.8/modules/demux' make[4]: *** [all-recursive] Error 1 make[4]: Leaving directory `/root/blake/vlc-2.0.8/modules/demux' make[3]: *** [all] Error 2 make[3]: Leaving directory `/root/blake/vlc-2.0.8/modules/demux' make[2]: *** [all-recursive] Error 1 make[2]: Leaving directory `/root/blake/vlc-2.0.8/modules' make[1]: *** [all-recursive] Error 1 make[1]: Leaving directory `/root/blake/vlc-2.0.8' make: *** [all] Error 2 root at piab:~/blake/vlc-2.0.8# -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Sep 8 10:08:59 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 8 Sep 2013 10:08:59 -0700 Subject: [Live-devel] Error building VLC because of changes arguments in RTSPClient::RTSPClient In-Reply-To: References: Message-ID: > I am attempting to build VLC with live.2013.09.08. Live builds and installs fine, but VLC-2.0.8 gives what I show below. It seems like one of the constructors in live has changed signatures. This is not a VLC mailing list. In general, problems building VLC should be reported to a VLC mailing list, not here. However, in this case the problem you're facing arises from the fact that the VLC developers have not updated their 'LIVE555 interface' code to allow for a change - reported back in June - to the "RTSPClient" constructor: http://lists.live555.com/pipermail/live-devel/2013-June/017085.html VLC's 'LIVE555 interface code' ("live555.cpp") can be fixed by adding an extra parameter -1 to the end of the "RTSPClient()" constructor call at line 235. Please report this to the VLC developers. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From skaarj1 at gmail.com Sun Sep 8 05:06:54 2013 From: skaarj1 at gmail.com (Skaarj NaPali) Date: Sun, 8 Sep 2013 14:06:54 +0200 Subject: [Live-devel] scheduleNextQOSMeasurement() bug? In-Reply-To: <2BA06977-82C5-4B3C-B1CE-A25B686A7159@live555.com> References: <2BA06977-82C5-4B3C-B1CE-A25B686A7159@live555.com> Message-ID: Yes, I was referring to function "scheduleNextQOSMeasurement()" at line 1049 of "testProgs/playCommon.cpp" (the "openRTSP" code). I wanted to refer my mail to an already existing thread in the mailing list with subject "scheduleNextQOSMeasurement() bug?" from "Thu Jan 15 05:41:40 PST 2009", sorry if that didn't work. The change which you applied in version 2013.09.08 does not change anything, because it does not cause the needed sign extension. In fact declaring "usecsToDelay" as "int64_t" and assigning it the result of an "unsigned" term is the same as passing an "unsigned" type to an "int64_t" parameter to a function. It's the way how C/C++ propagates types. To get a sign extension from "unsigned" (32-bit) to "signed" (64-bit), the "unsigned" (32-bit) has to get propagated first to a "signed" (32-bit) type. This can be done in several ways. unsigned nextQOSMeasurementUSecs; unsigned timeNowUSecs; ... int usecsToDelay = nextQOSMeasurementUSecs - timeNowUSecs; ----or---- int64_t nextQOSMeasurementUSecs; // or "uint64_t" int64_t timeNowUSecs; // or "uint64_t" ... int64_t usecsToDelay = nextQOSMeasurementUSecs - timeNowUSecs; ----or---- unsigned nextQOSMeasurementUSecs; unsigned timeNowUSecs; ... int64_t usecsToDelay = (int)(nextQOSMeasurementUSecs - timeNowUSecs); The 1st one avoids an explicit cast and if the types of "nextQOSMeasurementUSecs" or "timeNowUSecs" once get changed to "int64_t" it would cause the compiler to throw a warning about type truncation. The 2nd one solves the problem implicitly because all types are of the same size, but it would require more changes in the code. The 3rd one may become problematic once the types of "nextQOSMeasurementUSecs" or "timeNowUSecs" are changed in the future, because the explicit cast prevents compiler warnings. Kind regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Sep 8 21:47:06 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 8 Sep 2013 21:47:06 -0700 Subject: [Live-devel] scheduleNextQOSMeasurement() bug? In-Reply-To: References: <2BA06977-82C5-4B3C-B1CE-A25B686A7159@live555.com> Message-ID: <37C5201B-0697-416A-86D8-194CB2522E86@live555.com> > int usecsToDelay = nextQOSMeasurementUSecs - timeNowUSecs; OK, I'll change it to this in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From miguel.orenes at displaynote.com Mon Sep 9 03:56:14 2013 From: miguel.orenes at displaynote.com (Miguel Orenes) Date: Mon, 9 Sep 2013 10:56:14 +0000 Subject: [Live-devel] RTSP SETUP response Message-ID: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> Hi there, I want to handle the SETUP params that my RTSPClient is receiving when a RTSPServer sends the reply, I thought inherit by RTSPClient and to rewrite parseTransportParams or handleSETUPResponse but I cannot because these methods are declared as private. Is there any way to handle the reply setup params? Regards. Miguel ?ngel Orenes Fern?ndez Software Engineer DisplayNote Technologies Ltd. P +34 868 079 259 W www.displaynote.com E miguel.orenes at displaynote.com A Traper?a 19 4?D | MURCIA 30001 Follow us on Twitter! http://twitter.com/displaynote -------------- next part -------------- An HTML attachment was scrubbed... URL: From zammargu at marvell.com Mon Sep 9 06:57:01 2013 From: zammargu at marvell.com (Zahira Ammarguellat) Date: Mon, 9 Sep 2013 06:57:01 -0700 Subject: [Live-devel] Packets and delay In-Reply-To: References: Message-ID: Ross, Thanks for the answer. That clarifies things. Another question. You say: If, however, the 'packet reordering threshold' threshold elapses without packet N+1 arriving, then (the already-arrived) packet N+2 will get delivered instead. In this case, packet N+1 will never get delivered, even if it happens to arrive later. But suppose that the next packet is not N+2 but N+3? What happens in this case? Is there another delay of 100ms? Or the most delay is 100ms and whatever packet arrives next is delivered? In other words, suppose only N+2 is delayed we would have: N, N+1, ---delay 100ms --- N+2 But suppose N+2 and N+3 are delayed would we have? (Assuming that the threshold is expired) N, N+1, ---delay 100ms --- N+4 Or N, N+1, ---delay 100ms---, ---delay 100ms---, N+4 Thanks, -Zahira -Zahira From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, September 05, 2013 5:45 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Packets and delay Can you please explain what happens when a packet is being delayed? I seem to understand that each packet has a life time and it gets discarded when its time is up. Does lives555 wait for the following packets to arrive, or does it continue delivering packets to client and then deliver the previous packets (out of order?). Our "RTPSource" class (and subclasses) *always* delivers (the payload of) incoming RTP packets in order. In other words, data is *never* delivered out-of-order, even if incoming RTP packets happen to be misordered. If there is no packet loss - or misordering - in the network, then our code always delivers (the payload of) incoming RTP packets without delay. I.e., if the last-received RTP packet had RTP sequence number N, and the next incoming packet has RTP sequence number N+1 (mod 65536), then this next incoming packet is always delivered immediately. If, however, the next incoming packet has RTP sequence number N+2 (or later), then its payload is not delivered immediately. Instead, it is delayed by the 'packet reordering threshold', to see what subsequent packets happen to arrive. If packet N+1 arrives before the 'packet reordering threshold' time, then it will then get delivered, followed by (the already-arrived) packet N+2. If, however, the 'packet reordering threshold' threshold elapses without packet N+1 arriving, then (the already-arrived) packet N+2 will get delivered instead. In this case, packet N+1 will never get delivered, even if it happens to arrive later. By default, the 'packet reordering threshold' is 100 ms. Therefore, incoming packets may be delayed by this time - but *only* if there is packet loss in the network. It's important to understand that this is the *only* place in our code where incoming packets may be delayed before being delivered. If you are seeing delays greater than 100ms, then it is *not* happening in our code, so don't waste your time looking for it there. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 9 08:03:48 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Sep 2013 08:03:48 -0700 Subject: [Live-devel] RTSP SETUP response In-Reply-To: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> References: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> Message-ID: > I want to handle the SETUP params that my RTSPClient is receiving when a RTSPServer sends the reply, I thought inherit by RTSPClient and to rewrite parseTransportParams or handleSETUPResponse but I cannot because these methods are declared as private. Those member functions are not intended to be redefinable, because the set of 'transport' parameters - defined in RFC 2326 - is not intended to be extendable. Which specific parameter(s) - defined in RFC 2326 - do you want your RTSP client to be able to access? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From miguel.orenes at displaynote.com Mon Sep 9 08:57:20 2013 From: miguel.orenes at displaynote.com (Miguel Orenes) Date: Mon, 9 Sep 2013 15:57:20 +0000 Subject: [Live-devel] RTSP SETUP response In-Reply-To: References: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> Message-ID: <4d49faceb9ab495f970319061406dcb6@AMXPR05MB072.eurprd05.prod.outlook.com> Transport: RTP/AVP;unicast; I would like to know if RTSPClient is configured as multicast or unicast. I can see on parseTransportParams that this parameter is treated but is not stored in order to get later, then I am not able to know from client part if RTSPServer is working as multicast or unicast. I know that RTSPClient is configured automatically and is able to receive multicast traffic, but maybe I want to change to unicast because we are losing a lot of multicast frames or for another cause. Regards. Miguel ?ngel Orenes Fern?ndez Software Engineer DisplayNote Technologies Ltd. P +34 868 079 259 W www.displaynote.com E miguel.orenes at displaynote.com A Traper?a 19 4?D | MURCIA 30001 Follow us on Twitter! http://twitter.com/displaynote From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, September 9, 2013 5:04 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTSP SETUP response I want to handle the SETUP params that my RTSPClient is receiving when a RTSPServer sends the reply, I thought inherit by RTSPClient and to rewrite parseTransportParams or handleSETUPResponse but I cannot because these methods are declared as private. Those member functions are not intended to be redefinable, because the set of 'transport' parameters - defined in RFC 2326 - is not intended to be extendable. Which specific parameter(s) - defined in RFC 2326 - do you want your RTSP client to be able to access? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 9 08:58:05 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Sep 2013 08:58:05 -0700 Subject: [Live-devel] Packets and delay In-Reply-To: References: Message-ID: > In other words, suppose only N+2 is delayed we would have: > N, N+1, ---delay 100ms --- N+2 No. In this case - because there's no gap in RTP sequence number - each of these three packets is delivered immediately, when they arrive. There's no extra delay. > But suppose N+2 and N+3 are delayed would we have? (Assuming that the threshold is expired) > N, N+1, ---delay 100ms --- N+4 > Or > N, N+1, ---delay 100ms---, ---delay 100ms---, N+4 In this case the delay starts only after the arrival of packet N+4. In other words: Packets N and N+1 are delivered immediately, when they arrive. Then, after packet N+4 arrives, because there's a gap in the RTP sequence numbers, the code delays the delivery of packet N+4, to see if packets N+2 and/or N+3 arrive next. Specifically: - If packet N+2 arrives before 100ms has elapsed, then it is delivered immediately, and the 100ms timer is reset. - If packet N+3 then arrives next, before 100ms (after the arrival of packet N+2) has elapsed, then packet N+3 is also delivered immediately. Then packet N+4 is delivered next. In this case, there's no extra delay. - If, however, 100ms (after the arrival of packet N+2) elapses without packet N+3 arriving, then packet N+4 is delivered instead - i.e., after a single delay of 100ms. (In this case, packet N+3 is never delivered, even if it happens to arrive afterwards.) - If packet N+3 (but not packet N+2) arrives next, before 100ms (after the arrival of packet N+4) has elapsed, then the 100ms timer is reset. - If packet N+2 then arrives, before 100ms (after the arrival of packet N+3) has elapsed, then packet N+2 is delivered immediately. Then packet N+3 is delivered next. Then packet N+4 is delivered next. In this case, there's no extra delay. - If, however, 100ms (after the arrival of packet N+3) elapses without packet N+2 arriving, then packet N+3 is delivered instead - i.e., after a single delay of 100ms. Then packet N+4 is delivered next. (In this case, packet N+2 is never delivered, even if it happens to arrive afterwards.) - If neither packet N+2 or N+3 arrives before 100 ms (after the arrival of packet N+4) has elapsed, then packet N+4 is delivered instead - i.e., after a single delay of 100ms. (In this case, packet N+2 and N+3 are never delivered, even if they happens to arrive afterwards.) In other words, if there's only packet loss - but no packet reordering - in the network, then no incoming packet will ever be delayed more than 100ms before being delivered. If there's no packet loss (and no packet reordering), then no incoming packet will ever be delayed. Once again, there is nothing here that you need concern yourself with. This will be my (and your) last posting on this topic. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 9 09:19:18 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Sep 2013 09:19:18 -0700 Subject: [Live-devel] RTSP SETUP response In-Reply-To: <4d49faceb9ab495f970319061406dcb6@AMXPR05MB072.eurprd05.prod.outlook.com> References: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> <4d49faceb9ab495f970319061406dcb6@AMXPR05MB072.eurprd05.prod.outlook.com> Message-ID: No, it's the server - not the client - that decides the characteristics of a RTSP stream, including whether it's unicast or multicast. The only way for a RTSP client to explicitly choose between a unicast and a multicast stream is if the server has been set up beforehand to serve both kinds of stream - each with a different name. (Then the client can use the stream's name - and thus "rtsp://" URL - to choose which stream it wants.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From miguel.orenes at displaynote.com Mon Sep 9 10:17:48 2013 From: miguel.orenes at displaynote.com (Miguel Orenes) Date: Mon, 9 Sep 2013 17:17:48 +0000 Subject: [Live-devel] RTSP SETUP response In-Reply-To: References: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> <4d49faceb9ab495f970319061406dcb6@AMXPR05MB072.eurprd05.prod.outlook.com> Message-ID: <965e4e1b853b4316a8e0c131fac64562@AMXPR05MB072.eurprd05.prod.outlook.com> I know, I do not want to configure the client as multicast or unicast, I only want to know if RTSPClient received "Transport:RTP/AVP;unicast;" or "Trasport:RTP/AVP;multicast;" on SETUP reply. Regards. Miguel ?ngel Orenes Fern?ndez Software Engineer DisplayNote Technologies Ltd. P +34 868 079 259 W www.displaynote.com E miguel.orenes at displaynote.com A Traper?a 19 4?D | MURCIA 30001 Follow us on Twitter! http://twitter.com/displaynote From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, September 9, 2013 6:19 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTSP SETUP response No, it's the server - not the client - that decides the characteristics of a RTSP stream, including whether it's unicast or multicast. The only way for a RTSP client to explicitly choose between a unicast and a multicast stream is if the server has been set up beforehand to serve both kinds of stream - each with a different name. (Then the client can use the stream's name - and thus "rtsp://" URL - to choose which stream it wants.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 9 14:05:12 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Sep 2013 14:05:12 -0700 Subject: [Live-devel] RTSP SETUP response In-Reply-To: <965e4e1b853b4316a8e0c131fac64562@AMXPR05MB072.eurprd05.prod.outlook.com> References: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> <4d49faceb9ab495f970319061406dcb6@AMXPR05MB072.eurprd05.prod.outlook.com> <965e4e1b853b4316a8e0c131fac64562@AMXPR05MB072.eurprd05.prod.outlook.com> Message-ID: > I know, I do not want to configure the client as multicast or unicast, I only want to know if RTSPClient received ?Transport:RTP/AVP;unicast;? or ?Trasport:RTP/AVP;multicast;? on SETUP reply. OK - you can do this by testing the IP address for the "MediaSubsession" object. E.g., using the "testRTSPClient" code as an example, in the "continueAfterSETUP()" function, you could call IsMulticastAddress(scs.subsession->rtpSource()->RTPgs()->groupAddress().s_addr) I.e., you can do this without having to subclass "RTSPClient". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Vince.Li at quantatw.com Mon Sep 9 20:14:42 2013 From: Vince.Li at quantatw.com (=?big5?B?VmluY2UgTGkop/Wrwik=?=) Date: Tue, 10 Sep 2013 03:14:42 +0000 Subject: [Live-devel] Get crash with BasicUDPSink Message-ID: <1B8CE959541B0C40BEC5966000F63EC6AD98FDA6@MAILBX03.quanta.corp> Hello Rose, When I upgrade live555 from 2013.04.01 to 2013.09.08, I got crash after run my program for a while. The following is the crash message. FramedSource[0x16338d0]::getNextFrame(): attempting to read more than once at the same time! I am using BasicUDPSink to streaming H.264 and AAC with MPEG2 TS format like this. MPEG2TransportStreamFromESSource* tsFrames = MPEG2TransportStreamFromESSource::createNew(*env); tsFrames->addNewVideoSource(videoSource, 5 /* H.264 */); tsFrames->addNewAudioSource(audioSource, 4 /* AAC */ ); BasicUDPSink* udpSink = BasicUDPSink::createNew(*env, udpSocket, 1316); udpSink->startPlaying(*tsFrames, NULL, udpSink); Besides, I have noticed that the higher bit rate the faster to get crash. Not sure what to do to fix it. Any help would be greatly appreciated. Vince Li(??) Future Laboratory 2 Quanta Research Institute Email: Vince.Li at quantatw.com TEL: +886-3-327-2345 ext 16278 FAX: +886-3-397-2604 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Vince.Li at quantatw.com Mon Sep 9 20:29:09 2013 From: Vince.Li at quantatw.com (=?big5?B?VmluY2UgTGkop/Wrwik=?=) Date: Tue, 10 Sep 2013 03:29:09 +0000 Subject: [Live-devel] Without SDP with H264VideoStreamDiscreteFramer Message-ID: <1B8CE959541B0C40BEC5966000F63EC6AD98FDC5@MAILBX03.quanta.corp> Hello Rose, I?ve a question about H264VideoStreamFramer and H264VideoStreamDiscreteFramer. Since my video frame source is discrete (frame by frame), I use H264VideoStreamDiscreteFrame. However, I discover the H264VideoRTPSink didn?t have SDP (dump auxSDPLine() periodically) with H264VideoStreamDiscreteFramer. But H264VideoStreamFramer have. Without SDP, VLC player still can play my RTSP, but almost player apps on android phone cannot play. Any help would be greatly appreciated. Vince Li(??) Future Laboratory 2 Quanta Research Institute Email: Vince.Li at quantatw.com TEL: +886-3-327-2345 ext 16278 FAX: +886-3-397-2604 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 9 23:32:39 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Sep 2013 23:32:39 -0700 Subject: [Live-devel] Get crash with BasicUDPSink In-Reply-To: <1B8CE959541B0C40BEC5966000F63EC6AD98FDA6@MAILBX03.quanta.corp> References: <1B8CE959541B0C40BEC5966000F63EC6AD98FDA6@MAILBX03.quanta.corp> Message-ID: It's very unlikely that the LIVE555 version upgrade caused your crash. More likely, it somehow exposed a bug in your code that already existed. > The following is the crash message. > FramedSource[0x16338d0]::getNextFrame(): attempting to read more than once at the same time! This error message means that somewhere - in your code - you are calling "getNextFrame()" on an object while it is already handling a previous call to "getNextFrame()" - i.e., before the 'afteGettingFunc' that was passed to the previous call to "getNextFrame()" has been called. I suspect that the problem is in your code for your own audio and video source classes - in particular, your implementation of the "doGetNextFrame()" virtual function in these classes. Because these classes should do asynchronous (i.e., non-blocking) I/O, you need to be especially careful to properly handle the case when "doGetNextFrame()" is called when no new data is immediately available. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 9 23:36:53 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Sep 2013 23:36:53 -0700 Subject: [Live-devel] Without SDP with H264VideoStreamDiscreteFramer In-Reply-To: <1B8CE959541B0C40BEC5966000F63EC6AD98FDC5@MAILBX03.quanta.corp> References: <1B8CE959541B0C40BEC5966000F63EC6AD98FDC5@MAILBX03.quanta.corp> Message-ID: <910005BD-4DD3-4545-B3DB-2CD238F0A0D2@live555.com> > I?ve a question about H264VideoStreamFramer and H264VideoStreamDiscreteFramer. > Since my video frame source is discrete (frame by frame), I use H264VideoStreamDiscreteFrame. > However, I discover the H264VideoRTPSink didn?t have SDP (dump auxSDPLine() periodically) with H264VideoStreamDiscreteFramer. Does your input sequence of NAL units (that you pass to "H264VideoStreamDiscreteFramer") contain SPS and PPS NAL units (either at the start of the sequence, or periodically within it)? If so, then the "H264VideoStreamDiscreteFramer" will automatically record these when they appear, and this will cause the proper "auxSDPLine()" to get generated. Alternatively, if you know the SPS and PPS NAL unit data in advance, you can pass them as parameters to the "H264VideoRTPSink" constructor (see "liveMedia/include/H264VideoRTPSink.hh"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From miguel.orenes at displaynote.com Tue Sep 10 01:01:43 2013 From: miguel.orenes at displaynote.com (Miguel Orenes) Date: Tue, 10 Sep 2013 08:01:43 +0000 Subject: [Live-devel] RTSP SETUP response In-Reply-To: References: <123ed2c0218d4ab4a19d68e1915785a9@AMXPR05MB072.eurprd05.prod.outlook.com> <4d49faceb9ab495f970319061406dcb6@AMXPR05MB072.eurprd05.prod.outlook.com> <965e4e1b853b4316a8e0c131fac64562@AMXPR05MB072.eurprd05.prod.outlook.com> Message-ID: <28f366b134fc4c7bb42bfd397e410233@AMXPR05MB072.eurprd05.prod.outlook.com> Many thanks Ross Miguel ?ngel Orenes Fern?ndez Software Engineer DisplayNote Technologies Ltd. P +34 868 079 259 W www.displaynote.com E miguel.orenes at displaynote.com A Traper?a 19 4?D | MURCIA 30001 Follow us on Twitter! http://twitter.com/displaynote From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, September 9, 2013 11:05 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTSP SETUP response I know, I do not want to configure the client as multicast or unicast, I only want to know if RTSPClient received "Transport:RTP/AVP;unicast;" or "Trasport:RTP/AVP;multicast;" on SETUP reply. OK - you can do this by testing the IP address for the "MediaSubsession" object. E.g., using the "testRTSPClient" code as an example, in the "continueAfterSETUP()" function, you could call IsMulticastAddress(scs.subsession->rtpSource()->RTPgs()->groupAddress().s_addr) I.e., you can do this without having to subclass "RTSPClient". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Tue Sep 10 12:11:29 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Tue, 10 Sep 2013 21:11:29 +0200 Subject: [Live-devel] Absolute seekStream - problem with timestamps Message-ID: Hello Ross, I have subclassed OnDemandServerMediaSubsession and added support for 'trick play' using absolute times. All is working fine but the client (which is also using Live555) is receiving wrong timestamps from the server after pause/play is triggered. Basically I follow these steps: 1 - Client connects to the server for live video - OK 2 - Pause the client (sendPauseCommand) - OK 3 - Wait for some seconds and play the client again (sendPlayCommand("YYYYMM....")) - The frames are correct but the client reports wrong timestamps for 1 or 2 seconds, and then shows the correct timestamps. Looking at the implementation of OnDemandServerMediaSubsession I see that the other seekStream (with NPT) is changing the rtpSink presentation times with rtpSink->resetPresentationTimes(), while the other is not. Could this be the problem? Here is a sample of the received frames (in the client) after the play command (step 3). All the frames before 1378839810.868667 are correct but the timestamps are wrong: Received 205 new bytes of response data. > Received a complete PLAY response: > RTSP/1.0 200 OK > CSeq: 7 > Date: Tue, Sep 10 2013 19:03:37 GMT > Range: clock=20130910T210329.68Z- > Session: 6B73948F > RTP-Info: url=rtsp:// > 192.168.1.6:9000/2/playback/track1;seq=42948;rtptime=1877910131 > > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122886 > bytes. Presentation time: 1378839817.440850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122886 > bytes. Presentation time: 1378839817.506850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123596 > bytes. Presentation time: 1378839817.572850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123451 > bytes. Presentation time: 1378839817.638850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122796 > bytes. Presentation time: 1378839817.704850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123719 > bytes. Presentation time: 1378839817.770850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122876 > bytes. Presentation time: 1378839817.836850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122963 > bytes. Presentation time: 1378839817.902850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123406 > bytes. Presentation time: 1378839817.968850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123183 > bytes. Presentation time: 1378839818.034850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123008 > bytes. Presentation time: 1378839818.100850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123460 > bytes. Presentation time: 1378839818.166850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123108 > bytes. Presentation time: 1378839818.232850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123045 > bytes. Presentation time: 1378839818.298850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123708 > bytes. Presentation time: 1378839818.364850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123219 > bytes. Presentation time: 1378839818.430850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123228 > bytes. Presentation time: 1378839818.496850 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123398 > bytes. Presentation time: 1378839810.868667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122975 > bytes. Presentation time: 1378839810.934667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123151 > bytes. Presentation time: 1378839811.000667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123915 > bytes. Presentation time: 1378839811.066667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122808 > bytes. Presentation time: 1378839811.132667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123383 > bytes. Presentation time: 1378839811.198667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123518 > bytes. Presentation time: 1378839811.264667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 122613 > bytes. Presentation time: 1378839811.330667 > Stream "rtsp://192.168.1.6:9000/2/playback/"; video/JPEG: Received 123482 > bytes. Presentation time: 1378839811.396667 Thanks in advance. -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 10 13:33:04 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Sep 2013 13:33:04 -0700 Subject: [Live-devel] Absolute seekStream - problem with timestamps In-Reply-To: References: Message-ID: <0116F7A2-4844-4337-90D9-1AAAE02B1485@live555.com> > I have subclassed OnDemandServerMediaSubsession and added support for 'trick play' using absolute times. All is working fine but the client (which is also using Live555) is receiving wrong timestamps from the server after pause/play is triggered. Basically I follow these steps: > > 1 - Client connects to the server for live video - OK > 2 - Pause the client (sendPauseCommand) - OK > 3 - Wait for some seconds and play the client again (sendPlayCommand("YYYYMM....")) - The frames are correct but the client reports wrong timestamps for 1 or 2 seconds, and then shows the correct timestamps. Assuming that the presentation times during this time are all RTCP-synchronized (see ), I wonder if the problem might be that these 'extra' packets (the initial ones with the incorrect presentation time) are simply old packets that were buffered inside the client OS's socket receive buffer at the time of the pause? (If that's the case, then there's nothing much you can do about this, other than reduce your client OS socket buffering.) You could test this by sending (after your initial PAUSE) a non-absolute-time "PLAY" command with a "start" parameter of -1 (which means 'resume without seeking') > Looking at the implementation of OnDemandServerMediaSubsession I see that the other seekStream (with NPT) is changing the rtpSink presentation times with rtpSink->resetPresentationTimes(), while the other is not. Could this be the problem? No, because that doesn't actually change any presentation times; it just resets variables that are used to compute "getCurrentNPT()", which doesn't get called in your case, because you're seeking by absolute time. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Tue Sep 10 14:14:05 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Tue, 10 Sep 2013 23:14:05 +0200 Subject: [Live-devel] Absolute seekStream - problem with timestamps In-Reply-To: <0116F7A2-4844-4337-90D9-1AAAE02B1485@live555.com> References: <0116F7A2-4844-4337-90D9-1AAAE02B1485@live555.com> Message-ID: Hello Ross, Thanks for your quick response. Yes, the client is synchronized. I also checked that all the frames sent from the server where received on the client at the moment of pause (I log all of them in both sides and are all sent/received). I'll check if I can reproduce the problem with the test programs. Thanks. 2013/9/10 Ross Finlayson > I have subclassed OnDemandServerMediaSubsession and added support for > 'trick play' using absolute times. All is working fine but the client > (which is also using Live555) is receiving wrong timestamps from the server > after pause/play is triggered. Basically I follow these steps: > > 1 - Client connects to the server for live video - OK > 2 - Pause the client (sendPauseCommand) - OK > 3 - Wait for some seconds and play the client again > (sendPlayCommand("YYYYMM....")) - The frames are correct but the client > reports wrong timestamps for 1 or 2 seconds, and then shows the correct > timestamps. > > > Assuming that the presentation times during this time are all > RTCP-synchronized (see < > http://www.live555.com/liveMedia/faq.html#rtcp-synchronization-issue>), I > wonder if the problem might be that these 'extra' packets (the initial ones > with the incorrect presentation time) are simply old packets that were > buffered inside the client OS's socket receive buffer at the time of the > pause? (If that's the case, then there's nothing much you can do about > this, other than reduce your client OS socket buffering.) > > You could test this by sending (after your initial PAUSE) a > non-absolute-time "PLAY" command with a "start" parameter of -1 (which > means 'resume without seeking') > > > Looking at the implementation of OnDemandServerMediaSubsession I see that > the other seekStream (with NPT) is changing the rtpSink presentation times > with rtpSink->resetPresentationTimes(), while the other is not. Could this > be the problem? > > > No, because that doesn't actually change any presentation times; it just > resets variables that are used to compute "getCurrentNPT()", which doesn't > get called in your case, because you're seeking by absolute time. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 10 15:12:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Sep 2013 15:12:32 -0700 Subject: [Live-devel] Absolute seekStream - problem with timestamps In-Reply-To: References: <0116F7A2-4844-4337-90D9-1AAAE02B1485@live555.com> Message-ID: Another possibility is that you're not setting presentation times correctly at the server end. The presentation times should be aligned with 'wall clock' time - i.e., the time that you'd get by calling "gettimeofday()". This is true even when you resume after pausing, and/or seek within the underlying medium. Note that the presentation times should continue advancing - aligned with wall-clock time - even if you pause and/or seek. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From srimugunth at csa.iisc.ernet.in Tue Sep 10 23:58:27 2013 From: srimugunth at csa.iisc.ernet.in (srimugunth at csa.iisc.ernet.in) Date: Wed, 11 Sep 2013 12:28:27 +0530 Subject: [Live-devel] modifying client end of ProxyServer In-Reply-To: References: <494bd8f466f5773bdc050f226aaccc50.squirrel@clmail.csa.iisc.ernet.in> Message-ID: >> With testRTSPClient.cpp, i was able to add the following code to >> DummySink::afterGettingFrame() and successfully dump .h264 frames > > Much better: Use the existing "H264VideoFileSink" class (instead of > "DummySink"), which (I think) already does exactly what you want. > > Even better: Use the existing "openRTSP" application, which will > automatically output a H.264 video file: > http://www.live555.com/openRTSP/ > > I don't see what this has to do with proxy RTSP servers (and I certainly > don't see why you would want to modify the existing proxy server code). > > thanks for replying. ok. The reason I am trying to modify the ProxyServer is because, I want to run RTSPclient+server in a single process. I want to integrate our transcoder between the RTSP client and server. RTSP client will receive live stream from IP camera, the frames will be transcoded, the transcoded output will be received asynchronously and re-exported as a different stream. We were able to do this with testRTSPClient+transcoder as one process and testOnDemandRTSPserver as another process and both of them connected through a named pipe. We think having a single process with "RTSPclient+transcoder+RTSPserver", we can achieve more live and real time performance. I naively thought that i can plugin our transcoder into the proxyServer inbetween the RTSP client and server. But the proxyServer code is a bit difficult for me, and I could not figure out which is the client end and which is the server end. I want to receive the frames in the client end, and make a call to our transcoder and pass the received frame. The transcoded frame will be received asynchronously in a different thread. I receive the output transcoded frame and export it as a live-source for the server end. I will appreciate very much your inputs about how to achieve this. thanks, mugunthan > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From stanley at gofuseit.com Wed Sep 11 00:06:48 2013 From: stanley at gofuseit.com (Stanley Biggs) Date: Wed, 11 Sep 2013 09:06:48 +0200 Subject: [Live-devel] Bug-fix: RR packets are being sent early before PLAY command has been issued when using RTP-over-TCP Message-ID: Hi Ross I have found what I believe to a bug in the latest live555 code and I also have a fix for the problem. We have tested this fix in our implementation and I would like to submit this fix for inclusion in your project. *PROBLEM DESCRIPTION:* We are streaming using RTP-over-TCP. We are streaming via 3G cell phone connections which means that latency is higher than on a cabled network. We are setting up different Audio and Video streams which means that the client will issue a DESCRIBE request, followed by two SETUP requests and will then issue a PLAY request to start playing the stream. We have found that, after sending the second SETUP request (and before getting the result of this from the server), the client sends out an RR packet to the server. Since this happens before the client has issued a "PLAY" to the server, the server gets confused and responds with a "405 Method Not Allowed". This doesn't happen when we stream on a local LAN since the latency isn't a problem. The problem is only observed once we stream over a "real" network where there is some latency. *SOLUTION:* I have now debugged the Live555 code and have found the cause of the problem. Basically, in the case of a Sink Node being assigned, the RTCPInstance::addReport function checks the setting we have for enableRTCPReports (whether we can start sending reports or not). However, in the case of a Source Node being assigned, this function neglects to check the corresponding value of the Source Node. I have applied the following fix that I'd like to propose to include in the code: In *livemedia/include/RTSPSource.hh*: ... Boolean& enableRTCPReports() { return fEnableRTCPReports; } *Boolean constAccessibleEnableRTCPReports() const { return fEnableRTCPReports; }* //****Function added by me (Stanley). The same as enableRTCPReports except that it is "const" and "safe" for const object references to call, which means we can call this from RTCPInstance using the fSource object there. ... In *livemedia/RTSPClient.cpp*: ... Boolean RTCPInstance::addReport(Boolean alwaysAdd) { // Include a SR or a RR, depending on whether we have an associated sink or source: if (fSink != NULL) { if (!alwaysAdd) { if (!fSink->enableRTCPReports()) return False; // Hack: Don't send a SR during those (brief) times when the timestamp of the // next outgoing RTP packet has been preset, to ensure that that timestamp gets // used for that outgoing packet. (David Bertrand, 2006.07.18) if (fSink->nextTimestampHasBeenPreset()) return False; } addSR(); } else if (fSource != NULL) { //****The following IF-statement was added by me (Stanley). As in the case of the Sink Node above, we first check the value of EnableRTCPReports before we do addRR(). *if (!fSource->**constAccessibleEnableRTCPReports()) return false;* addRR(); } return True; } ... Kind Regards Stanley Biggs -------------- next part -------------- An HTML attachment was scrubbed... URL: From stanley at gofuseit.com Wed Sep 11 00:31:24 2013 From: stanley at gofuseit.com (Stanley Biggs) Date: Wed, 11 Sep 2013 09:31:24 +0200 Subject: [Live-devel] Bug-fix: RR packets are being sent early before PLAY command has been issued when using RTP-over-TCP In-Reply-To: References: Message-ID: Just a correction: I reference livemedia/RTSPClient.cpp as the source file where I made the fix. This is incorrect. The correct filename is livemedia/RTCP.cpp. It is here where the method occurs to which I applied the fix. I am also indicating it inline below... On Wed, Sep 11, 2013 at 9:06 AM, Stanley Biggs wrote: > Hi Ross > > I have found what I believe to a bug in the latest live555 code and I also > have a fix for the problem. We have tested this fix in our implementation > and I would like to submit this fix for inclusion in your project. > > *PROBLEM DESCRIPTION:* > We are streaming using RTP-over-TCP. We are streaming via 3G cell phone > connections which means that latency is higher than on a cabled network. We > are setting up different Audio and Video streams which means that the > client will issue a DESCRIBE request, followed by two SETUP requests and > will then issue a PLAY request to start playing the stream. We have found > that, after sending the second SETUP request (and before getting the result > of this from the server), the client sends out an RR packet to the server. > Since this happens before the client has issued a "PLAY" to the server, the > server gets confused and responds with a "405 Method Not Allowed". This > doesn't happen when we stream on a local LAN since the latency isn't a > problem. The problem is only observed once we stream over a "real" network > where there is some latency. > > *SOLUTION:* > I have now debugged the Live555 code and have found the cause of the > problem. Basically, in the case of a Sink Node being assigned, the > RTCPInstance::addReport function checks the setting we have for > enableRTCPReports (whether we can start sending reports or not). However, > in the case of a Source Node being assigned, this function neglects to > check the corresponding value of the Source Node. I have applied the > following fix that I'd like to propose to include in the code: > > In *livemedia/include/RTSPSource.hh*: > > ... > Boolean& enableRTCPReports() { return fEnableRTCPReports; } > *Boolean constAccessibleEnableRTCPReports() const { return > fEnableRTCPReports; }* //****Function added by me (Stanley). The same as > enableRTCPReports except that it is "const" and "safe" for const object > references to call, which means we can call this from RTCPInstance using > the fSource object there. > ... > > In *livemedia/RTCP.cpp*: > > ... > Boolean RTCPInstance::addReport(Boolean alwaysAdd) { > // Include a SR or a RR, depending on whether we have an associated sink > or source: > if (fSink != NULL) { > if (!alwaysAdd) { > if (!fSink->enableRTCPReports()) return False; > > // Hack: Don't send a SR during those (brief) times when the > timestamp of the > // next outgoing RTP packet has been preset, to ensure that that > timestamp gets > // used for that outgoing packet. (David Bertrand, 2006.07.18) > if (fSink->nextTimestampHasBeenPreset()) return False; > } > > addSR(); > } else if (fSource != NULL) { > //****The following IF-statement was added by me (Stanley). As in the > case of the Sink Node above, we first check the value of EnableRTCPReports > before we do addRR(). > *if (!fSource->**constAccessibleEnableRTCPReports()) return false;* > addRR(); > } > > return True; > } > ... > > Kind Regards > > Stanley Biggs > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From MJ at unitek.dk Wed Sep 11 01:08:22 2013 From: MJ at unitek.dk (Michael S. Juul) Date: Wed, 11 Sep 2013 10:08:22 +0200 Subject: [Live-devel] Presentation time when streaming video recording from surveillance cameras Message-ID: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> Hi Ross We use the RTSPClient from Live555 to access live video and video recordings from surveillance cameras, utilizing the cameras built-in RTSP server and recording storage (An SD card). We have a problem regarding the presentation time when streaming video recordings. Let's say we have a video recording from "last week" which we would like to stream. We start the stream; we can even start the stream from a specific time in the video recording, so that's swell. BUT, the presentation time we receive is "now", not "last week". In order to display the correct presentation time, we therefore have to do a lot of calculations, which isn't very accurate (Sometime errors of several seconds). In order to get a more precise presentation time calculation, I have tried to use the "durationInMicroseconds", but it seems to always be 0 (zero), and the getNormalPlayTime method always returns 0 (zero) because fNPT_PTS_Offset is 0 (zero). Is there any way we can get the actual recording time (I.e. "last week") via Live555? I am aware that this might in fact be a problem related to the cameras RTSP server, but I haven't been able to locate any setup properties in the cameras related to the presentation time when streaming video recordings, or for that matter live video. Also, one should imagine this is a normal usage scenario related to surveillance cameras, so when streaming a video recording, it should be possible to get the presentation time from when the video recording was actually made. Any suggestions? Venlig hilsen / Best regards Michael S. Juul B.SC.E.E mj at unitek.dk Unitek A/S V?vervej 5 8800 Viborg Tlf.: +45 86 61 44 22 mail at unitek.dk www.unitek.dk -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 11 01:16:42 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Sep 2013 01:16:42 -0700 Subject: [Live-devel] Bug-fix: RR packets are being sent early before PLAY command has been issued when using RTP-over-TCP In-Reply-To: References: Message-ID: Yes, you're correct - this was an oversight in the code (it had prevented "SR" packets from being sent prematurely, but not "RR" packets). I've now installed a new version (2013.09.11) of the "LIVE555 Streaming Media" code that fixes this. Thanks for the report. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 11 01:54:16 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Sep 2013 01:54:16 -0700 Subject: [Live-devel] Presentation time when streaming video recording from surveillance cameras In-Reply-To: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> References: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> Message-ID: <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> > We use the RTSPClient from Live555 to access live video and video recordings from surveillance cameras, utilizing the cameras built-in RTSP server and recording storage (An SD card). > We have a problem regarding the presentation time when streaming video recordings. > Let?s say we have a video recording from ?last week? which we would like to stream. We start the stream; we can even start the stream from a specific time in the video recording, so that?s swell. > BUT, the presentation time we receive is ?now?, not ?last week?. You're misunderstanding the purpose of "presentation time". The "presentation time" is a timestamp - generated by the server and conveyed to the client for each RTP packet - that tells the client the relative spacing of successive frames of the same media type, and also - if both audio and video substreams are present - allows the client to synchronize the audio and video (i.e., 'lip sync'). The absolute value of the presentation times is undefined and irrelevant. (It is often the 'current' time, but doesn't have to be. E.g., if the server's clock happens to be set to 1990, then the presentation times will be from then.) What you want is something like the "normal play time" (NPT), but (as you discovered) we don't compute that when the client is seeking by absolute time, which I think is what you're doing. But I think some sort of solution should be possible. To be sure, please give us an example of a RTSP protocol exchange between your client and the server, when you're asking to stream from some time in the past - i.e., the "DESCRIBE", "SETUP", and "PLAY" commands, and their responses. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Wed Sep 11 03:18:29 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Wed, 11 Sep 2013 12:18:29 +0200 Subject: [Live-devel] Absolute seekStream - problem with timestamps In-Reply-To: References: <0116F7A2-4844-4337-90D9-1AAAE02B1485@live555.com> Message-ID: Hello Ross, You are right, I was setting fPresentationTime incorrectly in the server end for 'trick play'. I was trying to set it as the "actual frame time" (when the frame was recorded), which as you said is wrong. How could I know in the client end what is the actual frame time? Is there any standard way of doing this? This is quite close to the thread opened today by Michael S. Juul "Presentation time when streaming video recording from surveillance cameras" but in my case I control both sides (client and server) which use live555. Thanks. 2013/9/11 Ross Finlayson > Another possibility is that you're not setting presentation times > correctly at the server end. The presentation times should be aligned with > 'wall clock' time - i.e., the time that you'd get by calling > "gettimeofday()". This is true even when you resume after pausing, and/or > seek within the underlying medium. Note that the presentation times should > continue advancing - aligned with wall-clock time - even if you pause > and/or seek. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 11 07:56:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Sep 2013 07:56:19 -0700 Subject: [Live-devel] Absolute seekStream - problem with timestamps In-Reply-To: References: <0116F7A2-4844-4337-90D9-1AAAE02B1485@live555.com> Message-ID: <71E9F686-39A0-4337-887E-85B54C246154@live555.com> > How could I know in the client end what is the actual frame time? Is there any standard way of doing this? Yes, what you're looking for here is "normal play time". As a RTSP client, you can get this by calling "MediaSubsession::getNormalPlayTime()" on your "MediaSubsession" object, passing the presentation time as parameter. (This should work in your case because your server (LIVE555-based) returns a "RTP-Info:" header in its "PLAY" response.) In your case - because you've seeked by absolute time - this "normal play time" value will start from 0.0. However, you can then add this to the absolute time (the time that you seeked to) to get the correct current absolute time. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Wed Sep 11 10:29:15 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Wed, 11 Sep 2013 19:29:15 +0200 Subject: [Live-devel] Absolute seekStream - problem with timestamps In-Reply-To: <71E9F686-39A0-4337-887E-85B54C246154@live555.com> References: <0116F7A2-4844-4337-90D9-1AAAE02B1485@live555.com> <71E9F686-39A0-4337-887E-85B54C246154@live555.com> Message-ID: Great! Now is working as I wanted. I have some other questions regarding trick play that I'll ask in a different thread. Thanks a lot for your help. 2013/9/11 Ross Finlayson > How could I know in the client end what is the actual frame time? Is there > any standard way of doing this? > > > Yes, what you're looking for here is "normal play time". As a RTSP > client, you can get this by calling > "MediaSubsession::getNormalPlayTime()" > on your "MediaSubsession" object, passing the presentation time as > parameter. (This should work in your case because your server > (LIVE555-based) returns a "RTP-Info:" header in its "PLAY" response.) > > In your case - because you've seeked by absolute time - this "normal play > time" value will start from 0.0. However, you can then add this to the > absolute time (the time that you seeked to) to get the correct current > absolute time. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Wed Sep 11 11:16:04 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Wed, 11 Sep 2013 20:16:04 +0200 Subject: [Live-devel] Get one frame at an absolute time Message-ID: Hello Ross, I want to be able to request one single frame at an absolute time from the RTSP Client. Is that possible? In the client I'm sending a play command with the same absStart/absEnd. Received 227 new bytes of response data. > Received a complete PLAY response: > RTSP/1.0 200 OK > CSeq: 326 > Date: Wed, Sep 11 2013 17:34:09 GMT > Range: clock=20130911T192738.393Z-20130911T192738.393Z > Session: 51CCAB60 > RTP-Info: url=rtsp:// > 192.168.1.6:9000/2/playback/track1;seq=21357;rtptime=771222272 > In the server end, my own OnDemandServerMediaSubsession implements seekStreamSource and I pass absStart/absEnd to my own FramedSource. How should I implement my FramedSource doGetNextFrame so it only sends frames until the absEnd is reached (in this case only one frame). Is there any example FramedSource I can use as a template? Is this the correct way to do this? Thanks in advance. -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 11 12:55:06 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Sep 2013 12:55:06 -0700 Subject: [Live-devel] Get one frame at an absolute time In-Reply-To: References: Message-ID: <06C15677-AB04-4B81-ADF1-232B519BA7E2@live555.com> > I want to be able to request one single frame at an absolute time from the RTSP Client. Is that possible? The RTSP protocol specification has an optional mechanism that allows this: specifying a range that uses SMPTE-format times (that can address individual frames, by number). However, the "LIVE555 Streaming Media" code does not support this, and it is unlikely to be added in the future (at least, not for free). > In the server end, my own OnDemandServerMediaSubsession implements seekStreamSource and I pass absStart/absEnd to my own FramedSource. > > How should I implement my FramedSource doGetNextFrame so it only sends frames until the absEnd is reached (in this case only one frame). That's up to you to decide. However, if your underlying data source uses a "ByteStreamFileSource" object, then note that the "ByteStreamFileSource ::seekToByteAbsolute()" function has an optional parameter "numBytesToStream". If this parameter is non-zero, then our implementation of "ByteStreamFileSource" will automatically limit the stream to deliver that many bytes only, before treating it as EOF. So, if you can figure out how many bytes you want to deliver, and are using a "ByteStreamFileSource", you can do it that way. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From MJ at unitek.dk Wed Sep 11 23:49:42 2013 From: MJ at unitek.dk (Michael S. Juul) Date: Thu, 12 Sep 2013 08:49:42 +0200 Subject: [Live-devel] Presentation time when streaming video recordingfrom surveillance cameras In-Reply-To: <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> References: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> Message-ID: <97D9F1117F2FF54A964F634D1D844B65322617@server-4.Unitek.local> Hi Ross I have attached a screen dump from WireShark showing you the RTSP protocol exchange between our client and the server (camera). The "DESCRIBE", "SETUP" and "PLAY" commands are packet no. 153, 173 and 186. Venlig hilsen / Best regards Michael S. Juul B.SC.E.E mj at unitek.dk Unitek A/S V?vervej 5 8800 Viborg Tlf.: +45 86 61 44 22 mail at unitek.dk www.unitek.dk Fra: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] P? vegne af Ross Finlayson Sendt: 11. september 2013 10:54 Til: LIVE555 Streaming Media - development & use Emne: Re: [Live-devel] Presentation time when streaming video recordingfrom surveillance cameras We use the RTSPClient from Live555 to access live video and video recordings from surveillance cameras, utilizing the cameras built-in RTSP server and recording storage (An SD card). We have a problem regarding the presentation time when streaming video recordings. Let's say we have a video recording from "last week" which we would like to stream. We start the stream; we can even start the stream from a specific time in the video recording, so that's swell. BUT, the presentation time we receive is "now", not "last week". You're misunderstanding the purpose of "presentation time". The "presentation time" is a timestamp - generated by the server and conveyed to the client for each RTP packet - that tells the client the relative spacing of successive frames of the same media type, and also - if both audio and video substreams are present - allows the client to synchronize the audio and video (i.e., 'lip sync'). The absolute value of the presentation times is undefined and irrelevant. (It is often the 'current' time, but doesn't have to be. E.g., if the server's clock happens to be set to 1990, then the presentation times will be from then.) What you want is something like the "normal play time" (NPT), but (as you discovered) we don't compute that when the client is seeking by absolute time, which I think is what you're doing. But I think some sort of solution should be possible. To be sure, please give us an example of a RTSP protocol exchange between your client and the server, when you're asking to stream from some time in the past - i.e., the "DESCRIBE", "SETUP", and "PLAY" commands, and their responses. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: WireSharp_dump.png Type: image/png Size: 196249 bytes Desc: WireSharp_dump.png URL: From finlayson at live555.com Thu Sep 12 00:06:02 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Sep 2013 00:06:02 -0700 Subject: [Live-devel] Presentation time when streaming video recordingfrom surveillance cameras In-Reply-To: <97D9F1117F2FF54A964F634D1D844B65322617@server-4.Unitek.local> References: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> <97D9F1117F2FF54A964F634D1D844B65322617@server-4.Unitek.local> Message-ID: <0FBCC14B-19AE-4D4A-A4D0-ABB6EFBD6AD0@live555.com> > I have attached a screen dump from WireShark showing you the RTSP protocol exchange between our client and the server (camera). No, I meant the *complete* protocol exchange, that includes both requests and responses. Your screen dump shows only the first line of the requests. The easiest way to get this is simply to pass 1 as the "verbosityLevel" parameter to RTSPClient::createNew() and then send the (text) diagnostic output. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Vince.Li at quantatw.com Thu Sep 12 00:13:11 2013 From: Vince.Li at quantatw.com (=?big5?B?VmluY2UgTGkop/Wrwik=?=) Date: Thu, 12 Sep 2013 07:13:11 +0000 Subject: [Live-devel] Without SDP with H264VideoStreamDiscreteFramer In-Reply-To: <910005BD-4DD3-4545-B3DB-2CD238F0A0D2@live555.com> References: <1B8CE959541B0C40BEC5966000F63EC6AD98FDC5@MAILBX03.quanta.corp> <910005BD-4DD3-4545-B3DB-2CD238F0A0D2@live555.com> Message-ID: <1B8CE959541B0C40BEC5966000F63EC6AD990F73@MAILBX03.quanta.corp> Actually, I?m not sure my input sequence of NAL units contain SPS and PPS NAL units. My video input is a live stream whose frames captured from webcam and encoded by x264. However, ?H264VideoStreamFramer? can parse SDP from my input, that means my input sequence of NAL units contain SPS and PPS NAL units? Vince From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, September 10, 2013 2:37 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Without SDP with H264VideoStreamDiscreteFramer I?ve a question about H264VideoStreamFramer and H264VideoStreamDiscreteFramer. Since my video frame source is discrete (frame by frame), I use H264VideoStreamDiscreteFrame. However, I discover the H264VideoRTPSink didn?t have SDP (dump auxSDPLine() periodically) with H264VideoStreamDiscreteFramer. Does your input sequence of NAL units (that you pass to "H264VideoStreamDiscreteFramer") contain SPS and PPS NAL units (either at the start of the sequence, or periodically within it)? If so, then the "H264VideoStreamDiscreteFramer" will automatically record these when they appear, and this will cause the proper "auxSDPLine()" to get generated. Alternatively, if you know the SPS and PPS NAL unit data in advance, you can pass them as parameters to the "H264VideoRTPSink" constructor (see "liveMedia/include/H264VideoRTPSink.hh"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Thu Sep 12 00:43:11 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Thu, 12 Sep 2013 09:43:11 +0200 Subject: [Live-devel] Get one frame at an absolute time In-Reply-To: <06C15677-AB04-4B81-ADF1-232B519BA7E2@live555.com> References: <06C15677-AB04-4B81-ADF1-232B519BA7E2@live555.com> Message-ID: Hello Ross, Thanks for pointing out the SMPTE Relative Timestamps concept. Looks like a good thing to have in the future. In any case, I don't know what should I do in any of the modes (SMTPE/NPT/Absolute) when the server has sent the complete range specified in the play command from the client. In RFC2326 I see this statement: After playing the desired range, the presentation is automatically paused, as if a PAUSE request had been issued. How should I implement this in live555? Thanks in advance. 2013/9/11 Ross Finlayson > I want to be able to request one single frame at an absolute time from the > RTSP Client. Is that possible? > > > The RTSP protocol specification has an optional mechanism that allows > this: specifying a range that uses SMPTE-format times (that can address > individual frames, by number). However, the "LIVE555 Streaming Media" code > does not support this, and it is unlikely to be added in the future (at > least, not for free). > > > In the server end, my own OnDemandServerMediaSubsession implements > seekStreamSource and I pass absStart/absEnd to my own FramedSource. > > How should I implement my FramedSource doGetNextFrame so it only sends > frames until the absEnd is reached (in this case only one frame). > > > That's up to you to decide. However, if your underlying data source uses > a "ByteStreamFileSource" object, then note that the > "ByteStreamFileSource ::seekToByteAbsolute()" function has an optional > parameter "numBytesToStream". If this parameter is non-zero, then our > implementation of "ByteStreamFileSource" will automatically limit the > stream to deliver that many bytes only, before treating it as EOF. So, if > you can figure out how many bytes you want to deliver, and are using a > "ByteStreamFileSource", you can do it that way. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From MJ at unitek.dk Thu Sep 12 00:49:43 2013 From: MJ at unitek.dk (Michael S. Juul) Date: Thu, 12 Sep 2013 09:49:43 +0200 Subject: [Live-devel] Presentation time when streaming videorecordingfrom surveillance cameras In-Reply-To: <0FBCC14B-19AE-4D4A-A4D0-ABB6EFBD6AD0@live555.com> References: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> <97D9F1117F2FF54A964F634D1D844B65322617@server-4.Unitek.local> <0FBCC14B-19AE-4D4A-A4D0-ABB6EFBD6AD0@live555.com> Message-ID: <97D9F1117F2FF54A964F634D1D844B65322619@server-4.Unitek.local> Hi Ross Is this more like what you want? Packet no. Time Source Destination Protocol Length Info 153 12.632819000 192.168.1.43 192.168.1.71 RTSP 224 DESCRIBE rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 CSeq: 3 User-Agent: LIVE555 Streaming Media v2013.02.11 Accept: application/sdp 154 12.633197000 192.168.1.71 192.168.1.43 TCP 60 rtsp > sitaramgmt [ACK] Seq=92 Ack=315 Win=1494 Len=0 155 12.633650000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] rtsp > sitaramgmt [ACK] Seq=92 Ack=315 Win=1664 Len=0 156 12.909117000 192.168.1.71 192.168.1.43 HTTP/XML 589 HTTP/1.1 200 OK 173 13.135672000 192.168.1.43 192.168.1.71 RTSP 254 SETUP rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1/video RTSP/1.0 CSeq: 4 User-Agent: LIVE555 Streaming Media v2013.02.11 Transport: RTP/AVP;unicast;client_port=56128-56129 174 13.136107000 192.168.1.71 192.168.1.43 TCP 60 rtsp > sitaramgmt [ACK] Seq=457 Ack=515 Win=1464 Len=0 175 13.136666000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] rtsp > sitaramgmt [ACK] Seq=457 Ack=515 Win=1664 Len=0 176 13.137901000 192.168.1.71 192.168.1.43 RTSP 213 Reply: RTSP/1.0 200 OK 186 15.394023000 192.168.1.43 192.168.1.71 RTSP 257 PLAY rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 CSeq: 5 User-Agent: LIVE555 Streaming Media v2013.02.11 Session: 3dd525ac760b01b Range: clock=20130830T084008.000Z- 187 15.394388000 192.168.1.71 192.168.1.43 TCP 60 rtsp > sitaramgmt [ACK] Seq=616 Ack=718 Win=1461 Len=0 188 15.394881000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] rtsp > sitaramgmt [ACK] Seq=616 Ack=718 Win=1664 Len=0 189 15.396583000 192.168.1.71 192.168.1.43 RTSP 108 Reply: RTSP/1.0 200 OK Fra: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] P? vegne af Ross Finlayson Sendt: 12. september 2013 09:06 Til: LIVE555 Streaming Media - development & use Emne: Re: [Live-devel] Presentation time when streaming videorecordingfrom surveillance cameras I have attached a screen dump from WireShark showing you the RTSP protocol exchange between our client and the server (camera). No, I meant the *complete* protocol exchange, that includes both requests and responses. Your screen dump shows only the first line of the requests. The easiest way to get this is simply to pass 1 as the "verbosityLevel" parameter to RTSPClient::createNew() and then send the (text) diagnostic output. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 12 01:03:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Sep 2013 01:03:19 -0700 Subject: [Live-devel] Get one frame at an absolute time In-Reply-To: References: <06C15677-AB04-4B81-ADF1-232B519BA7E2@live555.com> Message-ID: <88F2FDE5-66B9-4DAF-B041-4313E3F555AF@live555.com> > In any case, I don't know what should I do in any of the modes (SMTPE/NPT/Absolute) when the server has sent the complete range specified in the play command from the client. In RFC2326 I see this statement: > After playing the desired range, the presentation is automatically paused, as if a PAUSE request had been issued. > How should I implement this in live555? Just stop sending data :-) The best way to do this is - within your "doGetNextFrame()" implementation - call "FramedSource::handleClosure(this);" instead of the usual "FramedSource::afterGetting(this);". As I noted before, this is what "ByteStreamFileSource" does. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 12 01:07:51 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Sep 2013 01:07:51 -0700 Subject: [Live-devel] Presentation time when streaming videorecordingfrom surveillance cameras In-Reply-To: <97D9F1117F2FF54A964F634D1D844B65322619@server-4.Unitek.local> References: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> <97D9F1117F2FF54A964F634D1D844B65322617@server-4.Unitek.local> <0FBCC14B-19AE-4D4A-A4D0-ABB6EFBD6AD0@live555.com> <97D9F1117F2FF54A964F634D1D844B65322619@server-4.Unitek.local> Message-ID: <073280E8-01CF-4F17-8761-240FC9A26B74@live555.com> > Is this more like what you want? No, because you're showing only the first line of each RTSP response! Why don't you just do what I suggested before: Pass 1 as the "verbosityLevel" parameter to RTSPClient::createNew() and then send the (text) diagnostic output? Also, your client is using an old version of the "LIVE555 Streaming Media" software. You should upgrade to the latest version! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From MJ at unitek.dk Thu Sep 12 02:31:02 2013 From: MJ at unitek.dk (Michael S. Juul) Date: Thu, 12 Sep 2013 11:31:02 +0200 Subject: [Live-devel] Presentation time when streamingvideorecordingfrom surveillance cameras In-Reply-To: <073280E8-01CF-4F17-8761-240FC9A26B74@live555.com> References: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> <97D9F1117F2FF54A964F634D1D844B65322617@server-4.Unitek.local> <0FBCC14B-19AE-4D4A-A4D0-ABB6EFBD6AD0@live555.com> <97D9F1117F2FF54A964F634D1D844B65322619@server-4.Unitek.local> <073280E8-01CF-4F17-8761-240FC9A26B74@live555.com> Message-ID: <97D9F1117F2FF54A964F634D1D844B6532261D@server-4.Unitek.local> Hi Ross The problem is that I'm not getting any diagnostic output, even thou "verbosityLevel" is (And has always been) '1', all I have is what WireShark has captured, and I'm no expert in WireShark. Anyway, I found a way to get a textual dump from WireShark, but I don't know if it contains the formation you seek. It contains packet no. 149 through 189, where "DESCRIBE", "SETUP" and "PLAY" is packet no. 153, 173 and 186. No. Time Source Destination Protocol Length Info 149 12.629205000 192.168.1.43 192.168.1.71 RTSP 198 OPTIONS rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 Frame 149: 198 bytes on wire (1584 bits), 198 bytes captured (1584 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: sitaramgmt (2630), Dst Port: rtsp (554), Seq: 1, Ack: 1, Len: 144 Real Time Streaming Protocol Request: OPTIONS rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 Method: OPTIONS URL: rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 CSeq: 2 User-Agent: LIVE555 Streaming Media v2013.02.11 No. Time Source Destination Protocol Length Info 150 12.630320000 192.168.1.71 192.168.1.43 TCP 60 rtsp > sitaramgmt [ACK] Seq=1 Ack=145 Win=1520 Len=0 Frame 150: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 1, Ack: 145, Len: 0 No. Time Source Destination Protocol Length Info 151 12.630985000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] rtsp > sitaramgmt [ACK] Seq=1 Ack=145 Win=1664 Len=0 Frame 151: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 1, Ack: 145, Len: 0 No. Time Source Destination Protocol Length Info 152 12.632673000 192.168.1.71 192.168.1.43 RTSP 145 Reply: RTSP/1.0 200 OK Frame 152: 145 bytes on wire (1160 bits), 145 bytes captured (1160 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 1, Ack: 145, Len: 91 Real Time Streaming Protocol Response: RTSP/1.0 200 OK Status: 200 CSeq: 2 Public: DESCRIBE, SETUP, TEARDOWN, PLAY, SET_PARAMETER, PAUSE No. Time Source Destination Protocol Length Info 153 12.632819000 192.168.1.43 192.168.1.71 RTSP 224 DESCRIBE rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 Frame 153: 224 bytes on wire (1792 bits), 224 bytes captured (1792 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: sitaramgmt (2630), Dst Port: rtsp (554), Seq: 145, Ack: 92, Len: 170 Real Time Streaming Protocol Request: DESCRIBE rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 Method: DESCRIBE URL: rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 CSeq: 3 User-Agent: LIVE555 Streaming Media v2013.02.11 Accept: application/sdp No. Time Source Destination Protocol Length Info 154 12.633197000 192.168.1.71 192.168.1.43 TCP 60 rtsp > sitaramgmt [ACK] Seq=92 Ack=315 Win=1494 Len=0 Frame 154: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 92, Ack: 315, Len: 0 No. Time Source Destination Protocol Length Info 155 12.633650000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] rtsp > sitaramgmt [ACK] Seq=92 Ack=315 Win=1664 Len=0 Frame 155: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 92, Ack: 315, Len: 0 No. Time Source Destination Protocol Length Info 156 12.909117000 192.168.1.71 192.168.1.43 HTTP/XML 589 HTTP/1.1 200 OK Frame 156: 589 bytes on wire (4712 bits), 589 bytes captured (4712 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 11672, Ack: 9110, Len: 535 Hypertext Transfer Protocol eXtensible Markup Language No. Time Source Destination Protocol Length Info 157 12.909890000 192.168.1.43 192.168.1.71 TCP 315 [TCP segment of a reassembled PDU] Frame 157: 315 bytes on wire (2520 bits), 315 bytes captured (2520 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: gbjd816 (2626), Dst Port: http (80), Seq: 9110, Ack: 12207, Len: 261 No. Time Source Destination Protocol Length Info 158 12.911130000 192.168.1.71 192.168.1.43 TCP 60 http > gbjd816 [ACK] Seq=12207 Ack=9371 Win=1403 Len=0 Frame 158: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 12207, Ack: 9371, Len: 0 No. Time Source Destination Protocol Length Info 159 12.914105000 192.168.1.71 192.168.1.43 TCP 75 [TCP segment of a reassembled PDU] Frame 159: 75 bytes on wire (600 bits), 75 bytes captured (600 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 12207, Ack: 9371, Len: 21 No. Time Source Destination Protocol Length Info 160 12.914308000 192.168.1.71 192.168.1.43 TCP 60 [TCP segment of a reassembled PDU] Frame 160: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 12228, Ack: 9371, Len: 2 No. Time Source Destination Protocol Length Info 161 12.914309000 192.168.1.71 192.168.1.43 HTTP 60 HTTP/1.1 100 Continue Frame 161: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 12230, Ack: 9371, Len: 2 [3 Reassembled TCP Segments (25 bytes): #159(21), #160(2), #161(2)] Hypertext Transfer Protocol No. Time Source Destination Protocol Length Info 162 12.914329000 192.168.1.43 192.168.1.71 TCP 54 gbjd816 > http [ACK] Seq=9371 Ack=12232 Win=64311 Len=0 Frame 162: 54 bytes on wire (432 bits), 54 bytes captured (432 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: gbjd816 (2626), Dst Port: http (80), Seq: 9371, Ack: 12232, Len: 0 No. Time Source Destination Protocol Length Info 163 12.914369000 192.168.1.43 192.168.1.71 HTTP/XML 1265 POST /onvif/device_service HTTP/1.1 Frame 163: 1265 bytes on wire (10120 bits), 1265 bytes captured (10120 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: gbjd816 (2626), Dst Port: http (80), Seq: 9371, Ack: 12232, Len: 1211 [2 Reassembled TCP Segments (1472 bytes): #157(261), #163(1211)] Hypertext Transfer Protocol eXtensible Markup Language No. Time Source Destination Protocol Length Info 164 12.915089000 192.168.1.71 192.168.1.43 TCP 60 http > gbjd816 [ACK] Seq=12232 Ack=10582 Win=453 Len=0 Frame 164: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 12232, Ack: 10582, Len: 0 No. Time Source Destination Protocol Length Info 165 12.928085000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] http > gbjd816 [ACK] Seq=12232 Ack=10582 Win=1664 Len=0 Frame 165: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 12232, Ack: 10582, Len: 0 No. Time Source Destination Protocol Length Info 166 13.071832000 192.168.1.71 192.168.1.43 TCP 1514 [TCP segment of a reassembled PDU] Frame 166: 1514 bytes on wire (12112 bits), 1514 bytes captured (12112 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 12232, Ack: 10582, Len: 1460 No. Time Source Destination Protocol Length Info 167 13.071834000 192.168.1.71 192.168.1.43 TCP 255 [TCP segment of a reassembled PDU] Frame 167: 255 bytes on wire (2040 bits), 255 bytes captured (2040 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 13692, Ack: 10582, Len: 201 No. Time Source Destination Protocol Length Info 168 13.071881000 192.168.1.43 192.168.1.71 TCP 54 gbjd816 > http [ACK] Seq=10582 Ack=13893 Win=64896 Len=0 Frame 168: 54 bytes on wire (432 bits), 54 bytes captured (432 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: gbjd816 (2626), Dst Port: http (80), Seq: 10582, Ack: 13893, Len: 0 No. Time Source Destination Protocol Length Info 169 13.130830000 192.168.1.71 192.168.1.43 RTSP/SDP 419 Reply: RTSP/1.0 200 OK Frame 169: 419 bytes on wire (3352 bits), 419 bytes captured (3352 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 92, Ack: 315, Len: 365 Real Time Streaming Protocol Response: RTSP/1.0 200 OK Status: 200 CSeq: 3 Cache-control: no-cache Content-type: application/sdp Content-length: 260 Session Description Protocol Session Description Protocol Version (v): 0 Owner/Creator, Session Id (o): - 0 0 IN IP4 192.168.1.71 Owner Username: - Session ID: 0 Session Version: 0 Owner Network Type: IN Owner Address Type: IP4 Owner Address: 192.168.1.71 Session Name (s): LIVE VIEW Connection Information (c): IN IP4 0.0.0.0 Connection Network Type: IN Connection Address Type: IP4 Connection Address: 0.0.0.0 Time Description, active time (t): 0 0 Session Attribute (a): control:* Media Description, name and address (m): video 0 RTP/AVP 35 Media Attribute (a): rtpmap:35 H264/90000 Media Attribute (a): control:video Media Attribute (a): recvonly Media Attribute (a): x-onvif-track:VIDEO001 Media Description, name and address (m): audio 0 RTP/AVP 0 Media Attribute (a): rtpmap:0 PCMU/8000/1 Media Attribute (a): control:audio Media Attribute (a): recvonly No. Time Source Destination Protocol Length Info 170 13.135339000 192.168.1.43 224.0.0.22 IGMPv3 54 Membership Report / Join group 228.67.43.91 for any sources Frame 170: 54 bytes on wire (432 bits), 54 bytes captured (432 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: IPv4mcast_00:00:16 (01:00:5e:00:00:16) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 224.0.0.22 (224.0.0.22) Internet Group Management Protocol No. Time Source Destination Protocol Length Info 171 13.135361000 192.168.1.43 228.67.43.91 UDP 53 Source port: 15947 Destination port: 15947 Frame 171: 53 bytes on wire (424 bits), 53 bytes captured (424 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: IPv4mcast_43:2b:5b (01:00:5e:43:2b:5b) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 228.67.43.91 (228.67.43.91) User Datagram Protocol, Src Port: 15947 (15947), Dst Port: 15947 (15947) Data (11 bytes) 0000 68 6f 73 74 49 64 54 65 73 74 00 hostIdTest. No. Time Source Destination Protocol Length Info 172 13.135425000 192.168.1.43 224.0.0.22 IGMPv3 54 Membership Report / Leave group 228.67.43.91 Frame 172: 54 bytes on wire (432 bits), 54 bytes captured (432 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: IPv4mcast_00:00:16 (01:00:5e:00:00:16) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 224.0.0.22 (224.0.0.22) Internet Group Management Protocol No. Time Source Destination Protocol Length Info 173 13.135672000 192.168.1.43 192.168.1.71 RTSP 254 SETUP rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1/video RTSP/1.0 Frame 173: 254 bytes on wire (2032 bits), 254 bytes captured (2032 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: sitaramgmt (2630), Dst Port: rtsp (554), Seq: 315, Ack: 457, Len: 200 Real Time Streaming Protocol Request: SETUP rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1/video RTSP/1.0 Method: SETUP URL: rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1/video CSeq: 4 User-Agent: LIVE555 Streaming Media v2013.02.11 Transport: RTP/AVP;unicast;client_port=56128-56129 No. Time Source Destination Protocol Length Info 174 13.136107000 192.168.1.71 192.168.1.43 TCP 60 rtsp > sitaramgmt [ACK] Seq=457 Ack=515 Win=1464 Len=0 Frame 174: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 457, Ack: 515, Len: 0 No. Time Source Destination Protocol Length Info 175 13.136666000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] rtsp > sitaramgmt [ACK] Seq=457 Ack=515 Win=1664 Len=0 Frame 175: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 457, Ack: 515, Len: 0 No. Time Source Destination Protocol Length Info 176 13.137901000 192.168.1.71 192.168.1.43 RTSP 213 Reply: RTSP/1.0 200 OK Frame 176: 213 bytes on wire (1704 bits), 213 bytes captured (1704 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 457, Ack: 515, Len: 159 Real Time Streaming Protocol Response: RTSP/1.0 200 OK Status: 200 CSeq: 4 Session: 3dd525ac760b01b;timeout=60 Transport: RTP/AVP/UDP;unicast;client_port=56128-56129;server_port=15344-15345;ssrc=FFFFFFFF No. Time Source Destination Protocol Length Info 177 13.138061000 192.168.1.43 192.168.1.71 RTP 46 Unknown RTP version 3 Frame 177: 46 bytes on wire (368 bits), 46 bytes captured (368 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) User Datagram Protocol, Src Port: 56128 (56128), Dst Port: 15344 (15344) Real-Time Transport Protocol No. Time Source Destination Protocol Length Info 178 13.138093000 192.168.1.43 192.168.1.71 RTP 46 Unknown RTP version 3 Frame 178: 46 bytes on wire (368 bits), 46 bytes captured (368 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) User Datagram Protocol, Src Port: 56128 (56128), Dst Port: 15344 (15344) Real-Time Transport Protocol No. Time Source Destination Protocol Length Info 179 13.213057000 192.168.1.71 192.168.1.43 HTTP/XML 141 HTTP/1.1 200 OK Frame 179: 141 bytes on wire (1128 bits), 141 bytes captured (1128 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: http (80), Dst Port: gbjd816 (2626), Seq: 13893, Ack: 10582, Len: 87 [3 Reassembled TCP Segments (1748 bytes): #166(1460), #167(201), #179(87)] Hypertext Transfer Protocol eXtensible Markup Language No. Time Source Destination Protocol Length Info 180 13.346998000 192.168.1.43 192.168.1.71 TCP 54 sitaramgmt > rtsp [ACK] Seq=515 Ack=616 Win=64281 Len=0 Frame 180: 54 bytes on wire (432 bits), 54 bytes captured (432 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: sitaramgmt (2630), Dst Port: rtsp (554), Seq: 515, Ack: 616, Len: 0 No. Time Source Destination Protocol Length Info 181 13.409360000 192.168.1.43 192.168.1.71 TCP 54 gbjd816 > http [ACK] Seq=10582 Ack=13980 Win=64809 Len=0 Frame 181: 54 bytes on wire (432 bits), 54 bytes captured (432 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: gbjd816 (2626), Dst Port: http (80), Seq: 10582, Ack: 13980, Len: 0 No. Time Source Destination Protocol Length Info 182 13.518567000 192.168.1.43 224.0.0.22 IGMPv3 54 Membership Report / Leave group 228.67.43.91 Frame 182: 54 bytes on wire (432 bits), 54 bytes captured (432 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: IPv4mcast_00:00:16 (01:00:5e:00:00:16) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 224.0.0.22 (224.0.0.22) Internet Group Management Protocol No. Time Source Destination Protocol Length Info 183 14.052421000 SmcNetwo_4d:91:54 Spanning-tree-(for-bridges)_00 STP 60 RST. Root = 32768/1/00:22:2d:4d:91:50 Cost = 0 Port = 0x8005 Frame 183: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 IEEE 802.3 Ethernet Logical-Link Control Spanning Tree Protocol No. Time Source Destination Protocol Length Info 184 14.479013000 192.168.1.1 192.168.1.255 UDP 82 Source port: nim Destination port: sentinelsrm Frame 184: 82 bytes on wire (656 bits), 82 bytes captured (656 bits) on interface 0 Ethernet II, Src: AsustekC_92:94:1c (00:11:2f:92:94:1c), Dst: Broadcast (ff:ff:ff:ff:ff:ff) Internet Protocol Version 4, Src: 192.168.1.1 (192.168.1.1), Dst: 192.168.1.255 (192.168.1.255) User Datagram Protocol, Src Port: nim (1058), Dst Port: sentinelsrm (1947) Data (40 bytes) 0000 75 50 50 6f 55 64 67 4d 41 41 42 54 52 56 4a 57 uPPoUdgMAABTRVJW 0010 52 56 49 74 4e 41 42 74 52 47 39 73 62 33 4a 54 RVItNABtRG9sb3JT 0020 61 58 52 42 62 57 55 41 aXRBbWUA No. Time Source Destination Protocol Length Info 185 15.055695000 192.168.1.73 239.255.255.250 SSDP 179 M-SEARCH * HTTP/1.1 Frame 185: 179 bytes on wire (1432 bits), 179 bytes captured (1432 bits) on interface 0 Ethernet II, Src: Panasoni_9c:ad:ce (08:00:23:9c:ad:ce), Dst: IPv4mcast_7f:ff:fa (01:00:5e:7f:ff:fa) Internet Protocol Version 4, Src: 192.168.1.73 (192.168.1.73), Dst: 239.255.255.250 (239.255.255.250) User Datagram Protocol, Src Port: 65516 (65516), Dst Port: ssdp (1900) Hypertext Transfer Protocol No. Time Source Destination Protocol Length Info 186 15.394023000 192.168.1.43 192.168.1.71 RTSP 257 PLAY rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 Frame 186: 257 bytes on wire (2056 bits), 257 bytes captured (2056 bits) on interface 0 Ethernet II, Src: AsustekC_26:f5:8b (00:23:54:26:f5:8b), Dst: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69) Internet Protocol Version 4, Src: 192.168.1.43 (192.168.1.43), Dst: 192.168.1.71 (192.168.1.71) Transmission Control Protocol, Src Port: sitaramgmt (2630), Dst Port: rtsp (554), Seq: 515, Ack: 616, Len: 203 Real Time Streaming Protocol Request: PLAY rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 RTSP/1.0 Method: PLAY URL: rtsp://192.168.1.71/rtsp_tunnel?rec=1&line=1&inst=1&enableaudio=1 CSeq: 5 User-Agent: LIVE555 Streaming Media v2013.02.11 Session: 3dd525ac760b01b Range: clock=20130830T084008.000Z- No. Time Source Destination Protocol Length Info 187 15.394388000 192.168.1.71 192.168.1.43 TCP 60 rtsp > sitaramgmt [ACK] Seq=616 Ack=718 Win=1461 Len=0 Frame 187: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 616, Ack: 718, Len: 0 No. Time Source Destination Protocol Length Info 188 15.394881000 192.168.1.71 192.168.1.43 TCP 60 [TCP Window Update] rtsp > sitaramgmt [ACK] Seq=616 Ack=718 Win=1664 Len=0 Frame 188: 60 bytes on wire (480 bits), 60 bytes captured (480 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 616, Ack: 718, Len: 0 No. Time Source Destination Protocol Length Info 189 15.396583000 192.168.1.71 192.168.1.43 RTSP 108 Reply: RTSP/1.0 200 OK Frame 189: 108 bytes on wire (864 bits), 108 bytes captured (864 bits) on interface 0 Ethernet II, Src: VcsVideo_7d:ea:69 (00:07:5f:7d:ea:69), Dst: AsustekC_26:f5:8b (00:23:54:26:f5:8b) Internet Protocol Version 4, Src: 192.168.1.71 (192.168.1.71), Dst: 192.168.1.43 (192.168.1.43) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: sitaramgmt (2630), Seq: 616, Ack: 718, Len: 54 Real Time Streaming Protocol Response: RTSP/1.0 200 OK Status: 200 CSeq: 5 Session: 3dd525ac760b01b Fra: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] P? vegne af Ross Finlayson Sendt: 12. september 2013 10:08 Til: LIVE555 Streaming Media - development & use Emne: Re: [Live-devel] Presentation time when streamingvideorecordingfrom surveillance cameras Is this more like what you want? No, because you're showing only the first line of each RTSP response! Why don't you just do what I suggested before: Pass 1 as the "verbosityLevel" parameter to RTSPClient::createNew() and then send the (text) diagnostic output? Also, your client is using an old version of the "LIVE555 Streaming Media" software. You should upgrade to the latest version! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Ahmed.Hafsi at rohde-schwarz.com Thu Sep 12 07:49:31 2013 From: Ahmed.Hafsi at rohde-schwarz.com (Ahmed.Hafsi at rohde-schwarz.com) Date: Thu, 12 Sep 2013 16:49:31 +0200 Subject: [Live-devel] Raw IP Frames of video/audio streams + DLL Message-ID: Dear everyone, I am trying to use the live555 to make a RTSP client that forwards the raw IP frames that it gets from the RTSP server (that is the IP frames that contain the RTP data) Is it possible to access the raw IP packets from the library ? Another question: This client is going to live in a dll. Any idea on how to change the UsageEnvironment to suit the DLL ? Should I also change the TaskScheduler ? Thank you for the help ! Ahmed -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 12 07:59:54 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Sep 2013 07:59:54 -0700 Subject: [Live-devel] Presentation time when streamingvideorecordingfrom surveillance cameras In-Reply-To: <97D9F1117F2FF54A964F634D1D844B6532261D@server-4.Unitek.local> References: <97D9F1117F2FF54A964F634D1D844B65322608@server-4.Unitek.local> <49DD9BDE-489D-407D-B6D5-478F593EDBE9@live555.com> <97D9F1117F2FF54A964F634D1D844B65322617@server-4.Unitek.local> <0FBCC14B-19AE-4D4A-A4D0-ABB6EFBD6AD0@live555.com> <97D9F1117F2FF54A964F634D1D844B65322619@server-4.Unitek.local> <073280E8-01CF-4F17-8761-240FC9A26B74@live555.com> <97D9F1117F2FF54A964F634D1D844B6532261D@server-4.Unitek.local> Message-ID: > The problem is that I?m not getting any diagnostic output, even thou ?verbosityLevel? is (And has always been) ?1? If you run the application from a 'console' (or 'terminal') window, then the diagnostic output (to 'stderr') should appear there. But anyway, the output that you sent told me what I need to know: The server's response to the "PLAY" command doesn't include a "RTP-Info:" header. That means that you won't be able to call "getNormalPlayTime()" to find out where you are in the stream. But you should be able to simulate this by 1/ Recording the first presentation time that you see after you get the response to "PLAY". (Ignore any presentation time that's not 'RTCP-synchronized'.) 2/ Thereafter, subtract this first presentation time from the current presentation time, then add that to the (absolute) time that you seeked to. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 12 08:04:52 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Sep 2013 08:04:52 -0700 Subject: [Live-devel] Raw IP Frames of video/audio streams + DLL In-Reply-To: References: Message-ID: <2D5A2AA3-4D0B-4E4B-9A88-6E4DED0F1F95@live555.com> > I am trying to use the live555 to make a RTSP client that forwards the raw IP frames that it gets from the RTSP server (that is the IP frames that contain the RTP data) > Is it possible to access the raw IP packets from the library ? No - application-level code like this doesn't have access to IP headers. To see those, you probably need to run packet-sniffing software. > Another question: > This client is going to live in a dll. Any idea on how to change the UsageEnvironment to suit the DLL ? Should I also change the TaskScheduler ? No, I think most Windoze programmers who build a 'DLL' from the LIVE555 code can do so with the supplied "BasicUsageEnvironment" and "BasicTaskScheduler". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kingaceck at 163.com Wed Sep 11 20:21:44 2013 From: kingaceck at 163.com (kingaceck) Date: Thu, 12 Sep 2013 11:21:44 +0800 Subject: [Live-devel] testRTSPClient that behind a NAT Message-ID: <201309121121418825390@163.com> HI I put the live555MediaServer in a CENTOS computer that ip is 129.1.7.151. Then put testRTSPClient also in a CENTOS computer that it is behind a NAT(TP-LINK wireless router) and its ip is 192.168.1.105. I run ./testRTSPClient rtsp://129.1.7.201/test.mpg command and the testRTSPClient can't receive UDP packet.The testRTSPClient log as bellow. If I use tecent QQ software(it is a video communication software) on the both computes it can receive UDP packet. Can you help me to resolve the problem?Thank you very much. root at esc-laptop:/diske/live/testProgs# ./testRTSPClient rtsp://129.1.7.151/test.mpg Opening connection to 129.1.7.151, port 554... ...remote connection opened Sending request: DESCRIBE rtsp://129.1.7.151/test.mpg RTSP/1.0 CSeq: 2 User-Agent: ./testRTSPClient (LIVE555 Streaming Media v2013.09.11) Accept: application/sdp Received 638 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 2 Date: Thu, Sep 12 2013 03:15:48 GMT Content-Base: rtsp://129.1.7.151/test.mpg/ Content-Type: application/sdp Content-Length: 477 v=0 o=- 1378955748063821 1 IN IP4 129.1.7.151 s=MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server i=test.mpg t=0 0 a=tool:LIVE555 Streaming Media v2013.03.23 a=type:broadcast a=control:* a=range:npt=0-225.832 a=x-qt-text-nam:MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server a=x-qt-text-inf:test.mpg m=video 0 RTP/AVP 32 c=IN IP4 0.0.0.0 b=AS:500 a=control:track1 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 b=AS:128 a=control:track2 [URL:"rtsp://129.1.7.151/test.mpg/"]: Got a SDP description: v=0 o=- 1378955748063821 1 IN IP4 129.1.7.151 s=MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server i=test.mpg t=0 0 a=tool:LIVE555 Streaming Media v2013.03.23 a=type:broadcast a=control:* a=range:npt=0-225.832 a=x-qt-text-nam:MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server a=x-qt-text-inf:test.mpg m=video 0 RTP/AVP 32 c=IN IP4 0.0.0.0 b=AS:500 a=control:track1 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 b=AS:128 a=control:track2 [URL:"rtsp://129.1.7.151/test.mpg/"]: Initiated the "video/MPV" subsession (client ports 33962-33963) Sending request: SETUP rtsp://129.1.7.151/test.mpg/track1 RTSP/1.0 CSeq: 3 User-Agent: ./testRTSPClient (LIVE555 Streaming Media v2013.09.11) Transport: RTP/AVP;unicast;client_port=33962-33963 Received 201 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 3 Date: Thu, Sep 12 2013 03:15:48 GMT Transport: RTP/AVP;unicast;destination=129.1.7.100;source=129.1.7.151;client_port=33962-33963;server_port=6970-6971 Session: 045D9B13 [URL:"rtsp://129.1.7.151/test.mpg/"]: Set up the "video/MPV" subsession (client ports 33962-33963) [URL:"rtsp://129.1.7.151/test.mpg/"]: Created a data sink for the "video/MPV" subsession [URL:"rtsp://129.1.7.151/test.mpg/"]: Initiated the "audio/MPA" subsession (client ports 45140-45141) Sending request: SETUP rtsp://129.1.7.151/test.mpg/track2 RTSP/1.0 CSeq: 4 User-Agent: ./testRTSPClient (LIVE555 Streaming Media v2013.09.11) Transport: RTP/AVP;unicast;client_port=45140-45141 Session: 045D9B13 Received 201 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Date: Thu, Sep 12 2013 03:15:48 GMT Transport: RTP/AVP;unicast;destination=129.1.7.100;source=129.1.7.151;client_port=45140-45141;server_port=6972-6973 Session: 045D9B13 [URL:"rtsp://129.1.7.151/test.mpg/"]: Set up the "audio/MPA" subsession (client ports 45140-45141) [URL:"rtsp://129.1.7.151/test.mpg/"]: Created a data sink for the "audio/MPA" subsession Sending request: PLAY rtsp://129.1.7.151/test.mpg/ RTSP/1.0 CSeq: 5 User-Agent: ./testRTSPClient (LIVE555 Streaming Media v2013.09.11) Session: 045D9B13 Range: npt=0.000- Received 250 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Date: Thu, Sep 12 2013 03:15:48 GMT Range: npt=0.000- Session: 045D9B13 RTP-Info: url=rtsp://129.1.7.151/test.mpg/track1;seq=52825;rtptime=3805307075,url=rtsp://129.1.7.151/test.mpg/track2;seq=39912;rtptime=1308949270 [URL:"rtsp://129.1.7.151/test.mpg/"]: Started playing session (for up to 227.832000 seconds)... There is no data received from server log. 2013-09-12 kingaceck -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 13 14:32:50 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 13 Sep 2013 14:32:50 -0700 Subject: [Live-devel] testRTSPClient that behind a NAT In-Reply-To: <201309121121418825390@163.com> References: <201309121121418825390@163.com> Message-ID: First, your server is running an old version of the "LIVE555 Streaming Media" software. You should upgrade it to the latest version. In any case, it appears that your NAT box is not forwarding RTP/UDP packets reliably. I suggest that you change "testRTSPClient" to request RTP-over-TCP delivery. You can do this by changing line 229 of "testRTSPClient.cpp" to #define REQUEST_STREAMING_OVER_TCP True Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pagan81 at gmail.com Fri Sep 13 11:40:10 2013 From: pagan81 at gmail.com (pagan81 at gmail.com) Date: Fri, 13 Sep 2013 22:40:10 +0400 Subject: [Live-devel] How I can transform LATM packets with using Live555 ? Message-ID: <1585339081.20130913224010@gmail.com> Hi live-devel. My decoder can not decode audio in MP4A_LATM stream. I use UMC::AACDecode from Intel? Integrated Performance Primitives 7.0.4.054, this decoder does not support LATM. How I can transform LATM packets with use Live555 in packets with ADTS or ADIF headers for feeding in UMC::AACDecode ? If you knows other methods transform of LATM packets for correct work decoder please let me know. I will glad any help, and I will wait answer from you. Thanks. -- Pagan81 mailto:Pagan81 at gmail.com From finlayson at live555.com Sat Sep 14 20:54:36 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 14 Sep 2013 20:54:36 -0700 Subject: [Live-devel] How I can transform LATM packets with using Live555 ? In-Reply-To: <1585339081.20130913224010@gmail.com> References: <1585339081.20130913224010@gmail.com> Message-ID: <4D8C1481-9CD6-42D6-8ECD-FADBC54F8183@live555.com> You didn't mention RTP at all in your message. Our software supports only receiving LATM RTP packets, and sending LATM RTP packets, so it might not be what you're looking for. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Sun Sep 15 01:30:56 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Sun, 15 Sep 2013 10:30:56 +0200 Subject: [Live-devel] Get one frame at an absolute time In-Reply-To: <88F2FDE5-66B9-4DAF-B041-4313E3F555AF@live555.com> References: <06C15677-AB04-4B81-ADF1-232B519BA7E2@live555.com> <88F2FDE5-66B9-4DAF-B041-4313E3F555AF@live555.com> Message-ID: Hello Ross, handleClosure sends 'BYE' to the client, and I only want to pause it and be able to request more frames (ranges). I finally implemented what I needed using the task scheduler (scheduleDelayedTask/unscheduleDelatedTask). All is working quite well now. Thanks for your support. 2013/9/12 Ross Finlayson > In any case, I don't know what should I do in any of the modes > (SMTPE/NPT/Absolute) when the server has sent the complete range > specified in the play command from the client. In RFC2326 I see this > statement: > > After playing the desired range, the presentation is automatically paused, as if a PAUSE request had been issued. > > How should I implement this in live555? > > > Just stop sending data :-) > > The best way to do this is - within your "doGetNextFrame()" implementation > - call "FramedSource::handleClosure(this);" instead of the usual > "FramedSource::afterGetting(this);". As I noted before, this is what > "ByteStreamFileSource" does. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave at arthistory.cc Mon Sep 16 07:55:04 2013 From: dave at arthistory.cc (=?ISO-8859-1?Q?David_N=E4svik?=) Date: Mon, 16 Sep 2013 16:55:04 +0200 Subject: [Live-devel] method ANNOUNCE failed: 401 Unauthorized Message-ID: <52371BC8.4080606@arthistory.cc> Hi, First of all, I do know this is a development list, but I "just" want to use the pre-built binaries and cannot figure out where it stores the user database or other config. Now when I try to stream to the server I get: method ANNOUNCE failed: 401 Unauthorized so I guess that I missing the userinformation. Where's that stored for the prebuilt libraries? And I'm sorry if I have totally missunderstood the prebuilt code, maybe those aren't supposed to be used why config is missing. BR David From finlayson at live555.com Mon Sep 16 08:09:30 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 16 Sep 2013 08:09:30 -0700 Subject: [Live-devel] method ANNOUNCE failed: 401 Unauthorized In-Reply-To: <52371BC8.4080606@arthistory.cc> References: <52371BC8.4080606@arthistory.cc> Message-ID: <8B7C8EA4-9D38-4BE5-90B3-C7F5E35800D2@live555.com> Sorry, but our RTSP server code does not support - and will never support - the optional (and poorly thought out) RTSP "ANNOUNCE" command. We do, however, support a server taking another RTSP stream as input by acting as a 'proxy'. I.e., note our "LIVE555 Proxy Server": . (Note however, that - unlike the "LIVE555 Media Server" - we do not have pre-built binaries available for this; you must build it yourself, from source code.) Note especially the "-R" option, which may be of interest to you. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Yogev.Cohen at nice.com Tue Sep 17 22:27:19 2013 From: Yogev.Cohen at nice.com (Yogev Cohen) Date: Wed, 18 Sep 2013 07:27:19 +0200 Subject: [Live-devel] live555ProxyServer.exe crash after rtcp goodbye from backend is recieved. Message-ID: Hi, I use the live555ProxyServer.exe example. My front end is VLC and backend is Axis camera streaming h264. Both streams frontend and backend are UDP. frontend consume the proxy stream and then after a few seconds I stop the stream from frontend. I wait for about one minute until the backend is sending an RTCP Goodbye message. The ProxyServerMediaSubsession::subsessionByeHandler is called which cause the FramedSource::handleClosure to get called which in turn call the upstream source FramedSource::handleClosure to get called. But this FramedSource instance is already deleted when I closed the last stream from the frontend hence FramedSource::fOnCloseFunc is not valid and the application crash. I know that I shouldn't modify the library code, but setting all FramedSource members to NULL in the destructor solved the crash problem cause of the NULL check in FramedSource::handleClosure. But now once I try to play the stream from the VLC again I don't get any stream. Testing this against a Sony camera as a backend did not cause this problem because they don't send goodbye RTCP message. Please advise how to proceed, will there be a fix in future release? This is obviously not the right fix cause I can't restart the stream after this goodbye message. I'm working with the latest source - 11.09.2013 and also tested it against 13.08.2013 Thanks, Yogev. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 17 23:23:59 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 17 Sep 2013 23:23:59 -0700 Subject: [Live-devel] live555ProxyServer.exe crash after rtcp goodbye from backend is recieved. In-Reply-To: References: Message-ID: <31EF6961-6ED6-4501-BECD-499D158803D1@live555.com> > Please advise how to proceed, will there be a fix in future release? Yes, because you appear to have found a bug in the code. To help me debug this, could you please re-run the "LIVE555 Proxy Server" with the "-V" (upper-case V) option, to generate diagnostic output, and please send us the diagnostic output, up to the point of the crash. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Yogev.Cohen at nice.com Tue Sep 17 23:33:09 2013 From: Yogev.Cohen at nice.com (Yogev Cohen) Date: Wed, 18 Sep 2013 09:33:09 +0300 Subject: [Live-devel] live555 ProxyServerMediaSession "race condition". Message-ID: Hi, I know that the application is event based and single threaded but there is a problem which creating a ProxyServerMediaSession and trying to delete it before we get the describe response. This is my scenario: Create a new ProxyServerMediaSession - it in turn calls DESCRIBE on the backend. Call RTSPServer::removeMediaSession or RTSPServer::deleteMediaSession before that DESCRIBE call completes. (the ProxySMS instance is deleted but the Backend DESCRIBE handler is not disabled). Once the after DESCRIBE handler is called it reaches an invalid object (the deleted ProxySMS) and we crash. I Know the ProxyServerMediaSession was designed to be a black box that handles all the logics inside in a fire and forget manner, but the cleanup code is not consistent. What is the best way to rewrite the code so it will cleanup any open backend connections and handlers or protect against this scenario? Thanks, Yogev. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 18 00:03:09 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Sep 2013 00:03:09 -0700 Subject: [Live-devel] live555 ProxyServerMediaSession "race condition". In-Reply-To: References: Message-ID: <8C90523D-582B-4D18-BDF1-379167FD9998@live555.com> > What is the best way to rewrite the code so it will cleanup any open backend connections and handlers or protect against this scenario? That's for me to worry about; not you :-) > This is my scenario: > Create a new ProxyServerMediaSession - it in turn calls DESCRIBE on the backend. > Call RTSPServer::removeMediaSession or RTSPServer::deleteMediaSession before that DESCRIBE call completes. (the ProxySMS instance is deleted but the Backend DESCRIBE handler is not disabled). That's not supposed to happen, because the "ProxyServerMediaSession" destructor closes the associated "ProxyRTSPClient" object, which is supposed to close its TCP socket, thereby preventing any further event handling (including response handling) on that socket. So, either there's a bug in the code somewhere (that I'll need to track down), or else the scenario is not quite as you're describing it... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 18 13:11:34 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Sep 2013 13:11:34 -0700 Subject: [Live-devel] live555ProxyServer.exe crash after rtcp goodbye from backend is recieved. In-Reply-To: <31EF6961-6ED6-4501-BECD-499D158803D1@live555.com> References: <31EF6961-6ED6-4501-BECD-499D158803D1@live555.com> Message-ID: <530392D1-AB6B-4C8B-89C1-A95BD205331A@live555.com> >> Please advise how to proceed, will there be a fix in future release? > > Yes, because you appear to have found a bug in the code. > > To help me debug this, could you please re-run the "LIVE555 Proxy Server" with the "-V" (upper-case V) option, to generate diagnostic output, and please send us the diagnostic output, up to the point of the crash. Actually, you no longer need to do this, because I have now found (and fixed) the bug. (It turns out that the problem occurred only when proxying a H.264 video stream, from a back-end server that (1) does not handle "PAUSE", and (2) sends a RTCP "BYE" at the end of the stream. I've now installed a new version (2013.09.18) that should fix the problem. (I'll also take a look at the other problem that you reported, although that is less serious.) Thanks again for reporting this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Wed Sep 18 14:39:16 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Wed, 18 Sep 2013 21:39:16 +0000 Subject: [Live-devel] Seeing small handle leak when repeatedly starting and stopping event loop Message-ID: To verify if the handle leak I've noticed is in my code or in liveMedia I wrote a really simple program that just start and stops the event loop. I took the latest testRTSPClient.cpp and just replaced main() with a loop that initiates a thread which starts the event loop - StartEventLoop(). The main thread sleeps for 2s and then sets eventLoopWatchVariable making the thread exit. No streams are started, it's just the event loop going up and down. Code attached (testLiveMediaUpDown.cpp). I compile it using VC++; threading probably needs to be adjusted if using other compiler/OS. If I run this and monitor handles used I can see that the handles are incremented by one each time the event loop runs. This can e.g. be done using SysInternal's "Process Explorer" (handle column not on by default). Looking at the handle type I can also see it's the "\Device\Afd" handles that are increasing. I believe "Afd" stands for Auxiliary Function Driver and that it's related to sockets, but "Afd" is new to me as I've have never come across it before. Any idea where that handle leak originates? Could it be Windows specific? /Claes From: Erlandsson, Claes P (CERLANDS) Sent: Friday, July 19, 2013 11:20 AM To: 'live-devel at lists.live555.com' Subject: Seeing small handle leak when repeatedly starting and stopping event loop When I start and stop the liveMedia event loop I see a small handle leak in my Windows test client. When looking in Process Explorer I can see the amount of file handles with the name "\Device\Afd" growing. I don't have to stream anything for it to occur, i.e. I just start the event loop and then bring it down by setting the flag to 1. It is very possible it's due to some needed cleanup I'm missing. After leaving doEventLoop I delete my trigger(s), call env->reclaim() and delete the TaskScheduler. Is there anything else that should be done? /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: testLiveMediaUpDown.cpp URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5739 bytes Desc: not available URL: From finlayson at live555.com Wed Sep 18 15:23:58 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Sep 2013 15:23:58 -0700 Subject: [Live-devel] Seeing small handle leak when repeatedly starting and stopping event loop In-Reply-To: References: Message-ID: > Any idea where that handle leak originates? No. > Could it be Windows specific? I suppose that's possible... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 18 15:53:13 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Sep 2013 15:53:13 -0700 Subject: [Live-devel] live555 ProxyServerMediaSession "race condition". In-Reply-To: <8C90523D-582B-4D18-BDF1-379167FD9998@live555.com> References: <8C90523D-582B-4D18-BDF1-379167FD9998@live555.com> Message-ID: >> This is my scenario: >> Create a new ProxyServerMediaSession - it in turn calls DESCRIBE on the backend. >> Call RTSPServer::removeMediaSession or RTSPServer::deleteMediaSession before that DESCRIBE call completes. (the ProxySMS instance is deleted but the Backend DESCRIBE handler is not disabled). > > That's not supposed to happen, because the "ProxyServerMediaSession" destructor closes the associated "ProxyRTSPClient" object, which is supposed to close its TCP socket, thereby preventing any further event handling (including response handling) on that socket. I checked this scenario just now, and I was correct: The "ProxyRTSPClient" object - and its TCP socket - gets closed, so the response to the RTSP "DESCRIBE" command will never get handled. So, I can't see how the situation that you describe can be really happening. (If you are actually seeing a crash, then it appears to be for some reason other than what you're describing.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Wed Sep 18 18:07:33 2013 From: warren at etr-usa.com (Warren Young) Date: Wed, 18 Sep 2013 19:07:33 -0600 Subject: [Live-devel] Get MPEG-TS timing from .tsx file? Message-ID: <523A4E55.8010804@etr-usa.com> Currently live555MediaServer requires CBR or near-CBR files when streaming from MPEG-TS. The explanation I've read on the list is that this is because it makes determining the pacing of frame delivery easier. As I understand it, the server has an internal model of the video bit rate, and it can get behind if its model differs greatly from the reality. The problem, of course, is that VBR is increasingly the rule, and CBR the exception. VBR better models the nature of video, and newer encoding technologies are only increasing the variability. H.264 tends to have longer GOPs and a higher ratio of B pictures than MPEG-2, for example. For example, I have here a 1.4 Mbit/s VBR H.264 + AAC TS file that must be padded out to 3.7 Mbit/s to get it to stream smoothly because it has peaks to 4.2 Mbit/s. That's 164% overhead. My question is, couldn't the server's model of the video file's bit rate be greatly improved by looking at the index file? That implicitly tells you the video's bit rate at every point. (Ross, I will be sending you a link to this 1.4 Mbit/s file off-list.) From warren at etr-usa.com Wed Sep 18 18:08:51 2013 From: warren at etr-usa.com (Warren Young) Date: Wed, 18 Sep 2013 19:08:51 -0600 Subject: [Live-devel] mediaServer MP4 support Message-ID: <523A4EA3.7050803@etr-usa.com> Posts to the mailing list tell me there is some unspecified problem with the MP4/QuickTime file format related to A/V synchronization, and this is given as the reason there is no MP4FileServerMediaSubsession class in the library yet. (Or equivalently, an MP4FileServerDemux class, so you could handle MP4 the same as VOB is currently handled.) It's become increasingly popular with the success of Apple and Android, and it's an ISO standard besides. (MPEG-4 part 14, ISO 14496-14.) This means there's a lot of software that can produce these files. It's popping up in some very surprising places, like PowerPoint 2013. When Microsoft starts releasing their grip on their own proprietary formats in favor of open ones, you know there's a big shift under way. Remuxing an MP4 file into MPEG-TS is only a partial solution, for reasons given in my previous post. What would it take to get MP4 support into live555MediaServer in the open source distribution? I expect there's enough information in the existing liveMedia/QuickTime*.cpp files that I could construct the missing classes, but we've been trying to use the media server without changes. We're not particularly interested in having a proprietary version just to get such features. From finlayson at live555.com Thu Sep 19 15:13:13 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Sep 2013 15:13:13 -0700 Subject: [Live-devel] Get MPEG-TS timing from .tsx file? In-Reply-To: <523A4E55.8010804@etr-usa.com> References: <523A4E55.8010804@etr-usa.com> Message-ID: I took a look at this idea back in January 2011, and decided that it probably wouldn't be worth the trouble: http://lists.live555.com/pipermail/live-devel/2011-January/013083.html Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 19 15:17:54 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Sep 2013 15:17:54 -0700 Subject: [Live-devel] mediaServer MP4 support In-Reply-To: <523A4EA3.7050803@etr-usa.com> References: <523A4EA3.7050803@etr-usa.com> Message-ID: I'm not planning on supporting streaming from 'MP4'-format files anytime soon, because: 1/ It'd be a lot of work. (Live Networks, Inc. is not a charity.) 2/ The MP4 file format is patent-encumbered, and unfortunately this company is based in a country (USA) in which software patents apply. 3/ We already support streaming from 'Matroska'-format files (a file format that is not patent encimbered). So, if you can, I suggest converting to Matroska files, and streaming those. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Thu Sep 19 18:45:31 2013 From: warren at etr-usa.com (Warren Young) Date: Thu, 19 Sep 2013 19:45:31 -0600 Subject: [Live-devel] mediaServer MP4 support In-Reply-To: References: <523A4EA3.7050803@etr-usa.com> Message-ID: <523BA8BB.9040701@etr-usa.com> On 9/19/2013 16:17, Ross Finlayson wrote: > 1/ It'd be a lot of work. We thought it would be easy. Since it isn't, never mind. > (Live Networks, Inc. is not a charity.) You're always welcome to send us a work proposal. We may or may not accept it, but we will consider it. > 3/ We already support streaming from 'Matroska'-format files (a file > format that is not patent encimbered). Doesn't MKV have the same stream rate prediction problems as TS? MP4 and Matroska will also be a step backwards in terms of trick play support, wouldn't they? We're giving up on MP4 as a bad idea. From warren at etr-usa.com Thu Sep 19 19:16:36 2013 From: warren at etr-usa.com (Warren Young) Date: Thu, 19 Sep 2013 20:16:36 -0600 Subject: [Live-devel] Get MPEG-TS timing from .tsx file? In-Reply-To: References: <523A4E55.8010804@etr-usa.com> Message-ID: <523BB004.1060206@etr-usa.com> On 9/19/2013 16:13, Ross Finlayson wrote: > I took a look at this idea back in January 2011, and decided that it > probably wouldn't be worth the trouble: > http://lists.live555.com/pipermail/live-devel/2011-January/013083.html Okay, let's go back through it point by point: 1a. Index file change. Agreed, it's a problem. Too bad there isn't a header record with a version number, so you could cope with file format changes. You could add one, the same size as one of the existing records, but with magic bytes you could recognize. Then you calculate number of records - 1 wherever you currently divide record size into file size. 1b. Index file size increase. Can't you calculate "duration per transport packet" from the known locations of the frames and the frame rate? If a given TS packet is one of 100 representing a P frame and the frame rate is 29.97 fps, then the duration for this transport packet is 1 / 29.97 / 100 seconds. 2. File I/O overhead. Actually, this feature would *save* on disk I/O, because it saves you from having to null stuff VBR files out to CBR. Taking my 164% null stuffing overhead example, the net savings in I/O is around 2.5x. The only way it isn't a net savings is if you're trying to use this feature with a VBR file that's so close to CBR already that the index file is bigger than the required null stuffing, which probably isn't actually required to stream it smoothly in the first place. Simple fix: make it default, but optional. The only people needing to turn the feature off will be those who are within 10% of disk I/O saturation already. 3. Still requires fudge factors. Isn't this the core of the streaming problem? Isn't it why you have things like VBV in MPEG-2? No solution solves the entirety of this problem by itself, short of giving up on real-time streaming. (e.g. HLS, which isn't "streaming", to my way of thinking. It's just a bunch of chained downloads.) 4. Increases network burstiness, hence may affect congestion. Which contributes more to network congestion, a 3.7 Mbit/s CBR stream or a 1.4 Mbit/s VBR stream? In any case, there's an existence proof countering this argument: YouTube, Hulu, Netflix... They're all VBR. From finlayson at live555.com Thu Sep 19 22:42:15 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Sep 2013 22:42:15 -0700 Subject: [Live-devel] mediaServer MP4 support In-Reply-To: <523BA8BB.9040701@etr-usa.com> References: <523A4EA3.7050803@etr-usa.com> <523BA8BB.9040701@etr-usa.com> Message-ID: <46980F5D-554E-4BE7-94D1-FDC204C77F06@live555.com> > Doesn't MKV have the same stream rate prediction problems as TS? No, because it (unlike MP4, FYI) stores timestamps along with each frame. But in any case - as I'll explain in my next email - the "stream rate prediction" issue turns out to be a 'red herring'. > MP4 and Matroska will also be a step backwards in terms of trick play support, wouldn't they? I don't know about the MP4 format, but seeking within a Matroska file is trivial, and our server already supports it. Fast-forward and reverse play might also be possible to implement when streaming from Matroska files (perhaps only for some codecs), but we don't currently implement that. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 19 23:42:27 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Sep 2013 23:42:27 -0700 Subject: [Live-devel] Get MPEG-TS timing from .tsx file? In-Reply-To: <523BB004.1060206@etr-usa.com> References: <523A4E55.8010804@etr-usa.com> <523BB004.1060206@etr-usa.com> Message-ID: <16552BB0-2788-42D1-BFC6-83AF87241989@live555.com> > In any case, there's an existence proof countering this argument: YouTube, Hulu, Netflix... They're all VBR. They're also all TCP. But we stream via UDP. Apples and oranges. (Things are a lot easier if you're using a request-response protocol (like TCP), and therefore don't mind waiting arbitrarily long for data to get delivered.) Anyway, what I didn't make sufficiently clear in my January 2011 email was that when I ran my experiment, altering our server code (along with the index file format) to stream TS files using precise (rather than predicted) inter-Transport-Packet durations, it HAD NO EFFECT on the way that VLC (as the RTSP media player client) displayed a VBR TS stream. I.e., VLC played the stream in the same 'jittery' fashion as before. It wasn't any better. It might have even been a bit worse. It therefore became clear to me that I had been wrong about the 'accurate inter-packet duration calculation' issue being the cause of the problem. That turned out to be a 'red herring'. The real problem was the bursty nature of the VBR stream itself. If you want to transmit a highly VBR stream using a datagram protocol (like RTP), then you need to make sure that your client has a lot of buffering, and with a high 'low water mark' (much higher than the 100ms (I think) default that VLC uses). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Fri Sep 20 06:57:04 2013 From: warren at etr-usa.com (Warren Young) Date: Fri, 20 Sep 2013 07:57:04 -0600 Subject: [Live-devel] Get MPEG-TS timing from .tsx file? In-Reply-To: <16552BB0-2788-42D1-BFC6-83AF87241989@live555.com> References: <523A4E55.8010804@etr-usa.com> <523BB004.1060206@etr-usa.com> <16552BB0-2788-42D1-BFC6-83AF87241989@live555.com> Message-ID: <523C5430.4020706@etr-usa.com> On 9/20/2013 00:42, Ross Finlayson wrote: > > Anyway, what I didn't make sufficiently clear in my January 2011 email > was that when I ran my experiment, altering our server code (along with > the index file format) to stream TS files using precise (rather than > predicted) inter-Transport-Packet durations, it HAD NO EFFECT on the way > that VLC (as the RTSP media player client) displayed a VBR TS stream. Do you still have that code sitting around? Even if it's in patch form that no longer applies cleanly, I'd be willing to take a crack at re-applying it and playing with it. It may be that it's close to working, and could be fixed. For example, your January 2011 post talks about using a weighted [moving?] average. It may be that switching to a PID controller would give better results: https://en.wikipedia.org/wiki/PID_controller Control theory is a surprisingly subtle subject. From finlayson at live555.com Fri Sep 20 10:13:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Sep 2013 10:13:19 -0700 Subject: [Live-devel] Get MPEG-TS timing from .tsx file? In-Reply-To: <523C5430.4020706@etr-usa.com> References: <523A4E55.8010804@etr-usa.com> <523BB004.1060206@etr-usa.com> <16552BB0-2788-42D1-BFC6-83AF87241989@live555.com> <523C5430.4020706@etr-usa.com> Message-ID: <13C47E85-C499-45AD-B6A6-20FC2A8719D0@live555.com> > For example, your January 2011 post talks about using a weighted [moving?] average. It may be that switching to a PID controller would give better results: > > https://en.wikipedia.org/wiki/PID_controller > > Control theory is a surprisingly subtle subject. Yes, and feel free to play around with our existing code ("MPEG2TransportStreamFramer.cpp") that computes the weighted moving average. (Note in particular the 4 'fudge factor' constants near the beginning of the code; those can be modified, if necessary, from the compile line, without modifying the code) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike at criticalmatter.com Fri Sep 20 11:18:31 2013 From: mike at criticalmatter.com (Mike McNamara) Date: Fri, 20 Sep 2013 12:18:31 -0600 Subject: [Live-devel] NAL units sharing presentation time Message-ID: All, I'm working on a project where I need to consume H.264 data streamed from a webcam via RTSP. This is remarkably easy with Live555! Many thanks to Ross and everyone else. I'm currently confused by a particular sequence of NAL units that I see in the stream over and over and was hoping that someone more knowledgeable than myself might have some insight for me. Here's some output from a DummySink: video/H264: Received 10547 bytes. Presentation time: 1379699108.138589! audio/PCMU: Received 1024 bytes. Presentation time: 1379699108.189569! video/H264: Received 10962 bytes. Presentation time: 1379699108.178622! video/H264: Received 8651 bytes. Presentation time: 1379699108.218655! video/H264: Received 10621 bytes. Presentation time: 1379699108.258688! video/H264: Received 45 bytes. Presentation time: 1379699108.258688! video/H264: Received 4 bytes. Presentation time: 1379699108.258688! video/H264: Received 79400 bytes. Presentation time: 1379699108.298721! audio/PCMU: Received 1024 bytes. Presentation time: 1379699108.317569! ... video/H264: Received 10486 bytes. Presentation time: 1379699109.139414! audio/PCMU: Received 1024 bytes. Presentation time: 1379699109.213569! video/H264: Received 11426 bytes. Presentation time: 1379699109.179447! video/H264: Received 8681 bytes. Presentation time: 1379699109.219480! video/H264: Received 10878 bytes. Presentation time: 1379699109.259513! video/H264: Received 45 bytes. Presentation time: 1379699109.259513! video/H264: Received 4 bytes. Presentation time: 1379699109.259513! video/H264: Received 82291 bytes. Presentation time: 1379699109.299546! audio/PCMU: Received 1024 bytes. Presentation time: 1379699109.341569! etc.... What are the 45 and 4 byte NAL units that share the same presentation time as the preceding H.264 NAL unit? The payload of these 45 and 4 byte units is identical in every case: <674d0029 9a6280f0 044fcb35 01010140 0000fa40 003a983a 1800bb80 002ee06e f2e34300 17700005 dc0dde5c 28> and: <68ee3c80> respectively. Neither appears to be a start code or delimiter? I'm definitely outside of my sphere of knowledge, so any advice would be appreciated. Thanks! Mike -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 20 12:26:51 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Sep 2013 12:26:51 -0700 Subject: [Live-devel] NAL units sharing presentation time In-Reply-To: References: Message-ID: <62F3D76B-88C4-46F8-971B-B0C0E1A7A300@live555.com> > What are the 45 and 4 byte NAL units that share the same presentation time as the preceding H.264 NAL unit? They're SPS and PPS NAL units. They aren't frame data, but instead are configuration parameters - describing properties of the stream - that are useful to the decoder. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Fri Sep 20 03:39:50 2013 From: tayeb.dotnet at gmail.com (Tayeb Meftah) Date: Fri, 20 Sep 2013 11:39:50 +0100 Subject: [Live-devel] HTTP to RTSP Unicast Proxy Message-ID: hello guys, it's pocible to use Live555 RTSP Proxy to proxy from HTTP Unicast to RTSP (UNICAST)? thank Tayeb Meftah Voice of the blind T Broadcast Freedom http://www.vobradio.org Phone:447559762242 -------------- next part -------------- An HTML attachment was scrubbed... URL: From joao_dealmeida at hotmail.com Fri Sep 20 06:49:43 2013 From: joao_dealmeida at hotmail.com (Joao Almeida) Date: Fri, 20 Sep 2013 13:49:43 +0000 Subject: [Live-devel] mediaServer MP4 support In-Reply-To: References: <523A4EA3.7050803@etr-usa.com>, Message-ID: >From Matroska developpers: "Live streaming is the equivalent of TV broadcasting on the internet. There are 2 families of servers for that. The RTP/RTSP ones and the HTTP servers. Matroska is not meant to be used over RTP. RTP already has timing and channel mechanisms that would wasted if doubled in Matroska." From: finlayson at live555.com Date: Thu, 19 Sep 2013 15:17:54 -0700 To: live-devel at ns.live555.com Subject: Re: [Live-devel] mediaServer MP4 support I'm not planning on supporting streaming from 'MP4'-format files anytime soon, because: 1/ It'd be a lot of work. (Live Networks, Inc. is not a charity.) 2/ The MP4 file format is patent-encumbered, and unfortunately this company is based in a country (USA) in which software patents apply. 3/ We already support streaming from 'Matroska'-format files (a file format that is not patent encimbered). So, if you can, I suggest converting to Matroska files, and streaming those. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 20 23:56:00 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Sep 2013 23:56:00 -0700 Subject: [Live-devel] HTTP to RTSP Unicast Proxy In-Reply-To: References: Message-ID: <32B94D9D-9D8A-4D10-8B97-BDDB4980DEF5@live555.com> > it's pocible to use Live555 RTSP Proxy to proxy from HTTP Unicast to RTSP (UNICAST)? No, because the "LIVE555 Proxy Server" code assumes that the 'back-end' stream is accessed via RTSP. However, it should be straightforward to write a RTSP server (a regular RTSP server; not a proxy server) that takes an open HTTP stream as input (provided, of course, that the data is in a format that our server knows how to stream). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 21 00:08:34 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Sep 2013 00:08:34 -0700 Subject: [Live-devel] mediaServer MP4 support In-Reply-To: References: <523A4EA3.7050803@etr-usa.com>, Message-ID: <9223DDCF-25E4-49C4-BD89-B09743456705@live555.com> > From Matroska developpers: > > "Live streaming is the equivalent of TV broadcasting on the internet. There are 2 families of servers for that. The RTP/RTSP ones and the HTTP servers. Matroska is not meant to be used over RTP. RTP already has timing and channel mechanisms that would wasted if doubled in Matroska." They're talking about streaming a Matroska *file*, which obviously (as they note) doesn't make sense to do via RTP. (In particular, there's no RTP payload format defined that would make this possible.) This would also be the case for streaming MP4 files, BTW. What we implement, however, is streaming *from* a Matroska file (rather than streaming the file itself). I.e., we demultiplex each track of the file (audio, video, text (subtitles)), and stream each demultiplexed track using an appropriate RTP payload format, and with appropriate timestamps so that a receiving client can resynchronize them. (This also makes it possible for the server (or client) to stream only some tracks of the file, if desired.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From joao_dealmeida at hotmail.com Sat Sep 21 06:04:39 2013 From: joao_dealmeida at hotmail.com (Joao Almeida) Date: Sat, 21 Sep 2013 13:04:39 +0000 Subject: [Live-devel] mediaServer MP4 support In-Reply-To: <9223DDCF-25E4-49C4-BD89-B09743456705@live555.com> References: <523A4EA3.7050803@etr-usa.com>, , , , <9223DDCF-25E4-49C4-BD89-B09743456705@live555.com> Message-ID: >From GPAC developpers: " In short, MP4Box can be used: - for attaching metadata to individual streams or to the whole ISO file to produce MPEG-21 compliant or hybrid MPEG-4/MPEG-21 files - and packaging and tagging the result for streaming, download and playback on different devices (e.g. phones, PDA) or for different software (e.g. iTunes). To prepare for RTP, the following instruction will create RTP hint tracks for the file. This enables classic streaming servers like DarwinStreamingServer or QuickTime Streaming Server to deliver the file through RTSP/RTP: MP4Box -hint file.mp4 " I use their tool in my projects (a modified version). They're talking about streaming the MP4 *file*, or im wrong? From: finlayson at live555.com Date: Sat, 21 Sep 2013 00:08:34 -0700 To: live-devel at ns.live555.com Subject: Re: [Live-devel] mediaServer MP4 support >From Matroska developpers: "Live streaming is the equivalent of TV broadcasting on the internet. There are 2 families of servers for that. The RTP/RTSP ones and the HTTP servers. Matroska is not meant to be used over RTP. RTP already has timing and channel mechanisms that would wasted if doubled in Matroska." They're talking about streaming a Matroska *file*, which obviously (as they note) doesn't make sense to do via RTP. (In particular, there's no RTP payload format defined that would make this possible.) This would also be the case for streaming MP4 files, BTW. What we implement, however, is streaming *from* a Matroska file (rather than streaming the file itself). I.e., we demultiplex each track of the file (audio, video, text (subtitles)), and stream each demultiplexed track using an appropriate RTP payload format, and with appropriate timestamps so that a receiving client can resynchronize them. (This also makes it possible for the server (or client) to stream only some tracks of the file, if desired.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 21 08:27:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Sep 2013 08:27:19 -0700 Subject: [Live-devel] mediaServer MP4 support In-Reply-To: References: <523A4EA3.7050803@etr-usa.com>, , , , <9223DDCF-25E4-49C4-BD89-B09743456705@live555.com> Message-ID: <1645424B-3710-4E7D-A5F1-B280383F9296@live555.com> > I use their tool in my projects (a modified version). They're talking about streaming the MP4 *file*, or im wrong? Yes, you're wrong. They're not streaming 'the file'; they're streaming data that's extracted from the file. (The 'hint track' hack (which, BTW, is one of the parts of the MP4 format that's patented) is just a way to optimize this mechanism.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From gordu at dvr2010.com Sat Sep 21 13:49:07 2013 From: gordu at dvr2010.com (Gord Umphrey) Date: Sat, 21 Sep 2013 16:49:07 -0400 Subject: [Live-devel] HTTP Tunneling Message-ID: Hi Ross; We are have tried both the openRTSP, and testRTSPClient samples. We are tunnelling over HTTP. Both samples work ? we receive video and can display it. However the problem is that in both samples, we have listening UDP sockets on the client side. The whole idea of tunnelling over HTTP is to avoid firewall and port forwarding issues. If we are tunnelling over HTTP the client should not have any listening sockets ? it should only have a single outgoing socket. Is there a way to correct this? Thank you, Gord. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 21 15:31:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Sep 2013 15:31:19 -0700 Subject: [Live-devel] HTTP Tunneling In-Reply-To: References: Message-ID: > However the problem is that in both samples, we have listening UDP sockets on the client side. Don't worry. Those (2) sockets are created, but if you're receiving the RTP/RTCP packets over TCP (either over the RTSP/TCP connection, or over a RTSP/HTTP/TCP connection) then those sockets aren't actually listened to or used. So you can just ignore them. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.cassany at i2cat.net Wed Sep 25 08:01:18 2013 From: david.cassany at i2cat.net (David Cassany Viladomat) Date: Wed, 25 Sep 2013 17:01:18 +0200 Subject: [Live-devel] RTSP Server TCP Negotiation only Message-ID: Hi all, I am working in project where I will have to implement an rstp server over an already existing application which is already capable to transmit RTPstreams. I have been having a look in live555 source code for some time right now in order to find out how cloud I implement an RTSP server using live555 and using at the same time my own UDP/RTP sender functions. The first approach I found is to use testOnDemandRTSPServer example using MPEG2TransportStreamUDPServerMediaSubsession class in order to resend an already existing RTP loopback stream. The problem here is that all data is parsed and resent, well, this is a work around but it is not what I am loolink for. What I would like is an RTSP server capable of just performing the negotiation, get clients IP and port, send keep alive responses and notice clients commands; without sending and managing any RTP packet or data. My application already creates and sdp file, so even that, shouldn't be done with the RTSP server I need. Does anyone knows if there is a way to use the RTSPServer class without autogenerating sdp data and without having to invoke an RTPsink? I hope my explanation is clear enough :P Thanks, any help or suggestion is going to be appreciated :) David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 25 10:48:45 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 25 Sep 2013 10:48:45 -0700 Subject: [Live-devel] RTSP Server TCP Negotiation only In-Reply-To: References: Message-ID: <4B575557-48AF-4EE0-8114-8EC2C288413E@live555.com> > I have been having a look in live555 source code for some time right now in order to find out how cloud I implement an RTSP server using live555 and using at the same time my own UDP/RTP sender functions. I believe you can do this, by defining your own subclass of "ServerMediaSubsession" (and then adding an instance of this class (inside a "ServerMediaSession) to your "RTSPServer" object, after you've created it). Your subclass of "ServerMediaSubsession" will need to implement the following pure virtual functions: sdpLines(), getStreamParameters(), startStream(). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.cassany at i2cat.net Wed Sep 25 23:52:01 2013 From: david.cassany at i2cat.net (David Cassany Viladomat) Date: Thu, 26 Sep 2013 08:52:01 +0200 Subject: [Live-devel] RTSP Server TCP Negotiation only In-Reply-To: <4B575557-48AF-4EE0-8114-8EC2C288413E@live555.com> References: <4B575557-48AF-4EE0-8114-8EC2C288413E@live555.com> Message-ID: Thanks Ross, I had something similar in mind, so I believe I will take this path :) Thanks once again, you've been really helpful. David Cassany 2013/9/25 Ross Finlayson > I have been having a look in live555 source code for some time right now > in order to find out how cloud I implement an RTSP server using live555 > and using at the same time my own UDP/RTP sender functions. > > > I believe you can do this, by defining your own subclass of > "ServerMediaSubsession" (and then adding an instance of this class (inside > a "ServerMediaSession) to your "RTSPServer" object, after you've created > it). > > Your subclass of "ServerMediaSubsession" will need to implement the > following pure virtual functions: sdpLines(), getStreamParameters(), > startStream(). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From conchi.ap at vaelsys.com Thu Sep 26 10:40:50 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Thu, 26 Sep 2013 19:40:50 +0200 Subject: [Live-devel] Control connection issue: OPTIONS not received Message-ID: Hello, We are working using the following architecture: OpenRTSP client ----------------> OpenRTSP client ----------------> ProxyServer ------------------------->MediaServer OpenRTSP client ----------------> The videos served by the MediaServer are accessed through the ProxyServer. We are using your MediaServer and ProxyServer examples, without any change but the port number . When we install everything at the same LAN, the solution works fine, but we are experiencing some problems when the MediaServer and the ProxyServer are at different locations and connected through a 3G connection. In this case, the control connection doesn't work properly and the proxy closes the connection with a -11 error. After a logs analysis, we saw that the MediaServer receives the OPTIONS command and sends the proper response, but the ProxyServer never receives it. You can find attached the MediaServer and the ProxyServer logs. Regards Conchi Abasolo -- Conchi Abasolo P?rez C/Santiago Grisol?a n? 2, of. 203 Edif. PCM, Parque Tecnol?gico de Madrid 28760 Tres Cantos, Madrid Tlf. +34 91 804 62 48 // Fax. +34 91 803 10 31 Web: www.vaelsys.com Email: conchi.ap at vaelsys.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MediaServer.log Type: application/octet-stream Size: 14838 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ProxyServer.log Type: application/octet-stream Size: 3730 bytes Desc: not available URL: From gordu at dvr2010.com Thu Sep 26 16:41:04 2013 From: gordu at dvr2010.com (Gord Umphrey) Date: Thu, 26 Sep 2013 19:41:04 -0400 Subject: [Live-devel] HTTP Tunneling (Ross Finlayson) Message-ID: <64359628B68C4C4C90846AD97B0FA257@SmokeyPC> Thanks Ross; Unfortunately, the problem is with the various Windows Firewall programs. When an application opens a listening port, then most firewalls will totally block that application. McAFee and the built in Windows firewall exhibit this behaviour. It is expected that we need to configure such things on the server side, but not on the client. (We operate in a lot on financial institutions, and the corporate firewall rules there are quite strict). Since these ports are not required when tunnelling over HTTP, what would be the best way to not create the listening sockets in the first place? Thanks! Gord. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 26 17:49:43 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Sep 2013 17:49:43 -0700 Subject: [Live-devel] HTTP Tunneling (Ross Finlayson) In-Reply-To: <64359628B68C4C4C90846AD97B0FA257@SmokeyPC> References: <64359628B68C4C4C90846AD97B0FA257@SmokeyPC> Message-ID: > Since these ports are not required when tunnelling over HTTP, what would be the best way to not create the listening sockets in the first place? I can't "not create the listening sockets in the first place". What I can do, however, is - when the RTSP client knows that it's going to request RTP/RTCP-over-TCP, close these two (unneeded) sockets, so that they're no longer open in the client application. I have now installed a new version (2013.09.27) that does this. I hope this works for you. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 27 00:30:09 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 27 Sep 2013 00:30:09 -0700 Subject: [Live-devel] Control connection issue: OPTIONS not received In-Reply-To: References: Message-ID: <268E3960-5769-465B-9D9B-7CF85C668BE3@live555.com> As far as I can tell, the problem is a failure of the (3G) network between your proxy server and (back-end) media server. The proxy server's send of the "OPTIONS" command has failed (that's the cause of the "lost connection to server ('errno': 11)" message; you'll need to check your own system to find out what errno 11 is). The proxy server then attempts to reopen the TCP connection to the media server, but apparently fails. If this is the case - i.e., your 3G network has failed - then there's nothing you can do with our software to overcome this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From conchi.ap at vaelsys.com Fri Sep 27 06:08:59 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Fri, 27 Sep 2013 15:08:59 +0200 Subject: [Live-devel] Control connection issue: OPTIONS not received In-Reply-To: <268E3960-5769-465B-9D9B-7CF85C668BE3@live555.com> References: <268E3960-5769-465B-9D9B-7CF85C668BE3@live555.com> Message-ID: Thank you Ross for the quick response. Analyzing further the problem, we think we can discard that it's a lost packets issue due to the 3G network because we can reproduce the problem so many times as we want just following the same steps. It's systematic. Besides, we don't see any anomalous behaviour except for the OPTIONS message, even during the whole proccess and until the ProxyServer closes the connection, the video data is received properly in the openRTSP client. We also made some testing connecting through UDP instead TCP without any problem, using same 3G network. So we think the problem may be in the TCP connection handling. Is there a way we can provide you with more info to confirm this? Thank you 2013/9/27 Ross Finlayson > As far as I can tell, the problem is a failure of the (3G) network between > your proxy server and (back-end) media server. The proxy server's send of > the "OPTIONS" command has failed (that's the cause of the "lost connection > to server ('errno': 11)" message; you'll need to check your own system to > find out what errno 11 is). The proxy server then attempts to reopen the > TCP connection to the media server, but apparently fails. > > If this is the case - i.e., your 3G network has failed - then there's > nothing you can do with our software to overcome this. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Conchi Abasolo P?rez C/Santiago Grisol?a n? 2, of. 203 Edif. PCM, Parque Tecnol?gico de Madrid 28760 Tres Cantos, Madrid Tlf. +34 91 804 62 48 // Fax. +34 91 803 10 31 Web: www.vaelsys.com Email: conchi.ap at vaelsys.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 27 14:35:27 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 27 Sep 2013 14:35:27 -0700 Subject: [Live-devel] Control connection issue: OPTIONS not received In-Reply-To: References: <268E3960-5769-465B-9D9B-7CF85C668BE3@live555.com> Message-ID: <52045CD7-79BE-44DA-A6E6-E9E302BF46A6@live555.com> > We also made some testing connecting through UDP instead TCP without any problem, using same 3G network. If streaming over UDP works, then why are you streaming over TCP?? Streaming RTP/RTCP-over-TCP is suboptimal, and should be done only as a last resort - if you have a firewall that blocks UDP packets). If you stream over TCP, you will get increased latency (often much increased latency). More importantly, if the stream's bitrate exceeds the capacity of your network, then you *will* get socket I/O failure (due to OS network buffers filling up). This may be what is happening in your case. So, because streaming over UDP works for you, you should continue to use it, and *don't* try to stream over TCP. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chinnasamymca at yahoo.com Thu Sep 26 20:36:57 2013 From: chinnasamymca at yahoo.com (Chinnasamy C) Date: Thu, 26 Sep 2013 20:36:57 -0700 (PDT) Subject: [Live-devel] Live555 library showing error in XCode Message-ID: <1380253017.27036.YahooMailNeo@web125003.mail.ne1.yahoo.com> Hi,? I have integrated the live555 library with my project , but HashTable.hh and media.hh files are showing some errors .? Kindly help me to run on iOS device . If you have any samples or tutorials ?for iOS kindly send me..? ? ? ----- With Regards, Chinna -------------- next part -------------- An HTML attachment was scrubbed... URL: From arash.cordi at gmail.com Sat Sep 28 09:02:53 2013 From: arash.cordi at gmail.com (Arash Cordi) Date: Sat, 28 Sep 2013 19:32:53 +0330 Subject: [Live-devel] Live555 library showing error in XCode In-Reply-To: <1380253017.27036.YahooMailNeo@web125003.mail.ne1.yahoo.com> References: <1380253017.27036.YahooMailNeo@web125003.mail.ne1.yahoo.com> Message-ID: it may help if you post the errors you get On Fri, Sep 27, 2013 at 7:06 AM, Chinnasamy C wrote: > Hi, > > I have integrated the live555 library with my project , but HashTable.hh > and media.hh files are showing some errors . > > Kindly help me to run on iOS device . If you have any samples or tutorials > for iOS kindly send me.. > > > ----- > With Regards, > Chinna > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- ArasH -------------- next part -------------- An HTML attachment was scrubbed... URL: From Yogev.Cohen at nice.com Sun Sep 29 04:16:12 2013 From: Yogev.Cohen at nice.com (Yogev Cohen) Date: Sun, 29 Sep 2013 13:16:12 +0200 Subject: [Live-devel] live555ProxyServer.exe crash after rtcp goodbye Message-ID: Hi Ross, I updated to code to latest version and now the crash problem is solved - thanks for the quick response. Now I'm left with a new problem: 1. open VLC connection to proxy Axis camera H264 stream that does not implement RTSP "PAUSE" and sends RTSP "BYE" after a timeout. 2. stop the stream from VLC. 3. wait for the timeout. 4. play the proxy stream again from VLC. --- The stream won't start and I get from the backend a 454 Session Not Found After the stream is closed by the backend server it teardown the session that we worked with. If it try to request the stream again the proxy server is sending only play command to the backend - cause it assumes that it is already setup by prior call but actually I get a 454 Session Not Found - which is understandable because the backend closed this session. As I understand it from what you stated that the backend server does not implement PAUSE and sends RTSP "BYE" the idea of keeping the ProxyRTSPClient <---> Backend connection will not work and we need go back to the point before we called the first DESCRIBE. I modified the library code as follows in order to solved the problem, but it is an HACK and I would appreciate a better solution if you have one: void ProxyServerMediaSubsession::subsessionByeHandler() { if (verbosityLevel() > 0) { envir() << *this << ": received RTCP \"BYE\"\n"; } //make the proxy client belive that there is a problem with the backend and reset all connections and send DESCRIBE again. ProxyServerMediaSession* const sms = (ProxyServerMediaSession*)fParentSession; ProxyRTSPClient* const proxyRTSPClient = sms->fProxyRTSPClient; proxyRTSPClient->continueAfterLivenessCommand(-1, proxyRTSPClient->fServerSupportsGetParameter); //comment out this code. //// This "BYE" signals that our input source has (effectively) closed, so handle this accordingly: //FramedSource::handleClosure(fClientMediaSubsession.readSource()); //// Then, close our input source for real: //fClientMediaSubsession.deInitiate(); } After this change the scenario works. Thanks, Yogev. >> Please advise how to proceed, will there be a fix in future release? > > Yes, because you appear to have found a bug in the code. > > To help me debug this, could you please re-run the "LIVE555 Proxy Server" with the "-V" (upper-case V) option, to generate diagnostic output, and please send us the diagnostic output, up to the point of the crash. Actually, you no longer need to do this, because I have now found (and fixed) the bug. (It turns out that the problem occurred only when proxying a H.264 video stream, from a back-end server that (1) does not handle "PAUSE", and (2) sends a RTCP "BYE" at the end of the stream. I've now installed a new version (2013.09.18) that should fix the problem. (I'll also take a look at the other problem that you reported, although that is less serious.) Thanks again for reporting this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Sep 29 19:04:36 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 29 Sep 2013 19:04:36 -0700 Subject: [Live-devel] live555ProxyServer.exe crash after rtcp goodbye In-Reply-To: References: Message-ID: > As I understand it from what you stated that the backend server does not implement PAUSE and sends RTSP "BYE" the idea of keeping the > ProxyRTSPClient <---> Backend connection will not work and we need go back to the point before we called the first DESCRIBE. > > I modified the library code as follows in order to solved the problem, but it is an HACK and I would appreciate a better solution if you have one Your solution was actually quite good; the only real change I made was to keep the "handleClosure()" call, to ensure that front-end clients that attempted to play the stream without stopping will also receive a RTCP "BYE". I've now installed a new version - 2013.09.30 - of the software that includes this fix. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Mon Sep 30 00:44:09 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Mon, 30 Sep 2013 08:44:09 +0100 Subject: [Live-devel] RTSP over TCP HD frame size Message-ID: Hello Ross, I want to stream over TCP different streams that have been recorder from different cameras with live555 RTSP Server. Some are HD (H264 and MJPEG) and the frame size can be quite big (250 kb or more). If I stream over UDP all is working perfectly. I need to connect from the Internet so in that case the RTSP client connects using TCP. If the frame size is small, it works, but if it's big I receive incorrect frames. When I display them, they show only half of the frame and sometimes I receive the error: RTCPInstance error: Hit limit when reading incoming packet over TCP. > Increase "maxRTCPPacketSize" I have tried to increase that value (for the client) but the problem persists. Also after changing that value sometimes I see the error: MultiFramedRTPSource error: Hit limit when reading incoming packet over > TCP. Increase MAX_PACKET_SIZE Looking at openRTSP I have also increased the receive buffer size with setReceiveBufferTo to 2Mb without any difference. The server is not reporting any error. I have tried to connect over TCP locally and I have the same problem so although the bandwith over Internet is not enough the problem is there with a fast connection. One thing I noticed is that when I stream a small resolution stream the fFrameSize in the server is exactly the same value as the size of the received that in the client sink (afterGettingFrame). But when the resolution increases I see that fFrameSize is smaller than what the client is receiving. Is there something I can do to fix this without modifying live555? Should this be a problem with the client or the server? Is there any "verbose" mode in live555 so I can see where could be the problem. Thanks in advance. -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 30 00:53:07 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Sep 2013 00:53:07 -0700 Subject: [Live-devel] RTSP over TCP HD frame size In-Reply-To: References: Message-ID: <8A68F4E1-DF36-40DB-BA6F-FD99A88E6174@live555.com> > RTCPInstance error: Hit limit when reading incoming packet over TCP. Increase "maxRTCPPacketSize" > MultiFramedRTPSource error: Hit limit when reading incoming packet over TCP. Increase MAX_PACKET_SIZE These errors suggest that either your server (network camera?), your client, or both, are using out-of-date versions of the "LIVE555 Streaming Media" software. You should upgrade them to use the latest version. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Mon Sep 30 01:16:22 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Mon, 30 Sep 2013 09:16:22 +0100 Subject: [Live-devel] RTSP over TCP HD frame size In-Reply-To: <8A68F4E1-DF36-40DB-BA6F-FD99A88E6174@live555.com> References: <8A68F4E1-DF36-40DB-BA6F-FD99A88E6174@live555.com> Message-ID: My mistake, that message appeared yesterday before I updated both server an client to the latest live.2013.09.30 I have re-checked now and that message does not appear any more. 2013/9/30 Ross Finlayson > RTCPInstance error: Hit limit when reading incoming packet over TCP. >> Increase "maxRTCPPacketSize" > > MultiFramedRTPSource error: Hit limit when reading incoming packet over >> TCP. Increase MAX_PACKET_SIZE > > > These errors suggest that either your server (network camera?), your > client, or both, are using out-of-date versions of the "LIVE555 Streaming > Media" software. You should upgrade them to use the latest version. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at eyelynx.com Mon Sep 30 02:10:18 2013 From: francisco at eyelynx.com (Francisco Feijoo) Date: Mon, 30 Sep 2013 10:10:18 +0100 Subject: [Live-devel] RTSP over TCP HD frame size In-Reply-To: References: <8A68F4E1-DF36-40DB-BA6F-FD99A88E6174@live555.com> Message-ID: Just to clarify, the message does not appear, but the error is still there. Thanks. 2013/9/30 Francisco Feijoo > My mistake, that message appeared yesterday before I updated both server > an client to the latest live.2013.09.30 > > I have re-checked now and that message does not appear any more. > > > 2013/9/30 Ross Finlayson > >> RTCPInstance error: Hit limit when reading incoming packet over TCP. >>> Increase "maxRTCPPacketSize" >> >> MultiFramedRTPSource error: Hit limit when reading incoming packet over >>> TCP. Increase MAX_PACKET_SIZE >> >> >> These errors suggest that either your server (network camera?), your >> client, or both, are using out-of-date versions of the "LIVE555 Streaming >> Media" software. You should upgrade them to use the latest version. >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> > > > -- > Francisco Feijoo > Software Engineer > EyeLynx Limited > > T: +44 020 8133 9388 > E: francisco at eyelynx.com > W: www.eyelynx.com > -- Francisco Feijoo Software Engineer EyeLynx Limited T: +44 020 8133 9388 E: francisco at eyelynx.com W: www.eyelynx.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From conchi.ap at vaelsys.com Mon Sep 30 08:57:58 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Mon, 30 Sep 2013 17:57:58 +0200 Subject: [Live-devel] Control connection issue: OPTIONS not received In-Reply-To: <52045CD7-79BE-44DA-A6E6-E9E302BF46A6@live555.com> References: <268E3960-5769-465B-9D9B-7CF85C668BE3@live555.com> <52045CD7-79BE-44DA-A6E6-E9E302BF46A6@live555.com> Message-ID: Hi Ross Indeed, it works with UDP and we know it is a better option. Unfortunately we cannot use it. We did the UDP test to narrow down the problem and to provide you with more info but it is not a valid solution for us. As you said, there is a blocking firewall. Furthermore TCP is working perfectly in a LAN and even in a DSL line. We just would like to have the same behavior in the 3G network. We don't think the problem is within the network as everything seems to work properly until a certain sequence of commands is executed. The tests we are performing are with very light streams so they don't exceed our bandwith. The difference is the considerable latency we get in the 3G network. Digging more into the problem we discovered the following: Having a proxy running, when the first client disconnects (and there are no more connected clients), the proxy will send a PAUSE command to the server. The answer to the PAUSE is apparently never received. The library detects it and throws a message indicating the server may be "buggy". We then get different behaviors: - either the proxy can recover from this and continues sending the OPTIONS liveness commands - or the proxy sends the OPTIONS commands but does not receive responses - or after a while the connection is closed with -11 error We included some debugging (printfs) code in the RTSPClient::handleResponseBytes method and we noticed that after the PAUSE is sent there are some "unexpected" (non ASCII) bytes comming into the response buffer (fResponseBuffer). Then, after a while (because of the latency), the PAUSE response is received. But as it is preceeded by these "unexpected" bytes it is not handled adecuately. Instead, when the "\r\n\r\n" is received the library tries to parse it (headerDataCopy) as a request (as it does not have a valid response code). It obviously fails in doing so. After this "dirty" command is processed and the buffer is cleared things come back to normal (OPTIONS command are processed). If the buffer was filled up before receiving the "\r\n\r\n" the library will reset the connection in an attempt to recover. Even if the library finally recovers we always observe these "unexpected" bytes getting into the buffer and interfering with the proper execution of the control connection. Where do you think these "unexpected" bytes can come from? We think they may come from a data connection that is closed when the PAUSE is sent but maybe the socket connection had some unprocessed bytes in its "queue" that ended up in the response buffer (because it's normal background processing has been shut down). Do you think something like this is possible? We can provide you with the logs that lead us to think this is the problem. Thank you in advance Conchi Abasolo 2013/9/27 Ross Finlayson > We also made some testing connecting through UDP instead TCP without any > problem, using same 3G network. > > > If streaming over UDP works, then why are you streaming over TCP?? > Streaming RTP/RTCP-over-TCP is suboptimal, and should be done only as a > last resort - if you have a firewall that blocks UDP packets). If you > stream over TCP, you will get increased latency (often much increased > latency). More importantly, if the stream's bitrate exceeds the capacity > of your network, then you *will* get socket I/O failure (due to OS network > buffers filling up). This may be what is happening in your case. > > So, because streaming over UDP works for you, you should continue to use > it, and *don't* try to stream over TCP. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Conchi Abasolo P?rez C/Santiago Grisol?a n? 2, of. 203 Edif. PCM, Parque Tecnol?gico de Madrid 28760 Tres Cantos, Madrid Tlf. +34 91 804 62 48 // Fax. +34 91 803 10 31 Web: www.vaelsys.com Email: conchi.ap at vaelsys.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 30 14:33:02 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Sep 2013 14:33:02 -0700 Subject: [Live-devel] Control connection issue: OPTIONS not received In-Reply-To: References: <268E3960-5769-465B-9D9B-7CF85C668BE3@live555.com> <52045CD7-79BE-44DA-A6E6-E9E302BF46A6@live555.com> Message-ID: <3158012E-23DF-44A2-B21A-511D244A0012@live555.com> Conchi, Unfortunately it's hard to get a consistent view of what is going wrong with your proxy server, so I don't think that I'm going be able to resolve your problem on this mailing list. Instead, it's probably going to require more focused one-on-one interaction via private email. If you're interested in setting up a consulting arrangement for this, please let me know - by separate email. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: