From dbarboza at ic.uff.br Tue Oct 1 07:39:30 2013 From: dbarboza at ic.uff.br (Diego Barboza) Date: Tue, 1 Oct 2013 11:39:30 -0300 Subject: [Live-devel] Generating streaming at runtime Message-ID: Hi everyone, I'm doing a research about live streaming the video output from a game to a client. By doing so, I need to generate the streaming data at runtime instead of reading it from a file (like most LIVE555 examples do). Could someone point me out a good reference to start this up? Is there any example that does it? Or which files should I work with to make this work? Thanks in advance, Diego Barboza -------------- next part -------------- An HTML attachment was scrubbed... URL: From harsadi.mate at seacon.hu Tue Oct 1 06:00:07 2013 From: harsadi.mate at seacon.hu (=?iso-8859-2?B?SOFyc+FkaSBN4XTp?=) Date: Tue, 1 Oct 2013 13:00:07 +0000 Subject: [Live-devel] RTSP stream via liveMedia from opencv capture Message-ID: <7ea62b2fb56b49df892c6d4ed7233000@srvmail.seacon.hu> Hy! I'm currently developing a stream server which feeded by a webcam. I choosed the liveMedia, but at the moment i need a really big help. I cannot found an interface or something to create RTSP stream via liveMedia and the frames coming from an opencv webcam's capture. This capture is shared by the other application modules. After a lot of search I've founded this link: http://code.google.com/p/rudp/source/browse/#svn%2Ftrunk%2FRTSPServer But the code is written in windows and i use ubuntu 12.04... And as i can see the guy customized a little bit the codes. I can't implement to my code... Could anyone help me? H?rs?di M?t? szoftverfejleszt? Seacon Europe Kft. H-8000 Sz?kesfeh?rv?r, M?ricz Zsigmond utca 14. Mobil: +36 20 2398 389 Tel.: +36 22 501 632 Fax: +36 22 501 633 E-mail: harsadi.mate at seacon.hu Web: www.seacon.hu -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 1 07:58:54 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Oct 2013 07:58:54 -0700 Subject: [Live-devel] Generating streaming at runtime In-Reply-To: References: Message-ID: <935AB8BE-CBAD-4401-AB43-590FD7AF6CFC@live555.com> > Hi everyone, I'm doing a research about live streaming the video output from a game to a client. By doing so, I need to generate the streaming data at runtime instead of reading it from a file (like most LIVE555 examples do). Could someone point me out a good reference to start this up? See http://www.live555.com/liveMedia/faq.html#liveInput-unicast Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 1 08:12:12 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Oct 2013 08:12:12 -0700 Subject: [Live-devel] RTSP stream via liveMedia from opencv capture In-Reply-To: <7ea62b2fb56b49df892c6d4ed7233000@srvmail.seacon.hu> References: <7ea62b2fb56b49df892c6d4ed7233000@srvmail.seacon.hu> Message-ID: <3A307192-C087-451F-A394-CEEB9489A7A7@live555.com> > I'm currently developing a stream server which feeded by a webcam. I choosed the liveMedia, but at the moment i need a really big help. I cannot found an interface or something to create RTSP stream via liveMedia and the frames coming from an opencv webcam's capture. See http://www.live555.com/liveMedia/faq.html#liveInput-unicast > After a lot of search I've founded this link: > http://code.google.com/p/rudp/source/browse/#svn%2Ftrunk%2FRTSPServer You should ignore that; it's based on a very old version of our code. (It also violates our copyright, so I'll be asking Google to remove it ASAP.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From conchi.ap at vaelsys.com Wed Oct 2 02:24:48 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Wed, 2 Oct 2013 11:24:48 +0200 Subject: [Live-devel] SIGABRT: SingleStep(): select() fails: Bad file descriptor Message-ID: Hello Ross, We are having critical problems with versions 30.09.2013 and 01.10.2013 over TCP. We use the live555ProxyServer example to connect to a camera stream using the following command live555ProxyServer -V -t rtsp:// root:camera86 at 192.168.0.86/axis-media/media.amp Once the backend connection is established, we try to connet to the stream using a rtsp client, but after the SETUP command the live555ProxyServer crashes. It happens always and we cannot even see a single frame. Here you have a little extract of the gdb log (and attached the complete log): ProxyRTSPClient["rtsp://192.168.0.86/axis-media/media.amp/"]::continueAfterSETUP(): head codec: H264; numSubsessions 2 queue: H264 PCMU Sending request: SETUP rtsp://192.168.0.86/axis-media/media.amp/trackID=2 RTSP/1.0 CSeq: 4 User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.10.01) Transport: RTP/AVP/TCP;unicast;interleaved=2-3 Session: 3309CA32 BasicTaskScheduler::SingleStep(): select() fails: Bad file descriptor socket numbers used in the select() call: 7(r) 8(re) 9(r) 10(re) 11(r) 12(r) 14(r) 16(r) 18(r) 1032(w) 1034(w) 1088(e) 1094(e) 1156(e) 1157(e) 1158(e) 1160(e) 1162(e) 1163(e) 1164(e) 1165(e) 1166(e) 1167(e) 1168(e) 1169(e)... Program received signal SIGABRT, Aborted. 0x00007ffff72e6475 in *__GI_raise (sig=) at ../nptl/sysdeps/unix/sysv/linux/raise.c:64 64 ../nptl/sysdeps/unix/sysv/linux/raise.c: No existe el fichero o el directorio. (gdb) (gdb) (gdb) (gdb) (gdb) (gdb) (gdb) (gdb) (gdb) (gdb) (gdb) backtrace #0 0x00007ffff72e6475 in *__GI_raise (sig=) at ../nptl/sysdeps/unix/sysv/linux/raise.c:64 #1 0x00007ffff72e96f0 in *__GI_abort () at abort.c:92 #2 0x00000000004331ff in TaskScheduler::internalError (this=) at UsageEnvironment.cpp:56 #3 0x00000000004316d4 in BasicTaskScheduler::SingleStep (this=0x658010, maxDelayTime=) at BasicTaskScheduler.cpp:117 #4 0x0000000000432b85 in BasicTaskScheduler0::doEventLoop (this=0x658010, watchVariable=0x0) at BasicTaskScheduler0.cpp:80 #5 0x000000000040236f in main (argc=, argv=) at live555ProxyServer.cpp:204 We think maybe the changes introduced in 2013.09.27 version that close some TCP sockets are causing some instabilities. Many thanks Conchi -- Conchi Abasolo P?rez C/Santiago Grisol?a n? 2, of. 203 Edif. PCM, Parque Tecnol?gico de Madrid 28760 Tres Cantos, Madrid Tlf. +34 91 804 62 48 // Fax. +34 91 803 10 31 Web: www.vaelsys.com Email: conchi.ap at vaelsys.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: proxy.01.10.2013.log Type: application/octet-stream Size: 14709 bytes Desc: not available URL: From finlayson at live555.com Wed Oct 2 02:57:51 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Oct 2013 02:57:51 -0700 Subject: [Live-devel] SIGABRT: SingleStep(): select() fails: Bad file descriptor In-Reply-To: References: Message-ID: <2F813A06-557B-4483-8D93-F53C7A65E090@live555.com> > We think maybe the changes introduced in 2013.09.27 version that close some TCP sockets are causing some instabilities. The (unneeded) sockets that are closed starting in version 2013.09.27 are UDP sockets, not TCP sockets. I've released a new version (2013.10.02) of the code now that makes absolutely sure that no background reading is happening on these sockets when they are closed. This version should fix the problem that you reported. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From harsadi.mate at seacon.hu Wed Oct 2 05:57:59 2013 From: harsadi.mate at seacon.hu (=?iso-8859-2?B?SOFyc+FkaSBN4XTp?=) Date: Wed, 2 Oct 2013 12:57:59 +0000 Subject: [Live-devel] RTSP stream via liveMedia from opencv capture In-Reply-To: <3A307192-C087-451F-A394-CEEB9489A7A7@live555.com> References: <7ea62b2fb56b49df892c6d4ed7233000@srvmail.seacon.hu>, <3A307192-C087-451F-A394-CEEB9489A7A7@live555.com> Message-ID: Hy! I founded this example -> http://www.live555.com/Elphel/. I've changed the code to read only one .jpg image and start to stream that continuously. But after the stream starts the entire network is brokes until I stop the stream. I tried to watch the stream with VLC but it not showing the correct image. I attach the sources. H?rs?di M?t? szoftverfejleszt? Seacon Europe Kft. H-8000 Sz?kesfeh?rv?r, M?ricz Zsigmond utca 14. Mobil: +36 20 2398 389 Tel.: +36 22 501 632 Fax: +36 22 501 633 E-mail: harsadi.mate at seacon.hu Web: www.seacon.hu ________________________________ Felad?: live-devel-bounces at ns.live555.com, meghatalmaz?: Ross Finlayson Elk?ldve: 2013. okt?ber 1. 17:12 C?mzett: LIVE555 Streaming Media - development & use T?rgy: Re: [Live-devel] RTSP stream via liveMedia from opencv capture I'm currently developing a stream server which feeded by a webcam. I choosed the liveMedia, but at the moment i need a really big help. I cannot found an interface or something to create RTSP stream via liveMedia and the frames coming from an opencv webcam's capture. See http://www.live555.com/liveMedia/faq.html#liveInput-unicast After a lot of search I've founded this link: http://code.google.com/p/rudp/source/browse/#svn%2Ftrunk%2FRTSPServer You should ignore that; it's based on a very old version of our code. (It also violates our copyright, so I'll be asking Google to remove it ASAP.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: main.cpp Type: text/x-c++src Size: 5454 bytes Desc: main.cpp URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ElphelJPEGDeviceSource.cpp Type: text/x-c++src Size: 4878 bytes Desc: ElphelJPEGDeviceSource.cpp URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ElphelJPEGDeviceSource.h Type: text/x-chdr Size: 2092 bytes Desc: ElphelJPEGDeviceSource.h URL: From marcin at speed666.info Wed Oct 2 06:03:16 2013 From: marcin at speed666.info (Marcin) Date: Wed, 02 Oct 2013 15:03:16 +0200 Subject: [Live-devel] Problems with openRTSP on ARM device Message-ID: <524C1994.50509@speed666.info> Hello, I wanted to ask You because i have no idea where the problem may sit. I have compiled openRTSP which i use to dump H264 streams from RTSP server over localhost This approach works fine in many IP Cameras that i use for my purpose. Suddenly i found a device that producest wierd files. Incomming buffer size by default is 100000 - and everything works great as soon as I frame is not larger that that. Normally if i put bitrate like 4-6 Mbits, IFrames are bigger and then i increase buffer size to 300000 in example and everything works fine. But this time - the exported file is broken. Not playable and its structure looks broken. In meantime live view can be perfectly playable via VLC over RTSP. Produced files: http://www.speed666.info/good.h264 - this is below 100000 limit http://www.speed666.info/bad.h264 - this is bitrate where buffer is too small I tried to increase buffer size but this only produces bad files - nothing more. Where i should search the problem. Same binary running on different camera with same CPU works well. Any clues? Marcin WebCamera.pl From marcin at speed666.info Wed Oct 2 07:05:09 2013 From: marcin at speed666.info (Marcin) Date: Wed, 02 Oct 2013 16:05:09 +0200 Subject: [Live-devel] Problems with openRTSP on ARM device In-Reply-To: <524C1994.50509@speed666.info> References: <524C1994.50509@speed666.info> Message-ID: <524C2815.8000304@speed666.info> Hi all, I will respond to myself to because i forgot to mention. The camera produces good and working H264 stream. Proof: http://www.speed666.info/testok.h264 I can record it via openRTSP command from different host without problems so it looks like compiler or linux limitation - but where to search? Marcin W dniu 2013-10-02 15:03, Marcin pisze: > Hello, > I wanted to ask You because i have no idea where the problem may sit. > I have compiled openRTSP which i use to dump H264 streams from RTSP > server over localhost > This approach works fine in many IP Cameras that i use for my purpose. > Suddenly i found a device that producest wierd files. > > Incomming buffer size by default is 100000 - and everything works > great as soon as I frame is not larger that that. > Normally if i put bitrate like 4-6 Mbits, IFrames are bigger and then > i increase buffer size to 300000 in example and everything works fine. > > But this time - the exported file is broken. Not playable and its > structure looks broken. In meantime live view can be perfectly > playable via VLC over RTSP. > > Produced files: > http://www.speed666.info/good.h264 - this is below 100000 limit > http://www.speed666.info/bad.h264 - this is bitrate where buffer > is too small > > I tried to increase buffer size but this only produces bad files - > nothing more. > > Where i should search the problem. Same binary running on different > camera with same CPU works well. Any clues? > > Marcin > WebCamera.pl > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From yossi_c at robo-team.com Wed Oct 2 08:08:38 2013 From: yossi_c at robo-team.com (yossi.cohn) Date: Wed, 2 Oct 2013 15:08:38 +0000 Subject: [Live-devel] RTP Multicast Video Client Message-ID: <26b23fd7249a40fc81f350945f0f632e@AMXPR03MB024.eurprd03.prod.outlook.com> Hi, I'm developing an application which supports RTP Multicast video streaming. The video stream would be H.264 encoded video stream. The application should join the appropriate family address of the streamed video and behave as a video client. I looked for samples and the closest thing I found was the "testMPEG1or2VideoReciever". I could not see in this example any sample code showing how to consume the packets of the video stream. Is there any place I can see this example, are there any other examples I can get regarding multicast client. Thanks, Yossi -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 2 08:25:40 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Oct 2013 08:25:40 -0700 Subject: [Live-devel] RTSP stream via liveMedia from opencv capture In-Reply-To: References: <7ea62b2fb56b49df892c6d4ed7233000@srvmail.seacon.hu>, <3A307192-C087-451F-A394-CEEB9489A7A7@live555.com> Message-ID: > I've changed the code to read only one .jpg image and start to stream that continuously. But after the stream starts the entire network is brokes until I stop the stream. Of course - because you're streaming data continuously, without any gaps between the frames! You commented out the important line gettimeofday(&fLastCaptureTime, &Idunno); so, you're not setting each frame's presentation time properly. You also need to delay an appropriate length of time (depending on the frame rate that you want) before you transmit each frame. Please also note that JPEG/RTP streaming is strongly discouraged; see http://www.live555.com/liveMedia/faq.html#jpeg-streaming Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 2 08:34:53 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Oct 2013 08:34:53 -0700 Subject: [Live-devel] RTP Multicast Video Client In-Reply-To: <26b23fd7249a40fc81f350945f0f632e@AMXPR03MB024.eurprd03.prod.outlook.com> References: <26b23fd7249a40fc81f350945f0f632e@AMXPR03MB024.eurprd03.prod.outlook.com> Message-ID: <411A2D3D-B454-4835-B61D-68820A94FAD2@live555.com> > I'm developing an application which supports RTP Multicast video streaming. > The video stream would be H.264 encoded video stream. > The application should join the appropriate family address of the streamed video and behave as a video client. Because you are transmitting H.264 video via RTP, your server (transmitter) application should include a RTSP server. Note, for example, our "testH264VideoStreamer" demo application, and read http://www.live555.com/liveMedia/faq.html#rtsp-needed to understand why it's valuable to use RTSP when streaming H.264 via RTP. If you do this, then our standard RTSP client applications - e.g. "testRTSPClient" or "openRTSP" could be used to receive the data. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dbarboza at ic.uff.br Wed Oct 2 13:27:09 2013 From: dbarboza at ic.uff.br (Diego Barboza) Date: Wed, 2 Oct 2013 17:27:09 -0300 Subject: [Live-devel] Generating streaming at runtime In-Reply-To: <935AB8BE-CBAD-4401-AB43-590FD7AF6CFC@live555.com> References: <935AB8BE-CBAD-4401-AB43-590FD7AF6CFC@live555.com> Message-ID: Thanks Ross, I'll take a look into that. Diego Barboza 2013/10/1 Ross Finlayson > Hi everyone, I'm doing a research about live streaming the video output > from a game to a client. By doing so, I need to generate the streaming data > at runtime instead of reading it from a file (like most LIVE555 examples > do). Could someone point me out a good reference to start this up? > > > See http://www.live555.com/liveMedia/faq.html#liveInput-unicast > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Wed Oct 2 23:21:25 2013 From: nambirajan.manickam at i-velozity.com (Nambirajan M) Date: Thu, 3 Oct 2013 11:51:25 +0530 Subject: [Live-devel] Clarification Message-ID: <003f01cec000$d6fef850$84fce8f0$@manickam@i-velozity.com> Hi Ross, Do you have a provision for " Destination address parameter " setting in Transport Header Field in Setup command. Please let us know. We want to stream the data to the destination address specified there. This will allow us to decrease the load on the RTSP Server and the control monitoring can be done by a proxy there by giving us a leverage to monitor the data flow. Thanks and regards, M. Nambirajan -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Thu Oct 3 00:26:06 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Thu, 3 Oct 2013 09:26:06 +0200 Subject: [Live-devel] openRTSP QOS mesurement Message-ID: <8116_1380785168_524D1C10_8116_7477_1_1BE8971B6CFF3A4F97AF4011882AA25501563E027990@THSONEA01CMS01P.one.grp> Hi Ross, openRTSP allow to make QOS mesurement and print it when it exits. This is nice but do you think it could be an possible evolution to print periodically ? Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 3 02:11:53 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Oct 2013 02:11:53 -0700 Subject: [Live-devel] Clarification In-Reply-To: <003f01cec000$d6fef850$84fce8f0$@manickam@i-velozity.com> References: <003f01cec000$d6fef850$84fce8f0$@manickam@i-velozity.com> Message-ID: See http://lists.live555.com/pipermail/live-devel/2012-September/015884.html Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 3 02:12:23 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Oct 2013 02:12:23 -0700 Subject: [Live-devel] openRTSP QOS mesurement In-Reply-To: <8116_1380785168_524D1C10_8116_7477_1_1BE8971B6CFF3A4F97AF4011882AA25501563E027990@THSONEA01CMS01P.one.grp> References: <8116_1380785168_524D1C10_8116_7477_1_1BE8971B6CFF3A4F97AF4011882AA25501563E027990@THSONEA01CMS01P.one.grp> Message-ID: <50B473DE-19D8-400D-9C24-5FFF03CF8D70@live555.com> > openRTSP allow to make QOS mesurement and print it when it exits. > This is nice but do you think it could be an possible evolution to print periodically ? Perhaps, though it's very low priority. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Thu Oct 3 02:35:43 2013 From: nambirajan.manickam at i-velozity.com (Nambirajan M) Date: Thu, 3 Oct 2013 15:05:43 +0530 Subject: [Live-devel] Clarification In-Reply-To: References: <003f01cec000$d6fef850$84fce8f0$@manickam@i-velozity.com> Message-ID: <006201cec01b$f10527e0$d30f77a0$@manickam@i-velozity.com> Hi Ross, Thanks for your input. Regards, M. Nambirajan From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, October 03, 2013 2:42 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Clarification See http://lists.live555.com/pipermail/live-devel/2012-September/015884.html Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Thu Oct 3 06:06:02 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Thu, 3 Oct 2013 15:06:02 +0200 Subject: [Live-devel] multicast & linux information Message-ID: <2961_1380805564_524D6BBB_2961_2320_7_1BE8971B6CFF3A4F97AF4011882AA25501563E09919C@THSONEA01CMS01P.one.grp> Hi Ross, We discussed several times about this subject and I would like to share what I understood from exchanges with RedHat support. It seems that kernel maintainers chose to keep the backward compatibility with a default that cause receiving data from all multicast groups and not only from joined multicast group. But they add since kernel 2.6.31 an option that allow to avoid this. In a test program, I tryied adding #if defined(__linux__) && !defined(IP_MULTICAST_ALL) #warning linux is not able to filter multicast group that share same port without IP_MULTICAST_ALL option #endif #ifdef IP_MULTICAST_ALL int mc_all = 0; if ((setsockopt(sock, IPPROTO_IP, IP_MULTICAST_ALL, (void*) &mc_all, sizeof(mc_all))) < 0) { perror("setsockopt() failed"); exit(1); } #endif After this, multicast filter works "normally" (like windows & FreeBSD) Do you think it could be possible to set this option in socketJoinGroup in a future release of live555 ? Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 3 07:21:40 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Oct 2013 07:21:40 -0700 Subject: [Live-devel] multicast & linux information In-Reply-To: <2961_1380805564_524D6BBB_2961_2320_7_1BE8971B6CFF3A4F97AF4011882AA25501563E09919C@THSONEA01CMS01P.one.grp> References: <2961_1380805564_524D6BBB_2961_2320_7_1BE8971B6CFF3A4F97AF4011882AA25501563E09919C@THSONEA01CMS01P.one.grp> Message-ID: > It seems that kernel maintainers chose to keep the backward compatibility with a default that cause receiving data from all multicast groups and not only from joined multicast group. > But they add since kernel 2.6.31 an option that allow to avoid this. That's good to hear. (I guess this is the closest they'll ever come to admitting that the default behavior is a bug :-) > Do you think it could be possible to set this option in socketJoinGroup in a future release of live555 ? Done! I've just installed a new version (2013.10.03) of the "LIVE555 Streaming Media" code that sets this option to 0 if it's defined. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From robypomper at johnosproject.com Sun Oct 6 00:46:01 2013 From: robypomper at johnosproject.com (Roberto Pompermaier) Date: Sun, 06 Oct 2013 01:46:01 -0600 Subject: [Live-devel] Stream MP3 over TS Message-ID: <2a6533396f2582c2bbf4fa616f6c3599@johnosproject.com> Hi, I'm trying to transmit an MP3 file over Transport Stream (my goal is play the song on different device completely sync). I studied the 2 test programs testMP3Streamer and testMPEG2TransportStreamer, but I didn't understand how it can work. In the play() function of testMPEG2TransportStreamer, you read from file with a ByteStreamFileSource and then pass it to a MPEG2TransportStreamFramer, I tryed to do the same but reading the file with an MP3FileSource. The error was about the Sync Byte. In the mailing list i found a thread about it, and so I added an MPEG2TransportStreamFromESSource object to convert the MP3 file format to TransportStream format. Now it don't show me errors, but it still not working. So, what is the correct way to transmit an MP3 file over TransportStream? I, also tried to change the MPEG2TransportStreamFromESSource with the MPEG2TransportStreamFromPESSource without success. Thanks in advance From finlayson at live555.com Sun Oct 6 01:06:54 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 6 Oct 2013 01:06:54 -0700 Subject: [Live-devel] Stream MP3 over TS In-Reply-To: <2a6533396f2582c2bbf4fa616f6c3599@johnosproject.com> References: <2a6533396f2582c2bbf4fa616f6c3599@johnosproject.com> Message-ID: <6AA1906C-389A-44D5-9125-1F06CE349898@live555.com> > I'm trying to transmit an MP3 file over Transport Stream Note that a "Transport Stream" is really just a container (i.e. file) format, not a network protocol. So it doesn't really make sense to talk about streaming anything over a "Transport Stream". You can, however, convert, or 'pack' some data type (including MP3 audio) into a Transport Stream file. > I studied the 2 test programs testMP3Streamer and testMPEG2TransportStreamer, but I didn't understand how it can work. Those applications are used to stream MP3 and Transport Stream files (respectively) using the standard RTP protocol. If you want to convert (not stream) a MP3 file into a Transport Stream file, then you should be able to do so - using our software - by feeding a "MP3FileSource" object into a Transport Stream, by doing something like: FramedSource* audioSource = MP3FileSource(*env, mp3FileName); MPEG2TransportStreamFromESSource* tsFrames = MPEG2TransportStreamFromESSource::createNew(*env); tsFrames->addNewAudioSource(audioSource, mpegVersion); // Where "mpegVersion" is either 1 or 2, depending on whether your MP3 audio is MPEG-1 or MPEG-2 MediaSink* outputSink = FileSink::createNew(*env, outputFileName); outputSink->startPlaying(*tsFrames, afterPlaying, NULL); (See the "testH264VideoToTransportStream" code for an illustration of how we can convert a H.264 video file into a Transport Stream.) However, if you just want to stream a MP3 file over the network, then by far the best way to do this is to stream it via RTP - e.g., using our "testMP3Streamer" demo application, without using Transport Streams at all. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From arash.cordi at gmail.com Sun Oct 6 08:45:21 2013 From: arash.cordi at gmail.com (Arash Cordi) Date: Sun, 6 Oct 2013 19:15:21 +0330 Subject: [Live-devel] h263reader:: buffer too small Message-ID: hi, i'm trying too get h263 stream from a camera. i modified H263plusVideoFileServerMediaSubsession class for this. now when i try to get a 4CIF stream from the camera i get the following message: h263reader:: Buffer too small (60797) everything works fine with CIF stream where is this buffer and how can i increase its size? thanks in advance -- ArasH -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Mon Oct 7 03:50:14 2013 From: nambirajan.manickam at i-velozity.com (Nambirajan M) Date: Mon, 7 Oct 2013 16:20:14 +0530 Subject: [Live-devel] Clarification In-Reply-To: References: <003f01cec000$d6fef850$84fce8f0$@manickam@i-velozity.com> Message-ID: <002001cec34b$05ef85b0$11ce9110$@manickam@i-velozity.com> Hi Ross, Thanks for your valuable input. But can you please throw some light on the Denial of Service attack. How this will be security risk at the Server side. Thanks and regards, M. Nambirajan From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, October 03, 2013 2:42 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Clarification See http://lists.live555.com/pipermail/live-devel/2012-September/015884.html Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 7 05:48:04 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Oct 2013 05:48:04 -0700 Subject: [Live-devel] Clarification In-Reply-To: <002001cec34b$05ef85b0$11ce9110$@manickam@i-velozity.com> References: <003f01cec000$d6fef850$84fce8f0$@manickam@i-velozity.com> <002001cec34b$05ef85b0$11ce9110$@manickam@i-velozity.com> Message-ID: <315BDAA5-FF54-44A4-95EB-9BF5FE7F9ECA@live555.com> > Thanks for your valuable input. But can you please throw some light on the Denial of Service attack. How this will be security risk at the Server side. Because a client A can ask a server B to transmit a stream of packets towards a third party C - who never asked for this. That's why we don't allow a server to do this, by default. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Mon Oct 7 05:45:32 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Mon, 7 Oct 2013 07:45:32 -0500 Subject: [Live-devel] Proxy Server REGISTER option Message-ID: Hi, I have been trying to use the new REGISTER option with Proxy Server and have not been able to make this work. Has anyone used this feature? If so, any input on how to implement this feature would be greatly appreciated. When running an instance of Proxy Server with -V -R options set I do not see any output when making a client connection using the REGISTER or REGISTER_REMOTE method with rtsp://stream provided. I do receive a 200 / Ok response, but nothing else to indicate if the stream was successfully registered. Thanks, Bob -------------- next part -------------- An HTML attachment was scrubbed... URL: From marco.caverzaghi at italtel.com Mon Oct 7 06:03:50 2013 From: marco.caverzaghi at italtel.com (Caverzaghi Marco) Date: Mon, 7 Oct 2013 15:03:50 +0200 Subject: [Live-devel] Streaming H.264/SVC Message-ID: Hello, i want to know if the Live555 Streaming Media libraries support the video stream H.264/SVC (Annex G extension of H.264/AVC). Thanks in advance, Marco -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 7 06:39:44 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Oct 2013 06:39:44 -0700 Subject: [Live-devel] Streaming H.264/SVC In-Reply-To: References: Message-ID: > i want to know if the Live555 Streaming Media libraries support the video stream H.264/SVC (Annex G extension of H.264/AVC). No, not at present. However, if you can point me at an example of a H.264/SVC Elementary Stream file (i.e., containing a sequence of H.264 and H.264/SVC NAL units), then I'll take a closer look at what would be required to support streaming this. (Are you interested in server support (i.e., streaming H.264/SVC data), client support (i.e., receiving H.264/SVC data), or both?) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rob.krakora at messagenetsystems.com Mon Oct 7 06:54:57 2013 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Mon, 7 Oct 2013 09:54:57 -0400 Subject: [Live-devel] Streaming H.264/SVC In-Reply-To: References: Message-ID: Here you go... http://www.acceptv.com/page/products_svc_test_streams On Mon, Oct 7, 2013 at 9:39 AM, Ross Finlayson wrote: > i want to know if the *Live555 Streaming Media* libraries support the > video stream H.264/SVC (Annex G extension of H.264/AVC). > > > No, not at present. However, if you can point me at an example of a > H.264/SVC Elementary Stream file (i.e., containing a sequence of H.264 and > H.264/SVC NAL units), then I'll take a closer look at what would be > required to support streaming this. > > (Are you interested in server support (i.e., streaming H.264/SVC data), > client support (i.e., receiving H.264/SVC data), or both?) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Rob Krakora, Senior Software Engineer MessageNet Systems 101 E Carmel Dr, Suite 105 Carmel, IN 46032 MessageNetSystems.com Rob.Krakora at MessageNetSystems.com P: 317.566.1677, 212 F: 317.663.0808 For the latest news, information, and blogs, please be sure to visit, follow, and like us... -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 7 07:17:26 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Oct 2013 07:17:26 -0700 Subject: [Live-devel] Streaming H.264/SVC In-Reply-To: References: Message-ID: On Oct 7, 2013, at 6:54 AM, Robert Krakora wrote: > Here you go... > > http://www.acceptv.com/page/products_svc_test_streams How about a file that we can download without having to fill out an intrusive form first! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 7 07:26:14 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Oct 2013 07:26:14 -0700 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: Message-ID: > I have been trying to use the new REGISTER option with Proxy Server and have not been able to make this work. Has anyone used this feature? If so, any input on how to implement this feature would be greatly appreciated. > > When running an instance of Proxy Server with -V -R options set I do not see any output when making a client connection using the REGISTER or REGISTER_REMOTE method with rtsp://stream provided. I do receive a 200 / Ok response, but nothing else to indicate if the stream was successfully registered. It turns out that the proxy server *was* handling the "REGISTER" command OK; however due to a bug in the code, it was not properly handling the 'verbosity' option ("-V") in this case, so the proxy server wasn't telling you what it was doing. I've now installed a new version (2013.10.07) of the "LIVE555 Streaming Media" that fixes this, so the "-V -R" options to the "LIVE555 Proxy Server" should work properly now. Thanks for the report. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From robypomper at johnosproject.com Mon Oct 7 08:50:06 2013 From: robypomper at johnosproject.com (Roberto Pompermaier) Date: Mon, 07 Oct 2013 09:50:06 -0600 Subject: [Live-devel] Stream MP3 over TS Message-ID: <000574a7f87ce03eb0a10f47dfb5a3fb@johnosproject.com> Hi, thanks for your fast replay. Now I can stream my MP3 packed in MPEG-TS container over RTP :) Now it transform my mp3 in to new mpeg-ts file and then transmit it (the mpeg-ts file) over rtp. I would like to transform and transmit at the same time, without creating an temporary file. If I understand correctly, the FileSink write the result of startPlaying() in to the file, and the SimpleRTPSink transmit the same results to the network using the RTP protocol. I tried with this mod to your code: FramedSource* audioSource = MP3FileSource(*env, mp3FileName); MPEG2TransportStreamFromESSource* tsFrames = MPEG2TransportStreamFromESSource::createNew(*env); tsFrames->addNewAudioSource(audioSource, mpegVersion); // Where "mpegVersion" is either 1 or 2 MediaSink* outputSink = SimpleRTPSink::createNew ( *env, &rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); outputSink->startPlaying(*tsFrames, afterPlaying, NULL); Unfortunately it don't work. I also tried to change the rtpPayloadFormat and others parameters in the SimpleRTPSink constructor, but it don't work anyway. Many thanks for your patience. Roberto PS: My English is not so good, so sometime it's hard for me to explain what I mean From rob.krakora at messagenetsystems.com Mon Oct 7 08:07:18 2013 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Mon, 7 Oct 2013 11:07:18 -0400 Subject: [Live-devel] Streaming H.264/SVC In-Reply-To: References: Message-ID: Only ones I know of, unfortunately...I just received a Logitech C930e webcam that is supposed to be H.264 SVC...it is UVC 1.5 though and v4l2 only supports UVC 1.1. Have to do some devel to get the H.264 SVC stream out of it. On Mon, Oct 7, 2013 at 10:17 AM, Ross Finlayson wrote: > > On Oct 7, 2013, at 6:54 AM, Robert Krakora < > rob.krakora at messagenetsystems.com> wrote: > > Here you go... > > http://www.acceptv.com/page/products_svc_test_streams > > > How about a file that we can download without having to fill out an > intrusive form first! > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Rob Krakora, Senior Software Engineer MessageNet Systems 101 E Carmel Dr, Suite 105 Carmel, IN 46032 MessageNetSystems.com Rob.Krakora at MessageNetSystems.com P: 317.566.1677, 212 F: 317.663.0808 For the latest news, information, and blogs, please be sure to visit, follow, and like us... -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 7 09:24:55 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Oct 2013 09:24:55 -0700 Subject: [Live-devel] Stream MP3 over TS In-Reply-To: <000574a7f87ce03eb0a10f47dfb5a3fb@johnosproject.com> References: <000574a7f87ce03eb0a10f47dfb5a3fb@johnosproject.com> Message-ID: <889C7BEB-4E33-4676-BEAA-E57B218ADEF2@live555.com> > thanks for your fast replay. Now I can stream my MP3 packed in MPEG-TS container over RTP :) > > Now it transform my mp3 in to new mpeg-ts file and then transmit it (the mpeg-ts file) over rtp. Once again: If you want to transmit MP3 audio via RTP, the best way to do this is to do so directly - e.g., using our "testMP3Streamer" demo application (for multicast streaming), or "testOnDemandRTSPServer" or "live555MediaServer" (for unicast streaming). Packing the MP3 stream into a Transport Stream, and then transmitting the Transport Stream is much less efficient, and is less tolerant of packet loss. The only reason why anyone should be transmitting Transport Stream data via RTP is if the source data is in Transport Stream format to begin with. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Mon Oct 7 09:33:09 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Mon, 7 Oct 2013 11:33:09 -0500 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: Message-ID: Ross, Thanks for the update...I am now getting debug output. I see that the rtsp://url I am registering adds a new "ProxyServerMediaSubsession with periodic OPTION requests from the Proxy Server to the stream. Everything appears normal, but I'm not sure how to access the proxy stream now. Do REGISTERED streams have names like proxyStream-1, proxyStream-2...proxyStream-N? Thanks, Bob On Mon, Oct 7, 2013 at 9:26 AM, Ross Finlayson wrote: > I have been trying to use the new REGISTER option with Proxy Server and > have not been able to make this work. Has anyone used this feature? If so, > any input on how to implement this feature would be greatly appreciated. > > When running an instance of Proxy Server with -V -R options set I do not > see any output when making a client connection using the REGISTER or > REGISTER_REMOTE method with rtsp://stream provided. I do receive a 200 / > Ok response, but nothing else to indicate if the stream was successfully > registered. > > > It turns out that the proxy server *was* handling the "REGISTER" command > OK; however due to a bug in the code, it was not properly handling the > 'verbosity' option ("-V") in this case, so the proxy server wasn't telling > you what it was doing. > > I've now installed a new version (2013.10.07) of the "LIVE555 Streaming > Media" that fixes this, so the "-V -R" options to the "LIVE555 Proxy > Server" should work properly now. Thanks for the report. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 10760 TRENTON SAINT LOUIS, MISSOURI 63132 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 7 09:46:46 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Oct 2013 09:46:46 -0700 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: Message-ID: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> > Thanks for the update...I am now getting debug output. > > I see that the rtsp://url I am registering adds a new "ProxyServerMediaSubsession with periodic OPTION requests from the Proxy Server to the stream. Everything appears normal, but I'm not sure how to access the proxy stream now. Do REGISTERED streams have names like proxyStream-1, proxyStream-2...proxyStream-N? No, in this case, the proxy stream will have the same URL suffix as the 'back end' stream. So, if your 'back end' stream has the URL rtsp://host.example.com/foobar then the proxy stream should be accessible as rtsp://proxy-server-name-or-address:portnum/foobar Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Mon Oct 7 11:07:27 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Mon, 7 Oct 2013 13:07:27 -0500 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> Message-ID: Ross, Ok...that makes sense. Any idea how Proxy Server would handle the following url? rtsp://:/axis-media/media.amp?resolution=480x270&fps=3&compression=40&videokeyframeinterval=18 >From debug I can see ProxyServer handles the new subsession like this: ProxyServerMediaSession["rtsp://99.58.86.198:11031/axis-media/media.amp/"] Proxy Server DESCRIBE on registered stream returns sdp: a=control:rtsp://:/axis-media/media.amp?resolution=480x270&fps=3&compression=40&videokeyframeinterval=18 a=control:rtsp://:/axis-media/media.amp/trackID=1?resolution=480x270&fps=3&compression=40&videokeyframeinterval=18 >From the client side I have tried DESCRIBE against the Proxy Server using all possible suffix's with no success. Seems I might have to handle this differently. Either way I am much further along with understanding how to implement this feature. Thanks, Bob On Mon, Oct 7, 2013 at 11:46 AM, Ross Finlayson wrote: > Thanks for the update...I am now getting debug output. > > I see that the rtsp://url I am registering adds a new > "ProxyServerMediaSubsession with periodic OPTION requests from the Proxy > Server to the stream. Everything appears normal, but I'm not sure how to > access the proxy stream now. Do REGISTERED streams have names like > proxyStream-1, proxyStream-2...proxyStream-N? > > > No, in this case, the proxy stream will have the same URL suffix as the > 'back end' stream. So, if your 'back end' stream has the URL > rtsp://host.example.com/foobar > then the proxy stream should be accessible as > rtsp://proxy-server-name-or-address:portnum/foobar > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 10760 TRENTON SAINT LOUIS, MISSOURI 63132 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 7 23:17:51 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Oct 2013 23:17:51 -0700 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> Message-ID: <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> > Ok...that makes sense. Any idea how Proxy Server would handle the following url? > > rtsp://:/axis-media/media.amp?resolution=480x270&fps=3&compression=40&videokeyframeinterval=18 OK, I hadn't anticipated complicated back-end stream URLs like this. So I've gone ahead and changed the way that the proxy server creates stream names for proxied streams that have been created via "REGISTER". These now use the stream name "registeredProxyStream-N", where "N" is a counter, incremented for each new registered stream. The proxy server also now displays the URL for each new registered proxy, so you'll now know how to access it. This is done in a new version 2013.10.08 of the "LIVE555 Streaming Media" software, available now. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Tue Oct 8 06:30:48 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Tue, 8 Oct 2013 08:30:48 -0500 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> Message-ID: Ross, Thanks for your time and consideration with making the modifications. I have tested the new code and have a few observations to pass along: I see "Play this stream using the URL: rtsp://:8554/registeredProxyStream-1" when I send the REGISTER_REMOTE command. Proxy Server connects to the stream and output appears normal When I connect to the registered stream using VLC, I see "ProxyServerMediaSubsession["H264"]::createNewStreamSource(session id 3742543045)" with a request to PLAY. Next response is "RTSP/1.0 454 Session Not Found" Just to make sure that the stream works with Proxy Server I tried from the command line using the same rtsp:// that I was registering and was able to connect using VLC. Thanks, Bob On Tue, Oct 8, 2013 at 1:17 AM, Ross Finlayson wrote: > Ok...that makes sense. Any idea how Proxy Server would handle the > following url? > > > rtsp://:/axis-media/media.amp?resolution=480x270&fps=3&compression=40&videokeyframeinterval=18 > > > OK, I hadn't anticipated complicated back-end stream URLs like this. > > So I've gone ahead and changed the way that the proxy server creates > stream names for proxied streams that have been created via "REGISTER". > These now use the stream name "registeredProxyStream-N", where "N" is a > counter, incremented for each new registered stream. > > The proxy server also now displays the URL for each new registered proxy, > so you'll now know how to access it. > > This is done in a new version 2013.10.08 of the "LIVE555 Streaming Media" > software, available now. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 10760 TRENTON SAINT LOUIS, MISSOURI 63132 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From joao_dealmeida at hotmail.com Tue Oct 8 01:56:21 2013 From: joao_dealmeida at hotmail.com (Joao Almeida) Date: Tue, 8 Oct 2013 08:56:21 +0000 Subject: [Live-devel] Streaming H.264/SVC In-Reply-To: References: , Message-ID: openSVC share some SVC streams http://sourceforge.net/projects/opensvcdecoder/files/?source=navbar From: finlayson at live555.com Date: Mon, 7 Oct 2013 06:39:44 -0700 To: live-devel at ns.live555.com Subject: Re: [Live-devel] Streaming H.264/SVC i want to know if the Live555 Streaming Media libraries support the video stream H.264/SVC (Annex G extension of H.264/AVC). No, not at present. However, if you can point me at an example of a H.264/SVC Elementary Stream file (i.e., containing a sequence of H.264 and H.264/SVC NAL units), then I'll take a closer look at what would be required to support streaming this. (Are you interested in server support (i.e., streaming H.264/SVC data), client support (i.e., receiving H.264/SVC data), or both?) Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 8 08:30:49 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Oct 2013 08:30:49 -0700 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> Message-ID: <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> > When I connect to the registered stream using VLC FYI, it's better (and easier) to use "openRTSP" (or "testRTSPClient") as your client when testing a server, because you'll also get to see the RTSP protocol exchange at the client end. > I see "ProxyServerMediaSubsession["H264"]::createNewStreamSource(session id 3742543045)" with a request to PLAY. > > Next response is "RTSP/1.0 454 Session Not Found" This may be a problem with the back-end server. Is it using our software, or someone else's? To find out what's going on, please do the following: 1/ Add #define DEBUG 1 to the start of "liveMedia/RTSPServer.cpp" 2/ Recompile liveMedia, and then recompile "live555ProxyServer" 3/ Rerun "live555ProxyServer", with the -V and -R options, as before 4/ Send the "REGISTER" (or "REGISTER_REMOTE") command 5/ Try to play the proxy stream from a RTSP client 6/ Post the complete diagnostic output from the proxy server (up to the 454 error) to the mailing list Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jkordani at lsa2.com Tue Oct 8 09:20:13 2013 From: jkordani at lsa2.com (Joshua Kordani) Date: Tue, 08 Oct 2013 12:20:13 -0400 Subject: [Live-devel] using live555 with in memory data Message-ID: <525430BD.6040304@lsa2.com> Greetings all, I wish to use live555 to deliver in memory nals created with x264. In my application, I spawn off the live555 event loop into a separate thread from the one responsible for doing the encoding. I am configuring my encoder to output nals that will fit inside the standard mtu. In order to pass data from the encoder thread to the live555 thread, I am anticipating implementing a series of buffers, with a control scheme to switch back and forth to allow for minimal blocking, ie, the live555 thread will always have a static buffer of known content length to read from, the encoder thread will always have a similar buffer to write to, and there will be a back-buffer with a complete buffer of data ready to go should there be a point where all data has been read but the writer is still writing. I have a couple questions. Given what I've read so far, I should subclass ServerMediaSubsession and implement createNewStreamSource, making use of ByteStreamMemorySource class in some way. Given that I'll be changing the buffer to be read from, does this mean that I will have to create a new instance of the custom sms every time there is a new buffer to read from? Or instead, will my custom sms need to handle the setup and teardown of ByteStreamMemorySources? Or else, how is it anticipated that an in memory location be used to pass data to the live555 event loop when the data is sourced from a different thread? Would it be easier to simply memmove the data to be read into the readbuffer instead of change the readbuffer's location? Also, I notice that the createNewStreamSource call returns a framed source object, of which H264FUAFragmenter seems to implement, and whose methods seem to suggest that it is intending to be used to feed nals to some upper layer. Given also that I am already creating nals small enough to be packed inside an individual rtp packet, and the FUAFragmenter class seems to have code in it to handle nals of varrying sizes, is it the correct class for use in implementing the createNewStreamSource as mentioned above? I've read through some examples, but what I'm supposed to make happen inside of a createNewStreamSource hasn't clicked with me yet. Thank you very much for your time. -- Joshua Kordani LSA Autonomy -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Tue Oct 8 10:48:25 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Tue, 8 Oct 2013 12:48:25 -0500 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> Message-ID: Attached is the diagnostic output. Looks like the client is sending a TEARDOWN... Thanks, Bob On Tue, Oct 8, 2013 at 10:30 AM, Ross Finlayson wrote: > When I connect to the registered stream using VLC > > > FYI, it's better (and easier) to use "openRTSP" (or "testRTSPClient") as > your client when testing a server, because you'll also get to see the RTSP > protocol exchange at the client end. > > I see "ProxyServerMediaSubsession["H264"]::createNewStreamSource(session > id 3742543045)" with a request to PLAY. > > > Next response is "RTSP/1.0 454 Session Not Found" > > > This may be a problem with the back-end server. Is it using our software, > or someone else's? > > To find out what's going on, please do the following: > 1/ Add > #define DEBUG 1 to the start of "liveMedia/RTSPServer.cpp" > 2/ Recompile liveMedia, and then recompile "live555ProxyServer" > 3/ Rerun "live555ProxyServer", with the -V and -R options, as before > 4/ Send the "REGISTER" (or "REGISTER_REMOTE") command > 5/ Try to play the proxy stream from a RTSP client > 6/ Post the complete diagnostic output from the proxy server (up to the > 454 error) to the mailing list > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: debug.out Type: application/octet-stream Size: 21226 bytes Desc: not available URL: From finlayson at live555.com Tue Oct 8 12:11:09 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Oct 2013 12:11:09 -0700 Subject: [Live-devel] using live555 with in memory data In-Reply-To: <525430BD.6040304@lsa2.com> References: <525430BD.6040304@lsa2.com> Message-ID: > I wish to use live555 to deliver in memory nals created with x264. In my application, I spawn off the live555 event loop into a separate thread from the one responsible for doing the encoding. I am configuring my encoder to output nals that will fit inside the standard mtu. You don't have to do that. The LIVE555 code automatically takes care of fragmenting large NAL units into appropriate-sized RTP packets. Nonetheless, small NAL units are still a good idea, because they reduce the effect of network packet loss. It's always a good idea, for example, to break large 'I-frame' NAL units into multiple slices; they just don't need to be as small as a standard MTU (1500 bytes), because our software will take care of fragmenting them if they happen to be larger. > Given what I've read so far, I should subclass ServerMediaSubsession and implement createNewStreamSource Yes. > making use of ByteStreamMemorySource class in some way. The "ByteStreamMemoryBufferSource" (sic) class is used to implement a memory buffer that acts like a file - i.e., with bytes that are read from it sequentially, until it reaches its end. You could, in theory, use this to implement your H.264 streaming, feeding it into a "H264VideoStreamFramer" (i.e., in your "createNewStreamSource()" implementation). However, the fact that the buffer is a fixed size is a problem. If you haven't created all of your NAL units in advance, then that's a problem. You could instead just use an OS pipe as input, and read it using a regular "ByteStreamFileSource". However, because the size of the NAL units (created by you) are known in advance, it would be more efficient to have your own memory buffer - that contains just a single NAL unit at a time - and feed it into a "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer". (Note that each NAL unit that you feed into a "H264VideoStreamDiscreteFramer" must *not* begin with a 0x000001 'start code'.) > Given that I'll be changing the buffer to be read from, does this mean that I will have to create a new instance of the custom sms every time there is a new buffer to read from? Absolutely not! > Or instead, will my custom sms need to handle the setup and teardown of ByteStreamMemorySources? That's why I suggest not using "ByteStreamMemoryBufferSource" (see above). > Or else, how is it anticipated that an in memory location be used to pass data to the live555 event loop when the data is sourced from a different thread? Would it be easier to simply memmove the data to be read into the readbuffer instead of change the readbuffer's location? I suggest that you look at the "DeviceSource" code ("liveMedia/DeviceSource.cpp"), and use that as a model for how to implement your "FramedSource" subclass (an instance of which you'll create in your implementation of "createNewStreamSource()"). Note in particular the code (the function "signalNewFrameData()") at the end of the file. That is something that you could call from your non-LIVE555 thread to signal the availability of a new NAL unit. > Also, I notice that the createNewStreamSource call returns a framed source object, of which H264FUAFragmenter seems to implement, and whose methods seem to suggest that it is intending to be used to feed nals to some upper layer. Given also that I am already creating nals small enough to be packed inside an individual rtp packet, and the FUAFragmenter class seems to have code in it to handle nals of varrying sizes, is it the correct class for use in implementing the createNewStreamSource as mentioned above? Yes, the object created by your implementation of the "createNewRTPSink()" virtual function should be a "H264VideoRTPSink". As I noted above, it automatically takes care of fragmenting large NAL units, if needed. > I've read through some examples, but what I'm supposed to make happen inside of a createNewStreamSource hasn't clicked with me yet. I suggest looking at the "H264VideoFileServerMediaSubsession" code as a model for how to implement your "ServerMediaSubsession" subclass. The big difference will be the implementation of "createNewStreamSource()". Your implementation should create an instance of your own "FramedSource" subclass (instead of "ByteStreamFileSource"), and feed it into a "H264VideoStreamDiscreteFramer" (instead of a "H264VideoStreamFramer"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 8 12:25:34 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Oct 2013 12:25:34 -0700 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> Message-ID: On Oct 8, 2013, at 10:48 AM, Bob Bischan wrote: > Attached is the diagnostic output. > > Looks like the client is sending a TEARDOWN... Yes, your client (VLC) is sending a TEARDOWN. I don't know why; did you click the 'stop' button? Or perhaps it sent a TEARDOWN because it didn't receive any RTP data, for some reason. I notice that you're using an old version of VLC - one that uses a very old version (2011.12.23) of our client library. You should upgrade it. Or else (because this is not a VLC mailing list :-) use one of our own RTSP client applications instead - e.g., "openRTSP" (perhaps with the "-t" option if you want RTP-over-TCP). See http://www.live555.com/openRTSP/ Apart from that, the proxy server appears to be working OK. (I didn't see any 454 errors.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Tue Oct 8 14:00:44 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Tue, 8 Oct 2013 16:00:44 -0500 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> Message-ID: Many thanks for the on-going dialog! My evaluation of the Proxy Server / Live Media is probably going to lead to total dependence :-) openRTSP and the entire live555 library is very much on our radar screen. I'm still becoming familiar with the various pieces and parts. I did spend some time looking at openRTSP and the sample test programs, however, I did not see a way to send a REGISTER command to the Proxy Server. I'm not a c++ programmer, so most the of code itself is beyond my reach. At this point i'm just prototyping a few sample use cases and a high-level over-view for further discussion within our organization. In your response you mentioned the -t option. I have been using this option during all parts of this thread of discussion. I have found that the -t option is necessary to make connections across our routers. Before inquiring about the REGISTER option I had been successful with using the Proxy Server from the command line and everything worked as expected...using the same client (VLC) and the same stream. At one point I had 50 proxy streams running and was making multiple VLC connections to stress test and was very pleased with performance. Using current code I just now tried this test by running from command line ./live555ProxyServer -t -v "rtsp://:/axis-media/media.amp?resolution=480x270&fps=3&compression=40&videokeyframeinterval=18". Connect from VLC works perfectly!! Is there any chance that the -t option is not being processed for streams that are being registered? Is there any debugging I can due to confirm this? Thanks again for your time and patience. Bob On Tue, Oct 8, 2013 at 2:25 PM, Ross Finlayson wrote: > > On Oct 8, 2013, at 10:48 AM, Bob Bischan > wrote: > > Attached is the diagnostic output. > > Looks like the client is sending a TEARDOWN... > > > Yes, your client (VLC) is sending a TEARDOWN. I don't know why; did you > click the 'stop' button? Or perhaps it sent a TEARDOWN because it didn't > receive any RTP data, for some reason. I notice that you're using an old > version of VLC - one that uses a very old version (2011.12.23) of our > client library. You should upgrade it. Or else (because this is not a VLC > mailing list :-) use one of our own RTSP client applications instead - > e.g., "openRTSP" (perhaps with the "-t" option if you want RTP-over-TCP). > See http://www.live555.com/openRTSP/ > > Apart from that, the proxy server appears to be working OK. (I didn't see > any 454 errors.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 10760 TRENTON SAINT LOUIS, MISSOURI 63132 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From jkordani at lsa2.com Tue Oct 8 15:33:49 2013 From: jkordani at lsa2.com (Joshua Kordani) Date: Tue, 08 Oct 2013 18:33:49 -0400 Subject: [Live-devel] using live555 with in memory data In-Reply-To: References: <525430BD.6040304@lsa2.com> Message-ID: <5254884D.9000904@lsa2.com> Ross, Thank you for your detailed response! I have responded inline. On 10/8/13 3:11 PM, Ross Finlayson wrote: > You don't have to do that. The LIVE555 code automatically takes care > of fragmenting large NAL units into appropriate-sized RTP packets. > > Nonetheless, small NAL units are still a good idea, because they > reduce the effect of network packet loss. It's always a good idea, > for example, to break large 'I-frame' NAL units into multiple slices; > they just don't need to be as small as a standard MTU (1500 bytes), > because our software will take care of fragmenting them if they happen > to be larger. > > As I was originally reading warnings from your framework in regards to my passing of oversized NALs (or at least, of large NALs that were encompassing an entire frame), and in anticipation that the software will be used in packet lossy environments, I figured that I'd try to keep their size to a minimum. Being new to both of these domains makes it hard to ask the right questions, so I guess I will try this, are there large nals that can be split by your software, for whom losing a slice does not result in the loss of the whole frame, in addition to other large nals for which the opposite is true? I've just naively reduced the size of all nals because I didn't know any better. > The "ByteStreamMemoryBufferSource" (sic) class is used to implement a > memory buffer that acts like a file - i.e., with bytes that are read > from it sequentially, until it reaches its end. You could, in theory, > use this to implement your H.264 streaming, feeding it into a > "H264VideoStreamFramer" (i.e., in your "createNewStreamSource()" > implementation). However, the fact that the buffer is a fixed size is > a problem. If you haven't created all of your NAL units in advance, > then that's a problem. You could instead just use an OS pipe as > input, and read it using a regular "ByteStreamFileSource". However, > because the size of the NAL units (created by you) are known in > advance, it would be more efficient to have your own memory buffer - > that contains just a single NAL unit at a time - and feed it into a > "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer". > (Note that each NAL unit that you feed into > a "H264VideoStreamDiscreteFramer" must *not* begin with a 0x000001 > 'start code'.) So myFramedSource will be responsible for passing individual nals up to the DiscreteFramer, via deliverFrame, but currently, after my encode call, I have quite a few nals that I need to send over. I either need to call the triggerEvent function successively after loading each nal into the input memory location (which sounds like the wrong thing to do), or.. I can see how in DeviceSource how we return without writing in the event that there is nothing to ship out, but I don't see where we continue to get back into deliverFrame if there is more data to write. My encoder thread either... waits till all nals have been consumed and then continues? Or leaves the nals somewhere where live555 can continue to consume them at its leisure? Also, I'm not expecting the nals that I pass into live555 to represent full frames, does this change the suitability of the DiscreteFramer for this task? Again, thank you very much for your responses, and I appreciate your time. -- Joshua Kordani LSA Autonomy -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 8 16:13:31 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Oct 2013 16:13:31 -0700 Subject: [Live-devel] using live555 with in memory data In-Reply-To: <5254884D.9000904@lsa2.com> References: <525430BD.6040304@lsa2.com> <5254884D.9000904@lsa2.com> Message-ID: <398B4209-13E4-4CBC-B2E4-87DFC577EFD9@live555.com> >> Nonetheless, small NAL units are still a good idea, because they reduce the effect of network packet loss. It's always a good idea, for example, to break large 'I-frame' NAL units into multiple slices; they just don't need to be as small as a standard MTU (1500 bytes), because our software will take care of fragmenting them if they happen to be larger. >> >> > As I was originally reading warnings from your framework in regards to my passing of oversized NALs (or at least, of large NALs that were encompassing an entire frame), and in anticipation that the software will be used in packet lossy environments, I figured that I'd try to keep their size to a minimum. Being new to both of these domains makes it hard to ask the right questions, so I guess I will try this, are there large nals that can be split by your software, for whom losing a slice does not result in the loss of the whole frame No, assuming that by "loss of the whole frame", you meant to say "loss of the whole NAL unit (slice)". Whenever a NAL unit exceeds the network MTU, it will be split (by our software, in accordance with the IETF RTP packet format for H.264) into multiple outgoing RTP packets. However, if any one of these RTP packets is lost, then the whole NAL unit will be lost. OTOH, presumably there is some cost/overhead involved in generating very small NAL units. So, your decision will be to choose NAL unit (slice) sizes that are reasonably small, but not so small as to involve unreasonable overhead. > So myFramedSource will be responsible for passing individual nals up to the DiscreteFramer, via deliverFrame Yes. > , but currently, after my encode call, I have quite a few nals that I need to send over. I either need to call the triggerEvent function successively after loading each nal into the input memory location (which sounds like the wrong thing to do) No, that's exactly the right thing to do. The call to "triggerEvent()" simply informs the LIVE555 thread's event loop that there's a new event to be handled. If course, you need to be aware that the LIVE555 thread will be executing the "deliverFrame()" function soon afterwards (to grab and deliver the data), so you need to make sure that any data structures that are shared by this function and your encoding thread (i.e., the thread that calls "triggerEvent()") are locked appropriately. > Also, I'm not expecting the nals that I pass into live555 to represent full frames, does this change the suitability of the DiscreteFramer for this task? No; it deals only with NAL units. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jkordani at lsa2.com Tue Oct 8 16:44:51 2013 From: jkordani at lsa2.com (Joshua Kordani) Date: Tue, 08 Oct 2013 19:44:51 -0400 Subject: [Live-devel] using live555 with in memory data In-Reply-To: <398B4209-13E4-4CBC-B2E4-87DFC577EFD9@live555.com> References: <525430BD.6040304@lsa2.com> <5254884D.9000904@lsa2.com> <398B4209-13E4-4CBC-B2E4-87DFC577EFD9@live555.com> Message-ID: <525498F3.5000506@lsa2.com> That all sounds clear now, thank you! -- Joshua Kordani LSA Autonomy -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 8 17:09:10 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Oct 2013 17:09:10 -0700 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> Message-ID: <1D76D634-D536-4F08-A6FF-DA4DFBB9652F@live555.com> > I did not see a way to send a REGISTER command to the Proxy Server. We have code that allows a RTSP server to register one of its *own* streams with a proxy server (or with a client application). We don't have code that allows some 3rd-party application to send a "REGISTER_REMOTE" (not "REGISTER") command to a proxy server (or client application). We might provide this at some point, but it's not a high priority. The main motivation for this functionality was to allow a server - located behind a NAT in a non-public portion of the Internet (i.e., without a publically accessible URL) - to advertise its stream to a (publically-accessible) proxy server (and thereby to the public Internet). > Is there any chance that the -t option is not being processed for streams that are being registered? Yes, you're correct - this was a bug/deficiency in the code. I've just installed a new version (2013.10.09) of the "LIVE555 Streaming Media" code that should fix this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Wed Oct 9 06:10:19 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Wed, 9 Oct 2013 08:10:19 -0500 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: <1D76D634-D536-4F08-A6FF-DA4DFBB9652F@live555.com> References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> <1D76D634-D536-4F08-A6FF-DA4DFBB9652F@live555.com> Message-ID: Ross, Works perfectly! Just a few thoughts for your consideration: 1. Do you think the naming convention (registeredProxyStream-N) for REGISTERED streams will adequately support most users / use cases? This has been convenient in my particular case and allows me to proceed with my current efforts, however, I'm not sure it is the best approach. Would having an option that allows the stream name to be specified in the REGISTER command make sense? Default behavior would then fall back to Proxy Server parsing the URL into prefix/suffix as originally implemented. 2. A small item to note. During the course of my testing (using several different client implementations) I found a problem when trying to REGISTER_REMOTE since it is not explicitly list in the OPTIONS list from Proxy Server. Some client's (3rd party applications) may fail with "Method Not Allowed". For me it is not an issue...I simply use a client implementation that does not retrieve the OPTIONS list before sending the REGISTER_REMOTE command. Thanks again for your time and efforts. I'm going to close this thread for now...will open new ones as I move forward with testing / evaluation. Bob On Tue, Oct 8, 2013 at 7:09 PM, Ross Finlayson wrote: > I did not see a way to send a REGISTER command to the Proxy Server. > > > We have code that allows a RTSP server to register one of its *own* > streams with a proxy server (or with a client application). We don't have > code that allows some 3rd-party application to send a "REGISTER_REMOTE" > (not "REGISTER") command to a proxy server (or client application). We > might provide this at some point, but it's not a high priority. > > The main motivation for this functionality was to allow a server - located > behind a NAT in a non-public portion of the Internet (i.e., without a > publically accessible URL) - to advertise its stream to a > (publically-accessible) proxy server (and thereby to the public Internet). > > > Is there any chance that the -t option is not being processed for streams > that are being registered? > > > Yes, you're correct - this was a bug/deficiency in the code. > > I've just installed a new version (2013.10.09) of the "LIVE555 Streaming > Media" code that should fix this. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 9 06:58:39 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Oct 2013 06:58:39 -0700 Subject: [Live-devel] Proxy Server REGISTER option In-Reply-To: References: <0CCE7104-D1A5-47D8-A9A0-6BFE090E9993@live555.com> <06F6F70B-67D9-492F-A8C6-5B6F906B0B15@live555.com> <922C62EC-EE65-480C-8AF7-DF654373D676@live555.com> <1D76D634-D536-4F08-A6FF-DA4DFBB9652F@live555.com> Message-ID: > 1. Do you think the naming convention (registeredProxyStream-N) for REGISTERED streams will adequately support most users / use cases? This has been convenient in my particular case and allows me to proceed with my current efforts, however, I'm not sure it is the best approach. Would having an option that allows the stream name to be specified in the REGISTER command make sense? Default behavior would then fall back to Proxy Server parsing the URL into prefix/suffix as originally implemented. Having an optional parameter in the "REGISTER" or "REGISTER_REMOTE" command is an interesting idea. I might include this when I write up my forthcoming IETF Internet-Draft that specifies these commands. Having the default behavior be to use the back-end URL's suffix, however, would be problematic, as we discovered. (Another issue to consider is: What would the proxy server do if it received two separate "REGISTER" (or "REGISTER_REMOTE") requests for different back-end URLs with the same URL suffix?) Note that - if desired - one could subclass "RTSPServerWithREGISTERProxying" and easlly reimplement the virtual function "implementCmd_REGISTER()" however you wish. > 2. A small item to note. During the course of my testing (using several different client implementations) I found a problem when trying to REGISTER_REMOTE since it is not explicitly list in the OPTIONS list from Proxy Server. Some client's (3rd party applications) may fail with "Method Not Allowed". For me it is not an issue...I simply use a client implementation that does not retrieve the OPTIONS list before sending the REGISTER_REMOTE command. That was an oversight in our implementation of "OPTIONS"; it will be fixed in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From arn at bestmx.ru Thu Oct 10 01:57:18 2013 From: arn at bestmx.ru (Anton Yabchinskiy) Date: Thu, 10 Oct 2013 12:57:18 +0400 Subject: [Live-devel] Crash in parseRTSPRequestString() Message-ID: <20131010125718.676df3ea@doriath> Hello, First of all, I'm aware that this message is kind of sparse in detail, and there is no possibility to gather more information now (the customers problem is solved and he's lost). But maybe it'll lead you to some clues. The RTSP server is an IP camera, connected with RTP/AVP/TCP. Looks like the actual crash happened on this line of RTSPCommon.cpp, in parseRTSPRequestString(): while (k2 <= k1 - 1) resultURLPreSuffix[n++] = reqStr[k2++]; The quick fix was to remove the call to handleIncomingRequest() in RTSPClient.cpp: if (!parseResponseCode(lineStart, responseCode, responseStr)) { // This does not appear to be a RTSP response; perhaps it's a RTSP request instead? handleIncomingRequest(); break; // we're done with this data } That's all that's known. I'll provide more info if we'll encounter it again, or if there will be access to the camera. If it's of no help at all, please ignore this message. Regards. From finlayson at live555.com Thu Oct 10 15:11:08 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Oct 2013 15:11:08 -0700 Subject: [Live-devel] Crash in parseRTSPRequestString() In-Reply-To: <20131010125718.676df3ea@doriath> References: <20131010125718.676df3ea@doriath> Message-ID: <72E99C9E-1E1B-4B5E-B0E2-47A4020D2E85@live555.com> > That's all that's known. I'll provide more info if we'll encounter it > again, or if there will be access to the camera. I suggest setting the "verbosityLevel" in "RTSPClient::createNew()" to 1, so you can see the RTSP protocol exchange (including, I hope, the data that's causing your crash). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kingaceck at 163.com Thu Oct 10 02:08:35 2013 From: kingaceck at 163.com (kingaceck) Date: Thu, 10 Oct 2013 17:08:35 +0800 Subject: [Live-devel] live555 can't transport UDP packet through NAT Message-ID: <201310101708342977431@163.com> HI I put the live555MediaServer in a CENTOS computer that ip is 129.1.7.201. Then put testRTSPClient also in a CENTOS computer that it is behind a NAT(TP-LINK wireless router,a NAPT device) and its ip is 192.168.1.1. I run ./testRTSPClient rtsp://129.1.7.201/test.mpg command and the testRTSPClient can't receive UDP packet. live555 Server S (129.1.7.201) | | | NAPT A (WAN IP:129.1.7.100 LAN IP:192.168.1.1) | | | testRTSPClient A (192.168.0.20:4000) I have found out the reasons of this problem: live555 Server S (129.1.7.201) | ^ Session 1 ^ | | 129.1.7.201:6000 | | v 129.1.7.100:10060 v | | NAPT A (WAN IP:129.1.7.100 LAN IP:192.168.1.1) ^ Session 1 ^ | | 129.1.7.201:6000 | | v 192.168.0.20:4000 v | | testRTSPClient A (192.168.0.20:4000) Cient will send client RTP/RTCP port(such as 4000-4001) messge in the SETUP request to the server. But When testRTSPClient send a UDP packet to the server using RTP/RTCP port(4000-4001) after receiving SETUP response the NAT will rewrite the ip and these RTP/RTCP ports to its WAN ip and ports(such as rewriting to 10060-10061).Then server will send UDP packet to these port(4000-4001) after receiving PLAY command.But the NAT don't know the port(4000-4001) ,because these port have already been rewrited to other port(10060-10061).So the testRTSPClient can't receive UDP packet. Is it right that the server should send UDP packet to the port(10060-10061) what have been rewrited to after receiving PLAY command? 2013-10-10 kingaceck -------------- next part -------------- An HTML attachment was scrubbed... URL: From srimugunth at csa.iisc.ernet.in Thu Oct 10 23:36:24 2013 From: srimugunth at csa.iisc.ernet.in (srimugunth at csa.iisc.ernet.in) Date: Fri, 11 Oct 2013 12:06:24 +0530 Subject: [Live-devel] testRTSPclient stops after 2 mins Message-ID: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> Hi all, I downloaded Live555 latest code from http://www.live555.com/liveMedia/public/ I compiled with ./genMakefiles linux configuration. I tried the testPRog testRTSPclient to receive stream from IP camera. I used the following command: ./testRTSPClient rtsp://admin:4321 at 107.108.205.254/profile2/media.smp Within 2 to 3 mins the testRTSPclient stops with the following error. "Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 4239 bytes. Presentation time: 1381472674.733871! Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 124 bytes. Presentation time: 1381472674.733871! Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 4210 bytes. Presentation time: 1381472674.763904! Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 230 bytes. Presentation time: 1381472674.763904! Received 149 new bytes of response data. Received a complete (unknown) response: RTSP/1.0 408 Request Time-out CSeq: 6 Date: Fri Oct 11 11:55:56 2013 GMT Expires: Fri Oct 11 11:55:56 2013 GMT Cache-Control: must-revalidate" Isn't the testRTSPclient supposed to run infinitely? Doing ps ax shows the testRTSPclient process to be active " 13666 pts/4 S+ 0:00 ./testRTSPClient rtsp://admin:4321 at 107.108.205.254/profile2/media.smp" Am i missing something? Thanks in advance for replying. -mugunthan -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From srimugunth at csa.iisc.ernet.in Thu Oct 10 23:36:24 2013 From: srimugunth at csa.iisc.ernet.in (srimugunth at csa.iisc.ernet.in) Date: Fri, 11 Oct 2013 12:06:24 +0530 Subject: [Live-devel] testRTSPclient stops after 2 mins Message-ID: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> Hi all, I downloaded Live555 latest code from http://www.live555.com/liveMedia/public/ I compiled with ./genMakefiles linux configuration. I tried the testPRog testRTSPclient to receive stream from IP camera. I used the following command: ./testRTSPClient rtsp://admin:4321 at 107.108.205.254/profile2/media.smp Within 2 to 3 mins the testRTSPclient stops with the following error. "Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 4239 bytes. Presentation time: 1381472674.733871! Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 124 bytes. Presentation time: 1381472674.733871! Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 4210 bytes. Presentation time: 1381472674.763904! Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; video/H264: Received 230 bytes. Presentation time: 1381472674.763904! Received 149 new bytes of response data. Received a complete (unknown) response: RTSP/1.0 408 Request Time-out CSeq: 6 Date: Fri Oct 11 11:55:56 2013 GMT Expires: Fri Oct 11 11:55:56 2013 GMT Cache-Control: must-revalidate" Isn't the testRTSPclient supposed to run infinitely? Doing ps ax shows the testRTSPclient process to be active " 13666 pts/4 S+ 0:00 ./testRTSPClient rtsp://admin:4321 at 107.108.205.254/profile2/media.smp" Am i missing something? Thanks in advance for replying. -mugunthan -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From finlayson at live555.com Thu Oct 10 23:49:31 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Oct 2013 23:49:31 -0700 Subject: [Live-devel] testRTSPclient stops after 2 mins In-Reply-To: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> References: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> Message-ID: <6550C5C8-9AE4-4807-89EE-FC2A45BEEC50@live555.com> > Isn't the testRTSPclient supposed to run infinitely? The problem is not the RTSP client; the problem is the RTSP server (i.e., the IP camera). For some reason it is sending back the "408" error response, and closing the connection. To see more clearly what is happening, please run "openRTSP" (see ) rather than "testRTSPClient", and send us the diagnostic output that it prints. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 10 23:54:52 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Oct 2013 23:54:52 -0700 Subject: [Live-devel] testRTSPclient stops after 2 mins In-Reply-To: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> References: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> Message-ID: <6D894E2F-9CE5-4F1E-9DE0-C7D0D726C5F1@live555.com> > Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; > video/H264: Received 230 bytes. Presentation time: 1381472674.763904! One more thing: The "!" at the end of the presentation time indicates that it has not yet been synchronized via RTCP - i.e., the client has not yet received a RTCP "SR" packet. This often happens at the beginning of a stream, but if it's still happening after 2 minutes, than that suggests that the server (IP camera) is not sending RTCP "SR" packets, and is therefore non standards-compliant. You cannot expect our software to run with non-standards-compliant servers. You should replace (or upgrade) your IP camera. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 10 23:56:34 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Oct 2013 23:56:34 -0700 Subject: [Live-devel] live555 can't transport UDP packet through NAT In-Reply-To: <201310101708342977431@163.com> References: <201310101708342977431@163.com> Message-ID: Yes, you've discovered that the RTSP protocol often does not work across NAT. To overcome this, either get rid of your NAT box, or else have your client request RTP/RTCP-over-TCP. In the "testRTSPClient" demo application, you can do this by changing line 229 of "testProgs/testRTSPClient.cpp" from #define REQUEST_STREAMING_OVER_TCP False to #define REQUEST_STREAMING_OVER_TCP True Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From krishnaks at iwavesystems.com Fri Oct 11 00:10:28 2013 From: krishnaks at iwavesystems.com (Krishna) Date: Fri, 11 Oct 2013 12:40:28 +0530 Subject: [Live-devel] FrameSource:Getnextframe error while streaming PCM frames Message-ID: Hi Ross, I have problems streaming live PCM audio. Audio comes either directly from microphone (16-bit LE) Sampling frequency is 8k, mono(1 channel). I receive a buffer in the thread and use event trigger to signal my live555 thread. I've created class based on DeviceSource that inherit from AudioInputDevice and delivers the Frame on trigger. I am using uLawFromPCMAudioSource to convert to 8-bit u-law audio I am getting following error if I am giving audio format as WA_PCM: FramedSource ::getNextFrame():attempting to read more than once at the same time. One thing I observed here is FramedSource::getNextFrame is getting called twice at a time( uLawFromPCMAudioSource is calling it again) If I change audio format to WA_PCMU, I am able to stream without any error ( As FramedSource::getNextFrame is getting called once at a time), and VLC also able to play with some noise. Where I am going wrong ? Thanks in advance Regards, Krishna -------------- next part -------------- An HTML attachment was scrubbed... URL: From srimugunth at csa.iisc.ernet.in Fri Oct 11 00:33:52 2013 From: srimugunth at csa.iisc.ernet.in (srimugunth at csa.iisc.ernet.in) Date: Fri, 11 Oct 2013 13:03:52 +0530 Subject: [Live-devel] testRTSPclient stops after 2 mins In-Reply-To: <6D894E2F-9CE5-4F1E-9DE0-C7D0D726C5F1@live555.com> References: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> <6D894E2F-9CE5-4F1E-9DE0-C7D0D726C5F1@live555.com> Message-ID: >> Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; >> video/H264: Received 230 bytes. Presentation time: 1381472674.763904! > > One more thing: The "!" at the end of the presentation time indicates that > it has not yet been synchronized via RTCP - i.e., the client has not yet > received a RTCP "SR" packet. This often happens at the beginning of a > stream, but if it's still happening after 2 minutes, than that suggests > that the server (IP camera) is not sending RTCP "SR" packets, and is > therefore non standards-compliant. > > You cannot expect our software to run with non-standards-compliant > servers. You should replace (or upgrade) your IP camera. I tried with another IP camera and it had the same output. While i understand that you will not be interested in supporting non-standard IP cameras, is there any workaround in Live555 which i can change to get continued output for atleast 15mins. Will appreciate your inputs. And, the console output with openRTSP is as follows: $ ./openRTSP rtsp://admin:4321 at 107.108.205.230/profile2/media.smp Opening connection to 107.108.205.230, port 554... ...remote connection opened Sending request: OPTIONS rtsp://admin:4321 at 107.108.205.230/profile2/media.smp RTSP/1.0 CSeq: 2 User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) Received 228 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 401 Unauthorized CSeq: 2 Date: Fri Oct 11 12:46:54 2013 GMT Expires: Fri Oct 11 12:46:54 2013 GMT Cache-Control: must-revalidate WWW-Authenticate: Digest realm="iPOLiS", nonce="0000000000000000000000000594D5CE" Resending... Sending request: OPTIONS rtsp://admin:4321 at 107.108.205.230/profile2/media.smp RTSP/1.0 CSeq: 3 Authorization: Digest username="admin", realm="iPOLiS", nonce="0000000000000000000000000594D5CE", uri="rtsp://admin:4321 at 107.108.205.230/profile2/media.smp", response="46d86f1c4f5311b163fe33ddc2dbd613" User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) Received 222 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK CSeq: 3 Date: Fri Oct 11 12:46:55 2013 GMT Expires: Fri Oct 11 12:46:55 2013 GMT Cache-Control: must-revalidate Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, OPTIONS, GET_PARAMETER, SET_PARAMETER Sending request: DESCRIBE rtsp://admin:4321 at 107.108.205.230/profile2/media.smp RTSP/1.0 CSeq: 4 Authorization: Digest username="admin", realm="iPOLiS", nonce="0000000000000000000000000594D5CE", uri="rtsp://admin:4321 at 107.108.205.230/profile2/media.smp", response="df54b418781a7249ee9f544745aa8d82" User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) Accept: application/sdp Received 1024 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 4 Date: Fri Oct 11 12:46:55 2013 GMT Expires: Fri Oct 11 12:46:55 2013 GMT Content-Base: rtsp://107.108.205.230:554/profile2/media.smp/ Content-Type: application/sdp Content-Length: 712 x-Accept-Retransmit: our-retransmit x-Accept-Dynamic-Rate: 1 Cache-Control: must-revalidate v=0 o=- 0 0 IN IP4 107.108.212.53 s=Media Presentation i=samsung c=IN IP4 0.0.0.0 b=AS:384128 t=0 0 a=control:rtsp://107.108.205.230:554/profile2/media.smp a=range:npt=now- m=video 40080 RTP/AVP 98 b=AS:384000 a=rtpmap:98 H264/90000 a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=v a=cliprect:0,0,600,800 a=framesize:98 800-600 a=framerate:30.0 a=fmtp:98 packetization-mode=1;profile-level-id=640028;sprop-parameter-sets=Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQGQJvyoA==,aM48sA== m=audio 40082 RTP/AVP 0 b=AS:64 a=rtpmap:0 PCMU/8000 a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=a Opened URL "rtsp://admin:4321 at 107.108.205.230/profile2/media.smp", returning a SDP description: v=0 o=- 0 0 IN IP4 107.108.212.53 s=Media Presentation i=samsung c=IN IP4 0.0.0.0 b=AS:384128 t=0 0 a=control:rtsp://107.108.205.230:554/profile2/media.smp a=range:npt=now- m=video 40080 RTP/AVP 98 b=AS:384000 a=rtpmap:98 H264/90000 a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=v a=cliprect:0,0,600,800 a=framesize:98 800-600 a=framerate:30.0 a=fmtp:98 packetization-mode=1;profile-level-id=640028;sprop-parameter-sets=Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQGQJvyoA==,aM48sA== m=audio 40082 RTP/AVP 0 b=AS:64 a=rtpmap:0 PCMU/8000 a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=a Created receiver for "video/H264" subsession (client ports 40080-40081) Created receiver for "audio/PCMU" subsession (client ports 40082-40083) Sending request: SETUP rtsp://107.108.205.230:554/profile2/media.smp/trackID=v RTSP/1.0 CSeq: 5 Authorization: Digest username="admin", realm="iPOLiS", nonce="0000000000000000000000000594D5CE", uri="rtsp://107.108.205.230:554/profile2/media.smp/", response="43aaa7995d688cfb3c7e6ebe69574966" User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) Transport: RTP/AVP;unicast;client_port=40080-40081 Received 239 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 5 Date: Fri Oct 11 12:46:55 2013 GMT Expires: Fri Oct 11 12:46:55 2013 GMT Session: 13;timeout=60 Cache-Control: must-revalidate Transport: RTP/AVP/UDP;unicast;client_port=40080-40081;server_port=40080-40081 Setup "video/H264" subsession (client ports 40080-40081) Sending request: SETUP rtsp://107.108.205.230:554/profile2/media.smp/trackID=a RTSP/1.0 CSeq: 6 Authorization: Digest username="admin", realm="iPOLiS", nonce="0000000000000000000000000594D5CE", uri="rtsp://107.108.205.230:554/profile2/media.smp/", response="43aaa7995d688cfb3c7e6ebe69574966" User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) Transport: RTP/AVP;unicast;client_port=40082-40083 Session: 13 Received 239 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 6 Date: Fri Oct 11 12:46:55 2013 GMT Expires: Fri Oct 11 12:46:55 2013 GMT Session: 13;timeout=60 Cache-Control: must-revalidate Transport: RTP/AVP/UDP;unicast;client_port=40082-40083;server_port=40082-40083 Setup "audio/PCMU" subsession (client ports 40082-40083) Created output file: "video-H264-1" Created output file: "audio-PCMU-2" Sending request: PLAY rtsp://107.108.205.230:554/profile2/media.smp RTSP/1.0 CSeq: 7 Authorization: Digest username="admin", realm="iPOLiS", nonce="0000000000000000000000000594D5CE", uri="rtsp://107.108.205.230:554/profile2/media.smp/", response="ca50a936358f4e1ae53251aa956535b1" User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) Session: 13 Range: npt=0.000- Received 321 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 7 Date: Fri Oct 11 12:46:55 2013 GMT Expires: Fri Oct 11 12:46:55 2013 GMT Session: 13 Cache-Control: must-revalidate Range: npt=0.000- RTP-Info: url=rtsp://107.108.205.230:554/profile2/media.smp/trackID=v;seq=0,url=rtsp://107.108.205.230:554/profile2/media.smp/trackID=a;seq=0,url=;seq=0 Started playing session Receiving streamed data (signal with "kill -HUP 14006" or "kill -USR1 14006" to terminate)... Received 149 new bytes of response data. Received a complete (unknown) response: RTSP/1.0 408 Request Time-out CSeq: 7 Date: Fri Oct 11 12:48:56 2013 GMT Expires: Fri Oct 11 12:48:56 2013 GMT Cache-Control: must-revalidate -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From marcin at speed666.info Fri Oct 11 02:15:12 2013 From: marcin at speed666.info (Marcin) Date: Fri, 11 Oct 2013 11:15:12 +0200 Subject: [Live-devel] testRTSPclient stops after 2 mins In-Reply-To: References: <783915c3eb8b03c815faa35eb4cd6904.squirrel@clmail.csa.iisc.ernet.in> <6D894E2F-9CE5-4F1E-9DE0-C7D0D726C5F1@live555.com> Message-ID: <5257C1A0.1080409@speed666.info> Hi, As i can see - this is Samsung iPolis IP Camera - try to upgrade it to latest firmware - should work well as it supports RTCP in new firmwares. Marcin WebCamera.pl W dniu 2013-10-11 09:33, srimugunth at csa.iisc.ernet.in pisze: >>> Stream "rtsp://107.108.205.230:554/profile2/media.smp/"; >>> video/H264: Received 230 bytes. Presentation time: 1381472674.763904! >> One more thing: The "!" at the end of the presentation time indicates that >> it has not yet been synchronized via RTCP - i.e., the client has not yet >> received a RTCP "SR" packet. This often happens at the beginning of a >> stream, but if it's still happening after 2 minutes, than that suggests >> that the server (IP camera) is not sending RTCP "SR" packets, and is >> therefore non standards-compliant. >> >> You cannot expect our software to run with non-standards-compliant >> servers. You should replace (or upgrade) your IP camera. > I tried with another IP camera and it had the same output. > While i understand that you will not be interested in supporting > non-standard IP cameras, is there any workaround in Live555 which i can > change > to get continued output for atleast 15mins. > > Will appreciate your inputs. > > And, the console output with openRTSP is as follows: > > > $ ./openRTSP rtsp://admin:4321 at 107.108.205.230/profile2/media.smp > > > Opening connection to 107.108.205.230, port 554... > ...remote connection opened > Sending request: OPTIONS > rtsp://admin:4321 at 107.108.205.230/profile2/media.smp RTSP/1.0 > CSeq: 2 > User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) > > > Received 228 new bytes of response data. > Received a complete OPTIONS response: > RTSP/1.0 401 Unauthorized > CSeq: 2 > Date: Fri Oct 11 12:46:54 2013 GMT > Expires: Fri Oct 11 12:46:54 2013 GMT > Cache-Control: must-revalidate > WWW-Authenticate: Digest realm="iPOLiS", > nonce="0000000000000000000000000594D5CE" > > > Resending... > Sending request: OPTIONS > rtsp://admin:4321 at 107.108.205.230/profile2/media.smp RTSP/1.0 > CSeq: 3 > Authorization: Digest username="admin", realm="iPOLiS", > nonce="0000000000000000000000000594D5CE", > uri="rtsp://admin:4321 at 107.108.205.230/profile2/media.smp", > response="46d86f1c4f5311b163fe33ddc2dbd613" > User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) > > > Received 222 new bytes of response data. > Received a complete OPTIONS response: > RTSP/1.0 200 OK > CSeq: 3 > Date: Fri Oct 11 12:46:55 2013 GMT > Expires: Fri Oct 11 12:46:55 2013 GMT > Cache-Control: must-revalidate > Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, OPTIONS, GET_PARAMETER, > SET_PARAMETER > > > Sending request: DESCRIBE > rtsp://admin:4321 at 107.108.205.230/profile2/media.smp RTSP/1.0 > CSeq: 4 > Authorization: Digest username="admin", realm="iPOLiS", > nonce="0000000000000000000000000594D5CE", > uri="rtsp://admin:4321 at 107.108.205.230/profile2/media.smp", > response="df54b418781a7249ee9f544745aa8d82" > User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) > Accept: application/sdp > > > Received 1024 new bytes of response data. > Received a complete DESCRIBE response: > RTSP/1.0 200 OK > CSeq: 4 > Date: Fri Oct 11 12:46:55 2013 GMT > Expires: Fri Oct 11 12:46:55 2013 GMT > Content-Base: rtsp://107.108.205.230:554/profile2/media.smp/ > Content-Type: application/sdp > Content-Length: 712 > x-Accept-Retransmit: our-retransmit > x-Accept-Dynamic-Rate: 1 > Cache-Control: must-revalidate > > v=0 > o=- 0 0 IN IP4 107.108.212.53 > s=Media Presentation > i=samsung > c=IN IP4 0.0.0.0 > b=AS:384128 > t=0 0 > a=control:rtsp://107.108.205.230:554/profile2/media.smp > a=range:npt=now- > m=video 40080 RTP/AVP 98 > b=AS:384000 > a=rtpmap:98 H264/90000 > a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=v > a=cliprect:0,0,600,800 > a=framesize:98 800-600 > a=framerate:30.0 > a=fmtp:98 > packetization-mode=1;profile-level-id=640028;sprop-parameter-sets=Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQGQJvyoA==,aM48sA== > m=audio 40082 RTP/AVP 0 > b=AS:64 > a=rtpmap:0 PCMU/8000 > a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=a > > Opened URL "rtsp://admin:4321 at 107.108.205.230/profile2/media.smp", > returning a SDP description: > v=0 > o=- 0 0 IN IP4 107.108.212.53 > s=Media Presentation > i=samsung > c=IN IP4 0.0.0.0 > b=AS:384128 > t=0 0 > a=control:rtsp://107.108.205.230:554/profile2/media.smp > a=range:npt=now- > m=video 40080 RTP/AVP 98 > b=AS:384000 > a=rtpmap:98 H264/90000 > a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=v > a=cliprect:0,0,600,800 > a=framesize:98 800-600 > a=framerate:30.0 > a=fmtp:98 > packetization-mode=1;profile-level-id=640028;sprop-parameter-sets=Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQGQJvyoA==,aM48sA== > m=audio 40082 RTP/AVP 0 > b=AS:64 > a=rtpmap:0 PCMU/8000 > a=control:rtsp://107.108.205.230:554/profile2/media.smp/trackID=a > > Created receiver for "video/H264" subsession (client ports 40080-40081) > Created receiver for "audio/PCMU" subsession (client ports 40082-40083) > Sending request: SETUP > rtsp://107.108.205.230:554/profile2/media.smp/trackID=v RTSP/1.0 > CSeq: 5 > Authorization: Digest username="admin", realm="iPOLiS", > nonce="0000000000000000000000000594D5CE", > uri="rtsp://107.108.205.230:554/profile2/media.smp/", > response="43aaa7995d688cfb3c7e6ebe69574966" > User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) > Transport: RTP/AVP;unicast;client_port=40080-40081 > > > Received 239 new bytes of response data. > Received a complete SETUP response: > RTSP/1.0 200 OK > CSeq: 5 > Date: Fri Oct 11 12:46:55 2013 GMT > Expires: Fri Oct 11 12:46:55 2013 GMT > Session: 13;timeout=60 > Cache-Control: must-revalidate > Transport: > RTP/AVP/UDP;unicast;client_port=40080-40081;server_port=40080-40081 > > > Setup "video/H264" subsession (client ports 40080-40081) > Sending request: SETUP > rtsp://107.108.205.230:554/profile2/media.smp/trackID=a RTSP/1.0 > CSeq: 6 > Authorization: Digest username="admin", realm="iPOLiS", > nonce="0000000000000000000000000594D5CE", > uri="rtsp://107.108.205.230:554/profile2/media.smp/", > response="43aaa7995d688cfb3c7e6ebe69574966" > User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) > Transport: RTP/AVP;unicast;client_port=40082-40083 > Session: 13 > > > Received 239 new bytes of response data. > Received a complete SETUP response: > RTSP/1.0 200 OK > CSeq: 6 > Date: Fri Oct 11 12:46:55 2013 GMT > Expires: Fri Oct 11 12:46:55 2013 GMT > Session: 13;timeout=60 > Cache-Control: must-revalidate > Transport: > RTP/AVP/UDP;unicast;client_port=40082-40083;server_port=40082-40083 > > > Setup "audio/PCMU" subsession (client ports 40082-40083) > Created output file: "video-H264-1" > Created output file: "audio-PCMU-2" > Sending request: PLAY rtsp://107.108.205.230:554/profile2/media.smp RTSP/1.0 > CSeq: 7 > Authorization: Digest username="admin", realm="iPOLiS", > nonce="0000000000000000000000000594D5CE", > uri="rtsp://107.108.205.230:554/profile2/media.smp/", > response="ca50a936358f4e1ae53251aa956535b1" > User-Agent: ./openRTSP (LIVE555 Streaming Media v2013.10.09) > Session: 13 > Range: npt=0.000- > > > Received 321 new bytes of response data. > Received a complete PLAY response: > RTSP/1.0 200 OK > CSeq: 7 > Date: Fri Oct 11 12:46:55 2013 GMT > Expires: Fri Oct 11 12:46:55 2013 GMT > Session: 13 > Cache-Control: must-revalidate > Range: npt=0.000- > RTP-Info: > url=rtsp://107.108.205.230:554/profile2/media.smp/trackID=v;seq=0,url=rtsp://107.108.205.230:554/profile2/media.smp/trackID=a;seq=0,url=;seq=0 > > > Started playing session > Receiving streamed data (signal with "kill -HUP 14006" or "kill -USR1 > 14006" to terminate)... > Received 149 new bytes of response data. > Received a complete (unknown) response: > RTSP/1.0 408 Request Time-out > CSeq: 7 > Date: Fri Oct 11 12:48:56 2013 GMT > Expires: Fri Oct 11 12:48:56 2013 GMT > Cache-Control: must-revalidate > > > From david.verbeiren at intel.com Fri Oct 11 09:20:35 2013 From: david.verbeiren at intel.com (Verbeiren, David) Date: Fri, 11 Oct 2013 16:20:35 +0000 Subject: [Live-devel] MPEG1or2VideoRTPSource slice begin/end interpretation Message-ID: Hi, ? While testing my application that uses Live555 for MPEG2 Video input ("MPV"), I noticed MPEG1or2VideoRTPSource presents some slice data with the expected size, but others are presented as individual 1384 bytes chunks, just like at the network level. ? Looking into the code, I see the following fragment which looks suspicious (MPEG1or2VideoRTPSource::processSpecialHeader() in MPEG1or2VideoRTPSource.cpp, line 50): ? u_int32_t sBit = header&0x00002000; // sequence-header-present u_int32_t bBit = header&0x00001000; // beginning-of-slice u_int32_t eBit = header&0x00000800; // end-of-slice ? fCurrentPacketBeginsFrame = (sBit|bBit) != 0; fCurrentPacketCompletesFrame = ((sBit&~bBit)|eBit) != 0; ? sBit and bBit are looking at different bits of the header, and as each can only have the one specific bit set and all other bits are always 0, (sBit&~bBit) is in fact identical to sBit (*). ? I believe the operation should be performed at the logical level rather than bitwise: fCurrentPacketCompletesFrame = ((sBit != 0) && (bBit == 0)) || (eBit != 0); ? Changing the code as above did allow me to receive slice data correctly (if my understanding is correct) aggregated according to the S, B and E bits of the header. Is the above correct and if so, could this change be applied for a next release? Thanks, -David (*) in sBit, all bits are 0 except - possibly - the 'S' bit; hence the only bit that could be set in (sBit&~bBit) is the 'S' bit; and in ~bBit, all other bits than the 'B' bit are always 1, hence the 'S' bit in ~bBit is always 1. Intel Corporation NV/SA Kings Square, Veldkant 31 2550 Kontich RPM (Bruxelles) 0415.497.718. Citibank, Brussels, account 570/1031255/09 This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. From vasudevan.madubhushi at siemens.com Fri Oct 11 11:15:53 2013 From: vasudevan.madubhushi at siemens.com (Madubhushi, Vasudevan) Date: Fri, 11 Oct 2013 18:15:53 +0000 Subject: [Live-devel] Live 555 2013 Licensing Message-ID: <0E446C12F0F2D647B0F74D1F55AF6F94D88D8A@USLZUA0EM24MSX.ww017.siemens.net> Hi Ross, Just wanted some clarification on licensing obligations for using Live555 Streaming Media library 2013 versions. I have read the http://www.live555.com/liveMedia/#license and http://www.live555.com/liveMedia/faq.html#copyright-and-license pages to get a better understanding of the terms and conditions. However there is an old message posted in 2006 (by you ) related to the 2010 version of Live555 stating that some of the restrictions placed by LGPL are not valid. The link to that post is below : http://lists.live555.com/pipermail/live-devel/2006-February/003993.html http://comments.gmane.org/gmane.comp.multimedia.live555.devel/6223 Do you still maintain the position that LGPL in its strictest form is not applicable ? Or do you have a different take on that now ? Thanks, Vasu This message and any attachments are solely for the use of intended recipients. The information contained herein may include trade secrets, protected health or personal information, privileged or otherwise confidential information. Unauthorized review, forwarding, printing, copying, distributing, or using such information is strictly prohibited and may be unlawful. If you are not an intended recipient, you are hereby notified that you received this email in error, and that any review, dissemination, distribution or copying of this email and any attachment is strictly prohibited. If you have received this email in error, please contact the sender and delete the message and any attachment from your system. Thank you for your cooperation -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 11 11:40:24 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Oct 2013 11:40:24 -0700 Subject: [Live-devel] MPEG1or2VideoRTPSource slice begin/end interpretation In-Reply-To: References: Message-ID: > I believe the operation should be performed at the logical level rather than bitwise: > fCurrentPacketCompletesFrame = ((sBit != 0) && (bBit == 0)) || (eBit != 0); Yes, you've discovered a bug (actually, a very old bug, because MPEG-1 or 2 video - especially with slices - is rarely used these days). I've just installed a new version (2013.10.11) of the "LIVE555 Streaming Media" software that corrects this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 11 11:54:36 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Oct 2013 11:54:36 -0700 Subject: [Live-devel] Live 555 2013 Licensing In-Reply-To: <0E446C12F0F2D647B0F74D1F55AF6F94D88D8A@USLZUA0EM24MSX.ww017.siemens.net> References: <0E446C12F0F2D647B0F74D1F55AF6F94D88D8A@USLZUA0EM24MSX.ww017.siemens.net> Message-ID: I have never said that "some of the restrictions placed by LGPL are not valid". The LGPL license remains fully valid. What I have said, however, is that I do not personally have an objection to "LIVE555 Streaming Media" code being linked statically, especially within iOS applications - which apparently is the only way that iOS applications can make use of the code. However, I obviously can't make any guarantees about how others might view this - e.g., if this software (or company) were to change ownership sometime down the road. In any case, I also consider it very important that any developer of a product that uses this software be willing to - upon request - upgrade it to use the latest version of the software. In general, questions about software copyright and licensing are best handled by personal email - after consulting your company's legal department - rather than on this mailing list (which is intended mainly for programming questions, and whose members are usually programmers rather than lawyers). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Sun Oct 13 19:52:44 2013 From: fantasyvideo at 126.com (=?gb2312?B?w867w7mk1/fK0g==?=) Date: Mon, 14 Oct 2013 10:52:44 +0800 Subject: [Live-devel] Regarding the sps/pps in live555 Message-ID: <027a01cec888$765b1070$63113150$@126.com> HI , I use live555 to setup one rtsp server which provides h264 stream. In the GetNextVideoFrame I use the following code. void RecorderSession::GetNextVideoFrame(bool first,char* to, unsigned int maxsize, unsigned int *framesize, unsigned int* fNumTruncatedBytes) { int size = 0; static char startcode[]={0x00,0x00,0x00,0x01}; if(first) { memcpy(to,startcode,4); memcpy(to+4,sps.buf,sps.len); memcpy(to+sps.len+4,startcode,4); memcpy(to+8+sps.len,pps.buf,pps.len); size = pps.len+8+sps.len; } unsigned int len = 0; _videoqueue->Get(to+size,&len); *framesize = len +size; if(*framesize>maxsize) { *fNumTruncatedBytes = (*framesize)-maxsize; } } As you see, when the first request comes, I fill the sps/pps info to it. But when I use vlc to play it, it also said it's waiting sps/pps. Could you give me some suggestion about it? THKS. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Mon Oct 14 14:15:27 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Mon, 14 Oct 2013 16:15:27 -0500 Subject: [Live-devel] Proxy Server (Back-end Session) Message-ID: Ross, I have run into an issue where the session between Proxy Server and the back-end server has timed-out, but the Proxy Server is still receiving responses to the periodic OPTION requests. In this situation when a request (Play / Pause) is made from a client you see "454 Session Not Found", however on-going OPTION requests from Proxy Server to back-end server continue to be successful. Should Proxy Server reset the connection in these cases? I tested to see how Proxy Server behaves when there is total connection loss (reboot camera) to the back-end; Proxy detects connection loss and resets "lost connection to server ('errno': 115). Resetting..." and establishes new session with back-end. Bob -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 10760 TRENTON SAINT LOUIS, MISSOURI 63132 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 14 18:21:41 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Oct 2013 18:21:41 -0700 Subject: [Live-devel] Proxy Server (Back-end Session) In-Reply-To: References: Message-ID: <607CB263-F432-43E5-B5B3-9A4D3FD92AE7@live555.com> > I have run into an issue where the session between Proxy Server and the back-end server has timed-out, but the Proxy Server is still receiving responses to the periodic OPTION requests. In this situation when a request (Play / Pause) is made from a client you see "454 Session Not Found", however on-going OPTION requests from Proxy Server to back-end server continue to be successful. We don't check the response to the proxy server's "PLAY" command, because the periodic "OPTIONS" command is supposed to be sufficient to (1) keep the connection alive (in case the server is non-compliant and doesn't use RTCP "RR" to indicate liveness), and (2) check whether the connection has closed. So I don't understand how the server can continue to be handling the "OPTIONS" commands, yet have timed-out the session. Please send an example of the output (after running the proxy server with the "-V" option) that illustrates what you're seeing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From smirnov at rastr.natm.ru Mon Oct 14 22:51:43 2013 From: smirnov at rastr.natm.ru (=?koi8-r?B?883J0s7P1yDuycvPzMHKIOnXwc7P18ne?=) Date: Tue, 15 Oct 2013 08:51:43 +0300 Subject: [Live-devel] DeviceSource Translate large frames (fNumTruncatedBytes) Message-ID: <000601cec96a$a91057e0$fb3107a0$@rastr.natm.ru> Hi! I try translate H264 video stream by Live555. The hardware video source is the video camera (USB). Class for Live555 video source derived from DeviceSource. The problem in function DeviceSource::deliverFrame : When the newFrameSize > fMaxSize i set fNumTruncatedBytes, but only result is the message: "The input frame data was too large for our buffer size :.. bytes of trailing data was dropped!" This mean (as I understand) that the truncated part of frame is dropped. Is this mean that the truncated frame is dropped? So, all large frames (>fMaxSize) really will be skiped. I try looped the afterGetting() : u_int8_t* newFrameDataStart = (u_int8_t*)ourFrame; do { unsigned newFrameSize = hugeFrameSize; if (newFrameSize > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = newFrameSize - fMaxSize; } else { fNumTruncatedBytes =0; fFrameSize = newFrameSize; } : memmove( fTo, newFrameDataStart, fFrameSize); newFrameDataStart += fFrameSize; hugeFrameSize -= fFrameSize; FramedSource::afterGetting(this); } while( fNumTruncatedBytes); This is bad idea - the program dead. In message we have recommendation: "Correct this by increasing \"OutPacketBuffer::maxSize\" to at least" Is any way to sending large frames (H264 key frames , or other), except the HUGE fMaxSize? Sorry for my English! Nick -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 14 22:59:02 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Oct 2013 22:59:02 -0700 Subject: [Live-devel] DeviceSource Translate large frames (fNumTruncatedBytes) In-Reply-To: <000601cec96a$a91057e0$fb3107a0$@rastr.natm.ru> References: <000601cec96a$a91057e0$fb3107a0$@rastr.natm.ru> Message-ID: <7A47CB8A-C72E-4085-84FF-D33C363A1C59@live555.com> > I try translate H264 video stream by Live555. The hardware video source is the video camera (USB). > Class for Live555 video source derived from DeviceSource. > The problem in function DeviceSource::deliverFrame : > When the newFrameSize > fMaxSize i set fNumTruncatedBytes, but only result is the message: > > ?The input frame data was too large for our buffer size ?.. bytes of trailing data was dropped!? > > This mean (as I understand) that the truncated part of frame is dropped. > Is this mean that the truncated frame is dropped? Yes. If the input frame is larger than the buffer space that the downstream object provides, then you will *not* be able to deliver all of the data. The remaining data will be truncated (i.e., dropped, lost). > In message we have recommendation: > > ?Correct this by increasing \"OutPacketBuffer::maxSize\" to at least? > > Is any way to sending large frames (H264 key frames , or other), except the HUGE fMaxSize? No. The downstream object's buffer must be large enough to receive the frame. However, this shows why very large H.264 key frames are a bad idea. Even if you have a large enough buffer to stream these frames, each one will be packed into many outgoing RTP packets. If *any* of these packets gets lost, the receiver will be unable to reconstruct the frame. Instead, it is much better if you can break up each 'key frame' into several 'slices' (each of which would be its own H.264 NAL unit). Each of these slices (NAL units) would be delivered separately. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From smirnov at rastr.natm.ru Tue Oct 15 00:53:38 2013 From: smirnov at rastr.natm.ru (=?koi8-r?B?883J0s7P1yDuycvPzMHKIOnXwc7P18ne?=) Date: Tue, 15 Oct 2013 10:53:38 +0300 Subject: [Live-devel] DeviceSource Translate large frames (fNumTruncatedBytes) In-Reply-To: <7A47CB8A-C72E-4085-84FF-D33C363A1C59@live555.com> References: <000601cec96a$a91057e0$fb3107a0$@rastr.natm.ru> <7A47CB8A-C72E-4085-84FF-D33C363A1C59@live555.com> Message-ID: <002201cec97b$b0a66ba0$11f342e0$@rastr.natm.ru> Hi! Thanks for the answer. "*any* of these packets gets lost," Some questions: Does it mean that RPT protocol is unreliable? So, if 'slices' (NAL units) would be delivered through RPT, can they be lost too? Why the 'slices' way is more reliable? Is a unreliable transmission normal (reality)? In what object I must realize 'slices' technology? Nick From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, October 15, 2013 8:59 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] DeviceSource Translate large frames (fNumTruncatedBytes) I try translate H264 video stream by Live555. The hardware video source is the video camera (USB). Class for Live555 video source derived from DeviceSource. The problem in function DeviceSource::deliverFrame : When the newFrameSize > fMaxSize i set fNumTruncatedBytes, but only result is the message: "The input frame data was too large for our buffer size :.. bytes of trailing data was dropped!" This mean (as I understand) that the truncated part of frame is dropped. Is this mean that the truncated frame is dropped? Yes. If the input frame is larger than the buffer space that the downstream object provides, then you will *not* be able to deliver all of the data. The remaining data will be truncated (i.e., dropped, lost). In message we have recommendation: "Correct this by increasing \"OutPacketBuffer::maxSize\" to at least" Is any way to sending large frames (H264 key frames , or other), except the HUGE fMaxSize? No. The downstream object's buffer must be large enough to receive the frame. However, this shows why very large H.264 key frames are a bad idea. Even if you have a large enough buffer to stream these frames, each one will be packed into many outgoing RTP packets. If *any* of these packets gets lost, the receiver will be unable to reconstruct the frame. Instead, it is much better if you can break up each 'key frame' into several 'slices' (each of which would be its own H.264 NAL unit). Each of these slices (NAL units) would be delivered separately. Ross Finlayson Live Networks, Inc. http://www.live555.com/ __________ Information from ESET NOD32 Antivirus, version of virus signature database 8913 (20131014) __________ The message was checked by ESET NOD32 Antivirus. ????????? ??????????? ????? - - O? ????????? ??????????? ????? - - O? ???????? ??? ????? 00026.txt - - O? http://www.esetnod32.ru/.ml -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 15 04:10:43 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Oct 2013 04:10:43 -0700 Subject: [Live-devel] DeviceSource Translate large frames (fNumTruncatedBytes) In-Reply-To: <002201cec97b$b0a66ba0$11f342e0$@rastr.natm.ru> References: <000601cec96a$a91057e0$fb3107a0$@rastr.natm.ru> <7A47CB8A-C72E-4085-84FF-D33C363A1C59@live555.com> <002201cec97b$b0a66ba0$11f342e0$@rastr.natm.ru> Message-ID: > ?*any* of these packets gets lost,? > > Some questions: > Does it mean that RPT protocol is unreliable? Yes. RTP packets sent over UDP (as they usually are) are unreliable. > So, if 'slices' (NAL units) would be delivered through RPT, can they be lost too? > Why the 'slices' way is more reliable? Because even if one slice of a picture is lost, the other slices of the picture will get delivered, and (depending on the decoder) should be able to get rendered and displayed. > Is a unreliable transmission normal (reality)? It depends upon your network. Usually, lost packets happen infrequently. > In what object I must realize 'slices' technology? You would reconfigure your encoder to generate each 'key frame' as a sequence of 'slice' NAL units, instead of as a single NAL unit. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Tue Oct 15 10:04:37 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Tue, 15 Oct 2013 12:04:37 -0500 Subject: [Live-devel] Proxy Server (Back-end Session) In-Reply-To: <607CB263-F432-43E5-B5B3-9A4D3FD92AE7@live555.com> References: <607CB263-F432-43E5-B5B3-9A4D3FD92AE7@live555.com> Message-ID: Ross, Attached is a log file showing the issue. I have edited the log due to size limits on the mailing list. Bob On Mon, Oct 14, 2013 at 8:21 PM, Ross Finlayson wrote: > I have run into an issue where the session between Proxy Server and the > back-end server has timed-out, but the Proxy Server is still receiving > responses to the periodic OPTION requests. In this situation when a request > (Play / Pause) is made from a client you see "454 Session Not Found", > however on-going OPTION requests from Proxy Server to back-end server > continue to be successful. > > > We don't check the response to the proxy server's "PLAY" command, because > the periodic "OPTIONS" command is supposed to be sufficient to (1) keep the > connection alive (in case the server is non-compliant and doesn't use RTCP > "RR" to indicate liveness), and (2) check whether the connection has > closed. So I don't understand how the server can continue to be handling > the "OPTIONS" commands, yet have timed-out the session. > > Please send an example of the output (after running the proxy server with > the "-V" option) that illustrates what you're seeing. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 10760 TRENTON SAINT LOUIS, MISSOURI 63132 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: proxy.log Type: application/octet-stream Size: 13727 bytes Desc: not available URL: From finlayson at live555.com Tue Oct 15 17:29:04 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Oct 2013 17:29:04 -0700 Subject: [Live-devel] Proxy Server (Back-end Session) In-Reply-To: <607CB263-F432-43E5-B5B3-9A4D3FD92AE7@live555.com> References: <607CB263-F432-43E5-B5B3-9A4D3FD92AE7@live555.com> Message-ID: <92DC2528-A5FE-4082-87A7-E33FF024AB3A@live555.com> Unfortunately I can't tell why the 'back-end' server (an Axis camera) is timing out the sessions. (It is apparently a bug in the cameras.) However, in hope of preventing this from happening, I've now installed a new version (2013.10.16) of the "LIVE555 Streaming Media" code that now includes "Session:" header in each "OPTIONS" request (if we're currently part of a session). I'm hoping that this will either cause your back-end server (Axis camera) to keep its session alive, or - if it doesn't - at least cause the camera to return an error response for the "OPTIONS" command after it (for whatever reason) times its session out. Please let me know if this fixes your problem. If it doesn't, then Axis will need to fix their cameras to become more standards compliant. (As always, I'm willing to help Axis (and any other network camera manufacturer) make their systems more compliant. There's at least one Axis employee on this mailing list; you know how to get hold of me :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bobemail at foxmail.com Tue Oct 15 18:52:09 2013 From: bobemail at foxmail.com (=?gb2312?B?z+jX0w==?=) Date: Wed, 16 Oct 2013 09:52:09 +0800 Subject: [Live-devel] LiveMedia NG When Time Changed Message-ID: <201310160952081256836@foxmail.com> Hi, When time changed, The stream stopped. Reason 1: If time value become bigger, "uSecondsToGo" in MultiFramedRTPSink::sendPacketIfNecessary is bigger, "scheduleDelayedTask" need too long time to call "sendNext". Reason 2: If time value become smaller, stream is stopped also. But I don't know why. Time Change is unavoidable, What should I do ? Thanks. Bob -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 16 01:21:23 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Oct 2013 01:21:23 -0700 Subject: [Live-devel] LiveMedia NG When Time Changed In-Reply-To: <201310160952081256836@foxmail.com> References: <201310160952081256836@foxmail.com> Message-ID: <7B83AD2F-5A3F-4628-8498-FD2097D7DF2D@live555.com> > When time changed, The stream stopped. > Reason 1: > If time value become bigger, "uSecondsToGo" in MultiFramedRTPSink::sendPacketIfNecessary is bigger, "scheduleDelayedTask" need too long time to call "sendNext". > Reason 2: > If time value become smaller, stream is stopped also. But I don't know why. It's probably because you're not setting "fDurationInMicroseconds" properly in your upstream object (the object that you're feeding to your "RTPSink" (subclass) object). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Wed Oct 16 08:38:34 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Wed, 16 Oct 2013 10:38:34 -0500 Subject: [Live-devel] Proxy Server (Back-end Session) In-Reply-To: <92DC2528-A5FE-4082-87A7-E33FF024AB3A@live555.com> References: <607CB263-F432-43E5-B5B3-9A4D3FD92AE7@live555.com> <92DC2528-A5FE-4082-87A7-E33FF024AB3A@live555.com> Message-ID: Ross, Thanks for the code update. Looks like sending the session id in the OPTION header solved the problem! After my original post yesterday, I did determine that Axis cameras have a session time-out (default 60 sec) setting that was tearing down the session. The only thing I cannot explain is why the log file from yesterday did not have the following: ProxyServerMediaSubsession["H264"]: received RTCP "BYE". (The back-end stream has ended.) ProxyServerMediaSubsession["H264"]::~ProxyServerMediaSubsession() Opening connection to 99.58.86.198, port 11034... In tests that I ran later in the day I saw this message every time I waited 60 secs between test cycles. So, it appears that the camera does send a message on time-out...and based on my observations Proxy Server handles this correctly. I'm at a loss as to why "BYE" was not received in the original case. I suppose there could have been dropped packets, loss of connectivuty...etc. Bob On Tue, Oct 15, 2013 at 7:29 PM, Ross Finlayson wrote: > Unfortunately I can't tell why the 'back-end' server (an Axis camera) is > timing out the sessions. (It is apparently a bug in the cameras.) > > However, in hope of preventing this from happening, I've now installed a > new version (2013.10.16) of the "LIVE555 Streaming Media" code that now > includes "Session:" header in each "OPTIONS" request (if we're currently > part of a session). I'm hoping that this will either cause your back-end > server (Axis camera) to keep its session alive, or - if it doesn't - at > least cause the camera to return an error response for the "OPTIONS" > command after it (for whatever reason) times its session out. > > Please let me know if this fixes your problem. If it doesn't, then Axis > will need to fix their cameras to become more standards compliant. (As > always, I'm willing to help Axis (and any other network camera > manufacturer) make their systems more compliant. There's at least one Axis > employee on this mailing list; you know how to get hold of me :-) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 16 10:41:14 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Oct 2013 10:41:14 -0700 Subject: [Live-devel] Proxy Server (Back-end Session) In-Reply-To: References: <607CB263-F432-43E5-B5B3-9A4D3FD92AE7@live555.com> <92DC2528-A5FE-4082-87A7-E33FF024AB3A@live555.com> Message-ID: > Looks like sending the session id in the OPTION header solved the problem! > > After my original post yesterday, I did determine that Axis cameras have a session time-out (default 60 sec) setting that was tearing down the session. The Axis camera should also be using incoming RTCP "RR" packets (from the proxy server) to indicate that the session is still alive. NOTE TO AXIS (and, in particular, to the Axis employee who's on this mailing list); Please fix this! > The only thing I cannot explain is why the log file from yesterday did not have the following: > > ProxyServerMediaSubsession["H264"]: received RTCP "BYE". (The back-end stream has ended.) > ProxyServerMediaSubsession["H264"]::~ProxyServerMediaSubsession() > Opening connection to 99.58.86.198, port 11034... > > In tests that I ran later in the day I saw this message every time I waited 60 secs between test cycles. The "BYE" packet from the camera probably got lost in that one case. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 18 03:03:30 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 03:03:30 -0700 Subject: [Live-devel] Modifications to the RTSP "REGISTER" command and API Message-ID: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> FYI, I have just submitted to the IETF a new Internet-Draft document that describes our new custom "REGISTER" RTSP command (that we use in our proxy server implementation). You can find a copy online at http://tools.ietf.org/html/draft-finlayson-rtsp-register-command-00 I have also released a new version (2013.10.18) of the "LIVE555 Streaming Media" code that changes the implementation to conform to this document. In particular: - There are no longer separate "REGISTER" and "REGISTER_REMOTE" commands. Now, there's only "REGISTER". An optional Boolean parameter "reuse_connection" - in the RTSP "Transport:" header - can be used to specify whether the recipient should reuse the TCP connection. - There is a new optional parameter "proxy_url_suffix" (again, in the RTSP "Transport:" header) that can be used to specify the URL suffix that the receiving proxy server should use to advertise the proxied stream. If you are "REGISTER"ing one of your "RTSPServer"s own streams (e.g., with a proxy server), then note that the "RTSPServer::registerStream()" API has changed. If you are using a 3rd-party application to construct and send "REGISTER" commands - as Bob Bischan is doing - then note that - The command to send is now "REGISTER", not "REGISTER_REMOTE", and - There are optional parameters that you can specify in a RTSP "Transport:" header. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Fri Oct 18 08:57:12 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Fri, 18 Oct 2013 10:57:12 -0500 Subject: [Live-devel] Modifications to the RTSP "REGISTER" command and API In-Reply-To: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> References: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> Message-ID: Ross, Very nice...I look forward to seeing how this evolves. I noticed that you implemented the ability to specify stream suffix! I will be setting up a new test environment with the latest code base to work through these API changes. I'll provide feedback, should I encounter anything of interest. Since this an initial draft, I assume your still open for comments/suggestions :-) - Proxy Server -T option does not allow for specifying a unique port per stream. In NAT cases streams would have different ports. Would it be possible to have this option for streams that will be using RTSP over HTTP. - Not sure if this would make sense, but would an UN-REGISTER method further enhance the capabilities of Proxy Server? This would allow for a perpetually running server that could dynamically REGISTER/UN-REGISTER streams as needed. Bob On Fri, Oct 18, 2013 at 5:03 AM, Ross Finlayson wrote: > FYI, I have just submitted to the IETF a new Internet-Draft document that > describes our new custom "REGISTER" RTSP command (that we use in our proxy > server implementation). You can find a copy online at > http://tools.ietf.org/html/draft-finlayson-rtsp-register-command-00 > > I have also released a new version (2013.10.18) of the "LIVE555 Streaming > Media" code that changes the implementation to conform to this document. > In particular: > - There are no longer separate "REGISTER" and "REGISTER_REMOTE" commands. > Now, there's only "REGISTER". An optional Boolean parameter > "reuse_connection" - in the RTSP "Transport:" header - can be used to > specify whether the recipient should reuse the TCP connection. > - There is a new optional parameter "proxy_url_suffix" (again, in the > RTSP "Transport:" header) that can be used to specify the URL suffix that > the receiving proxy server should use to advertise the proxied stream. > > If you are "REGISTER"ing one of your "RTSPServer"s own streams (e.g., with > a proxy server), then note that the "RTSPServer::registerStream()" API has > changed. > > If you are using a 3rd-party application to construct and send "REGISTER" > commands - as Bob Bischan is doing - then note that > - The command to send is now "REGISTER", not "REGISTER_REMOTE", and > - There are optional parameters that you can specify in a RTSP > "Transport:" header. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 18 10:34:59 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 10:34:59 -0700 Subject: [Live-devel] Modifications to the RTSP "REGISTER" command and API In-Reply-To: References: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> Message-ID: > - Proxy Server -T option does not allow for specifying a unique port per stream. In NAT cases streams would have different ports. Would it be possible to have this option for streams that will be using RTSP over HTTP. FYI, right now back-end RTP/RTCP-over-RTSP-over-HTTP tunneling works only for 'back-end' streams that are specified on the command line. It does *not* work for 'back-end' streams that have been "REGISTER"ed (even if the "-T " option was given on the command line). > - Not sure if this would make sense, but would an UN-REGISTER method further enhance the capabilities of Proxy Server? Perhaps, but this is low-priority. You can always program your own proxy server that does this itself, using the "RTSPServer::removeServerMediaSession()" method. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mail at chernowii.com Fri Oct 18 10:49:55 2013 From: mail at chernowii.com (Konrad) Date: Fri, 18 Oct 2013 19:49:55 +0200 Subject: [Live-devel] Question regarding GoPro Live Streaming Message-ID: Hello! I am a GoPro camera user, and I am a bit dissapointed because of the lag produced in the WiFi Live Streaming. The GoPro streams to a server (10.5.5.9:8080) and the preview is here: http://10.5.5.9:8080/live/amba.m3u8 I googled and I found the open source site from that camera (1), I found a sheet containig live streaming information (2). There I found Live555. Is there any way to get rid of the lag? Best regards Konrad Iturbe http://chernowii.com (1): http://gopro.com/support/open-source (2): http://wpcdn.gopro.com.s3.amazonaws.com/wp-content/uploads/2013/01/live.2012.02.04.tar.gz -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 18 11:01:10 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 11:01:10 -0700 Subject: [Live-devel] Modifications to the RTSP "REGISTER" command and API In-Reply-To: References: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> Message-ID: <2D3E6306-F543-4D80-BA18-FA94B101B45D@live555.com> >> - Proxy Server -T option does not allow for specifying a unique port per stream. In NAT cases streams would have different ports. Would it be possible to have this option for streams that will be using RTSP over HTTP. > > FYI, right now back-end RTP/RTCP-over-RTSP-over-HTTP tunneling works only for 'back-end' streams that are specified on the command line. It does *not* work for 'back-end' streams that have been "REGISTER"ed (even if the "-T " option was given on the command line). The reason for this, BTW, is that the primary intended purpose for the new "REGISTER" command was to allow a back-end server to register it's *own* stream, using the "RTSPServer::registerStream()" method, and to allow the receiving client (or proxy server) to reuse the TCP connection on which the "REGISTER" command was sent. If this is done, then you don't need RTP/RTCP-over-RTSP-over-HTTP tunneling at all, because you already have a TCP connection set up. You can just to RTP/RTCP-over-RTSP tunneling (over that same TCP connection) instead. The way that you're using the "REGISTER" command - to allow a 3rd party to register a back-end stream with a proxy server - is a nice way to use the mechanism, but it wasn't my primary purpose for developing it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 18 11:08:50 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 11:08:50 -0700 Subject: [Live-devel] Question regarding GoPro Live Streaming In-Reply-To: References: Message-ID: <1AE337C4-0375-46C3-82BC-1D0715C199BB@live555.com> > I googled and I found the open source site from that camera (1), I found a sheet containig live streaming information (2). There I found Live555. [...] > (1): http://gopro.com/support/open-source > (2): http://wpcdn.gopro.com.s3.amazonaws.com/wp-content/uploads/2013/01/live.2012.02.04.tar.gz Please contact "GoPro", and tell them that - under the terms of the LGPL - they must update their software to use the latest version of the "LIVE555 Streaming Media" software, or else tell you how you can perform this upgrade yourself. In any case, though, it's far from clear how they are using our software, and what - if anything - it has to do with their 'WiFi Live Streaming" mechanism, and/or the 'lag' that you are seeing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Fri Oct 18 12:31:50 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Fri, 18 Oct 2013 14:31:50 -0500 Subject: [Live-devel] Modifications to the RTSP "REGISTER" command and API In-Reply-To: <2D3E6306-F543-4D80-BA18-FA94B101B45D@live555.com> References: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> <2D3E6306-F543-4D80-BA18-FA94B101B45D@live555.com> Message-ID: Understanding the author's original intent and design philosophy is key, especially when it comes to using their software :-) In regards to the registering RTSP-over-HTTP back-end servers...I do not anticipate a need for this, however, your clarification on it's implementation within Proxy Server is good to know. RTP/RTCP over UDP is what I'm currently using and it seems to work the best in my particular case. Embedding code on the Axis camera to call home is an excellent idea ! As a side note: Proxy Server is running quite well ! Bob On Fri, Oct 18, 2013 at 1:01 PM, Ross Finlayson wrote: > - Proxy Server -T option does not allow for specifying a unique port per > stream. In NAT cases streams would have different ports. Would it be > possible to have this option for streams that will be using RTSP over HTTP. > > > FYI, right now back-end RTP/RTCP-over-RTSP-over-HTTP tunneling works only > for 'back-end' streams that are specified on the command line. It does > *not* work for 'back-end' streams that have been "REGISTER"ed (even if the > "-T " option was given on the command line). > > > The reason for this, BTW, is that the primary intended purpose for the new > "REGISTER" command was to allow a back-end server to register it's *own* > stream, using the "RTSPServer::registerStream()" method, and to allow the > receiving client (or proxy server) to reuse the TCP connection on which the > "REGISTER" command was sent. If this is done, then you don't need > RTP/RTCP-over-RTSP-over-HTTP tunneling at all, because you already have a > TCP connection set up. You can just to RTP/RTCP-over-RTSP tunneling (over > that same TCP connection) instead. > > The way that you're using the "REGISTER" command - to allow a 3rd party to > register a back-end stream with a proxy server - is a nice way to use the > mechanism, but it wasn't my primary purpose for developing it. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 18 12:42:02 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 12:42:02 -0700 Subject: [Live-devel] Modifications to the RTSP "REGISTER" command and API In-Reply-To: References: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> <2D3E6306-F543-4D80-BA18-FA94B101B45D@live555.com> Message-ID: <666E71C9-49AF-41DC-B561-9C79356F9FE1@live555.com> > Embedding code on the Axis camera to call home is an excellent idea ! Feel free to suggest this to Axis; I'd be happy to work with them to help make this happen. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Fri Oct 18 12:34:53 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Sat, 19 Oct 2013 01:04:53 +0530 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder Message-ID: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> Hi, I spend almost last 3 days banging my head with RFCs and other documents. Here is what I want to achieve. I have raw BGRA frames and I want to stream them using live555. This is what I am doing I created a subclass of FramedSource which is responsible to encode the raw BGRA frame into MPEG4 using ffmpeg and then copy it to fTo. the brief source code is given below: Here the m_queue contains the AVPacket directly from ffmpeg which is output from av_encode_video function /*****************************************************/ void MyStreamingDeviceSource::deliverFrame() { EsUtil::getTimeOfDay(&fPresentationTime, NULL); AVPacket pkt; m_queue.getPacket(&pkt, 1); unsigned int newFrameSize = pkt.size; if(newFrameSize == 0) return; if(newFrameSize > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = newFrameSize - fMaxSize; } else { fFrameSize = newFrameSize; } memcpy(fTo, pkt.data, fFrameSize); av_free_packet(&pkt); FramedSource::afterGetting(this); } /*****************************************************/ My filter chain looks like this Raw BGRA -> Encode to MPEG4 using ffmpeg -> MyStreamingDeviceSource -> MPEG4VideoStreamDiscreteFramer -> MPEG4ESVideoRTPSink when i am doing that I am not getting anything in VLC and testRTCPClient and debugging just print some information after every 1-1.25 seconds. it just print "Sending REPORT" and "sending RTCP packet" I suspect that I did not send the configuration data before sending the actual data. I saw that my AVPAcket has "00000001B6" in starting and I think i somehow need to send "00000001B0" to signal start of data. I also noticed that when i encode using FFMpeg in codec->extradata I have some header information that has "00000001B0" in that but i dont know how to pack all this information to create a valid RTP packet and how frequently i need to send this header and encoded packets. I might be doing something very wrong . Please help me in order to figure out how to send the FFMpeg encoded data using live555. Thanks, Caduceus From bbischan at watchtower-security.com Fri Oct 18 13:12:58 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Fri, 18 Oct 2013 15:12:58 -0500 Subject: [Live-devel] Modifications to the RTSP "REGISTER" command and API In-Reply-To: <666E71C9-49AF-41DC-B561-9C79356F9FE1@live555.com> References: <2C58E155-E836-4E27-9A6C-A0C1BDA6CD8B@live555.com> <2D3E6306-F543-4D80-BA18-FA94B101B45D@live555.com> <666E71C9-49AF-41DC-B561-9C79356F9FE1@live555.com> Message-ID: Hmm....I have a better idea :-) On Fri, Oct 18, 2013 at 2:42 PM, Ross Finlayson wrote: > Embedding code on the Axis camera to call home is an excellent idea ! > > > Feel free to suggest this to Axis; I'd be happy to work with them to help > make this happen. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 10760 TRENTON SAINT LOUIS, MISSOURI 63132 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 18 13:24:59 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 13:24:59 -0700 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> Message-ID: <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> Your "deliverFrame()" function looks OK - provided that it gets called whenever a new MPEG-4 frame is available. Note that each call to "MyStreamingDeviceSource::doGetNextFrame()" must be followed (eventually) by a call to "FramedSource::afterGetting(this)". Therefore, if you return from "deliverFrame()" at the "if(newFrameSize == 0) return;" statement (because no new frame data is available), you must ensure that "deliverFrame()" somehow gets called again later when new frame data *is* available. > when i am doing that I am not getting anything in VLC and testRTCPClient You should use "testRTSPClient" (sic) for testing first; only after that's working should you bother using VLC. > I suspect that I did not send the configuration data before sending the actual data. I saw that my AVPAcket has "00000001B6" in starting and I think i somehow need to send "00000001B0" to signal start of data. > > I also noticed that when i encode using FFMpeg in codec->extradata I have some header information that has "00000001B0" in that but i dont know how to pack all this information Yes, this 'extra data' is important. You should treat it just as you would a video frame, and make sure that it is the first 'frame' that you feed to your "MyStreamingDeviceSource::deliverFrame()" is this 'extra data'. I suspect that this may be enough to make your code work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Fri Oct 18 14:24:41 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Sat, 19 Oct 2013 02:54:41 +0530 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> Message-ID: <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> Thank you for the clarification. I will try to send this extra data as first packet. I have another confusion. if i send this packet as first packet when there was no client attached to it and later on client attached to it than how will that client get this packet as it will be lost. Right now the way I am using FrameSource is: /****************************************************************/ void MyStreamingDeviceSource::streamPacket(AVPacket pkt) { m_queue.putPacket(&pkt); envir().taskScheduler().triggerEvent(m_eventID, this); } /***************************************************************/ so it does not matter if a client is attached or not I keep on sending the data to network. Thanks, Caduceus On 2013-10-19 01:54, Ross Finlayson wrote: > Your "deliverFrame()" function looks OK - provided that it gets called > whenever a new MPEG-4 frame is available. Note that each call to > "MyStreamingDeviceSource::doGetNextFrame()" must be followed > (eventually) by a call to "FramedSource::afterGetting(this)". > Therefore, if you return from "deliverFrame()" at the "if(newFrameSize > == 0) return;" statement (because no new frame data is available), you > must ensure that "deliverFrame()" somehow gets called again later when > new frame data *is* available. > >> when i am doing that I am not getting anything in VLC and >> testRTCPClient > > You should use "testRTSPClient" (sic) for testing first; only after > that's working should you bother using VLC. > >> I suspect that I did not send the configuration data before sending >> the actual data. I saw that my AVPAcket has "00000001B6" in starting >> and I think i somehow need to send "00000001B0" to signal start of >> data. >> >> I also noticed that when i encode using FFMpeg in codec->extradata I >> have some header information that has "00000001B0" in that but i >> dont know how to pack all this information > > Yes, this 'extra data' is important. You should treat it just as you > would a video frame, and make sure that it is the first 'frame' that > you feed to your "MyStreamingDeviceSource::deliverFrame()" is this > 'extra data'. I suspect that this may be enough to make your code > work. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [1] > > > Links: > ------ > [1] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Fri Oct 18 15:17:03 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 15:17:03 -0700 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> Message-ID: <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> > Thank you for the clarification. I will try to send this extra data as first packet. I have another confusion. if i send this packet as first packet when there was no client attached to it and later on client attached to it than how will that client get this packet As long as you deliver this extra data as the first frame - when you do your first delivery - then it will get remembered (as special 'configuration data') by the downstream "MPEG4VideoStreamDiscreteFramer" object. No problem. I assume that your "ServerMediaSubsession" subclass (for your RTSP server) is similar to the code for "MPEG4VideoFileServerMediaSubsession", in that you are implementing not only the virtual functions "createNewStreamSource()" and "createNewRTPSink()", but also the virtual function "getAuxSDPLine()" (in the same way as is done in "MPEG4VideoFileServerMediaSubsession.cpp"). > Right now the way I am using FrameSource is: > > /****************************************************************/ > void MyStreamingDeviceSource::streamPacket(AVPacket pkt) > { > m_queue.putPacket(&pkt); > envir().taskScheduler().triggerEvent(m_eventID, this); > } > /***************************************************************/ > > so it does not matter if a client is attached or not I keep on sending the data to network. OK, so you'll need to add this line to the start of your "deliverFrame()" function: if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet (see the example in "DeviceSource.cpp") Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Fri Oct 18 15:19:55 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Sat, 19 Oct 2013 03:49:55 +0530 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> Message-ID: <800ebbb8bca3a2b6d2a8b65bd4045387@neurosoft.in> I tried sending this extra data as first packet and also tried sending it with each packet but no use. I debuggedit and found that when i send this packet my MPEG4discreteFrame found all the information it needed (debugged function analyzeVOLHeader()) but still my tetsRTSPClient does not show any packets received. Thanks On 2013-10-19 02:54, ssingh at neurosoft.in wrote: > Thank you for the clarification. I will try to send this extra data as > first packet. I have another confusion. if i send this packet as first > packet when there was no client attached to it and later on client > attached to it than how will that client get this packet as it will be > lost. Right now the way I am using FrameSource is: > > /****************************************************************/ > void MyStreamingDeviceSource::streamPacket(AVPacket pkt) > { > m_queue.putPacket(&pkt); > envir().taskScheduler().triggerEvent(m_eventID, this); > } > /***************************************************************/ > > so it does not matter if a client is attached or not I keep on sending > the data to network. > > Thanks, > Caduceus > > On 2013-10-19 01:54, Ross Finlayson wrote: >> Your "deliverFrame()" function looks OK - provided that it gets called >> whenever a new MPEG-4 frame is available. Note that each call to >> "MyStreamingDeviceSource::doGetNextFrame()" must be followed >> (eventually) by a call to "FramedSource::afterGetting(this)". >> Therefore, if you return from "deliverFrame()" at the "if(newFrameSize >> == 0) return;" statement (because no new frame data is available), you >> must ensure that "deliverFrame()" somehow gets called again later when >> new frame data *is* available. >> >>> when i am doing that I am not getting anything in VLC and >>> testRTCPClient >> >> You should use "testRTSPClient" (sic) for testing first; only after >> that's working should you bother using VLC. >> >>> I suspect that I did not send the configuration data before sending >>> the actual data. I saw that my AVPAcket has "00000001B6" in starting >>> and I think i somehow need to send "00000001B0" to signal start of >>> data. >>> >>> I also noticed that when i encode using FFMpeg in codec->extradata I >>> have some header information that has "00000001B0" in that but i >>> dont know how to pack all this information >> >> Yes, this 'extra data' is important. You should treat it just as you >> would a video frame, and make sure that it is the first 'frame' that >> you feed to your "MyStreamingDeviceSource::deliverFrame()" is this >> 'extra data'. I suspect that this may be enough to make your code >> work. >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ [1] >> >> >> Links: >> ------ >> [1] http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Fri Oct 18 15:42:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 15:42:32 -0700 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <800ebbb8bca3a2b6d2a8b65bd4045387@neurosoft.in> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> <800ebbb8bca3a2b6d2a8b65bd4045387@neurosoft.in> Message-ID: <19DEFE4C-0473-4CA8-881E-7864FF88102E@live555.com> > I tried sending this extra data as first packet and also tried sending it with each packet but no use. I debuggedit and found that when i send this packet my MPEG4discreteFrame found all the information it needed (debugged function analyzeVOLHeader()) but still my tetsRTSPClient does not show any packets received. Make sure that your "getAuxSDPLine()" implementation is working OK. In particular, you should make sure that you eventually return from the call to envir().taskScheduler().doEventLoop(&fDoneFlag); and thereby eventually return from the function. If you use the same code that's in "MPEG4VideoFileServerMediaSubsession.cpp", then it should work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Fri Oct 18 16:24:45 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Sat, 19 Oct 2013 04:54:45 +0530 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> Message-ID: <48f6283eb5ecf131060d752566019c66@neurosoft.in> Thanks for the input and i rellay appreciate your help. If i subclass ServerMediaSubsession than I need to implement other functions too. Is there a another class which will do most for me. I noticed that OnDemandMediaSubsession is being used for some subsession implementation but is it only for unicast or can i use it for multicast too. I am little confused as for what the ondemand media subsession is used. Thanks, On 2013-10-19 03:47, Ross Finlayson wrote: >> Thank you for the clarification. I will try to send this extra data >> as first packet. I have another confusion. if i send this packet as >> first packet when there was no client attached to it and later on >> client attached to it than how will that client get this packet > > As long as you deliver this extra data as the first frame - when you > do your first delivery - then it will get remembered (as special > 'configuration data') by the downstream > "MPEG4VideoStreamDiscreteFramer" object. No problem. > > I assume that your "ServerMediaSubsession" subclass (for your RTSP > server) is similar to the code for > "MPEG4VideoFileServerMediaSubsession", in that you are implementing > not only the virtual functions "createNewStreamSource()" and > "createNewRTPSink()", but also the virtual function "getAuxSDPLine()" > (in the same way as is done in > "MPEG4VideoFileServerMediaSubsession.cpp"). > >> Right now the way I am using FrameSource is: >> >> /****************************************************************/ >> void MyStreamingDeviceSource::streamPacket(AVPacket pkt) >> { >> m_queue.putPacket(&pkt); >> envir().taskScheduler().triggerEvent(m_eventID, this); >> } >> /***************************************************************/ >> >> so it does not matter if a client is attached or not I keep on >> sending the data to network. > > OK, so you'll need to add this line to the start of your > "deliverFrame()" function: > > if (!isCurrentlyAwaitingData()) return; // we're not ready for the > data yet > > (see the example in "DeviceSource.cpp") > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [1] > > > Links: > ------ > [1] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Fri Oct 18 16:48:26 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 16:48:26 -0700 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <48f6283eb5ecf131060d752566019c66@neurosoft.in> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> <48f6283eb5ecf131060d752566019c66@neurosoft.in> Message-ID: > Thanks for the input and i rellay appreciate your help. If i subclass ServerMediaSubsession than I need to implement other functions too. Is there a another class which will do most for me. I noticed that OnDemandMediaSubsession is being used for some subsession implementation but is it only for unicast or can i use it for multicast too. I am little confused as for what the ondemand media subsession is used. Yes, for streaming via unicast, you should subclass "OnDemandServerMediaSubsession", and implement the virtual functions "createNewStreamSource()", "createNewRTPSink()", and also (because MPEG-4 is a codec that uses special 'configuration' data) "getAuxSDPLine()". Also, in your subclass's constructor, when it calls the "OnDemandServerMediaSubsession" constructor, be sure to set the "reuseFirstSource" parameter to True (because you're streaming from a live source, rather than from a file). See also http://www.live555.com/liveMedia/faq.html#liveInput-unicast If instead you want to stream via multicast, then use the "testMPEG4VideoStreamer" code as a model, and see http://www.live555.com/liveMedia/faq.html#liveInput Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Fri Oct 18 17:22:04 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Sat, 19 Oct 2013 05:52:04 +0530 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> <48f6283eb5ecf131060d752566019c66@neurosoft.in> Message-ID: <05270ecde0d1cabe662507640083e956@neurosoft.in> I tried the unicast server based on onDemand sample but my function "getAuxSDPLine()" never returns. When i debugged i found that function "checkForAuxSDPLine1()" calls the function "fDummyRTPSink->auxSDPLine()" which is the function implemented in MPEG4RTPSink which in turn tried to get the pointer to MPEG4VideoStreamFramer which seems wrong as I put the MPEG4VideoStreamDiscreteFramer as input to RTPSink. Am i doing anything wrong or is it suppose to be like that. It never gets the data required from Framer so it is stuck checking that. Also another question if i need to implement multicast I dont need to subclass Subsession and implement 3 virtual functions. Is my understanding correct? Sorry to bother you that much but I really appreciate your help. On 2013-10-19 05:18, Ross Finlayson wrote: >> Thanks for the input and i rellay appreciate your help. If i >> subclass ServerMediaSubsession than I need to implement other >> functions too. Is there a another class which will do most for me. I >> noticed that OnDemandMediaSubsession is being used for some >> subsession implementation but is it only for unicast or can i use it >> for multicast too. I am little confused as for what the ondemand >> media subsession is used. > > Yes, for streaming via unicast, you should subclass > "OnDemandServerMediaSubsession", and implement the virtual functions > "createNewStreamSource()", "createNewRTPSink()", and also (because > MPEG-4 is a codec that uses special 'configuration' data) > "getAuxSDPLine()". Also, in your subclass's constructor, when it calls > the "OnDemandServerMediaSubsession" constructor, be sure to set the > "reuseFirstSource" parameter to True (because you're streaming from a > live source, rather than from a file). See also > http://www.live555.com/liveMedia/faq.html#liveInput-unicast [1] > > If instead you want to stream via multicast, then use the > "testMPEG4VideoStreamer" code as a model, and see > http://www.live555.com/liveMedia/faq.html#liveInput [2] > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [3] > > > Links: > ------ > [1] http://www.live555.com/liveMedia/faq.html#liveInput-unicast > [2] http://www.live555.com/liveMedia/faq.html#liveInput > [3] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Fri Oct 18 18:56:21 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Oct 2013 18:56:21 -0700 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <05270ecde0d1cabe662507640083e956@neurosoft.in> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> <48f6283eb5ecf131060d752566019c66@neurosoft.in> <05270ecde0d1cabe662507640083e956@neurosoft.in> Message-ID: <0E973C19-6A36-49F5-AB06-3E732CB50E16@live555.com> > I tried the unicast server based on onDemand sample but my function "getAuxSDPLine()" never returns. When i debugged i found that function "checkForAuxSDPLine1()" calls the function "fDummyRTPSink->auxSDPLine()" which is the function implemented in MPEG4RTPSink which in turn tried to get the pointer to MPEG4VideoStreamFramer which seems wrong as I put the MPEG4VideoStreamDiscreteFramer as input to RTPSink. Am i doing anything wrong or is it suppose to be like that. It never gets the data required from Framer so it is stuck checking that. Yes, and that's your problem. Note "MPEG4ESVideoRTPSink.cpp", lines 113 and 116. Either the call to framerSource->profile_and_level_indication() is returning 0, or the call to framerSource->getConfigBytes(configLength) is returning NULL. This means that your "MPEG4VideoStreamDiscreteFramer" object (which, BTW, is a subclass of "MPEG4VideoStreamFramer") *did not* receive correct configuration data. In particular, it shows that the code in "MPEG4VideoStreamDiscreteFramer.cpp" from lines 74-94 is not getting executed, despite the fact that you say that you're feeding a chunk of configuration data - beginning with 0xB0 - to your "MPEG4VideoStreamDiscreteFramer" as the first frame of data. So you need to figure out why that's not working. > Also another question if i need to implement multicast I dont need to subclass Subsession and implement 3 virtual functions. Is my understanding correct? Yes. However, you'd still need to have proper configuration data being fed to your "MPEG4VideoStreamDiscreteFramer" object (i.e., the same problem you're having with unicast), otherwise your RTSP server still would not be able to produce a correct SDP description for the stream (and no decoder would ever be able to decode it). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From moco.sun at gmail.com Thu Oct 17 18:34:34 2013 From: moco.sun at gmail.com (yuke sun) Date: Fri, 18 Oct 2013 09:34:34 +0800 Subject: [Live-devel] [help] trick play when play a mkv file by live555 Message-ID: It seems that live555 support RTSP 'trick play' operations for some media types. Recently when I attempt to stream a mkv file through live555, i found VLC player's functionality of trick play doesn't work well. When i seeked to time 1:00, it got data correctly(this means live555 has seeked to the right file position),but the progress bar of VLC reached 2:00. If I seeked to time 4:00, the progress bar reached 7:30! *Is there any bug in live555? Or VLC's bug?* Thanks for help. ** Version info: live555: version 0.78, 0.77 vlc: 2.08, 2.10 -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Fri Oct 18 02:58:06 2013 From: fantasyvideo at 126.com (=?gb2312?B?w867w7mk1/fK0g==?=) Date: Fri, 18 Oct 2013 17:58:06 +0800 Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp Message-ID: <007301cecbe8$8bce5710$a36b0530$@126.com> HI, In the videoframesource?s getnextframe, if the buffer is nalu, not completely frame. So the fPresentationTime and fDurationInMicroseconds should only be set when the buffer is the last nalu in current frame. Is it right? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Oct 20 01:23:58 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Oct 2013 01:23:58 -0700 Subject: [Live-devel] [help] trick play when play a mkv file by live555 In-Reply-To: References: Message-ID: On Oct 17, 2013, at 6:34 PM, yuke sun wrote: > It seems that live555 support RTSP 'trick play' operations for some media types. > Recently when I attempt to stream a mkv file through live555, i found VLC player's functionality of trick play doesn't work well. When i seeked to time 1:00, it got data correctly(this means live555 has seeked to the right file position),but the progress bar of VLC reached 2:00. If I seeked to time 4:00, the progress bar reached 7:30! > Is there any bug in live555? Or VLC's bug? If the seeking appears to be working OK (i.e., the resulting media is coming from the correct time in the file), but the only problem is that VLC's progress bar is wrong, then most likely this is a problem with VLC. To be sure, though, why not put your file on a (publicly accessible) web server, and send us the URL, so we can download and test it for ourselves? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Oct 20 01:29:31 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Oct 2013 01:29:31 -0700 Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: <007301cecbe8$8bce5710$a36b0530$@126.com> References: <007301cecbe8$8bce5710$a36b0530$@126.com> Message-ID: > In the videoframesource?s getnextframe, if the buffer is nalu, not completely frame. So the fPresentationTime and fDurationInMicroseconds should only be set when the buffer is the last nalu in current frame. > Is it right? Not quite. "fPresentationTime" should be set for every NAL unit that you deliver. However, for NAL units that make up the same access unit, the "fPresentationTime" value will be the same. Also, if you are streaming from a live source (i.e., from an encoder), rather than from a file, then you don't need to set "fDurationInMicroseconds" at all. If, however, you are streaming pre-recorded video (e.g., from a file), then you will need to set "fDurationInMicroseconds" for the last NAL unit of the access unit (and leave "fDurationInMicroseconds" for the other NAL units at the default value of 0). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From morten at softwarehuset.dk Mon Oct 21 01:27:23 2013 From: morten at softwarehuset.dk (Morten S. Laursen) Date: Mon, 21 Oct 2013 10:27:23 +0200 Subject: [Live-devel] How to tie libvpx together with live555 Message-ID: Hi A streaming video newbie here, so please be gentle. I'm currently in a project where I'm trying to tie together live555 and libvpx for a live stream. I am however a little confused about how exactly to proceed. I have looked at the testOnDemandRTSPServer which uses the MatroskaFileServerDemux, which I've used for the general outline, however I'm a little confused about to generate the media subsession. As far as I have understood I'm supposed to create a new class describing the subsession inheriting from the OnDemandServerMediaSubsession class. In this class I shall then implement a new method createNewRTPSink, which instantiates and returns a VP8VideoRTPSink. But after this I'm a bit unclear how to tie the data packets from libvpx (vpx_codec_get_cx_data) into the VP8VideoRTPSink, I might be searching using the wrong terms, but the only thing I could find on the mailing list ( http://article.gmane.org/gmane.comp.video.livedotcom.devel/12178) is that I should not use a framer object. Could anyone give me a hint in the right direction? Thank you in advance Morten S. Laursen -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 21 01:55:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Oct 2013 01:55:32 -0700 Subject: [Live-devel] How to tie libvpx together with live555 In-Reply-To: References: Message-ID: > I'm currently in a project where I'm trying to tie together live555 and libvpx for a live stream. > I am however a little confused about how exactly to proceed. > I have looked at the testOnDemandRTSPServer which uses the MatroskaFileServerDemux, which I've used for the general outline Because you're streaming from a video-only source, rather than from a video+audio file, it would be better to use the "H.264" code (i.e., on lines 89-101 of "testOnDemandRTSPServer") as a model. > , however I'm a little confused about to generate the media subsession. See http://www.live555.com/liveMedia/faq.html#liveInput-unicast > As far as I have understood I'm supposed to create a new class describing the subsession inheriting from the OnDemandServerMediaSubsession class. > In this class I shall then implement a new method createNewRTPSink, which instantiates and returns a VP8VideoRTPSink. Yes, but you *also* need to implement another virtual function - "createNewStreamSource()". This will create an object of a new "FramedSource" subclass (that you must define) that delivers VP8 frames - one at a time. As noted in the FAQ, it is recommended that you use the "DeviceSource" code (see "liveMedia/DeviceSource.cpp") as a model for this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From morten at softwarehuset.dk Mon Oct 21 05:29:53 2013 From: morten at softwarehuset.dk (Morten S. Laursen) Date: Mon, 21 Oct 2013 14:29:53 +0200 Subject: [Live-devel] How to tie libvpx together with live555 In-Reply-To: References: Message-ID: 2013/10/21 Ross Finlayson > As far as I have understood I'm supposed to create a new class describing > the subsession inheriting from the OnDemandServerMediaSubsession class. > In this class I shall then implement a new method createNewRTPSink, which > instantiates and returns a VP8VideoRTPSink. > > > Yes, but you *also* need to implement another virtual function - > "createNewStreamSource()". This will create an object of a new > "FramedSource" subclass (that you must define) that delivers VP8 frames - > one at a time. As noted in the FAQ, it is recommended that you use the > "DeviceSource" code (see "liveMedia/DeviceSource.cpp") as a model for this. > > Thank you, that helped me quite a lot, now It seams as though the server is running, there still seems to be some problem with the metadata of the videostream preventing mplayer and vlc from showing the stream though, but I'm sure that is just a minor issue that takes a little more fiddling around. (Mplayer: unknown MPlayer format code for MIME type "video/VP8", VLC: no suitable decoder module for fourcc `undf'). Kind regards Morten S. Laursen -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Oct 21 08:39:34 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Oct 2013 08:39:34 -0700 Subject: [Live-devel] How to tie libvpx together with live555 In-Reply-To: References: Message-ID: > (Mplayer: unknown MPlayer format code for MIME type "video/VP8", VLC: no suitable decoder module for fourcc `undf'). You will need to raise this issue with the developers of the respective media players (MPlayer and VLC). But first, make sure that you're using the most up-to-date version of those players. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Mon Oct 21 10:58:20 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Mon, 21 Oct 2013 23:28:20 +0530 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <0E973C19-6A36-49F5-AB06-3E732CB50E16@live555.com> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> <48f6283eb5ecf131060d752566019c66@neurosoft.in> <05270ecde0d1cabe662507640083e956@neurosoft.in> <0E973C19-6A36-49F5-AB06-3E732CB50E16@live555.com> Message-ID: <5536c26af515784adba4d2a21b7635c0@neurosoft.in> Thank you Ross for all the help. I finally managed to get it working and now i have both audio and video being streamed. I dont know but somehow the refresh rate (as seen on vlc) is not good. My video frames comes incomplete and it takes time for vlc to paint the whole frame and audio stops after about 1 second. I thought it has something to do with my router which is wireless and is about 54Mbps. I had same issue running testMPEG1or2AudiovideoStreamer.exe with VLC as client and when i increased the router speed to 100 Mbps the issue went away for test program but its still there in my program. I dont know what I am doing wrong. I tried messing with estimateBandWidth varibale and also fDurationInMicroseconds but did not solve this issue. Do you know any possible reason for this. Thanks for all the help and I appreciate you inputs and time spend on ansewring the questions. On 2013-10-19 07:26, Ross Finlayson wrote: >> I tried the unicast server based on onDemand sample but my function >> "getAuxSDPLine()" never returns. When i debugged i found that >> function "checkForAuxSDPLine1()" calls the function >> "fDummyRTPSink->auxSDPLine()" which is the function implemented in >> MPEG4RTPSink which in turn tried to get the pointer to >> MPEG4VideoStreamFramer which seems wrong as I put the >> MPEG4VideoStreamDiscreteFramer as input to RTPSink. Am i doing >> anything wrong or is it suppose to be like that. It never gets the >> data required from Framer so it is stuck checking that. > > Yes, and that's your problem. Note "MPEG4ESVideoRTPSink.cpp", lines > 113 and 116. Either the call to > framerSource->profile_and_level_indication() > is returning 0, or the call to > framerSource->getConfigBytes(configLength) > is returning NULL. This means that your > "MPEG4VideoStreamDiscreteFramer" object (which, BTW, is a subclass of > "MPEG4VideoStreamFramer") *did not* receive correct configuration > data. In particular, it shows that the code in > "MPEG4VideoStreamDiscreteFramer.cpp" from lines 74-94 is not getting > executed, despite the fact that you say that you're feeding a chunk of > configuration data - beginning with 0xB0 - to your > "MPEG4VideoStreamDiscreteFramer" as the first frame of data. So you > need to figure out why that's not working. > >> Also another question if i need to implement multicast I dont need >> to subclass Subsession and implement 3 virtual functions. Is my >> understanding correct? > > Yes. However, you'd still need to have proper configuration data being > fed to your "MPEG4VideoStreamDiscreteFramer" object (i.e., the same > problem you're having with unicast), otherwise your RTSP server still > would not be able to produce a correct SDP description for the stream > (and no decoder would ever be able to decode it). > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [1] > > > Links: > ------ > [1] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From ssingh at neurosoft.in Mon Oct 21 12:35:11 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Tue, 22 Oct 2013 01:05:11 +0530 Subject: [Live-devel] MPEG4 streaming using ffmpeg as encoder In-Reply-To: <5536c26af515784adba4d2a21b7635c0@neurosoft.in> References: <2e0a8ca4ed3317e548d9541e86e5f493@neurosoft.in> <8F5969F3-F5AE-4041-8694-A9E6BEE5DE79@live555.com> <54cb33da2b4c74e8c0217dead00ecbb8@neurosoft.in> <41E68741-A530-4BD8-8875-471D9265E6E8@live555.com> <48f6283eb5ecf131060d752566019c66@neurosoft.in> <05270ecde0d1cabe662507640083e956@neurosoft.in> <0E973C19-6A36-49F5-AB06-3E732CB50E16@live555.com> <5536c26af515784adba4d2a21b7635c0@neurosoft.in> Message-ID: <58bb23acee2c47c5a36a718ce30c762e@neurosoft.in> Another thing I noticed is that the VLC shows lower bitrate. When I use tets programs it shows > 1000 and with my program its showing around 100-200, which can be a source of problem. Which factor determimes bitrate in live555. In one of your post you said its entirely 'fPresentationTime'. I am setting it to getTimeOfDay(). Is that wrong? On 2013-10-21 23:28, ssingh at neurosoft.in wrote: > Thank you Ross for all the help. I finally managed to get it working > and now i have both audio and video being streamed. I dont know but > somehow the refresh rate (as seen on vlc) is not good. My video frames > comes incomplete and it takes time for vlc to paint the whole frame > and audio stops after about 1 second. I thought it has something to do > with my router which is wireless and is about 54Mbps. I had same issue > running testMPEG1or2AudiovideoStreamer.exe with VLC as client and when > i increased the router speed to 100 Mbps the issue went away for test > program but its still there in my program. I dont know what I am doing > wrong. I tried messing with estimateBandWidth varibale and also > fDurationInMicroseconds but did not solve this issue. Do you know any > possible reason for this. > > Thanks for all the help and I appreciate you inputs and time spend on > ansewring the questions. > > > On 2013-10-19 07:26, Ross Finlayson wrote: >>> I tried the unicast server based on onDemand sample but my function >>> "getAuxSDPLine()" never returns. When i debugged i found that >>> function "checkForAuxSDPLine1()" calls the function >>> "fDummyRTPSink->auxSDPLine()" which is the function implemented in >>> MPEG4RTPSink which in turn tried to get the pointer to >>> MPEG4VideoStreamFramer which seems wrong as I put the >>> MPEG4VideoStreamDiscreteFramer as input to RTPSink. Am i doing >>> anything wrong or is it suppose to be like that. It never gets the >>> data required from Framer so it is stuck checking that. >> >> Yes, and that's your problem. Note "MPEG4ESVideoRTPSink.cpp", lines >> 113 and 116. Either the call to >> framerSource->profile_and_level_indication() >> is returning 0, or the call to >> framerSource->getConfigBytes(configLength) >> is returning NULL. This means that your >> "MPEG4VideoStreamDiscreteFramer" object (which, BTW, is a subclass of >> "MPEG4VideoStreamFramer") *did not* receive correct configuration >> data. In particular, it shows that the code in >> "MPEG4VideoStreamDiscreteFramer.cpp" from lines 74-94 is not getting >> executed, despite the fact that you say that you're feeding a chunk of >> configuration data - beginning with 0xB0 - to your >> "MPEG4VideoStreamDiscreteFramer" as the first frame of data. So you >> need to figure out why that's not working. >> >>> Also another question if i need to implement multicast I dont need >>> to subclass Subsession and implement 3 virtual functions. Is my >>> understanding correct? >> >> Yes. However, you'd still need to have proper configuration data being >> fed to your "MPEG4VideoStreamDiscreteFramer" object (i.e., the same >> problem you're having with unicast), otherwise your RTSP server still >> would not be able to produce a correct SDP description for the stream >> (and no decoder would ever be able to decode it). >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ [1] >> >> >> Links: >> ------ >> [1] http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From ssingh at neurosoft.in Mon Oct 21 16:48:57 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Mon, 21 Oct 2013 16:48:57 -0700 Subject: [Live-devel] True push DeviceSource Message-ID: Hi, I am confused as how the event mechanism works in live555. I have a source that is fed with video and audio frames and I want to trigger doGetNextFrame() of my custom DeviceSource so that those frames are streamed using live555. For this I am using m_eventID = envir().taskScheduler().createEventTrigger(deliverFrame0); envir().taskScheduler().triggerEvent(m_eventID, this); and void MyStreamingDeviceSource::deliverFrame0(void* clientData) { ((MyStreamingDeviceSource*)clientData)->doGetNextFrame(); } But doGetFrame is called when I called videoSink->startPlaying() too which is not valid for me as I dont have any data yet to stream. I dont know what shall i do in this scenario. I am also trying to use isCurrentlyAwaitingData() but I am not sure what this function does. I checked in source and its just checking one boolean variable. Also if i return prematurely after checking that i dont have any data it gives internalError() until i call FrameSource::afterGetting() which again is not valid as i dont have any data and there is no point telling the sink to stream null data. So are the DeviceSources acts as Pull source or push source in live555. I am interested in push source which i can derive depending upon the data i receieve from other modules. Thanks, Caduceus From michel.promonet at thalesgroup.com Tue Oct 22 01:01:57 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 22 Oct 2013 10:01:57 +0200 Subject: [Live-devel] RTSP over HTTP in URL Message-ID: <27559_1382428922_526630FA_27559_242_1_1BE8971B6CFF3A4F97AF4011882AA25501563E77ABEF@THSONEA01CMS01P.one.grp> Hi Ross, By now selecting RTSP over http in the RTSP client need to specify the port in its constructor and it works fine. In such a situation the port specified in the URL is ignored. I wondering whether it could be more convenient to select the HTTP tunneling through the URL and decode the port. Perhaps : - rtspoverhttp://ip:port/... - rtsp:http://ip:port/... It did not find something standard, some cameras use a ?Transport=HTTP argument. Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 22 02:09:33 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Oct 2013 02:09:33 -0700 Subject: [Live-devel] RTSP over HTTP in URL In-Reply-To: <27559_1382428922_526630FA_27559_242_1_1BE8971B6CFF3A4F97AF4011882AA25501563E77ABEF@THSONEA01CMS01P.one.grp> References: <27559_1382428922_526630FA_27559_242_1_1BE8971B6CFF3A4F97AF4011882AA25501563E77ABEF@THSONEA01CMS01P.one.grp> Message-ID: <307AC5FD-222A-44BA-9E9E-67D45B82CD32@live555.com> > By now selecting RTSP over http in the RTSP client need to specify the port in its constructor and it works fine. > > In such a situation the port specified in the URL is ignored. That's correct. > I wondering whether it could be more convenient to select the HTTP tunneling through the URL and decode the port. > Perhaps : > - rtspoverhttp://ip:port/? > - rtsp:http://ip:port/? Neither of those are standard, and the current mechanism works OK. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From morten at softwarehuset.dk Tue Oct 22 03:58:50 2013 From: morten at softwarehuset.dk (Morten S. Laursen) Date: Tue, 22 Oct 2013 12:58:50 +0200 Subject: [Live-devel] How to tie libvpx together with live555 In-Reply-To: References: Message-ID: Hi Ross, Just tested with the latest version from the respective repository, and it does indeed seem to be working with these. Thank you very much for your suggestions, I really appreciate it. I did assume it was something on my end, as I thought support was more widespread? It even seems like google's own API for android has problems playing back the stream, even though Google explicitly lists support for both RTSP and VP8 (https://developer.android.com/guide/appendix/media-formats.html). Is there some defacto standard normally asuming the video to be wrapped in a container or do you have any other ideas about the compatibility issues? Kind regards Morten S. Laursen 2013/10/21 Ross Finlayson > (Mplayer: unknown MPlayer format code for MIME type "video/VP8", VLC: no > suitable decoder module for fourcc `undf'). > > > You will need to raise this issue with the developers of the respective > media players (MPlayer and VLC). But first, make sure that you're using > the most up-to-date version of those players. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 22 05:54:41 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Oct 2013 05:54:41 -0700 Subject: [Live-devel] How to tie libvpx together with live555 In-Reply-To: References: Message-ID: <1954D29F-9300-4053-9541-125C6114664E@live555.com> > It even seems like google's own API for android has problems playing back the stream, even though Google explicitly lists support for both RTSP and VP8 (https://developer.android.com/guide/appendix/media-formats.html). Why don't you ask them (i.e., Google) about this? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 22 06:05:39 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Oct 2013 06:05:39 -0700 Subject: [Live-devel] FYI: Updates to the RTSP "REGISTER" mechanism Message-ID: I have installed a new version (2013.10.22) of the "LIVE555 Streaming Media" software that updates and improves our custom RTSP "REGISTER" mechanism. In particular: - In the "testProgs" directory, there is now a utility application "registerRTSPStream" that registers a given "rtsp://" URL with a remote RTSP client or proxy server. (Bob Bischan may find this useful.) - The "LIVE555 Proxy Server" now takes an optional command-line argument -U that allows you to specify access control for incoming "REGISTER" requests. If you are using the 'remote registration' feature of the proxy server (i.e., with the -R option), and your proxy server is accessible on the public Internet, then it is recommended that you also use the "-U " option, to stop random people from registering streams with your proxy server without your knowledge. (Note: The "registerRTSPStream" application noted above also lets you specify a username and password.) - The API to "RTSPServer::registerStream()" has changed (including the addition of optional username and password parameters). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 22 06:13:36 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Oct 2013 06:13:36 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: References: Message-ID: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> > I am confused as how the event mechanism works in live555. I have a source that is fed with video and audio frames and I want to trigger doGetNextFrame() of my custom DeviceSource so that those frames are streamed using live555. For this I am using > > > m_eventID = envir().taskScheduler().createEventTrigger(deliverFrame0); > envir().taskScheduler().triggerEvent(m_eventID, this); The "this" in the "triggerEvent()" call is wrong, because you should not be calling "triggerEvent()" from within one of your 'DeviceSource' class's member functions. "triggerEvent()" should be called from a *separate thread* - the thread that is doing your encoding. Because this separate thread is not the LIVE555 thread, then "triggerEvent()" is the *only* LIVE555 code that it is allowed to be calling. > void MyStreamingDeviceSource::deliverFrame0(void* clientData) > { > ((MyStreamingDeviceSource*)clientData)->doGetNextFrame(); > } No, don't do this. "deliverFrame0()" should call "deliverFrame()", as illustrated in the "DeviceSource" code. > But doGe[Next]tFrame is called when I called videoSink->startPlaying() too which is not valid for me as I dont have any data yet to stream. That's OK. When data later *does* become available, then your separate 'encoder' thread will call "triggerEvent()", and then "deliverFrame()" will be called. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Tue Oct 22 10:43:38 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Tue, 22 Oct 2013 12:43:38 -0500 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) Message-ID: When running Proxy Server in a multiple client configuration there appears to be an obscure problem (html5: media error) with recording h264/mp4 files. If I run a single recording application (VLC VLM based) pulling streams from Proxy Server I do not encounter this error. Files work just fine on desktop (VLC,Mplayer, Quicktime..etc) and in html5 video tag. When a second application (crtmpserver / Flash server) is run simultaneously pulling the same streams from Proxy Server, the h264/mp4 recordings from the VLC/VLM based recording no longer validate as html5 video. This observation also appears to depend on OS / Browser type and version??? The one consistent variable has been...If one client is connected to Proxy Server the problem goes away. Debugging of the files has been difficult as they will always play in desktop client, but I did find the following info in chrome debug: MEDIA_ERR_ABORTED: 1 MEDIA_ERR_DECODE: 3 MEDIA_ERR_ENCRYPTED: 5 MEDIA_ERR_NETWORK: 2 MEDIA_ERR_SRC_NOT_SUPPORTED: 4 Despite this issue, Proxy Server is working well with both applications...with no errors or other performance issues. Live555 version: 2013.10.18 I'm at a loss to explain this issue. Below is a link to sample files should anyone have time to take a look :-) http://184.175.65.70/temp/ html5_bad.mp4 html5_ok.mp4 Thanks, Bob -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 22 11:11:33 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Oct 2013 11:11:33 -0700 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: References: Message-ID: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> Unfortunately your description of your problem wasn't particularly clear; it seems, though, that you should be investigating your recording application(s), to figure out why they're not giving you what you want. (In particular, you could investigate how well they handle packet (i.e., data) loss, which is more likely when you have more streaming. However, you do realize - I hope - that when you stream through our proxy server, there's only *one* stream coming from the back-end server (to the proxy server), regardless of how many front-end clients are currently accessing the proxy server. That means, therefore, that if your back-end stream comes from a file (rather than from a live source like a network camera), then only the first front-end client will get to see the stream starting from the beginning of the file. Subsequent front-end clients will (obviously) start receiving the stream from some later point. If that's important to your recording application(s), then they shouldn't be receiving the stream this way. (Instead, just copy the file directly :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Tue Oct 22 11:42:18 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Tue, 22 Oct 2013 13:42:18 -0500 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> Message-ID: The back-end servers in this case are feeding a live stream to Proxy Server. From there I have one application that connects to Proxy Server to convert the rtsp:// streams to rtmp:// for flash presentation. Simultaneously, there is a second application that connects to Proxy Server to record the streams to file. Both of these applications are behaving very well with Proxy Server (rtmp:// streams are working....recording is consistent)....just the unusual file issue in the case where both applications are running together. If I shutdown crtmpserver and leave just recording application running the recorded files are back to normal. It's somewhat like a Rubik's Cube when putting things together in this fashion, but overall it's working better than I would have expected. I guess my question would be this: Assuming all back-end servers are live rtsp:// streams going to Proxy Server. Would each front-end client connection to Proxy Server be isolated from the other...would the characteristics of the stream provided by Proxy Server be as if each application was connected in isolation? Bob On Tue, Oct 22, 2013 at 1:11 PM, Ross Finlayson wrote: > Unfortunately your description of your problem wasn't particularly clear; > it seems, though, that you should be investigating your recording > application(s), to figure out why they're not giving you what you want. > (In particular, you could investigate how well they handle packet (i.e., > data) loss, which is more likely when you have more streaming. > > However, you do realize - I hope - that when you stream through our proxy > server, there's only *one* stream coming from the back-end server (to the > proxy server), regardless of how many front-end clients are currently > accessing the proxy server. That means, therefore, that if your back-end > stream comes from a file (rather than from a live source like a network > camera), then only the first front-end client will get to see the stream > starting from the beginning of the file. Subsequent front-end clients will > (obviously) start receiving the stream from some later point. If that's > important to your recording application(s), then they shouldn't be > receiving the stream this way. (Instead, just copy the file directly :-) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 22 11:54:07 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Oct 2013 11:54:07 -0700 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> Message-ID: <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> > I guess my question would be this: Assuming all back-end servers are live rtsp:// streams going to Proxy Server. Would each front-end client connection to Proxy Server be isolated from the other...would the characteristics of the stream provided by Proxy Server be as if each application was connected in isolation? Yes, it should. In fact, when streaming to >1 front-end clients, the proxy server transmits the exact same RTP/RTCP packets to each client. So the only difference should be the somewhat increased probability of packet loss (due to the Nx increase in packets being sent). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Tue Oct 22 12:39:32 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Tue, 22 Oct 2013 14:39:32 -0500 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> Message-ID: That makes sense. I think I need to change my environment to get a better sense of what the underlying problem might be. Currently, I have one application residing on the same server as the proxy and the other is out on amazon cloud. in this scenario proxy server is feeding 2 clients with vastly different network characteristics. In the first case the client would have very little packet loss and low latency and the latter more of both. I will try running both clients on the same server as proxy to see if the outcome is different. thanks for the input...much appreciated. bob On Oct 22, 2013 2:09 PM, "Ross Finlayson" wrote: > I guess my question would be this: Assuming all back-end servers are live > rtsp:// streams going to Proxy Server. Would each front-end client > connection to Proxy Server be isolated from the other...would the > characteristics of the stream provided by Proxy Server be as if each > application was connected in isolation? > > > Yes, it should. In fact, when streaming to >1 front-end clients, the > proxy server transmits the exact same RTP/RTCP packets to each client. So > the only difference should be the somewhat increased probability of packet > loss (due to the Nx increase in packets being sent). > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Tue Oct 22 02:18:10 2013 From: fantasyvideo at 126.com (Tony) Date: Tue, 22 Oct 2013 17:18:10 +0800 (CST) Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: References: <007301cecbe8$8bce5710$a36b0530$@126.com> Message-ID: <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> Thanks your answer. There is another question about the scheduleDelayedTask(duration,x,x). So how should I set the duration, then the audio and video would be sync. Currently my every audio's frame is 20000ms. the video frame rate is 25fps.. Now I set the audio's next getframe time is 20000ms, video's next getframe time is 8000ms. In such case, I use vlc to access it, it also shows that the audio is "PTS is out of range". It seems that the audio is too late. So if I slowed the video send rate, the result is video is "PTS is out range". So is there any solution to solve it ? ? 2013-10-20 16:29:31?"Ross Finlayson" ??? In the videoframesource?s getnextframe, if the buffer is nalu, not completely frame. So the fPresentationTime and fDurationInMicroseconds should only be set when the buffer is the last nalu in current frame. Is it right? Not quite. "fPresentationTime" should be set for every NAL unit that you deliver. However, for NAL units that make up the same access unit, the "fPresentationTime" value will be the same. Also, if you are streaming from a live source (i.e., from an encoder), rather than from a file, then you don't need to set "fDurationInMicroseconds" at all. If, however, you are streaming pre-recorded video (e.g., from a file), then you will need to set "fDurationInMicroseconds" for the last NAL unit of the access unit (and leave "fDurationInMicroseconds" for the other NAL units at the default value of 0). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Wed Oct 23 11:49:06 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Wed, 23 Oct 2013 11:49:06 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> Message-ID: <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> Thank you Ross for clarification, its more clear now. Now I am facing issue that i have separate thread that pushes audio packets for my device source to stream. I trigger event each time I push packet to that queue. I noticed that on VLC my audio comes for about a second and then stops. When I debugged my code I found that I have more than 1000 packets in my audio queue waiting to be streamed by devicesource. I think what is happening is that whenever I trigger an event and that event is already happening it ignores it, is it correct? Whats is the correct way to handle this. I think the audio packets should be streamed at the same rate as they are being encoded from live source. Thanks On 2013-10-22 06:13, Ross Finlayson wrote: >> I am confused as how the event mechanism works in live555. I have a >> source that is fed with video and audio frames and I want to trigger >> doGetNextFrame() of my custom DeviceSource so that those frames are >> streamed using live555. For this I am using >> >> m_eventID = >> envir().taskScheduler().createEventTrigger(deliverFrame0); >> envir().taskScheduler().triggerEvent(m_eventID, this); > > The "this" in the "triggerEvent()" call is wrong, because you should > not be calling "triggerEvent()" from within one of your 'DeviceSource' > class's member functions. "triggerEvent()" should be called from a > *separate thread* - the thread that is doing your encoding. Because > this separate thread is not the LIVE555 thread, then "triggerEvent()" > is the *only* LIVE555 code that it is allowed to be calling. > >> void MyStreamingDeviceSource::deliverFrame0(void* clientData) >> { >> ((MyStreamingDeviceSource*)clientData)->doGetNextFrame(); >> } > > No, don't do this. "deliverFrame0()" should call "deliverFrame()", as > illustrated in the "DeviceSource" code. > >> But doGe[Next]tFrame is called when I called >> videoSink->startPlaying() too which is not valid for me as I dont >> have any data yet to stream. > > That's OK. When data later *does* become available, then your separate > 'encoder' thread will call "triggerEvent()", and then "deliverFrame()" > will be called. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [1] > > > Links: > ------ > [1] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From bbischan at watchtower-security.com Wed Oct 23 14:16:49 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Wed, 23 Oct 2013 16:16:49 -0500 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> Message-ID: Ross, Wanted to follow-up on this issue. I have gone through additional testing throughout the day and have greatly simplified my testing scenario, however I'm still experiencing the strange file issue as described previously. After making changes in my test environment and making debug observations, I'm quite sure there is something amiss. The problem does not lend itself to easy explanation, but I'll try my best to communicate what I'm observing in my testing. Test Environment: Single back-end server (Axis camera) Proxy Server running on an isolated server Two test clients running on separate computers Proxy Server / Back-end server / clients all connected on local LAN Test Steps: 1. Start ProxyServer with -V -R options 2. Registered the single back-end server (works as expected; using new transport header options) . BTW, the option for setting suffix is a nice additon! 3. Connected a single client (VLC) to record the proxy stream to file. (Connects fine and I see debug output recording the connection) 4. Connect another client (VLC) to proxy stream to record a second file. (Connects fine, but I do not see any debug output showing this connection???) 5. Stop Recording on first client. 6 Stop recording on second client. Both clients produce mp4 files that have a problem with being validated as html5 video. If I run the same test scenario with just one client the file is different. It now validates as html5 video. I can reproduce this over and over with no change in the outcome. It almost seems like when running simultaneously the 2 clients are sharing the same rtsp session? Could it be that when one client sends STOP (stopping recording) and the other client is still playing/recording the stream Proxy Server does not respond to the STOP from the other client? I have attached a debug file that reflects steps (1-6). If you could take a quick look to see if there something obvious that would be great. From what I can see it seems given the steps outlined that the debug file would have been different. Also, if you have any suggestions on additional steps or debugging that I can do that would be great. Bob On Tue, Oct 22, 2013 at 1:54 PM, Ross Finlayson wrote: > I guess my question would be this: Assuming all back-end servers are live > rtsp:// streams going to Proxy Server. Would each front-end client > connection to Proxy Server be isolated from the other...would the > characteristics of the stream provided by Proxy Server be as if each > application was connected in isolation? > > > Yes, it should. In fact, when streaming to >1 front-end clients, the > proxy server transmits the exact same RTP/RTCP packets to each client. So > the only difference should be the somewhat increased probability of packet > loss (due to the Nx increase in packets being sent). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: proxy-dev.log Type: text/x-log Size: 18259 bytes Desc: not available URL: From finlayson at live555.com Wed Oct 23 14:48:40 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Oct 2013 14:48:40 -0700 Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> References: <007301cecbe8$8bce5710$a36b0530$@126.com> <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> Message-ID: > So how should I set the duration, then the audio and video would be sync. You don't. The way you ensure that your audio and video streams are in sync is by having your server give each (audio and video) frame accurate presentation times - i.e., in the setting of "fPresentationTime" by the (audio and video) "FramedSource" subclasses. If you do this, then the receiving client(s) will properly synchronize audio and video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Wed Oct 23 00:18:41 2013 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Wed, 23 Oct 2013 15:18:41 +0800 Subject: [Live-devel] HTTP Live Streaming Message-ID: Hi, I've come across the following problems with HTTP Live Streaming, 1. Replies to HTTP GET requests are sometime truncated. As an example, curl http://serverip/somets.ts will sometimes result in only part of the playlist I've traced this to fNumBytesToStream is not being initialized when created in handleHTTPCmd_StreamingGET, this caused doGetNextFrame to sometimes ended prematurely. Should fNumBytesToStream be initialized to bufferSize too? 2. When streaming is in progress and client disconnected in the middle of a transfer, mediaServer will keep trying sending to the disconnected client in TCPStreamSink::processBuffer and will not respond to further requests. Should there be a EPIPE check in the send call there? Thank you, Park. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 23 23:36:23 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Oct 2013 23:36:23 -0700 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> Message-ID: <381AB845-D1BB-46F2-BC3E-0D35C97308F7@live555.com> > 2. Registered the single back-end server (works as expected; using new transport header options) . BTW, the option for setting suffix is a nice additon! FYI, you can now use our new application "registerRTSPStream" (in the "testProgs" directory) for this. > 3. Connected a single client (VLC) to record the proxy stream to file. (Connects fine and I see debug output recording the connection) > > 4. Connect another client (VLC) to proxy stream to record a second file. (Connects fine, but I do not see any debug output showing this connection???) That's normal, and expected. What's important to understand is that the diagnostic output generated by the "LIVE555 Proxy Server" (when you give it the "-V" option) is only for the proxying functionality (between the proxy server and the back-end server). Proxying starts (by sending "SETUP" and "PLAY" commands) once the first front-end client connects, and stops (by sending a "PAUSE" command) once the last front-end client disconnects. During that time, the proxy server will continue to serve an arbitrary number of connecting/disconnecting front-end clients, but won't have any effect on the connection between the proxy server and the back-end server - until the last front-end client disconnects. That's why you don't see any additional diagnostic output when the second front-end client connects - but data will still be streamed to that front-end client, as expected. > I have attached a debug file that reflects steps (1-6). If you could take a quick look to see if there something obvious that would be great. Your file looks OK; the proxy server appears to be working normally. Once again, what I suspect is happening is that when you have N>1 front-end clients, you are experiencing some packet loss (because, in this case, each outgoing RTP packet from the proxy server is being transmitted N separate times (once to each front-end client)). And apparently your stream recording software is not tolerant of data loss. You can verify this by instead running "openRTSP" with the "-Q" (report QOS) option as your front-end clients. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 23 23:51:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Oct 2013 23:51:19 -0700 Subject: [Live-devel] HTTP Live Streaming In-Reply-To: References: Message-ID: <276BF549-8ED2-4C4E-8AF2-42418E8CDB06@live555.com> > 1. Replies to HTTP GET requests are sometime truncated. As an example, > curl http://serverip/somets.ts will sometimes result in only part of the playlist > > I've traced this to fNumBytesToStream is not being initialized when created in handleHTTPCmd_StreamingGET, this caused doGetNextFrame to sometimes ended prematurely. Are you sure?? If you are referring to the "fNumBytesToStream" variable in the "ByteStreamMemoryBufferSource" class, then note that this variable is checked only if "fLimitNumBytesToStream" is set, and in that case "fNumBytesToStream" will always be set (see "ByteStreamMemoryBufferSource.cpp", line 55). I'm not saying that there can't be a bug in this code, but if there is, I don't think it's what you're describing. > 2. When streaming is in progress and client disconnected in the middle of a transfer, mediaServer will keep trying sending to the disconnected client in TCPStreamSink::processBuffer and will not respond to further requests. I don't see how that can happen, because the socket is non-blocking. If the client disconnects, then the call to "send()" should return (with less than the expected number of bytes written). At that point, the server won't try "send()"ing to the socket again until it becomes writable again (which won't ever happen if the client has disconnected). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Wed Oct 23 20:54:24 2013 From: fantasyvideo at 126.com (Tony) Date: Thu, 24 Oct 2013 11:54:24 +0800 (CST) Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: References: <007301cecbe8$8bce5710$a36b0530$@126.com> <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> Message-ID: <41e3a3f2.20a94.141e8995385.Coremail.fantasyvideo@126.com> I saw the FAQ in live555 website. It said the live555 sync the audio and video by RTCP's SR packets. So I should create RTCP instance for each RTP source explititly? At 2013-10-24 05:48:40,"Ross Finlayson" wrote: So how should I set the duration, then the audio and video would be sync. You don't. The way you ensure that your audio and video streams are in sync is by having your server give each (audio and video) frame accurate presentation times - i.e., in the setting of "fPresentationTime" by the (audio and video) "FramedSource" subclasses. If you do this, then the receiving client(s) will properly synchronize audio and video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 24 01:35:56 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 01:35:56 -0700 Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: <41e3a3f2.20a94.141e8995385.Coremail.fantasyvideo@126.com> References: <007301cecbe8$8bce5710$a36b0530$@126.com> <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> <41e3a3f2.20a94.141e8995385.Coremail.fantasyvideo@126.com> Message-ID: <4CCD8B37-39FC-4EB8-9D76-C48781B1328E@live555.com> > I saw the FAQ in live555 website. > It said the live555 sync the audio and video by RTCP's SR packets. > So I should create RTCP instance for each RTP source explititly? No, because (assuming that you are controlling the streaming using RTSP) this is done implicitly. (In the RTSP server, this is done when the stream starts playing; in the RTSP client, it is done in the implementation of the "initiate()" function.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Thu Oct 24 01:53:12 2013 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Thu, 24 Oct 2013 16:53:12 +0800 Subject: [Live-devel] HTTP Live Streaming In-Reply-To: <276BF549-8ED2-4C4E-8AF2-42418E8CDB06@live555.com> References: <276BF549-8ED2-4C4E-8AF2-42418E8CDB06@live555.com> Message-ID: On Thu, Oct 24, 2013 at 2:51 PM, Ross Finlayson wrote: > 1. Replies to HTTP GET requests are sometime truncated. As an example, > curl http://serverip/somets.ts will sometimes result in only part of the > playlist > > I've traced this to fNumBytesToStream is not being initialized when > created in handleHTTPCmd_StreamingGET, this caused doGetNextFrame to > sometimes ended prematurely. > > > Are you sure?? If you are referring to the "fNumBytesToStream" variable > in the "ByteStreamMemoryBufferSource" class, then note that this variable > is checked only if "fLimitNumBytesToStream" is set, and in that case > "fNumBytesToStream" will always be set (see > "ByteStreamMemoryBufferSource.cpp", line 55). > > I'm not saying that there can't be a bug in this code, but if there is, I > don't think it's what you're describing. > > I missed the check on "fLimitNumBytesToStream". In this case, should "fLimitNumBytesToStream" be initialized to False? It is some non-zero value when testing the code. > 2. When streaming is in progress and client disconnected in the middle of > a transfer, mediaServer will keep trying sending to the disconnected client > in TCPStreamSink::processBuffer and will not respond to further requests. > > > I don't see how that can happen, because the socket is non-blocking. If > the client disconnects, then the call to "send()" should return (with less > than the expected number of bytes written). At that point, the server > won't try "send()"ing to the socket again until it becomes writable again > (which won't ever happen if the client has disconnected). > > Thanks for the clarifications, I've looked further and found that send() is returning -1, with errno being set to EPIPE. The problem is in "select" in BasicTaskScheduler::SingleStep still indicating a writable socket even when it is disconnected. Is this a libc problem? I'm working on Fedora 17 if that matters. Thank you. Park. -------------- next part -------------- An HTML attachment was scrubbed... URL: From krishnaks at iwavesystems.com Thu Oct 24 05:19:52 2013 From: krishnaks at iwavesystems.com (Krishna) Date: Thu, 24 Oct 2013 17:49:52 +0530 Subject: [Live-devel] FrameSource:Getnextframe error while streaming PCMframes In-Reply-To: References: Message-ID: <7DA4F08D95CC4047A2DB0557E4064009@IWAVE> Hi Ross, I found the problem that uLawFromPCMAudioSource afterGettingFrame is not getting called when I use DeviceSource based design and triggering concept. i.e. If I am calling FramedSource::afterGetting(this) in doGetNextFrame itself , it is calling afterGettingFrame function in uLawFromPCMAudioSource followed by calling afterGettingFrame function in MultiFramedRTPSink. If I am calling FramedSource::afterGetting(this) in deliverFrame(which will called by trigger event), then it is calling only afterGettingFrame function in MultiFramedRTPSink and not uLawFromPCMAudioSource afterGettingFrame function. That's why I am getting FramedSource ::getNextFrame():attempting to read more than once at the same time. Where I am going wrong? Can you please help on that? Thanks in advance From: Krishna Sent: Friday, October 11, 2013 12:40 PM To: live-devel at ns.live555.com Subject: [Live-devel] FrameSource:Getnextframe error while streaming PCMframes Hi Ross, I have problems streaming live PCM audio. Audio comes either directly from microphone (16-bit LE) Sampling frequency is 8k, mono(1 channel). I receive a buffer in the thread and use event trigger to signal my live555 thread. I've created class based on DeviceSource that inherit from AudioInputDevice and delivers the Frame on trigger. I am using uLawFromPCMAudioSource to convert to 8-bit u-law audio I am getting following error if I am giving audio format as WA_PCM: FramedSource ::getNextFrame():attempting to read more than once at the same time. One thing I observed here is FramedSource::getNextFrame is getting called twice at a time( uLawFromPCMAudioSource is calling it again) If I change audio format to WA_PCMU, I am able to stream without any error ( As FramedSource::getNextFrame is getting called once at a time), and VLC also able to play with some noise. Where I am going wrong ? Thanks in advance Regards, Krishna -------------------------------------------------------------------------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 24 05:53:13 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 05:53:13 -0700 Subject: [Live-devel] FrameSource:Getnextframe error while streaming PCMframes In-Reply-To: <7DA4F08D95CC4047A2DB0557E4064009@IWAVE> References: <7DA4F08D95CC4047A2DB0557E4064009@IWAVE> Message-ID: <9C79FCD8-021B-4853-91EA-EC43D9FDF8DA@live555.com> > I found the problem that uLawFromPCMAudioSource afterGettingFrame is not getting called when I use DeviceSource based design and triggering concept. > i.e. > If I am calling FramedSource::afterGetting(this) in doGetNextFrame itself , it is calling afterGettingFrame function in uLawFromPCMAudioSource followed by calling afterGettingFrame function in MultiFramedRTPSink. > > If I am calling FramedSource::afterGetting(this) in deliverFrame(which will called by trigger event), then it is calling only afterGettingFrame function in MultiFramedRTPSink and not uLawFromPCMAudioSource afterGettingFrame function. > That's why I am getting FramedSource ::getNextFrame():attempting to read more than once at the same time. > > Where I am going wrong? I can't tell what's wrong, without seeing your code. Please post the code for your "OnDemandServerMediaSubsession" subclass, and for your "DeviceSource" based class. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 24 06:15:34 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 06:15:34 -0700 Subject: [Live-devel] HTTP Live Streaming In-Reply-To: References: <276BF549-8ED2-4C4E-8AF2-42418E8CDB06@live555.com> Message-ID: > I missed the check on "fLimitNumBytesToStream". In this case, should "fLimitNumBytesToStream" be initialized to False? You're right - this is a bug. I've just installed a new version (2013.10.24) of the code that fixes this. Thanks again for the report. > Thanks for the clarifications, I've looked further and found that send() is returning -1, with errno being set to EPIPE. The problem is in "select" in BasicTaskScheduler::SingleStep still indicating a writable socket even when it is disconnected. Is this a libc problem? If "select()" is reporting that a socket is writable, when it's not, then that would appear to be a bug... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bbischan at watchtower-security.com Thu Oct 24 06:15:29 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Thu, 24 Oct 2013 08:15:29 -0500 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: <381AB845-D1BB-46F2-BC3E-0D35C97308F7@live555.com> References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> <381AB845-D1BB-46F2-BC3E-0D35C97308F7@live555.com> Message-ID: Ross, Thanks for your patience and time responding to this issue....I do understand that many of my questions and inquiries are probably outside the scope of this development list. With that said, I will be brief in my comments. With your input and running through numerous permutations from an implementation / testing perspective I believe I now know what the issue may be. If N=1 SETUP = SETUP PLAY = PLAY PAUSE = PAUSE TEARDOWN = TEARDOWN When N=1 I do not encounter the problem (irregardless of number of back-end streams). Client is able to close the recording file as expected. Works in desktop players and validates as html5 compatible video. My particular client (VLC) is definitely sending a TEARDOWN when closing the recording file. If N > 1 SETUP == SETUP (maybe slightly different than N=1) PLAY == PLAY PAUSE == PAUSE TEARDOWN = PAUSE (Here is what I believe is the issue) For N > 1 clients there is no actual TEARDOWN?? When N > 1 client sends TEARDOWN request, the response from ProxyServer is not precisely the same as if it where an actual TEARDOWN?? Another possibility would be that ProxyServer SETUP response is not the same for N > 1 client??? The recording files for N > 1 clients work fine with every desktop player I use (VLC, Totem, QuickTime, ffplay, Mplayer...etc), however the file no longer validates as html5 video. This is definitely not a critical issue...just a nuance I would like to better understand. Thanks, Bob On Thu, Oct 24, 2013 at 1:36 AM, Ross Finlayson wrote: > 2. Registered the single back-end server (works as expected; using new > transport header options) . BTW, the option for setting suffix is a nice > additon! > > > FYI, you can now use our new application "registerRTSPStream" (in the > "testProgs" directory) for this. > > > 3. Connected a single client (VLC) to record the proxy stream to file. > (Connects fine and I see debug output recording the connection) > > 4. Connect another client (VLC) to proxy stream to record a second file. > (Connects fine, but I do not see any debug output showing this > connection???) > > > That's normal, and expected. What's important to understand is that the > diagnostic output generated by the "LIVE555 Proxy Server" (when you give it > the "-V" option) is only for the proxying functionality (between the proxy > server and the back-end server). Proxying starts (by sending "SETUP" and > "PLAY" commands) once the first front-end client connects, and stops (by > sending a "PAUSE" command) once the last front-end client disconnects. > During that time, the proxy server will continue to serve an arbitrary > number of connecting/disconnecting front-end clients, but won't have any > effect on the connection between the proxy server and the back-end server - > until the last front-end client disconnects. That's why you don't see any > additional diagnostic output when the second front-end client connects - > but data will still be streamed to that front-end client, as expected. > > > I have attached a debug file that reflects steps (1-6). If you could take > a quick look to see if there something obvious that would be great. > > > Your file looks OK; the proxy server appears to be working normally. > > Once again, what I suspect is happening is that when you have N>1 > front-end clients, you are experiencing some packet loss (because, in this > case, each outgoing RTP packet from the proxy server is being transmitted N > separate times (once to each front-end client)). And apparently your > stream recording software is not tolerant of data loss. You can verify > this by instead running "openRTSP" with the "-Q" (report QOS) option as > your front-end clients. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 24 06:46:42 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 06:46:42 -0700 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> <381AB845-D1BB-46F2-BC3E-0D35C97308F7@live555.com> Message-ID: <6D846BF5-474A-43AC-A1FA-C291A3162C12@live555.com> > The recording files for N > 1 clients work fine with every desktop player I use (VLC, Totem, QuickTime, ffplay, Mplayer...etc), however the file no longer validates as html5 video. Why don't you try to find out why that is? In any case, this does not appear to be a problem that I can spend any more time on for free on this mailing list... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From krishnaks at iwavesystems.com Thu Oct 24 07:11:17 2013 From: krishnaks at iwavesystems.com (Krishna) Date: Thu, 24 Oct 2013 19:41:17 +0530 Subject: [Live-devel] FrameSource:Getnextframe error while streamingPCMframes In-Reply-To: <9C79FCD8-021B-4853-91EA-EC43D9FDF8DA@live555.com> References: <7DA4F08D95CC4047A2DB0557E4064009@IWAVE> <9C79FCD8-021B-4853-91EA-EC43D9FDF8DA@live555.com> Message-ID: Hi Ross, I have attached 1. my Device source file Wavsource.cpp 2. WaveStreamer .cpp( took a reference from testWavAudioStreamer.cpp) where I have thread to read the samples and have code for initialization and starting the session. Regards From: Ross Finlayson Sent: Thursday, October 24, 2013 6:23 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] FrameSource:Getnextframe error while streamingPCMframes I found the problem that uLawFromPCMAudioSource afterGettingFrame is not getting called when I use DeviceSource based design and triggering concept. i.e. If I am calling FramedSource::afterGetting(this) in doGetNextFrame itself , it is calling afterGettingFrame function in uLawFromPCMAudioSource followed by calling afterGettingFrame function in MultiFramedRTPSink. If I am calling FramedSource::afterGetting(this) in deliverFrame(which will called by trigger event), then it is calling only afterGettingFrame function in MultiFramedRTPSink and not uLawFromPCMAudioSource afterGettingFrame function. That's why I am getting FramedSource ::getNextFrame():attempting to read more than once at the same time. Where I am going wrong? I can't tell what's wrong, without seeing your code. Please post the code for your "OnDemandServerMediaSubsession" subclass, and for your "DeviceSource" based class. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------------------------------------------------------------------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: WaveStreamer.cpp URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: WAVSource.cpp URL: From bbischan at watchtower-security.com Thu Oct 24 07:50:09 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Thu, 24 Oct 2013 09:50:09 -0500 Subject: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue) In-Reply-To: <6D846BF5-474A-43AC-A1FA-C291A3162C12@live555.com> References: <495ADFC3-D85D-4B7C-BDE4-CFC13C28405A@live555.com> <4841A6F1-0CB5-48F0-9EF3-E9BB9BD7637B@live555.com> <381AB845-D1BB-46F2-BC3E-0D35C97308F7@live555.com> <6D846BF5-474A-43AC-A1FA-C291A3162C12@live555.com> Message-ID: Fair enough :-) I will contact you outside the dev-list to explore options. thanks, bob On Oct 24, 2013 8:58 AM, "Ross Finlayson" wrote: > The recording files for N > 1 clients work fine with every desktop player > I use (VLC, Totem, QuickTime, ffplay, Mplayer...etc), however the file no > longer validates as html5 video. > > > Why don't you try to find out why that is? > > In any case, this does not appear to be a problem that I can spend any > more time on for free on this mailing list... > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From piers.hawksley at panogenics.com Thu Oct 24 10:00:13 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Thu, 24 Oct 2013 18:00:13 +0100 Subject: [Live-devel] Changing from Multicast to Unicast Message-ID: <5269521D.2050403@panogenics.com> Hi Ross, Using the following code I can stop a server media session and restart it (with different parameters such as multicast address & port). I can also change from unicast to multicast and back. However when I change from multicast to unicast the multicast stream continues until I request an RTSP stream (with VLC). Am I missing a step in the remove code ? Note that I am calling the functions this code is in using events triggered by triggerEvent so I can have a sleep in the calling thread of 500ms between removing and adding the server media session. Note also that the RTSP server is still (potentially) streaming other server media sessions, so I can't stop and restart that. Looking at wireshark - the RTCP packets continue after the change of settings - is this an indication of what the issue might be ? To remove rtspServer->deleteServerMediaSession(stream->sms); stream->sms->deleteAllSubsessions(); To add stream->sms = ServerMediaSession::createNew(*env, ... ); (For Multicast) const Port rtpPort(stream->rtpPort); Groupsock *rtpGroupsock = new Groupsock(*env, destaddr, rtpPort, stream->ttl); rtpGroupsock->multicastSendOnly(); const Port rtcpPort(stream->rtcpPort); Groupsock *rtcpGroupsock = new Groupsock(*env, destaddr, rtcpPort, stream->ttl); rtcpGroupsock->multicastSendOnly(); RTPSink *sink = SimpleRTPSink::createNew(*env, rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); RTCPInstance *rtcp = RTCPInstance::createNew(*env, rtcpGroupsock, estimatedSessionBandwidth, CNAME, sink, NULL, True); stream->sms->addSubsession(PassiveServerMediaSubsession::createNew(*sink, rtcp)); MPEG2TransportStreamFromESSource *vidSrc = MPEG2TransportStreamFromESSource::createNew(*env); vidSrc->addNewVideoSource(source, 2); MPEG2TransportStreamFramer *videoSource = MPEG2TransportStreamFramer::createNew(*env, vidSrc); (For Unicast) (in createNewStreamSource) MPEG2TransportStreamFromESSource *videoSource = MPEG2TransportStreamFromESSource::createNew(envir()); videoSource->addNewVideoSource(source, 2); return MPEG2TransportStreamFramer::createNew(envir(), videoSource); (in createNewRTPSink) return SimpleRTPSink::createNew(envir(), rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); (then for both unicast & multicast) rtspServer->addServerMediaSession(stream->sms); My encoder produces MPEG2 Elementary Streams, so some of the code converts these to Transport Streams (thanks for the library calls to do this and http://lists.live555.com/pipermail/live-devel/2011-July/013674.html for showing me the way). Many Thanks, Piers Hawksley From ssingh at neurosoft.in Thu Oct 24 11:35:27 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Thu, 24 Oct 2013 11:35:27 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> Message-ID: <4e60ee563237db1866b60773ef0df240@neurosoft.in> Any input on this? On 2013-10-23 11:49, ssingh at neurosoft.in wrote: > Thank you Ross for clarification, its more clear now. Now I am facing > issue that i have separate thread that pushes audio packets for my > device source to stream. I trigger event each time I push packet to > that queue. I noticed that on VLC my audio comes for about a second > and then stops. When I debugged my code I found that I have more than > 1000 packets in my audio queue waiting to be streamed by devicesource. > > I think what is happening is that whenever I trigger an event and that > event is already happening it ignores it, is it correct? Whats is the > correct way to handle this. I think the audio packets should be > streamed at the same rate as they are being encoded from live source. > > Thanks > > > On 2013-10-22 06:13, Ross Finlayson wrote: >>> I am confused as how the event mechanism works in live555. I have a >>> source that is fed with video and audio frames and I want to trigger >>> doGetNextFrame() of my custom DeviceSource so that those frames are >>> streamed using live555. For this I am using >>> >>> m_eventID = >>> envir().taskScheduler().createEventTrigger(deliverFrame0); >>> envir().taskScheduler().triggerEvent(m_eventID, this); >> >> The "this" in the "triggerEvent()" call is wrong, because you should >> not be calling "triggerEvent()" from within one of your 'DeviceSource' >> class's member functions. "triggerEvent()" should be called from a >> *separate thread* - the thread that is doing your encoding. Because >> this separate thread is not the LIVE555 thread, then "triggerEvent()" >> is the *only* LIVE555 code that it is allowed to be calling. >> >>> void MyStreamingDeviceSource::deliverFrame0(void* clientData) >>> { >>> ((MyStreamingDeviceSource*)clientData)->doGetNextFrame(); >>> } >> >> No, don't do this. "deliverFrame0()" should call "deliverFrame()", as >> illustrated in the "DeviceSource" code. >> >>> But doGe[Next]tFrame is called when I called >>> videoSink->startPlaying() too which is not valid for me as I dont >>> have any data yet to stream. >> >> That's OK. When data later *does* become available, then your separate >> 'encoder' thread will call "triggerEvent()", and then "deliverFrame()" >> will be called. >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ [1] >> >> >> Links: >> ------ >> [1] http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Thu Oct 24 13:03:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 13:03:19 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> Message-ID: > Thank you Ross for clarification, its more clear now. Now I am facing issue that i have separate thread that pushes audio packets for my device source to stream. I trigger event each time I push packet to that queue. I noticed that on VLC my audio comes for about a second and then stops. When I debugged my code I found that I have more than 1000 packets in my audio queue waiting to be streamed by devicesource. > > I think what is happening is that whenever I trigger an event and that event is already happening it ignores it, is it correct? Whats is the correct way to handle this. I think the audio packets should be streamed at the same rate as they are being encoded from live source. This should all be OK, provided that your "doGetNextFrame()" implementation knows to check for (queued) data that is immediately available, rather than always waiting for a triggered event. Note this code on lines 87-90 of "DeviceSource.cpp": // If a new frame of data is immediately available to be delivered, then do this now: if (0 /* a new frame of data is immediately available to be delivered*/ /*%%% TO BE WRITTEN %%%*/) { deliverFrame(); } Replace the "0" with some test that checks whether queued data is immediately available. (Remember that because your queue data structure is also being written to by the encoding thread, it needs to be 'thread safe'.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 24 15:01:08 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 15:01:08 -0700 Subject: [Live-devel] FrameSource:Getnextframe error while streamingPCMframes In-Reply-To: References: <7DA4F08D95CC4047A2DB0557E4064009@IWAVE> <9C79FCD8-021B-4853-91EA-EC43D9FDF8DA@live555.com> Message-ID: <7B21282B-5A10-4F46-B5AE-C82221F2DED1@live555.com> I think your problem is here: void triggerLive555Scheduler(void) { scheduler->triggerEvent(WAVSource::s_frameReceivedTrigger, sessionState.source); } The problem with this is the second parameter to "triggerEvent()". It needs to be a pointer to a "WAVSource" object. If you are streaming raw PCM audio, and inserting a "uLawFromPCMAudioSource" filter object in front of it, then "sessionState.source" will point to that filter object, which is the wrong thing to be passing to "triggerEvent()". So, you should change the second parameter to be "wavSource". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 24 15:03:26 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 15:03:26 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: <4e60ee563237db1866b60773ef0df240@neurosoft.in> References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> <4e60ee563237db1866b60773ef0df240@neurosoft.in> Message-ID: On Oct 24, 2013, at 11:35 AM, ssingh at neurosoft.in wrote: > Any input on this? Because of this violation of basic email netiquette - i.e., posting the same question to the mailing list multiple times - future postings from you will be moderated. Please don't do this again, otherwise you'll be banned from the mailing list! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Thu Oct 24 08:15:06 2013 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Thu, 24 Oct 2013 23:15:06 +0800 Subject: [Live-devel] HTTP Live Streaming In-Reply-To: References: <276BF549-8ED2-4C4E-8AF2-42418E8CDB06@live555.com> Message-ID: On Thu, Oct 24, 2013 at 9:15 PM, Ross Finlayson wrote: > Thanks for the clarifications, I've looked further and found that send() > is returning -1, with errno being set to EPIPE. The problem is in "select" > in BasicTaskScheduler::SingleStep still indicating a writable socket even > when it is disconnected. Is this a libc problem? > > > If "select()" is reporting that a socket is writable, when it's not, then > that would appear to be a bug... > > > I've found a reference on what would be the proper behavior, In Unix Network Programming, Volume 1: The Sockets Networking API (3rd Edition), it is noted that a socket will be ready for writing if the write half of the connection is closed, but a write will generate a SIGPIPE. So a check on EPIPE in the send call will be needed. Park. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 24 18:20:08 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Oct 2013 18:20:08 -0700 Subject: [Live-devel] HTTP Live Streaming In-Reply-To: References: <276BF549-8ED2-4C4E-8AF2-42418E8CDB06@live555.com> Message-ID: > I've found a reference on what would be the proper behavior, In Unix Network Programming, Volume 1: The Sockets Networking API (3rd Edition), it is noted that a socket will be ready for writing if the write half of the connection is closed, but a write will generate a SIGPIPE. So a check on EPIPE in the send call will be needed. OK, I've now updated the code to check for this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 25 00:04:12 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Oct 2013 00:04:12 -0700 Subject: [Live-devel] Changing from Multicast to Unicast In-Reply-To: <5269521D.2050403@panogenics.com> References: <5269521D.2050403@panogenics.com> Message-ID: <9DB86C2F-7B83-4A89-A771-4746483343D6@live555.com> > Am I missing a step in the remove code ? Yes. What you're missing is that a "PassiveServerMediaSubsession" object refers to a stream that exists independently (as opposed to an "OnDemandServerMediaSubsession", that creates (and destroys) its own stream, on demand). Therefore, for the multicast case, you need to not only create the stream separately, you also need to destroy it separately. In particular, to destroy it, you should delete objects in this order: Medium::close(rtcp); Medium::close(sink); Medium::close(videoSource); // note that this will also automatically delete the "vidSrc"("MPEG2TransportStreamFromESSource") and "source" objects delete rtcpGroupsock; delete rtpGroupsock; Also: > To remove > rtspServer->deleteServerMediaSession(stream->sms); > stream->sms->deleteAllSubsessions(); The second statement is unnecessary. Even worse, it can cause a crash, because the first statement deletes the "stream->sms" object! > (For Multicast) > const Port rtpPort(stream->rtpPort); > Groupsock *rtpGroupsock = new Groupsock(*env, destaddr, rtpPort, stream->ttl); > rtpGroupsock->multicastSendOnly(); > const Port rtcpPort(stream->rtcpPort); > Groupsock *rtcpGroupsock = new Groupsock(*env, destaddr, rtcpPort, stream->ttl); > rtcpGroupsock->multicastSendOnly(); > RTPSink *sink = SimpleRTPSink::createNew(*env, rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); > RTCPInstance *rtcp = RTCPInstance::createNew(*env, rtcpGroupsock, estimatedSessionBandwidth, CNAME, sink, NULL, True); > stream->sms->addSubsession(PassiveServerMediaSubsession::createNew(*sink, rtcp)); > MPEG2TransportStreamFromESSource *vidSrc = MPEG2TransportStreamFromESSource::createNew(*env); > vidSrc->addNewVideoSource(source, 2); > MPEG2TransportStreamFramer *videoSource = MPEG2TransportStreamFramer::createNew(*env, vidSrc); Of course, there's also a call to sink<-startPlaying(*videoSource, ...); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From krishnaks at iwavesystems.com Fri Oct 25 01:54:54 2013 From: krishnaks at iwavesystems.com (Krishna) Date: Fri, 25 Oct 2013 14:24:54 +0530 Subject: [Live-devel] FrameSource:Getnextframe error whilestreamingPCMframes In-Reply-To: <7B21282B-5A10-4F46-B5AE-C82221F2DED1@live555.com> References: <7DA4F08D95CC4047A2DB0557E4064009@IWAVE> <9C79FCD8-021B-4853-91EA-EC43D9FDF8DA@live555.com> <7B21282B-5A10-4F46-B5AE-C82221F2DED1@live555.com> Message-ID: <507B1833C8F24A6AAE626E19FF5BBA73@IWAVE> HI Ross, Thanks. Now afterGettingFrame function is getting called. Regards, From: Ross Finlayson Sent: Friday, October 25, 2013 3:31 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] FrameSource:Getnextframe error whilestreamingPCMframes I think your problem is here: void triggerLive555Scheduler(void) { scheduler->triggerEvent(WAVSource::s_frameReceivedTrigger, sessionState.source); } The problem with this is the second parameter to "triggerEvent()". It needs to be a pointer to a "WAVSource" object. If you are streaming raw PCM audio, and inserting a "uLawFromPCMAudioSource" filter object in front of it, then "sessionState.source" will point to that filter object, which is the wrong thing to be passing to "triggerEvent()". So, you should change the second parameter to be "wavSource". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------------------------------------------------------------------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From piers.hawksley at panogenics.com Fri Oct 25 06:25:01 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Fri, 25 Oct 2013 14:25:01 +0100 Subject: [Live-devel] Changing from Multicast to Unicast In-Reply-To: <5269521D.2050403@panogenics.com> References: <5269521D.2050403@panogenics.com> Message-ID: <526A712D.5080108@panogenics.com> >> Am I missing a step in the remove code ? >Yes. What you're missing is that a "PassiveServerMediaSubsession" object refers to a stream that exists independently (as opposed to an "OnDemandServerMediaSubsession", that creates (and destroys) its own stream, on demand). Therefore, for the multicast case, you need to not only create the stream separately, you also need to destroy it separately. Thanks for the explanation and code - it sounds obvious now you've explained it. >> To remove >> rtspServer->deleteServerMediaSession(stream->sms); >> stream->sms->deleteAllSubsessions(); >The second statement is unnecessary. Even worse, it can cause a crash, because the first statement deletes the "stream->sms" object! Thanks - second statement removed. >Of course, there's also a call to sink<-startPlaying(*videoSource, ...); Sorry - I did have a call to startPlaying ... just failed to add it to the question ! Thanks for correcting me - my application now works correctly. Cheers, Piers From piers.hawksley at panogenics.com Fri Oct 25 09:40:44 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Fri, 25 Oct 2013 17:40:44 +0100 Subject: [Live-devel] multicast & linux information Message-ID: <526A9F0C.9050303@panogenics.com> >> It seems that kernel maintainers chose to keep the backward compatibility with a default that cause receiving data from all multicast groups and not only from joined multicast group. >> But they add since kernel 2.6.31 an option that allow to avoid this. > That's good to hear. (I guess this is the closest they'll ever come to admitting that the default behavior is a bug :-) >> Do you think it could be possible to set this option in socketJoinGroup in a future release of live555 ? > Done! I've just installed a new version (2013.10.03) of the "LIVE555 Streaming Media" code that sets this option to 0 if it's defined. Attached is the changes I made to Linux 2.6.29 to port the IP_MULTICAST_ALL socket option from 2.6.31. Sorry for this being an svn diff - but it isn't a big change, only a few lines to 5 files. Hopefully it is of interest and not too off-topic. I've tested it within the limits of what it is meant to do (and only on one (custom ARM) embedded Linux platform) so ymmv. Cheers, Piers Hawksley -------------- next part -------------- Index: include/net/inet_sock.h =================================================================== --- include/net/inet_sock.h (revision 5965) +++ include/net/inet_sock.h (working copy) @@ -130,7 +130,8 @@ freebind:1, hdrincl:1, mc_loop:1, - transparent:1; + transparent:1, + mc_all:1; // From 2.6.31 int mc_index; __be32 mc_addr; struct ip_mc_socklist *mc_list; Index: include/linux/in.h =================================================================== --- include/linux/in.h (revision 5965) +++ include/linux/in.h (working copy) @@ -107,6 +107,7 @@ #define MCAST_JOIN_SOURCE_GROUP 46 #define MCAST_LEAVE_SOURCE_GROUP 47 #define MCAST_MSFILTER 48 +#define IP_MULTICAST_ALL 49 // From 2.6.31 #define MCAST_EXCLUDE 0 #define MCAST_INCLUDE 1 Index: net/ipv4/af_inet.c =================================================================== --- net/ipv4/af_inet.c (revision 5965) +++ net/ipv4/af_inet.c (working copy) @@ -376,6 +376,7 @@ inet->uc_ttl = -1; inet->mc_loop = 1; inet->mc_ttl = 1; + inet->mc_all = 1; // From 2.6.31 inet->mc_index = 0; inet->mc_list = NULL; Index: net/ipv4/ip_sockglue.c =================================================================== --- net/ipv4/ip_sockglue.c (revision 5965) +++ net/ipv4/ip_sockglue.c (working copy) @@ -449,6 +449,7 @@ (1<= sizeof(int)) { @@ -895,6 +896,13 @@ kfree(gsf); break; } + case IP_MULTICAST_ALL: // From 2.6.31 + if (optlen < 1) + goto e_inval; + if (val != 0 && val != 1) + goto e_inval; + inet->mc_all = val; + break; case IP_ROUTER_ALERT: err = ip_ra_control(sk, val ? 1 : 0, NULL); break; @@ -1147,6 +1155,9 @@ release_sock(sk); return err; } + case IP_MULTICAST_ALL: // From 2.6.31 + val = inet->mc_all; + break; case IP_PKTOPTIONS: { struct msghdr msg; Index: net/ipv4/igmp.c =================================================================== --- net/ipv4/igmp.c (revision 5965) +++ net/ipv4/igmp.c (working copy) @@ -2196,7 +2196,7 @@ break; } if (!pmc) - return 1; + return inet->mc_all; // From 2.6.31 psl = pmc->sflist; if (!psl) return pmc->sfmode == MCAST_EXCLUDE; From finlayson at live555.com Fri Oct 25 12:41:54 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Oct 2013 12:41:54 -0700 Subject: [Live-devel] multicast & linux information In-Reply-To: <526A9F0C.9050303@panogenics.com> References: <526A9F0C.9050303@panogenics.com> Message-ID: > Attached is the changes I made to Linux 2.6.29 to port the IP_MULTICAST_ALL socket option from 2.6.31. Out of curiosity - why didn't you just use the 2.6.31 kernel? (It presumably has other fixes/improvements that you would find useful.) Ross. From ssingh at neurosoft.in Thu Oct 24 18:18:27 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Thu, 24 Oct 2013 18:18:27 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> Message-ID: <5d5ecc8e603d612648c4c340721cf3da@neurosoft.in> Thank you Ross, That solved my issue with audio packets being properly transmitted. I am still facing issue with Audio at reception side. When I try with VLC it gives ausio for half a second and then stops. If i stop and start VLC again it agian gives audiio for half a second and stops. Then i tried with ffmpeg. FFmpeg reports that the audio codec is (null) and shows mono channels whereas VLC shows proper codec which is AC3 and stereo channels. I am using AC3 codec to encode audio and then stream it using AC3AudioStreamFramer and AC3AudioRTPSink. Am i doing anything wrong ? I also found that somone used MPEG4GenericRTPSink for audio. Is it right? PS: I apologize for my last repeated email. Thanks On 2013-10-24 13:03, Ross Finlayson wrote: >> Thank you Ross for clarification, its more clear now. Now I am >> facing issue that i have separate thread that pushes audio packets >> for my device source to stream. I trigger event each time I push >> packet to that queue. I noticed that on VLC my audio comes for about >> a second and then stops. When I debugged my code I found that I have >> more than 1000 packets in my audio queue waiting to be streamed by >> devicesource. >> >> I think what is happening is that whenever I trigger an event and >> that event is already happening it ignores it, is it correct? Whats >> is the correct way to handle this. I think the audio packets should >> be streamed at the same rate as they are being encoded from live >> source. > > This should all be OK, provided that your "doGetNextFrame()" > implementation knows to check for (queued) data that is immediately > available, rather than always waiting for a triggered event. Note this > code on lines 87-90 of "DeviceSource.cpp": > > // If a new frame of data is immediately available to be delivered, > then do this now: > if (0 /* a new frame of data is immediately available to be > delivered*/ /*%%% TO BE WRITTEN %%%*/) { > deliverFrame(); > } > > Replace the "0" with some test that checks whether queued data is > immediately available. (Remember that because your queue data > structure is also being written to by the encoding thread, it needs to > be 'thread safe'.) > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [1] > > > Links: > ------ > [1] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From fantasyvideo at 126.com Fri Oct 25 00:49:03 2013 From: fantasyvideo at 126.com (Tony) Date: Fri, 25 Oct 2013 15:49:03 +0800 (CST) Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: <4CCD8B37-39FC-4EB8-9D76-C48781B1328E@live555.com> References: <007301cecbe8$8bce5710$a36b0530$@126.com> <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> <41e3a3f2.20a94.141e8995385.Coremail.fantasyvideo@126.com> <4CCD8B37-39FC-4EB8-9D76-C48781B1328E@live555.com> Message-ID: But if I use the vlc to play it, the audio's pts is out of range, and it's dropped. but it works by ffplay. I don't know where is wrong. Could you help me to check it where is wrong. I'm crazy caused by this problem. Here is my code. audio is g711, each buffer is 160 bytes. void AudioFrameSource::doGetNextFrame() { unsigned acquiredFrameSize=0; if(m_session!=NULL) { m_session->GetNextAudioFrame((char*)fTo,fMaxSize,&acquiredFrameSize,&fNumTruncatedBytes); if(acquiredFrameSize!=0) { if(_isFirst) { m_session->GetTimeScale(&_timeval); _isFirst = false; } else { _timeval.tv_usec += 20000; if(_timeval.tv_usec>=1000000) { _timeval.tv_sec ++; _timeval.tv_usec -= 1000000; } } fFrameSize = acquiredFrameSize; fPresentationTime = _timeval; // fDurationInMicroseconds = 20000; } } nextTask() = envir().taskScheduler().scheduleDelayedTask(20000,(TaskFunc*)FramedSource::afterGetting, this); } video's getnextframe each buffer is h264 nalu, and the stream's average framerate is 25fps. void VideoFrameSource::doGetNextFrame() { unsigned int framesize=0; if(m_session!=NULL) { bool lastnalu=false; m_session->GetNextVideoFrame(_firstframe,lastnalu,(char*)fTo,fMaxSize,&framesize,&fNumTruncatedBytes); fFrameSize = framesize; if(framesize!=0) { if(_firstframe) { m_session->GetTimeScale(&_timescale); _firstframe = false; } else if(lastnalu) { _timescale.tv_usec += 40000; if(_timescale.tv_usec>=1000000) { _timescale.tv_sec ++; _timescale.tv_usec -= 1000000; } } fPresentationTime = _timescale; // fDurationInMicroseconds = 40000; } } nextTask() = envir().taskScheduler().scheduleDelayedTask(8000,(TaskFunc*)FramedSource::afterGetting, this); } At 2013-10-24 16:35:56,"Ross Finlayson" wrote: I saw the FAQ in live555 website. It said the live555 sync the audio and video by RTCP's SR packets. So I should create RTCP instance for each RTP source explititly? No, because (assuming that you are controlling the streaming using RTSP) this is done implicitly. (In the RTSP server, this is done when the stream starts playing; in the RTSP client, it is done in the implementation of the "initiate()" function.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Oct 25 21:05:12 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Oct 2013 21:05:12 -0700 Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: References: <007301cecbe8$8bce5710$a36b0530$@126.com> <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> <41e3a3f2.20a94.141e8995385.Coremail.fantasyvideo@126.com> <4CCD8B37-39FC-4EB8-9D76-C48781B1328E@live555.com> Message-ID: <59F2830B-40C2-438A-B4E7-A274B39B2B28@live555.com> > nextTask() = envir().taskScheduler().scheduleDelayedTask(20000,(TaskFunc*)FramedSource::afterGetting, this); [...] > nextTask() = envir().taskScheduler().scheduleDelayedTask(8000,(TaskFunc*)FramedSource::afterGetting, this); This is wrong. Once you've delivered a frame of data (using your "GetNextAudioFrame()"/"GetNextVideoFrame()" calls) to the downstream object, you shouldn't be delaying at all before you complete delivery of the object. Instead, just call FramedSource::afterGetting(this): directly, in each case. Note, however, that you call this ***only if*** you were able to successfully deliver a frame of data - i.e., if the acquired frame size was >0. If, instead, you were not able to deliver a frame of data (i.e., the acquired frame size was 0), then you must arrange for the delivery (and the call to "FramedSource::afterGetting(this):") to take place later, when a frame of data becomes available. Once again, I suggest using the "DeviceSource" code as a model. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Oct 26 01:13:22 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Oct 2013 01:13:22 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: <5d5ecc8e603d612648c4c340721cf3da@neurosoft.in> References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> <5d5ecc8e603d612648c4c340721cf3da@neurosoft.in> Message-ID: > I am using AC3 codec to encode audio and then stream it using AC3AudioStreamFramer and AC3AudioRTPSink. Am i doing anything wrong ? Yes. You should *not* be using an "AC3AudioStreamFramer". That's intended only for streaming AC3 audio that comes in a byte stream - e.g., from a file. Because your AC3 audio input comes from an encoder - one frame at a time - you should feed it directly into a "AC3AudioRTPSink". (Also, don't forget to set accurate presentation times ("fPresentationTime") for each frame.) Also, you should test your stream first using "testRTSPClient", before trying VLC (or any other media player). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From piers.hawksley at panogenics.com Sat Oct 26 03:39:16 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Sat, 26 Oct 2013 11:39:16 +0100 Subject: [Live-devel] multicast & linux information Message-ID: <05a1c5aa-ceb8-4093-8d7d-21e6ff3e45bc@email.android.com> >> Attached is the changes I made to Linux 2.6.29 to port the IP_MULTICAST_ALL socket option from 2.6.31. > Out of curiosity - why didn't you just use the 2.6.31 kernel? (It presumably has other fixes/improvements that you would find useful.) The ARM SOC that I'm developing for was provided with 2.6.29 but the changes to the kernel were not mainlined. Sadly this is not uncommon, hopefully these changes may be of use to others using live555 on embedded Linux 2.6.29. -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.cassany at i2cat.net Tue Oct 29 01:51:55 2013 From: david.cassany at i2cat.net (David Cassany Viladomat) Date: Tue, 29 Oct 2013 09:51:55 +0100 Subject: [Live-devel] Frame Buffer Message-ID: Hi all, I may ask a very simple question, but I couldn't manage to figure it out by my self as I don't know livemedia library arch deeply. I was wondering if there is any kind of frame buffer implementation or this is out of livemedia scope. I am thinking of moving my code to livemedia555 for an RTP streaming app (receiver and sender sides), but one of my major concerns goes about the buffer, where packets are stored (after being parsed, imagine h264 payload) and where coded frames are stored (a collection of packets). At which classes should I have a look to understand how the buffers works? I Saw FrameSource and MediaSoure or RTPSource but I couldn't manage to undestand if they actually work as a frame buffer, as I could not understand which data units they manage. Thanks in advance. Kind Regards, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Tue Oct 29 00:55:33 2013 From: fantasyvideo at 126.com (Tony) Date: Tue, 29 Oct 2013 15:55:33 +0800 (CST) Subject: [Live-devel] Regarding the h264 video stream's rtp packet timestamp In-Reply-To: <59F2830B-40C2-438A-B4E7-A274B39B2B28@live555.com> References: <007301cecbe8$8bce5710$a36b0530$@126.com> <165d1067.3256e.141df750527.Coremail.fantasyvideo@126.com> <41e3a3f2.20a94.141e8995385.Coremail.fantasyvideo@126.com> <4CCD8B37-39FC-4EB8-9D76-C48781B1328E@live555.com> <59F2830B-40C2-438A-B4E7-A274B39B2B28@live555.com> Message-ID: <17ccc7e7.8899.1420335e659.Coremail.fantasyvideo@126.com> If I did as you said. VLC always said the input buffer is empty. At 2013-10-26 12:05:12,"Ross Finlayson" wrote: nextTask() = envir().taskScheduler().scheduleDelayedTask(20000,(TaskFunc*)FramedSource::afterGetting, this); [...] nextTask() = envir().taskScheduler().scheduleDelayedTask(8000,(TaskFunc*)FramedSource::afterGetting, this); This is wrong. Once you've delivered a frame of data (using your "GetNextAudioFrame()"/"GetNextVideoFrame()" calls) to the downstream object, you shouldn't be delaying at all before you complete delivery of the object. Instead, just call FramedSource::afterGetting(this): directly, in each case. Note, however, that you call this ***only if*** you were able to successfully deliver a frame of data - i.e., if the acquired frame size was >0. If, instead, you were not able to deliver a frame of data (i.e., the acquired frame size was 0), then you must arrange for the delivery (and the call to "FramedSource::afterGetting(this):") to take place later, when a frame of data becomes available. Once again, I suggest using the "DeviceSource" code as a model. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Oct 29 14:10:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Oct 2013 14:10:32 -0700 Subject: [Live-devel] Frame Buffer In-Reply-To: References: Message-ID: > I may ask a very simple question, but I couldn't manage to figure it out by my self as I don't know livemedia library arch deeply. I was wondering if there is any kind of frame buffer implementation or this is out of livemedia scope. > > I am thinking of moving my code to livemedia555 for an RTP streaming app (receiver and sender sides), but one of my major concerns goes about the buffer, where packets are stored (after being parsed, imagine h264 payload) and where coded frames are stored (a collection of packets). At which classes should I have a look to understand how the buffers works? I'm not sure I completely understand your question, but 'reading between the lines', I think you are asking this because you want to *display* an incoming video stream (rather than just receive packets, as the "testRTSPClient" demo application does). If that's the case, then you should first look at a media player like VLC , which uses the "LIVE555 Streaming Media" library's RTSP client implementation to receive a RTSP/RTP video (and/or audio) stream, and display it. If you want to program something like this yourself, however, then note that you will need decoding (i.e., codec) software (or hardware) to translate the (compressed, encoded) H.264 video NAL units into displayable video frames. Note, however, that the "LIVE555 Streaming Media" software *does not* include any codec (video decoding) software; you would need to get that from somewhere else. However, if you have video decoding software (or hardware), then you can easily fit this into a LIVE555-based receiver application. See, for example, the comments at line 133 of "testProgs/testRTSPClient.cpp". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pgorazda at gmail.com Tue Oct 29 02:59:21 2013 From: pgorazda at gmail.com (Pawel Gorazda) Date: Tue, 29 Oct 2013 10:59:21 +0100 Subject: [Live-devel] patch for Amino STB - JMACX Message-ID: <526F86F9.3080700@gmail.com> Hello, I am using live media server for streaming to Amino STBs. I found out that some Amino API calls could be easily satisfied. Please find attached patch to latest code (live555-latest.tar.gz). It implements support for GetDuration and GetPosition (GetPos) Amino API. Pawel Gorazda -------------- next part -------------- A non-text attachment was scrubbed... Name: live-amino.patch Type: text/x-patch Size: 14707 bytes Desc: not available URL: From david.cassany at i2cat.net Wed Oct 30 01:54:32 2013 From: david.cassany at i2cat.net (David Cassany Viladomat) Date: Wed, 30 Oct 2013 09:54:32 +0100 Subject: [Live-devel] Frame Buffer In-Reply-To: References: Message-ID: Thanks Ross, I will have a look at VLC as you suggest. Sorry if I was not clear enough, in my app (written in plain C) we work at frame level (compressed or not, we already wrapped libavcodec, so we work at AVFrame level rather than at NAL Unit level for h264) in order to transform the image and resend again, so we deliver frame by frame to the decoder as the in following sequence: coded frame -> decoder -> process raw frame -> encoder -> coded frame. So my concern and my question was more about recieving/sending frames than recieving/sending packets, so that I was asking if inside livemedia there is any buffer that stores packets, sorts them and compile them in frames. In our actual code we have a frame buffer (C linked list) where each node represents a frame which is composed by another buffer (C linked list again) that holds all the frame packets. We use linked linked lists because we sort packets dynamically as they arrive. Imagine I would like to start an RTPH264VideoSource to gather frames and resend them with RTPH264VideoSink, just dummy test, without decoding and rencoding. I guess I should use doGetNextFrame() method to retrieve the frames but I can't manage to understand how to control it (how do I know when the frame is ready? what happens if there is no new frame? etc.). Thanks for your time Ross, David 2013/10/29 Ross Finlayson > I may ask a very simple question, but I couldn't manage to figure it out > by my self as I don't know livemedia library arch deeply. I was wondering > if there is any kind of frame buffer implementation or this is out of > livemedia scope. > > I am thinking of moving my code to livemedia555 for an RTP streaming app > (receiver and sender sides), but one of my major concerns goes about the > buffer, where packets are stored (after being parsed, imagine h264 payload) > and where coded frames are stored (a collection of packets). At which > classes should I have a look to understand how the buffers works? > > > I'm not sure I completely understand your question, but 'reading between > the lines', I think you are asking this because you want to *display* an > incoming video stream (rather than just receive packets, as the > "testRTSPClient" demo application does). > > If that's the case, then you should first look at a media player like VLC < > http://www.videolan.org/vlc/>, which uses the "LIVE555 Streaming Media" > library's RTSP client implementation to receive a RTSP/RTP video (and/or > audio) stream, and display it. > > If you want to program something like this yourself, however, then note > that you will need decoding (i.e., codec) software (or hardware) to > translate the (compressed, encoded) H.264 video NAL units into displayable > video frames. Note, however, that the "LIVE555 Streaming Media" software > *does not* include any codec (video decoding) software; you would need to > get that from somewhere else. > > However, if you have video decoding software (or hardware), then you can > easily fit this into a LIVE555-based receiver application. See, for > example, the comments at line 133 of "testProgs/testRTSPClient.cpp". > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 30 05:44:38 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 05:44:38 -0700 Subject: [Live-devel] Frame Buffer In-Reply-To: References: Message-ID: <842FE277-E3A3-44AA-B610-7F5A21976718@live555.com> > So my concern and my question was more about recieving/sending frames than recieving/sending packets, so that I was asking if inside livemedia there is any buffer that stores packets, sorts them and compile them in frames. Sort of. First, our RTP reception code automatically ensures that incoming data is delivered in the correct order, even if packets happen to get reordered on the network. So that's not something that you ever need to worry about. Our code also deals with the fact that H.264 NAL units may be fragmented over multiple RTP packets. Our RTP reception code ("H264VideoRTPSource", for H.264 video) always delivers either a complete NAL unit, or (if one of the fragment packets happened to have been lost) no NAL unit at all. (Note that if a H.264 video 'access unit' has been split into several slices (each one being a NAL unit), then these slices will be delivered - in order, and completely (or not at all) - to the recipient.) Also, the incoming NAL units will be sorted into 'decode order' - because that's the order (according to the H.264/RTP payload format standard) in which they were originally transmitted. Finally, we have a Boolean function "RTPSource::curPacketMarkerBit()" that the receiving code can call to check whether or not the most recently-received NAL unit ends an 'access unit'. The bottom line is that our code automatically deals with the RTP packetization/depacketization. All you (as a programmer) need to deal with is 'NAL units'. You can illustrate this by running the "testRTSPClient" demo application on a H.264 RTSP stream. See also http://www.live555.com/liveMedia/faq.html#testRTSPClient-how-to-decode-data (ps. When responding to an email, please remember to properly trim the text of the quoted message that you're responding to.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Wed Oct 30 10:23:23 2013 From: nambirajan.manickam at i-velozity.com (Nambirajan M) Date: Wed, 30 Oct 2013 22:53:23 +0530 Subject: [Live-devel] Reagarding normal play time (NPT ) in Live555Media Sever Message-ID: <000001ced594$c30a20a0$491e61e0$@manickam@i-velozity.com> Hi Ross, Is it possible to get the normal play time for each PLAY request from the RTSP Client without RANGE HEADER information included in the client request. We checked the code in handleCmd_PLAY in RTSPServer.cpp. In this function, if the Range Header is not in the client request, the Live555MediaServer Sends the response as Range: npt = 0.000. But when the PLAY request is sent from the client after the PAUSE request, the Server sends the normal play time Correctly. We want to implement the response from the Live555Server to include the normal play time from the Server to PLAY request from the RTSP Client even Without the RANGE info in the Client PLAY request. Thanks and regards, M. Nambirajan -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Wed Oct 30 11:35:38 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Thu, 31 Oct 2013 00:05:38 +0530 Subject: [Live-devel] True push DeviceSource In-Reply-To: References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> <5d5ecc8e603d612648c4c340721cf3da@neurosoft.in> Message-ID: <4df4c57a617163f57deb13d0fb512d08@neurosoft.in> Hi, Thanks for the guidance and I tried it. Now I can see the packets being transmitted in testRTSPClient and also in VLC. VLC shows packets being decoded for both audio and video. But still somehow video is running smooth but audio just comes and goes. Its like something is choking audio. It comes for a second and then goes back. What area should I see for possible issue. I tried changingthe estimate bandwidth for video and currently set it to 4500 and for audio set it to 200 but still same issue. I checked time stamps but they seems fine in testRTSPClient. Thanks, Caduceus On 2013-10-26 13:43, Ross Finlayson wrote: >> I am using AC3 codec to encode audio and then stream it using >> AC3AudioStreamFramer and AC3AudioRTPSink. Am i doing anything wrong >> ? > > Yes. You should *not* be using an "AC3AudioStreamFramer". That's > intended only for streaming AC3 audio that comes in a byte stream - > e.g., from a file. Because your AC3 audio input comes from an encoder > - one frame at a time - you should feed it directly into a > "AC3AudioRTPSink". (Also, don't forget to set accurate presentation > times ("fPresentationTime") for each frame.) > > Also, you should test your stream first using "testRTSPClient", before > trying VLC (or any other media player). > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [1] > > > Links: > ------ > [1] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From michel.promonet at thalesgroup.com Wed Oct 30 10:18:05 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Wed, 30 Oct 2013 18:18:05 +0100 Subject: [Live-devel] Matroska BANK_SIZE overflow Message-ID: <31396_1383153571_52713FA3_31396_3808_1_1BE8971B6CFF3A4F97AF4011882AA25501563E9ED5C5@THSONEA01CMS01P.one.grp> Hi Ross, Searching in the mailing list, I saw some discussion about the BANK_SIZE, the last one I found seems to say that it is no more needed at least for H264 parsing. I reach this limit parsing an MKV file with an H264 stream. It's annoying that parsing a file makes abort the process with the following backtrace : #2 0x00000000008c7b61 in UsageEnvironment::internalError (this=0x138fd60) at UsageEnvironment.cpp:46 #3 0x00000000008bc809 in StreamParser::ensureValidBytes1 (this=0x13e0530, numBytesNeeded=289877) at StreamParser.cpp:151 #4 0x00000000008bf1c1 in StreamParser::ensureValidBytes (this=0x13e0530, numBytesNeeded=289877) at StreamParser.hh:118 #5 0x00000000008bf12a in StreamParser::skipBytes (this=0x13e0530, numBytes=289877) at StreamParser.hh:92 #6 0x00000000008b4760 in MatroskaFileParser::parseBlock (this=0x13e0530) at MatroskaFileParser.cpp:769 #7 0x00000000008b3179 in MatroskaFileParser::parse (this=0x13e0530) at MatroskaFileParser.cpp:155 #8 0x00000000008b3006 in MatroskaFileParser::continueParsing (this=0x13e0530) at MatroskaFileParser.cpp:96 ... Could be possible in a future release to add a way to customize this value and return a parsing error instead of aborting ? I guess it could be possible to customize the buffer size setting a value in the UsageEnvironement, and return a parsing error when the buffer overflow ? Best Regards, Michel. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 30 11:58:01 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 11:58:01 -0700 Subject: [Live-devel] Reagarding normal play time (NPT ) in Live555Media Sever In-Reply-To: <000001ced594$c30a20a0$491e61e0$@manickam@i-velozity.com> References: <000001ced594$c30a20a0$491e61e0$@manickam@i-velozity.com> Message-ID: <07C1A623-CBCA-4510-9318-E278139E73E7@live555.com> > Is it possible to get the normal play time for each PLAY request from the RTSP Client without RANGE HEADER information included in the client request. I'm not sure I understand your question. The client should always be able to get the 'normal play time' by calling "MediaSubsession:getNormalPlayTime()". At the server end: If the "PLAY" request did not contain a "Range:" header, then the server will still send back a "Range:" header - starting with the current 'normal play time', but (unless you're resuming from a "PAUSE") this 'normal play time' will obviously be 0. (As always, I assume that you're using the latest version of the code - the only version that we support.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 30 12:01:58 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 12:01:58 -0700 Subject: [Live-devel] Matroska BANK_SIZE overflow In-Reply-To: <31396_1383153571_52713FA3_31396_3808_1_1BE8971B6CFF3A4F97AF4011882AA25501563E9ED5C5@THSONEA01CMS01P.one.grp> References: <31396_1383153571_52713FA3_31396_3808_1_1BE8971B6CFF3A4F97AF4011882AA25501563E9ED5C5@THSONEA01CMS01P.one.grp> Message-ID: > I reach this limit parsing an MKV file with an H264 stream. So please put this file on a (publically-accessible) web server, and send us the URL, so we can download and test it for ourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 30 15:23:27 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 15:23:27 -0700 Subject: [Live-devel] patch for Amino STB - JMACX In-Reply-To: <526F86F9.3080700@gmail.com> References: <526F86F9.3080700@gmail.com> Message-ID: <7BEC0991-9686-495E-824F-5653B9CCD335@live555.com> Thanks for the note. However, I won't be adding these changes to the supplied "LIVE555 Streaming Media" code, unless I am contacted *directly* by 'Amino Corporation' (not just by some intermediary), explaining why they continue to violate established Internet standards, and why (in spite of this) I should continue to spend time modifying our code to conform to their hacked, standards-ignorant hardware. In particular: - There should be no reason for servers to return their custom "a=X-duration:" header, when the standard "a=range:" header does the same job. - There is a standard way for a server to convey the stream's 'normal play time' to the client - namely, via RTP timestamps, combined with RTCP "SR" packets, and the RTSP "RTP-Info:" header. But wait - Amino had earlier decided not to use RTP/RTCP at all, but instead to transmit their streams via raw-UDP. So, they are now 'reaping what they have sown', because - without RTP/RTCP - their new, nonstandard "GET_PARAMETER position:" hack is the only way for them to get the stream's NPT from the server. (I also note that - without RTP/RTCP - they have no way to handle network packet reordering (whereas this comes for free with RTP).) I would be thrilled if 'Amino' were to upgrade their products to be more standards-compliant (and I would be happy to help them do so, should they desire). Failing this, however, I recommend that, instead of using 'Amino' set-top box clients, people use other, more standards-compliant products instead. (Note that if anyone does choose to make these (or any other) modifications to the LIVE55 code, then they are bound by the terms of the LGPL; see .) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From piers.hawksley at panogenics.com Wed Oct 30 16:02:35 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Wed, 30 Oct 2013 23:02:35 +0000 Subject: [Live-devel] MPEG2 Transport Stream Packet ID Message-ID: <5271900B.2080001@panogenics.com> Hi Ross, Can the MPEG2 transport stream packet ID be set ? The code appears (in MPEG2TransportStreamFromESSource::addNewVideoSource) to set this to 0xE0. Changing this causes MPEG2TransportStreamMultiplexor::handleNewBuffer to fail 'if ((stream_id&0xF0) == 0xE0) { // video', and thus set 'streamType = 0x81; // private'. For test purposes I've hard coded: u_int8_t streamId = 0x20 | (fVideoSourceCounter++&0x0F); in MPEG2TransportStreamFromESSource::addNewVideoSource and changed the if statement mentioned above (in MPEG2TransportStreamMultiplexor::handleNewBuffer) to: if (((stream_id&0xF0) == 0xE0)||((stream_id&0xF0) == 0x20)) { // video to get the packet ID to be 0x20. Am I understanding the code or the MPEG transport stream format wrongly ? I realise 0xE0 if the first stream in a Packetized Elementary Stream, but I *think* this is separate from the Transport Stream Packet ID. Best Regards, Piers Hawksley From finlayson at live555.com Wed Oct 30 16:26:56 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 16:26:56 -0700 Subject: [Live-devel] MPEG2 Transport Stream Packet ID In-Reply-To: <5271900B.2080001@panogenics.com> References: <5271900B.2080001@panogenics.com> Message-ID: <4317369D-E003-4DDE-9D63-49F8A4FF0F2E@live555.com> As you noted, to simplify the code - when constructing a new Transport Stream - we reuse the PES packet's "stream id" to also be the Transport Stream "PID". This has worked OK for everyone so far, so I wasn't planning on changing this, unless there's a compelling reason to do so... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From piers.hawksley at panogenics.com Wed Oct 30 16:33:58 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Wed, 30 Oct 2013 23:33:58 +0000 Subject: [Live-devel] RTSP Interleaved Streams Message-ID: <52719766.8010600@panogenics.com> Hi Ross, One of our customers is trying to get RTSP interleaved streams from our live555 server. I have posted the 178KB Wireshark trace online at http://www.hawksley42.co.uk/amg-dump.pcapng. Frame 177 has a block of 112 bytes on the end which does not appear to be valid data. Do I need to change how I set up the unicast OnDemandServerMediaSubsession if the client may request interleaved streams ? Can you suggest what I may try to test this further ? I have not yet found a way to get VLC to request RTSP interleaved data, so I am limited in how to test this at present. The version of Live555 that is in use is not up-to-date as our customer is using an old version of our software. We try very hard to release the most recent version of live 555. Best Regards, Piers Hawksley From bbischan at watchtower-security.com Wed Oct 30 18:37:27 2013 From: bbischan at watchtower-security.com (Bob Bischan) Date: Wed, 30 Oct 2013 20:37:27 -0500 Subject: [Live-devel] RTSP Interleaved Streams In-Reply-To: <52719766.8010600@panogenics.com> References: <52719766.8010600@panogenics.com> Message-ID: Piers, I have not yet found a way to get VLC to request RTSP interleaved data, so I am limited in how to test this at present. VLC does support RTSP-over-TCP. This option may be configured from the GUI or from VLC config file. GUI: Tools > Preferences > Input/Codecs > Demuxers > RTP/RTSP > Use RTP over RTSP (TCP) Config File (Linux): ~/.config/VLC/vlcrc # Use RTP over RTSP (TCP) (boolean) rtsp-tcp=1 On windows you might have to search around for this file. Frame 177 has a block of 112 bytes on the end which does not appear to be valid data. I believe this might be related to VLC sending GET_PARAMETER requests for keep-alive. live555 appears to have data in the body of the response (LIVEMEDIA_LIBRARY_VERSION_STRING). If you run VLC from cli like this: cvlc -vvv rtsp://some-ip:port/somestream --sout=file/mp4:file.mp4 Do you see this or something similar? Have received 112 total bytes of a GET_PARAMETER RTSP response; awaiting 2 bytes more. I'm not sure if this is significant or not? Thought it might be helpful to pass on my observation. I wish VLC would get up to date with the live555 code base! Bob On Wed, Oct 30, 2013 at 6:33 PM, Piers Hawksley < piers.hawksley at panogenics.com> wrote: > Hi Ross, > > One of our customers is trying to get RTSP interleaved streams from our > live555 server. I have posted the 178KB Wireshark trace online at > http://www.hawksley42.co.uk/**amg-dump.pcapng > . > > Frame 177 has a block of 112 bytes on the end which does not appear to be > valid data. > > Do I need to change how I set up the unicast OnDemandServerMediaSubsession > if the client may request interleaved streams ? > > Can you suggest what I may try to test this further ? I have not yet found > a way to get VLC to request RTSP interleaved data, so I am limited in how > to test this at present. > > The version of Live555 that is in use is not up-to-date as our customer is > using an old version of our software. We try very hard to release the most > recent version of live 555. > > Best Regards, > > Piers Hawksley > ______________________________**_________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/**mailman/listinfo/live-devel > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 2418 Northline Industrial Drive Maryland Heights, MISSOURI 63043 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 30 18:51:44 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 18:51:44 -0700 Subject: [Live-devel] RTSP Interleaved Streams In-Reply-To: <52719766.8010600@panogenics.com> References: <52719766.8010600@panogenics.com> Message-ID: <37EED4AF-38F2-47C4-91F5-515842EC3601@live555.com> > Do I need to change how I set up the unicast OnDemandServerMediaSubsession if the client may request interleaved streams ? No. The server will serve this automatically, iff the client requests it. > The version of Live555 that is in use is not up-to-date as our customer is using an old version of our software. That's likely your problem. There have been *many* bugs in the implementation of RTP/RTCP-over-TCP (for both clients and servers) that have been fixed in recent months. Both your client and your server should upgrade to the latest version of the "LIVE555 Streaming Media" code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Wed Oct 30 17:07:12 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Thu, 31 Oct 2013 05:37:12 +0530 Subject: [Live-devel] True push DeviceSource In-Reply-To: <4df4c57a617163f57deb13d0fb512d08@neurosoft.in> References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> <5d5ecc8e603d612648c4c340721cf3da@neurosoft.in> <4df4c57a617163f57deb13d0fb512d08@neurosoft.in> Message-ID: <34986c04be36052138c034c559752364@neurosoft.in> Hi, I tried to use openRTSP.exe on the output with option -F to save the audio and video and i found that audio is getting saved just fine thought i was not able to play back the video in saved file but at least i know that my audio is being streamed. But i am not sure why VLS is not rendering audio. Is there something that openRTSp must be doing that VLC is not. -Caduceus On 2013-10-31 00:05, ssingh at neurosoft.in wrote: > Hi, > > Thanks for the guidance and I tried it. Now I can see the packets > being transmitted in testRTSPClient and also in VLC. VLC shows packets > being decoded for both audio and video. But still somehow video is > running smooth but audio just comes and goes. Its like something is > choking audio. It comes for a second and then goes back. What area > should I see for possible issue. I tried changingthe estimate > bandwidth for video and currently set it to 4500 and for audio set it > to 200 but still same issue. I checked time stamps but they seems fine > in testRTSPClient. > > Thanks, > Caduceus > > > > On 2013-10-26 13:43, Ross Finlayson wrote: >>> I am using AC3 codec to encode audio and then stream it using >>> AC3AudioStreamFramer and AC3AudioRTPSink. Am i doing anything wrong >>> ? >> >> Yes. You should *not* be using an "AC3AudioStreamFramer". That's >> intended only for streaming AC3 audio that comes in a byte stream - >> e.g., from a file. Because your AC3 audio input comes from an encoder >> - one frame at a time - you should feed it directly into a >> "AC3AudioRTPSink". (Also, don't forget to set accurate presentation >> times ("fPresentationTime") for each frame.) >> >> Also, you should test your stream first using "testRTSPClient", before >> trying VLC (or any other media player). >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ [1] >> >> >> Links: >> ------ >> [1] http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From ssingh at neurosoft.in Wed Oct 30 18:26:39 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Thu, 31 Oct 2013 06:56:39 +0530 Subject: [Live-devel] fNumchannels hardcoded Message-ID: <9facf41c3cc083cdb570679fe9ccc074@neurosoft.in> Hi, I found that when I was checking my streaming using ffmpeg the ffmpeg displays the channels as 1 and audio codec to be (null). I am using AC3RTPSink in live555. I tried to debug the source of live555 and found that the fNumChannels is not set while creating object (calling base class AudioRTPSink) for AC3AudioRTPSink. Is that intentional or something that we need to correct. I think it should be configurable and should be a setter in AC3AudioRTPSink class. Also I am not sure why ffmpeg display audio codec as (null). When i look at debug info for ffmpeg it displays =============================================== [rtsp @ 0085e9a0] SDP: v=0 o=- 1383179174903374 1 IN IP4 10.20.22.20 s=Session streamed by "Ekomsys" i=Streaming channel using Ekomsys streaming server t=0 0 a=tool:LIVE555 Streaming Media v2011.12.23 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:Session streamed by "Ekomsys" a=x-qt-text-inf:Streaming channel using Ekomsys streaming server m=audio 6666 RTP/AVP 96 c=IN IP4 239.255.42.42/255 b=AS:200 a=rtpmap:96 AC3/48000 a=control:track1 m=video 8888 RTP/AVP 96 c=IN IP4 239.255.42.42/255 b=AS:45000 a=rtpmap:96 MP4V-ES/90000 a=fmtp:96 profile-level-id=1;config=000001B001000001B58913000001000000012000C48D8800F50644191463000001B24C61766335352E31382E313032 a=control:track2 [rtsp @ 0085e9a0] audio codec set to: (null) [rtsp @ 0085e9a0] audio samplerate set to: 48000 [rtsp @ 0085e9a0] audio channels set to: 1 [rtsp @ 0085e9a0] video codec set to: mpeg4 ===================================================== which makes me think that it recognizes that its AC3 format but still responds (null) as audio codec. I used following command "ffplay -rtsp_transport udp_multicast rtsp://10.20.22.20:8554/Ekomsys_stream -loglevel debug" Thanks, Caduceus From finlayson at live555.com Wed Oct 30 18:57:18 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 18:57:18 -0700 Subject: [Live-devel] True push DeviceSource In-Reply-To: <34986c04be36052138c034c559752364@neurosoft.in> References: <1413CC54-024C-4A2E-BBC1-60BEE969E30C@live555.com> <8233b9d2a724a88c383a4ec9a27e71f8@neurosoft.in> <5d5ecc8e603d612648c4c340721cf3da@neurosoft.in> <4df4c57a617163f57deb13d0fb512d08@neurosoft.in> <34986c04be36052138c034c559752364@neurosoft.in> Message-ID: > I tried to use openRTSP.exe on the output with option -F to save the audio and video and i found that audio is getting saved just fine thought i was not able to play back the video in saved file Note - the saved video file is just a H.264 Elementary Stream file. If you rename it to have a ".h264" filename suffix (*not* ".264"), then VLC may be able to play it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Oct 30 19:37:33 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Oct 2013 19:37:33 -0700 Subject: [Live-devel] fNumchannels hardcoded In-Reply-To: <9facf41c3cc083cdb570679fe9ccc074@neurosoft.in> References: <9facf41c3cc083cdb570679fe9ccc074@neurosoft.in> Message-ID: <530DF121-833A-48E8-9335-900FD186A856@live555.com> > a=tool:LIVE555 Streaming Media v2011.12.23 Why are you using such an old version of the "LIVE555 Streaming Media" code? You should upgrade to the latest version (the only version that we support). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From piers.hawksley at panogenics.com Thu Oct 31 00:29:37 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Thu, 31 Oct 2013 07:29:37 +0000 Subject: [Live-devel] RTSP Interleaved Streams In-Reply-To: References: <52719766.8010600@panogenics.com> Message-ID: <2b47484d-ba40-4890-ad44-5e62242a2096@email.android.com> Thanks Bob, Thanks for the pointers on how to test with VLC. Very useful info on GET_PARAMETER. I'll post my findings early next week. Cheers, Piers -- Sent from my Android phone with K-9 Mail. Please excuse my brevity. -------- Original Message -------- From: Bob Bischan Sent: Thu Oct 31 01:37:27 GMT 2013 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTSP Interleaved Streams Piers, I have not yet found a way to get VLC to request RTSP interleaved data, so I am limited in how to test this at present. VLC does support RTSP-over-TCP. This option may be configured from the GUI or from VLC config file. GUI: Tools > Preferences > Input/Codecs > Demuxers > RTP/RTSP > Use RTP over RTSP (TCP) Config File (Linux): ~/.config/VLC/vlcrc # Use RTP over RTSP (TCP) (boolean) rtsp-tcp=1 On windows you might have to search around for this file. Frame 177 has a block of 112 bytes on the end which does not appear to be valid data. I believe this might be related to VLC sending GET_PARAMETER requests for keep-alive. live555 appears to have data in the body of the response (LIVEMEDIA_LIBRARY_VERSION_STRING). If you run VLC from cli like this: cvlc -vvv rtsp://some-ip:port/somestream --sout=file/mp4:file.mp4 Do you see this or something similar? Have received 112 total bytes of a GET_PARAMETER RTSP response; awaiting 2 bytes more. I'm not sure if this is significant or not? Thought it might be helpful to pass on my observation. I wish VLC would get up to date with the live555 code base! Bob On Wed, Oct 30, 2013 at 6:33 PM, Piers Hawksley < piers.hawksley at panogenics.com> wrote: > Hi Ross, > > One of our customers is trying to get RTSP interleaved streams from our > live555 server. I have posted the 178KB Wireshark trace online at > http://www.hawksley42.co.uk/**amg-dump.pcapng > . > > Frame 177 has a block of 112 bytes on the end which does not appear to be > valid data. > > Do I need to change how I set up the unicast OnDemandServerMediaSubsession > if the client may request interleaved streams ? > > Can you suggest what I may try to test this further ? I have not yet found > a way to get VLC to request RTSP interleaved data, so I am limited in how > to test this at present. > > The version of Live555 that is in use is not up-to-date as our customer is > using an old version of our software. We try very hard to release the most > recent version of live 555. > > Best Regards, > > Piers Hawksley > ______________________________**_________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/**mailman/listinfo/live-devel > -- Bob Bischan Manager (Operations/Software Development) *WATCHTOWER SECURITY, INC.* 2418 Northline Industrial Drive Maryland Heights, MISSOURI 63043 314 427 4586 office 314 330 9001 cell 314 427 8823 fax www.watchtower-security.com *?Protecting your community and those you value most.?* ------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From piers.hawksley at panogenics.com Thu Oct 31 00:39:49 2013 From: piers.hawksley at panogenics.com (Piers Hawksley) Date: Thu, 31 Oct 2013 07:39:49 +0000 Subject: [Live-devel] RTSP Interleaved Streams In-Reply-To: <37EED4AF-38F2-47C4-91F5-515842EC3601@live555.com> References: <52719766.8010600@panogenics.com> <37EED4AF-38F2-47C4-91F5-515842EC3601@live555.com> Message-ID: Thanks Ross, I'll test the current live555 code on Monday against the most recent VLC and get that out to my customers who reported the issue with a note that they should update their client. Thanks for all your help and a great library. Cheers, Piers -- Sent from my Android phone with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ssingh at neurosoft.in Thu Oct 31 00:40:31 2013 From: ssingh at neurosoft.in (ssingh at neurosoft.in) Date: Thu, 31 Oct 2013 13:10:31 +0530 Subject: [Live-devel] fNumchannels hardcoded In-Reply-To: <530DF121-833A-48E8-9335-900FD186A856@live555.com> References: <9facf41c3cc083cdb570679fe9ccc074@neurosoft.in> <530DF121-833A-48E8-9335-900FD186A856@live555.com> Message-ID: <62da04dedcf103e443ae3ea307749af2@neurosoft.in> Hi Ross, I tried the latest live555 but still same issue. ========================================== v=0 o=- 1383204769034542 1 IN IP4 192.168.1.104 s=Session streamed by "Ekomsys" i=Streaming channel using Ekomsys streaming server t=0 0 a=tool:LIVE555 Streaming Media v2013.10.03 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:Session streamed by "Ekomsys" a=x-qt-text-inf:Streaming channel using Ekomsys streaming server m=audio 6666 RTP/AVP 96 c=IN IP4 239.255.42.42/255 b=AS:200 a=rtpmap:96 AC3/48000 a=control:track1 m=video 8888 RTP/AVP 96 c=IN IP4 239.255.42.42/255 b=AS:45000 a=rtpmap:96 MP4V-ES/90000 a=fmtp:96 profile-level-id=1;config=000001B001000001B58913000001000000012000C48D8800F50644191463000001B24C61766335352E31382E313032 a=control:track2 ========================================== is my understanding that fNumChannels should be configurable correct or not ? -Caduceus On 2013-10-31 08:07, Ross Finlayson wrote: >> a=tool:LIVE555 Streaming Media v2011.12.23 > > Why are you using such an old version of the "LIVE555 Streaming Media" > code? You should upgrade to the latest version (the only version that > we support). > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ [1] > > > Links: > ------ > [1] http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Thu Oct 31 01:15:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 Oct 2013 01:15:19 -0700 Subject: [Live-devel] fNumchannels hardcoded In-Reply-To: <62da04dedcf103e443ae3ea307749af2@neurosoft.in> References: <9facf41c3cc083cdb570679fe9ccc074@neurosoft.in> <530DF121-833A-48E8-9335-900FD186A856@live555.com> <62da04dedcf103e443ae3ea307749af2@neurosoft.in> Message-ID: > is my understanding that fNumChannels should be configurable correct or not ? Not for AC-3 audio, because AC-3 frames are 'self-describing'. In particular, the number of channels is determined by the AC-3 content. If 'ffmpeg' (or VLC) has problems playing your stream, then you will need to use a mailing list for that software - not this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From marcin at speed666.info Thu Oct 31 01:33:09 2013 From: marcin at speed666.info (Marcin) Date: Thu, 31 Oct 2013 09:33:09 +0100 Subject: [Live-devel] Problems with openRTSP on ARM device In-Reply-To: <524C2815.8000304@speed666.info> References: <524C1994.50509@speed666.info> <524C2815.8000304@speed666.info> Message-ID: <527215C5.9090405@speed666.info> Hi, Again i will respond to myself. The correct fix was to increase net.core.rmem_max with sysctl. Marcin W dniu 2013-10-02 16:05, Marcin pisze: > Hi all, > I will respond to myself to because i forgot to mention. The camera > produces good and working H264 stream. > Proof: http://www.speed666.info/testok.h264 > I can record it via openRTSP command from different host without > problems so it looks like compiler or linux limitation - but where to > search? > Marcin > > W dniu 2013-10-02 15:03, Marcin pisze: >> Hello, >> I wanted to ask You because i have no idea where the problem may sit. >> I have compiled openRTSP which i use to dump H264 streams from RTSP >> server over localhost >> This approach works fine in many IP Cameras that i use for my >> purpose. Suddenly i found a device that producest wierd files. >> >> Incomming buffer size by default is 100000 - and everything works >> great as soon as I frame is not larger that that. >> Normally if i put bitrate like 4-6 Mbits, IFrames are bigger and then >> i increase buffer size to 300000 in example and everything works fine. >> >> But this time - the exported file is broken. Not playable and its >> structure looks broken. In meantime live view can be perfectly >> playable via VLC over RTSP. >> >> Produced files: >> http://www.speed666.info/good.h264 - this is below 100000 limit >> http://www.speed666.info/bad.h264 - this is bitrate where buffer is >> too small >> >> I tried to increase buffer size but this only produces bad files - >> nothing more. >> >> Where i should search the problem. Same binary running on different >> camera with same CPU works well. Any clues? >> >> Marcin >> WebCamera.pl >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Thu Oct 31 02:55:18 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Thu, 31 Oct 2013 10:55:18 +0100 Subject: [Live-devel] ServerMediaSession destructor should be protected ? Message-ID: <4021_1383213323_5272290B_4021_551_1_1BE8971B6CFF3A4F97AF4011882AA25501563EA2EDE3@THSONEA01CMS01P.one.grp> Hi Ross, I spend times to find a memory leak that I caused in our code. Unfortunately I made a delete of a ServerMediaSession, then the BasicEnvironment reclaim don't call the destructor because some medias are still registered in the liveMediaPriv table. Obviously replacing the delete call with Medium::close fix the memory leak. It could be nice to have a ServerMediaSession destructor protected in order to avoid my mistake. Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Oct 31 06:38:23 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 Oct 2013 06:38:23 -0700 Subject: [Live-devel] ServerMediaSession destructor should be protected ? In-Reply-To: <4021_1383213323_5272290B_4021_551_1_1BE8971B6CFF3A4F97AF4011882AA25501563EA2EDE3@THSONEA01CMS01P.one.grp> References: <4021_1383213323_5272290B_4021_551_1_1BE8971B6CFF3A4F97AF4011882AA25501563EA2EDE3@THSONEA01CMS01P.one.grp> Message-ID: <170F8423-262E-4F66-AD0B-2579EF735C75@live555.com> > It could be nice to have a ServerMediaSession destructor protected in order to avoid my mistake. Thanks for the note. This will be fixed in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From admin at music.vt.edu Thu Oct 31 12:07:40 2013 From: admin at music.vt.edu (VTMusicAdmin) Date: Thu, 31 Oct 2013 15:07:40 -0400 Subject: [Live-devel] doubled framerate necessary to correctly capture RTSP stream using openRTSP Message-ID: I am experiencing an issue with openRTSP for which I've not yet been able to find an explanation. If there is a more appropriate forum for this or a FAQ I have missed, please let me know. I'm attempting to use openRTSP to capture and record the RTSP stream from an Extron SME-100 live streaming hardware encoder. The encoder is configured with a frame rate / GOP length of 30 fps, yet if I set openRTSP with a frame rate of 30 like this: openRTSP -d 300 -4 -w 640 -h 360 -f 30 -Q rtsp://URL > test.mp4 (or) openRTSP -d 300 -q -w 640 -h 360 -f 30 -Q rtsp://URL > test.mov The video is the incorrect speed and does not match the audio. If however, I set openRTSP with a frame rate of 60 with: openRTSP -d 300 -4 -w 640 -h 360 -f 60 -Q rtsp://URL > test.mp4 (or) openRTSP -d 300 -q -w 640 -h 360 -f 60 -Q rtsp://URL > test.mov The video is captured correctly and is synchronous with the audio for the duration of the capture. This is a perfectly viable work-around, but I'm curious about why openRTSP only seems to work as expected when set to capture at a 2x (actual) frame rate. Am I misunderstanding something about the use of openRTSP or the process by any chance? Anny insight that can be offered into this situation would be greatly appreciated; thank you! -- -- Michael Dunston -- Music and Technology -- School of Performing Arts -- Music | Theatre | Cinema -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- From finlayson at live555.com Thu Oct 31 18:19:13 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 Oct 2013 18:19:13 -0700 Subject: [Live-devel] doubled framerate necessary to correctly capture RTSP stream using openRTSP In-Reply-To: References: Message-ID: <5F93D876-4E68-4FCA-8EA4-A22AB077E287@live555.com> > The encoder is configured with a frame rate / GOP length of 30 fps, yet if I set openRTSP with a frame rate of 30 like this: > > openRTSP -d 300 -4 -w 640 -h 360 -f 30 -Q rtsp://URL > test.mp4 (or) > openRTSP -d 300 -q -w 640 -h 360 -f 30 -Q rtsp://URL > test.mov > > The video is the incorrect speed and does not match the audio. If however, I set openRTSP with a frame rate of 60 with: > > openRTSP -d 300 -4 -w 640 -h 360 -f 60 -Q rtsp://URL > test.mp4 (or) > openRTSP -d 300 -q -w 640 -h 360 -f 60 -Q rtsp://URL > test.mov > > The video is captured correctly and is synchronous with the audio for the duration of the capture. This is a perfectly viable work-around, but I'm curious about why openRTSP only seems to work as expected when set to capture at a 2x (actual) frame rate. Yes, I also find this strange. If anyone out there wants to take a look at this, the source code module to look at is "QuickTimeFileSink" (in the "liveMedia" directory). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: