From ds at viddiga.com Fri Mar 2 02:10:20 2012 From: ds at viddiga.com (David Scravaglieri) Date: Fri, 2 Mar 2012 11:10:20 +0100 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg Message-ID: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> Hi, Well I am really stuck. My project is to dump Images from a live H264 stream using live555 and ffmpeg. I started with the testRTSPClient prog and in the afterGettingFrame method I want to send the fReceiveBuffer to a class that will decode each frames with ffmpeg and dump all images to a directory. But I can't find any clear sample code or clear HowTo method in order to do that. I have read some posts in the mailinglist. But it's not clear. How to create a suitable ffmpeg codec context for consuming the fReceiveBuffer ? What about using parseSPropParameterSets ? Do I have to use ffmpeg av_parser_parse2 method ? Any help would be appreciated. Thanks in advance, David. From bstump at codemass.com Fri Mar 2 09:05:18 2012 From: bstump at codemass.com (Barry Stump) Date: Fri, 2 Mar 2012 09:05:18 -0800 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> Message-ID: At a minimum, you will probably need to add the MPEG start code 0x00000001 to each frame before passing it to your decoder. See H264VideoFileSink::afterGettingFrame1() for an example of this. You may or may not need to deal with parsing the SProp string depending on whether your H.264 stream contains SPS/PPS NAL units. The mailing list has past discussions on this topic. While configuring and using FFmpeg is outside the scope of this list, your primary decode function will probably be avcodec_decode_video2() from libavcodec. -Barry On Fri, Mar 2, 2012 at 2:10 AM, David Scravaglieri wrote: > Hi, > > Well I am really stuck. > > My project is to dump Images from a live H264 stream using live555 and > ffmpeg. > > I started with the testRTSPClient prog and in the afterGettingFrame method > I want to send the fReceiveBuffer to a class that will decode each frames > with ffmpeg and dump all images to a directory. But I can't find any clear > sample code or clear HowTo method in order to do that. > > I have read some posts in the mailinglist. But it's not clear. > How to create a suitable ffmpeg codec context for consuming the > fReceiveBuffer ? > What about using parseSPropParameterSets ? > Do I have to use ffmpeg av_parser_parse2 method ? > > Any help would be appreciated. > > Thanks in advance, > David. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 2 09:08:09 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Mar 2012 09:08:09 -0800 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> Message-ID: <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> > What about using parseSPropParameterSets ? Yes. You should take the SDP 'configuration' string (from "MediaSubsession::fmtp_spropparametersets()"), and parse this string into a set of SPS and PPS NAL units, using the function "parseSPropParameterSets()". You should then insert these NAL units into your decoder (before the NAL units that come from the RTP stream). I'll let other people answer with specific tips about "ffmpeg" (because I don't use this, and it's not our software). However, I've heard that you need to prepend each NAL unit (*including* the SPS and PPS NAL units that you get from calling "parseSPropParameterSets()") with 0x00 0x00 0x00 0x01, before feeding them to the "ffmpeg" H.264 decoding function. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ricardo at kafemeeting.com Fri Mar 2 09:21:05 2012 From: ricardo at kafemeeting.com (Ricardo Acosta) Date: Fri, 2 Mar 2012 18:21:05 +0100 Subject: [Live-devel] RTCP functions when using BasicUDPSource In-Reply-To: <939658E7-96A8-4C8D-B6CD-0ABE8050926C@live555.com> References: <939658E7-96A8-4C8D-B6CD-0ABE8050926C@live555.com> Message-ID: Hi Ross, With your email, I found how to do it Thank you ! Ricardo On Wed, Feb 29, 2012 at 4:00 PM, Ross Finlayson wrote: > I would like to know what is the best way to get some of the RTCP info > when using UDP in the server side. > > Server side : we are using BasicUDPSource and StreamReplicator to send > replicas towards the client apps. > > > So, is your server's input data RTP/UDP, or raw-UDP? I.e., is your > intention to: > 1/ Make a direct copy of incoming RTP/UDP packets into outgoing RTP/UDP > packets (i.e., keeping the RTP headers exactly the same), or > 2/ Copy data from incoming raw-UDP packets (which are *not* RTP packets) > into outgoing RTP/UDP packets? > > If you're trying to do 1/ (a simple 'UDP relay'), then you should also be > copying the RTCP stream (that comes from the same source as the input RTP > stream). Note that this RTCP stream will (normally) be using the RTP > stream's port number +1; and you should do the same for the output > ('relayed') RTCP packets. And, ideally, you should also 'relay' RTCP > packets from the receiver back to the original source (i.e., also set up a > 'relay' for RTCP packets that come in the reverse direction). > > But if you're trying to do 2/ (a 'raw-UDP-to-RTP relay'), then your server > should be using an appropriate "RTPSink" subclass, *not* a "BasicUDPSink". > And then you should also be creating a "RTCPInstance", tied to this > "RTPSink". And once again, the output "RTPSink" should use an > even-numbered port, and the corresponding "RTCPInstance" should use that > port number +1 (i.e., odd-numbered). > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at viddiga.com Fri Mar 2 09:23:18 2012 From: ds at viddiga.com (David Scravaglieri) Date: Fri, 2 Mar 2012 18:23:18 +0100 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> Message-ID: Thank you Barry I will try. David. Le 2 mars 2012 ? 18:05, Barry Stump a ?crit : > At a minimum, you will probably need to add the MPEG start code 0x00000001 to each frame before passing it to your decoder. See H264VideoFileSink::afterGettingFrame1() for an example of this. > > You may or may not need to deal with parsing the SProp string depending on whether your H.264 stream contains SPS/PPS NAL units. The mailing list has past discussions on this topic. > > While configuring and using FFmpeg is outside the scope of this list, your primary decode function will probably be avcodec_decode_video2() from libavcodec. > > -Barry > > On Fri, Mar 2, 2012 at 2:10 AM, David Scravaglieri wrote: > Hi, > > Well I am really stuck. > > My project is to dump Images from a live H264 stream using live555 and ffmpeg. > > I started with the testRTSPClient prog and in the afterGettingFrame method I want to send the fReceiveBuffer to a class that will decode each frames with ffmpeg and dump all images to a directory. But I can't find any clear sample code or clear HowTo method in order to do that. > > I have read some posts in the mailinglist. But it's not clear. > How to create a suitable ffmpeg codec context for consuming the fReceiveBuffer ? > What about using parseSPropParameterSets ? > Do I have to use ffmpeg av_parser_parse2 method ? > > Any help would be appreciated. > > Thanks in advance, > David. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at viddiga.com Fri Mar 2 10:08:37 2012 From: ds at viddiga.com (David Scravaglieri) Date: Fri, 2 Mar 2012 19:08:37 +0100 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> Message-ID: I have added the following lines into "afterGettingFrame" method. unsigned numSPropRecords; SPropRecord* sPropRecords = parseSPropParameterSets(fSubsession.fmtp_spropparametersets(), numSPropRecords); numSPropRecords is always 0 and sPropsRecords is always NULL For information, my class inherit from MediaSink David. Le 2 mars 2012 ? 18:08, Ross Finlayson a ?crit : >> What about using parseSPropParameterSets ? > > Yes. You should take the SDP 'configuration' string (from "MediaSubsession::fmtp_spropparametersets()"), and parse this string into a set of SPS and PPS NAL units, using the function "parseSPropParameterSets()". You should then insert these NAL units into your decoder (before the NAL units that come from the RTP stream). > > I'll let other people answer with specific tips about "ffmpeg" (because I don't use this, and it's not our software). However, I've heard that you need to prepend each NAL unit (*including* the SPS and PPS NAL units that you get from calling "parseSPropParameterSets()") with 0x00 0x00 0x00 0x01, before feeding them to the "ffmpeg" H.264 decoding function. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Fri Mar 2 10:13:53 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Fri, 2 Mar 2012 11:13:53 -0700 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> Message-ID: Sounds like a few of us ought to get together and conjure up a sample for this use case -- seems like this use-case is becoming increasingly common, though it ought to be, given that there's no native real-time video option on either iOS or Android. Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com On Mar 2, 2012, at 10:23 AM, David Scravaglieri wrote: > Thank you Barry I will try. > > David. > > Le 2 mars 2012 ? 18:05, Barry Stump a ?crit : > >> At a minimum, you will probably need to add the MPEG start code 0x00000001 to each frame before passing it to your decoder. See H264VideoFileSink::afterGettingFrame1() for an example of this. >> >> You may or may not need to deal with parsing the SProp string depending on whether your H.264 stream contains SPS/PPS NAL units. The mailing list has past discussions on this topic. >> >> While configuring and using FFmpeg is outside the scope of this list, your primary decode function will probably be avcodec_decode_video2() from libavcodec. >> >> -Barry >> >> On Fri, Mar 2, 2012 at 2:10 AM, David Scravaglieri wrote: >> Hi, >> >> Well I am really stuck. >> >> My project is to dump Images from a live H264 stream using live555 and ffmpeg. >> >> I started with the testRTSPClient prog and in the afterGettingFrame method I want to send the fReceiveBuffer to a class that will decode each frames with ffmpeg and dump all images to a directory. But I can't find any clear sample code or clear HowTo method in order to do that. >> >> I have read some posts in the mailinglist. But it's not clear. >> How to create a suitable ffmpeg codec context for consuming the fReceiveBuffer ? >> What about using parseSPropParameterSets ? >> Do I have to use ffmpeg av_parser_parse2 method ? >> >> Any help would be appreciated. >> >> Thanks in advance, >> David. >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 2 12:14:33 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Mar 2012 12:14:33 -0800 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> Message-ID: > I have added the following lines into "afterGettingFrame" method. > > unsigned numSPropRecords; > SPropRecord* sPropRecords = parseSPropParameterSets(fSubsession.fmtp_spropparametersets(), numSPropRecords); > > numSPropRecords is always 0 > and sPropsRecords is always NULL That's probably because "fmtp_spropparametersets()" returned NULL, which suggests that the stream's SDP description - for some reason - did not contain a proper "a=fmtp: ..." line. To help figure out what's going wrong, please let us know the stream's SDP description, and also the result of calling "fmtp_spropparametersets()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at viddiga.com Fri Mar 2 13:27:35 2012 From: ds at viddiga.com (David Scravaglieri) Date: Fri, 2 Mar 2012 22:27:35 +0100 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> Message-ID: <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> If I add the following code: const char* pSpropParamSets = fSubsession.fmtp_spropparametersets(); if (pSpropParamSets != NULL) cout << "pSpropParamSets = " << pSpropParamSets << endl; else cout << "pSpropParamSets = NULL" << endl; const char* pSavedSDPLines = fSubsession.savedSDPLines(); if (pSavedSDPLines != NULL) cout << "pSavedSDPLines = " << pSavedSDPLines << endl; else cout << "pSavedSDPLines = NULL" << endl; the output is: pSpropParamSets = NULL pSavedSDPLines = m=video 0 RTP/AVP 33 a=control:rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1&service=620&flavour=ld Le 2 mars 2012 ? 21:14, Ross Finlayson a ?crit : >> I have added the following lines into "afterGettingFrame" method. >> >> unsigned numSPropRecords; >> SPropRecord* sPropRecords = parseSPropParameterSets(fSubsession.fmtp_spropparametersets(), numSPropRecords); >> >> numSPropRecords is always 0 >> and sPropsRecords is always NULL > > That's probably because "fmtp_spropparametersets()" returned NULL, which suggests that the stream's SDP description - for some reason - did not contain a proper "a=fmtp: ..." line. To help figure out what's going wrong, please let us know the stream's SDP description, and also the result of calling "fmtp_spropparametersets()". > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 2 13:39:17 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Mar 2012 13:39:17 -0800 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> Message-ID: No, please let us know the *complete* SDP description - the string that you passed to "MediaSession::createNew()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 2 13:42:32 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Mar 2012 13:42:32 -0800 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> Message-ID: <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> > pSavedSDPLines = m=video 0 RTP/AVP 33 Whoa! Hold on - this line tells me that your input stream is *not* a H.264 RTP stream. Instead, it's a MPEG Transport Stream (that may, or may not, contain H.264, or MPEG-4, or MPEG-2, or a number of other things). This is your problem: You're treating your input stream as if it were a H.264 stream - but it's not. Because your data is MPEG Transport stream, the way in which you decode it is completely different. (You don't deal with NAL units, or anything like that, because that's all hidden within the Transport Stream.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at viddiga.com Fri Mar 2 13:54:43 2012 From: ds at viddiga.com (David Scravaglieri) Date: Fri, 2 Mar 2012 22:54:43 +0100 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> Message-ID: <29C1BEE8-3E6D-4395-823C-EECB7E9E1A53@viddiga.com> sdpdescription: v=0 o=leCDN 1330725225 1330725225 IN IP4 kapoueh.proxad.net s=unknown i=unknown c=IN IP4 0.0.0.0 t=0 0 m=video 0 RTP/AVP 33 a=control:rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1&service=620&flavour=ld Le 2 mars 2012 ? 22:39, Ross Finlayson a ?crit : > No, please let us know the *complete* SDP description - the string that you passed to "MediaSession::createNew()". > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at viddiga.com Fri Mar 2 14:06:23 2012 From: ds at viddiga.com (David Scravaglieri) Date: Fri, 2 Mar 2012 23:06:23 +0100 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> Message-ID: So what do I have to do now ? My vlc player can read this stream and the info stream says it's H264. My understanding is that I must check how to go through this with ffmpeg. Am I wrong ? Le 2 mars 2012 ? 22:42, Ross Finlayson a ?crit : >> pSavedSDPLines = m=video 0 RTP/AVP 33 > > Whoa! Hold on - this line tells me that your input stream is *not* a H.264 RTP stream. Instead, it's a MPEG Transport Stream (that may, or may not, contain H.264, or MPEG-4, or MPEG-2, or a number of other things). > > This is your problem: You're treating your input stream as if it were a H.264 stream - but it's not. Because your data is MPEG Transport stream, the way in which you decode it is completely different. (You don't deal with NAL units, or anything like that, because that's all hidden within the Transport Stream.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 2 14:08:45 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Mar 2012 14:08:45 -0800 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> Message-ID: <7D6437EE-EBB6-43B7-A1E8-2D85DAC10B10@live555.com> > So what do I have to do now ? You do whatever you need to do to decode a MPEG Transport Stream. But (because we don't do decoding or encoding) that's off-topic for this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at viddiga.com Fri Mar 2 14:25:57 2012 From: ds at viddiga.com (David Scravaglieri) Date: Fri, 2 Mar 2012 23:25:57 +0100 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <7D6437EE-EBB6-43B7-A1E8-2D85DAC10B10@live555.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> <7D6437EE-EBB6-43B7-A1E8-2D85DAC10B10@live555.com> Message-ID: <769511C7-66AF-4B39-9148-D793C1DB114F@viddiga.com> Ok, Thank you Ross. David. Le 2 mars 2012 ? 23:08, Ross Finlayson a ?crit : >> So what do I have to do now ? > > You do whatever you need to do to decode a MPEG Transport Stream. But (because we don't do decoding or encoding) that's off-topic for this mailing list. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From Layne.Berge.1 at my.ndsu.edu Sat Mar 3 19:29:36 2012 From: Layne.Berge.1 at my.ndsu.edu (Layne Berge) Date: Sun, 4 Mar 2012 03:29:36 +0000 Subject: [Live-devel] openRTSP Test Program Fails to Export a File In-Reply-To: <7EE52B71-E0B0-4F36-ABB3-18F5D47D153C@live555.com> References: <97C2C9BC-7CED-4F6D-9344-E887E50E44EC@my.ndsu.edu> <7EE52B71-E0B0-4F36-ABB3-18F5D47D153C@live555.com> Message-ID: <4509644E-3C70-4FEE-B6D3-979428173A78@my.ndsu.edu> >> Is it possible to not specify a frame-rate? > > Of course. However - due to the limitations of the ".mov"/".mp4" file format - if you omit this parameter (or if the frame rate varies), it's unlikely that you'll end up with a file that you'll be able to play. Thanks to your help, I have a working client to record the live stream. However, I haven't found a solution for the camera I'm using as a source to output a static frame-rate; it keeps varying because of network traffic. As a solution, I want to look at the time between received frames. If it is greater than the time corresponding to a set frame-rate, say 30 fps, I will simply insert the previous frame 'n' times in order to make up the time difference. As for non integer values of 'n', I'll just keep track of the residual and insert another when it becomes > 1. The question I have for you is whether this is a good idea or is there a better method which you can think of? Any advice is appreciated. From brado at bighillsoftware.com Mon Mar 5 10:18:37 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Mon, 5 Mar 2012 11:18:37 -0700 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: <769511C7-66AF-4B39-9148-D793C1DB114F@viddiga.com> References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> <7D6437EE-EBB6-43B7-A1E8-2D85DAC10B10@live555.com> <769511C7-66AF-4B39-9148-D793C1DB114F@viddiga.com> Message-ID: David -- If you don't mind, I'm going to piggy-back on your thread -- I have the exact same use case (H.264 stream over RTSP > Live 555 > ffmpeg), and it would appear maybe a very similar problem (though I have a different SDP description). To all -- I have implemented a MediaSink subclass (based on an adaptation of the DummySink in the testRTSPClient sample), which takes the receive buffer and hands it off to ffmpeg to decode the frame. Note that I've received varying input on whether you need to prepend a start code to the frame first -- I've tried it both with and without, but with the same result. In my case, the problem is that the decoding step in ffmpeg doesn't return any bytes, and logs the following message to the console: [h264 @ 0xe2dbc00] no frame! I know this isn't an ffmpeg support forum, but I'm not seeking ffmpeg support with this question -- I'm merely mentioning it to give a context for the problem. My suspicions are that the problem is in the handling taking place in Live555, not in ffmpeg -- that the decode failure is merely a downstream symptom of a problem handling the stream. A few questions: 1. What is the exact nature of the data in the receive buffer at the time that the afterGettingFrame() method of the MediaSink subclass is called, when an H.264 stream from RTSP is in play? 2. When an H.264 stream from RTSP is in play, is there any massaging of data in the receive buffer that needs to take place prior to decoding an H.264 frame in ffmpeg? Does it need a start code or not, and if yes, does it need a start code every time, or just conditionally? 3. Ross, you seem to know your stuff, so I'd be very appreciative if you or anyone else could take a look at my SDP description and let me know if you see any red flags. My description is below: [URL:"rtsp://192.168.1.100:8080/test.sdp"]: Got a SDP description: v=0 o=- 13553521053896953521 13553521053896953521 IN IP4 boundary s=N/A u=http c=IN IP4 0.0.0.0 t=0 0 a=control:rtsp://192.168.1.100:8080/test.sdp m=video 0 RTP/AVP 96 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=42001f;sprop-parameter-sets=Z0IAH6aAyBLk,aM4wpIA=; a=control:rtsp://192.168.1.100:8080/test.sdp/trackID=0 Thanks, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com On Mar 2, 2012, at 3:25 PM, David Scravaglieri wrote: > Ok, > > Thank you Ross. > > David. > > Le 2 mars 2012 ? 23:08, Ross Finlayson a ?crit : > >>> So what do I have to do now ? >> >> You do whatever you need to do to decode a MPEG Transport Stream. But (because we don't do decoding or encoding) that's off-topic for this mailing list. >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 5 11:10:51 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 5 Mar 2012 11:10:51 -0800 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> <7D6437EE-EBB6-43B7-A1E8-2D85DAC10B10@live555.com> <769511C7-66AF-4B39-9148-D793C1DB114F@viddiga.com> Message-ID: > 1. What is the exact nature of the data in the receive buffer at the time that the afterGettingFrame() method of the MediaSink subclass is called, when an H.264 stream from RTSP is in play? It's a H.264 NAL unit. > 2. When an H.264 stream from RTSP is in play, is there any massaging of data in the receive buffer that needs to take place prior to decoding an H.264 frame in ffmpeg? Does it need a start code Yes, you will need to prepend 0x00 0x00 0x00 0x01 to each NAL unit before passing it to "ffmpeg" for decoding. You should also take the SDP 'configuration' string (from "MediaSubsession::fmtp_spropparametersets()"), and parse this string into a set of SPS and PPS NAL units, using the function "parseSPropParameterSets()". You should then insert these NAL units (with start codes) into your decoder (before the NAL units that come from the RTP stream). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Mon Mar 5 13:56:11 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Tue, 6 Mar 2012 03:26:11 +0530 Subject: [Live-devel] Live555 server and VLC playback problem Message-ID: <62564685-F146-474B-A045-BAE715ADF7BE@saranyu.in> Hi, I have Live555 RTSP server running on a machine (Amazon EC2) that has a private IP address and an internet routable public IP address. When I try to play this stream using VLC (Version 1.1.7 ) it says "live555 warning. no data received for 10s. Switching to TCP" One issue i found is SDP was having private IP address. During SDP creation the getsockname is called to get the ip address which was returning private IP address. I modified this to put public IP address. After this modification also, still not able to play it via UDP. It waits for a timeout period and switches to TCP. From debug prints I see that RTSP Server is sending out UDP data. There is no firewall on the machine running VLC. Any idea what can be going wrong ? Regards, Sambhav -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 5 14:24:16 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 5 Mar 2012 14:24:16 -0800 Subject: [Live-devel] Live555 server and VLC playback problem In-Reply-To: <62564685-F146-474B-A045-BAE715ADF7BE@saranyu.in> References: <62564685-F146-474B-A045-BAE715ADF7BE@saranyu.in> Message-ID: > From debug prints I see that RTSP Server is sending out UDP data. > There is no firewall on the machine running VLC. Nonetheless, if RTP-over-UDP streaming doesn't work, but RTP-over-TCP streaming does work, then there is probably a firewall *somewhere* between the server and client that is blocking UDP. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From claudiotezzin at gmail.com Mon Mar 5 09:23:42 2012 From: claudiotezzin at gmail.com (Claudio Tezzin) Date: Mon, 5 Mar 2012 14:23:42 -0300 Subject: [Live-devel] OpenRTSP Pause and Resume(re-play) Issue In-Reply-To: References: Message-ID: Hi, I am trying to implement the pause and resume at openRTSP but I am getting a problem when I try to send the play command after pause. The PAUSE is working fine and then I read at live55 web site that we need to call the PLAY after pause with -1 at the "start" parameter, so the player will start at the point it stopped before. It is working but when the video starts to play I don't have the audio anymore and the video is not smoothly anymore. Do you have an idea about how to fix that issue? Do I have any way to get the current position (in time) of the movie? Regards, -- Claudio Tezzin Jeremias -- +55 19 91992160 -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Mon Mar 5 21:52:23 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Tue, 6 Mar 2012 11:22:23 +0530 Subject: [Live-devel] Live555 server and VLC playback problem In-Reply-To: References: <62564685-F146-474B-A045-BAE715ADF7BE@saranyu.in> Message-ID: I updated to VLC version 2.0 and its working fine with RTP over UDP. Only with older versions the problem exists. Even FFPlay has same issue, times out and switches to TCP Also tested with Quicktime and gstreamer. RTP over UDP works fine with these clients. Since the behavior is client specific, is there anything missed by the server which is getting ignored by few clients and not by others ? On Mar 6, 2012, at 3:54 AM, Ross Finlayson wrote: >> From debug prints I see that RTSP Server is sending out UDP data. >> There is no firewall on the machine running VLC. > > Nonetheless, if RTP-over-UDP streaming doesn't work, but RTP-over-TCP streaming does work, then there is probably a firewall *somewhere* between the server and client that is blocking UDP. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From drollinson at logostech.net Tue Mar 6 11:18:16 2012 From: drollinson at logostech.net (Rollinson, Derek) Date: Tue, 6 Mar 2012 11:18:16 -0800 Subject: [Live-devel] Change sockets Message-ID: <8CD7A9204779214D9FDC255DE48B9521BCA23281@EXPMBX105-1.exch.logostech.net> What would it take, if possible, to swap out the current sockets used in live555. Say I wanted to use something like amqp or zeromq. If possible what kind of effort are we talking about? Thanks, Derek R. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Tue Mar 6 13:25:31 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Tue, 6 Mar 2012 14:25:31 -0700 Subject: [Live-devel] Questions about project to dump Images frames from H264 stream using live555 and ffmpeg In-Reply-To: References: <39A4CBE0-F701-49A0-A358-CE9671D9F051@viddiga.com> <54CC000C-9F5A-45A7-85FA-E175A007A7A1@live555.com> <81287A79-4076-4987-BF3B-47E11AC1CAC0@viddiga.com> <56639D41-7E76-4733-B238-F3AE493A8260@live555.com> <7D6437EE-EBB6-43B7-A1E8-2D85DAC10B10@live555.com> <769511C7-66AF-4B39-9148-D793C1DB114F@viddiga.com> Message-ID: <3E2C9033-7732-45C9-A92A-E9B02031C1D7@bighillsoftware.com> Ross, Barry, and Jon, I want to thank you for your assistance with your answers on this mailing list. I am far from complete with the app I am working on, and will certainly be posting more questions as development progresses, but I now have a working prototype moving real-time video across a network through Live555 and ffmpeg to display on an iOS device. Your help has been appreciated more than you know. I'm sure I'm not the first to suggest this, and I mean this entirely in the spirit of helpful feedback, but it would be really helpful to start making available some current documentation and addressing various use cases. In particular, I believe the use case addressed by this thread (real-time H.264 to display, on mobile devices) is a very common use-case, given that no native ability to do so exists in either iOS or Android, so I think it would be great to start seeing attention to these uses take shape. I don't know what the typical / desired approach is for make new info available...perhaps for the least intrusion, a posting on my own blog might be the best route. Anyway, I wanted to say thanks again guys.... Sincerely, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com (480) 280-1468 On Mar 5, 2012, at 12:10 PM, Ross Finlayson wrote: >> 1. What is the exact nature of the data in the receive buffer at the time that the afterGettingFrame() method of the MediaSink subclass is called, when an H.264 stream from RTSP is in play? > > It's a H.264 NAL unit. > > >> 2. When an H.264 stream from RTSP is in play, is there any massaging of data in the receive buffer that needs to take place prior to decoding an H.264 frame in ffmpeg? Does it need a start code > > Yes, you will need to prepend 0x00 0x00 0x00 0x01 to each NAL unit before passing it to "ffmpeg" for decoding. > > You should also take the SDP 'configuration' string (from "MediaSubsession::fmtp_spropparametersets()"), and parse this string into a set of SPS and PPS NAL units, using the function "parseSPropParameterSets()". You should then insert these NAL units (with start codes) into your decoder (before the NAL units that come from the RTP stream). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 6 13:31:09 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 Mar 2012 13:31:09 -0800 Subject: [Live-devel] Change sockets In-Reply-To: <8CD7A9204779214D9FDC255DE48B9521BCA23281@EXPMBX105-1.exch.logostech.net> References: <8CD7A9204779214D9FDC255DE48B9521BCA23281@EXPMBX105-1.exch.logostech.net> Message-ID: > What would it take, if possible, to swap out the current sockets used in live555. Say I wanted to use something like amqp or zeromq. If possible what kind of effort are we talking about? It would be an enormous undertaking, unfortunately. At some point, I plan to change the way that 'network interfaces' are treated by the code - making them 'source' and 'sink' objects, like the rest of the "liveMedia" code. This should make it easy for developers to plug in their own implementation of network interfaces (including 'simulated' network interfaces). (It's also how we'll get IPv6 support.) However, too, this will be a major undertaking, requiring a major upheaval in the code (including possibly changing the way that buffering is done), and there is no ETA for this. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Tue Mar 6 14:08:36 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Tue, 6 Mar 2012 15:08:36 -0700 Subject: [Live-devel] MediaSink / Memory leak Message-ID: All, I have a MediaSink subclass I have created. In the afterGettingFrame() method, I have the following code: if (!fHaveWrittenFirstFrame) { // If we have PPS/SPS NAL units encoded in a "sprop parameter string", prepend these to the file: unsigned numSPropRecords; SPropRecord* sPropRecords = parseSPropParameterSets(fSPropParameterSetsStr, numSPropRecords); for (unsigned i = 0; i < numSPropRecords; ++i) { addData(start_code, 4, presentationTime); addData(sPropRecords[i].sPropBytes, sPropRecords[i].sPropLength, presentationTime); } delete[] sPropRecords; fHaveWrittenFirstFrame = True; // for next time } This particular code snippet was taken directly from the Live555 H264VideoFileSink class implementation of its afterGettingFrame() method, so the exact code and usage context should be identical. Anyway, I have taken my app run it on Xcode, and this line pretty quickly crashes the app: delete[] sPropRecords; with the error that the "pointer being freed was not allocated". I then figured that there was something internal to the parseSPropParameterSets which might conditionally allocate sPropRecords. So I wrapped the delete in a conditional: if (sPropRecords) { delete[] sPropRecords; } The error was the same, and I confirmed in the debugger that sPropRecords was indeed a valid pointer. I then deleted out the offending delete line entirely, and ran the app. The app ran without crashing. However, when I profiled the app in Instruments, it reported a memory leak on this code line: SPropRecord* sPropRecords = parseSPropParameterSets(fSPropParameterSetsStr, numSPropRecords); ...which is obviously because there's no associated delete. So I believe it reasonable to conclude that there's a memory leak within this entity. My question -- does anyone have any insight into what the memory leak is and how to correct it? Thanks, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 6 14:16:35 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 Mar 2012 14:16:35 -0800 Subject: [Live-devel] MediaSink / Memory leak In-Reply-To: References: Message-ID: <9C1D6718-E9BD-45D3-B10B-9D7B70DEC4EA@live555.com> > This particular code snippet was taken directly from the Live555 H264VideoFileSink class implementation of its afterGettingFrame() method, so the exact code and usage context should be identical. > > Anyway, I have taken my app run it on Xcode, and this line pretty quickly crashes the app: > > delete[] sPropRecords; > > with the error that the "pointer being freed was not allocated". I don't understand why you're getting an error here; the code looks OK. > I then figured that there was something internal to the parseSPropParameterSets which might conditionally allocate sPropRecords. So I wrapped the delete in a conditional: > > if (sPropRecords) > { > delete[] sPropRecords; > } > > The error was the same FYI, in C++ "delete[]"ing (or "delete"ing) a NULL pointer is not an error (it has no effect). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Tue Mar 6 14:31:47 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Tue, 6 Mar 2012 15:31:47 -0700 Subject: [Live-devel] MediaSink / Memory leak In-Reply-To: <9C1D6718-E9BD-45D3-B10B-9D7B70DEC4EA@live555.com> References: <9C1D6718-E9BD-45D3-B10B-9D7B70DEC4EA@live555.com> Message-ID: <3B430B79-97F7-4376-94EB-53523AC10F84@bighillsoftware.com> Ross, >> delete[] sPropRecords; >> >> with the error that the "pointer being freed was not allocated". > > I don't understand why you're getting an error here; the code looks OK. Yeah, I know the feeling! ;-) But it would seem more than an issue of style, it outright crashes the entire app... > FYI, in C++ "delete[]"ing (or "delete"ing) a NULL pointer is not an error (it has no effect). Yeah...I've been wondering if I'm dealing with some compiler nuance in part. I did that to see if it would have any marked effect on the outcome. The only way I can get a run that doesn't crash is to remove the >> delete[] sPropRecords; line entirely, and as a result, the memory.... Brad On Mar 6, 2012, at 3:16 PM, Ross Finlayson wrote: >> This particular code snippet was taken directly from the Live555 H264VideoFileSink class implementation of its afterGettingFrame() method, so the exact code and usage context should be identical. >> >> Anyway, I have taken my app run it on Xcode, and this line pretty quickly crashes the app: >> >> delete[] sPropRecords; >> >> with the error that the "pointer being freed was not allocated". > > I don't understand why you're getting an error here; the code looks OK. > > >> I then figured that there was something internal to the parseSPropParameterSets which might conditionally allocate sPropRecords. So I wrapped the delete in a conditional: >> >> if (sPropRecords) >> { >> delete[] sPropRecords; >> } >> >> The error was the same > > FYI, in C++ "delete[]"ing (or "delete"ing) a NULL pointer is not an error (it has no effect). > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Tue Mar 6 15:41:07 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Tue, 6 Mar 2012 16:41:07 -0700 Subject: [Live-devel] MediaSink / Memory leak In-Reply-To: <3B430B79-97F7-4376-94EB-53523AC10F84@bighillsoftware.com> References: <9C1D6718-E9BD-45D3-B10B-9D7B70DEC4EA@live555.com> <3B430B79-97F7-4376-94EB-53523AC10F84@bighillsoftware.com> Message-ID: <0F1D20FF-15A1-4E51-BEF1-C6C03104C12B@bighillsoftware.com> Ross, I had another thought here -- and I open this to anyone else trying to use Live555 inside the scope of Xcode. My MediaSink subclass I implemented as a .cpp file, (as a matter of fact I had to do the same with my RTSPClient subclass, used a .cpp file), because if I attempted to use a .mm file (typical for compiling in C++ in Xcode), I received the following compilation error (as a result of Live555 header includes): Boolean.hh: Typedef redefinition with different types ('unsigned int' vs 'unsigned char') This is presumably due to type definition conflicts between Objective C boolean-related declarations and those declared in Boolean.hh. As I went through the Live555 source and consulted Googlepalooza's returned forum posts on the matter, I was a bit skittish about tweaking anything as fundamental as a base type definition without knowing in more detail the possible ramifications. I'm sure the answer is probably a simple one, but Ross, you seem to have insight into the entire code-base, so if you or anyone else can advise on how to address this so as to make Live555 play nice in Xcode with Objective C, I'd greatly appreciate it. Bottom line, I'm wondering what effect this might be having on the runtime issue below. It is really the only nuance of environmental difference I can think of, other than use of Apple's LLVM and LLDB. Thanks, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com On Mar 6, 2012, at 3:31 PM, Brad O'Hearne wrote: > Ross, > >>> delete[] sPropRecords; >>> >>> with the error that the "pointer being freed was not allocated". >> >> I don't understand why you're getting an error here; the code looks OK. > > Yeah, I know the feeling! ;-) But it would seem more than an issue of style, it outright crashes the entire app... > >> FYI, in C++ "delete[]"ing (or "delete"ing) a NULL pointer is not an error (it has no effect). > > Yeah...I've been wondering if I'm dealing with some compiler nuance in part. I did that to see if it would have any marked effect on the outcome. > > The only way I can get a run that doesn't crash is to remove the > >>> delete[] sPropRecords; > > > line entirely, and as a result, the memory.... > > Brad > > On Mar 6, 2012, at 3:16 PM, Ross Finlayson wrote: > >>> This particular code snippet was taken directly from the Live555 H264VideoFileSink class implementation of its afterGettingFrame() method, so the exact code and usage context should be identical. >>> >>> Anyway, I have taken my app run it on Xcode, and this line pretty quickly crashes the app: >>> >>> delete[] sPropRecords; >>> >>> with the error that the "pointer being freed was not allocated". >> >> I don't understand why you're getting an error here; the code looks OK. >> >> >>> I then figured that there was something internal to the parseSPropParameterSets which might conditionally allocate sPropRecords. So I wrapped the delete in a conditional: >>> >>> if (sPropRecords) >>> { >>> delete[] sPropRecords; >>> } >>> >>> The error was the same >> >> FYI, in C++ "delete[]"ing (or "delete"ing) a NULL pointer is not an error (it has no effect). >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From hfsamb-cel at yahoo.com.br Tue Mar 6 09:26:40 2012 From: hfsamb-cel at yahoo.com.br (hfsamb-cel at yahoo.com.br) Date: Tue, 6 Mar 2012 09:26:40 -0800 (PST) Subject: [Live-devel] Help with RTP/RTCP relay implementation Message-ID: <1331054800.8422.YahooMailNeo@web113619.mail.gq1.yahoo.com> I am creating an RTP/UDP relay that encrypts raw MPEG2TS streams. On the incoming side I used an RTPSource plus an RTCPInstance in client mode and on the outgoing side I used an RTPSink and an RTCPInstance in server mode. The encrypting function is implemented in a subclass of framed filter which I use as a source to the RTPSink. I have tested the encryption implementation using a FileSink and it seems to be working fine. I have been using VLC to send the RTP stream to the relay but in the client the same video as the source is seen without encryption. I guess that the RTCP is being relayed but somehow the RTP is going straight from the source to the client. Is that a possible explanation? Am I missing something in my implementation? Best regards, Henrique -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 6 19:37:24 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 Mar 2012 19:37:24 -0800 Subject: [Live-devel] Help with RTP/RTCP relay implementation In-Reply-To: <1331054800.8422.YahooMailNeo@web113619.mail.gq1.yahoo.com> References: <1331054800.8422.YahooMailNeo@web113619.mail.gq1.yahoo.com> Message-ID: <9F89BA3A-43CE-452B-9E05-153D6C438ECA@live555.com> > I am creating an RTP/UDP relay that encrypts raw MPEG2TS streams. On the incoming side I used an RTPSource plus an RTCPInstance in client mode and on the outgoing side I used an RTPSink and an RTCPInstance in server mode. > > The encrypting function is implemented in a subclass of framed filter which I use as a source to the RTPSink. I have tested the encryption implementation using a FileSink and it seems to be working fine. > > I have been using VLC to send the RTP stream to the relay but in the client the same video as the source is seen without encryption. > > I guess that the RTCP is being relayed but somehow the RTP is going straight from the source to the client. Is that a possible explanation? Am I missing something in my implementation? It's hard to say, but make sure that: 1/ You use separate "groupsock" objects for receiving and sending, and 2/ You use separate port numbers for receiving and sending. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From hfsamb-cel at yahoo.com.br Wed Mar 7 05:56:05 2012 From: hfsamb-cel at yahoo.com.br (hfsamb-cel at yahoo.com.br) Date: Wed, 7 Mar 2012 05:56:05 -0800 (PST) Subject: [Live-devel] Help with RTP/RTCP relay implementation In-Reply-To: <9F89BA3A-43CE-452B-9E05-153D6C438ECA@live555.com> References: <1331054800.8422.YahooMailNeo@web113619.mail.gq1.yahoo.com> <9F89BA3A-43CE-452B-9E05-153D6C438ECA@live555.com> Message-ID: <1331128565.24591.YahooMailNeo@web113601.mail.gq1.yahoo.com> Hi Ross, I think I found what the problem was. The receiver misses the initial packets that contain the PAT and without the PAT the encrypting function never starts encrypting, it just copies the stream. I am using unicast mode only and after some testing with the test tools I found out that: 1/ The testMPEG2TransportReceiver must be started after testMPEG2TransportStreamer otherwise the transfer will not begin 2/ If I first start testMPEG2TransportStreamer the initial packets are lost as the streaming is not blocked until the receiver is connected Is that expected? How can I change it? Regards,Henrique ________________________________ De: Ross Finlayson Para: LIVE555 Streaming Media - development & use Enviadas: Quarta-feira, 7 de Mar?o de 2012 0:37 Assunto: Re: [Live-devel] Help with RTP/RTCP relay implementation I am creating an RTP/UDP relay that encrypts raw MPEG2TS streams. On the incoming side I used an RTPSource plus an RTCPInstance in client mode and on the outgoing side I used an RTPSink and an RTCPInstance in server mode. > > > >The encrypting function is implemented in a subclass of framed filter which I use as a source to the RTPSink. I have tested the encryption implementation using a FileSink and it seems to be working fine. > > >I have been using VLC to send the RTP stream to the relay but in the client the same video as the source is seen without encryption. > > >I guess that the RTCP is being relayed but somehow the RTP is going straight from the source to the client. Is that a possible explanation? Am I missing something in my implementation? > It's hard to say, but make sure that: 1/ You use separate "groupsock" objects for receiving and sending, and 2/ You use separate port numbers for receiving and sending. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Wed Mar 7 22:01:17 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Wed, 7 Mar 2012 23:01:17 -0700 Subject: [Live-devel] Boolean.hh type redefinition conflict when compiling in Xcode Message-ID: All, I have run into a conflict trying to link in Live555 into a project within Xcode. Typically, the approach for referencing C++ code within an Objective C source file is to name the implementation with a .mm extension. However, when I try to include various Live555 headers in a .mm file, I received the following compilation error in Xcode: Boolean.hh: Typedef redefinition with different types ('unsigned int' vs 'unsigned char') This is presumably due to type definition conflicts between Objective C boolean-related declarations and those declared in Boolean.hh. I am reluctant to change anything anything as fundamental as a base type definition without knowing in more detail the possible ramifications. I have been able to get around it by using a file extension of .cpp, but I am suspecting that is causing other problems elsewhere. I'd like to resolve this conflict so that I can reference Live555 headers within .mm files. If anyone can advise on how to effectively resolve the type declaration problem without detrimental effects, I would greatly appreciate it. Thanks, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com From nick at incension.com Thu Mar 8 04:36:57 2012 From: nick at incension.com (Nick Byrne) Date: Thu, 8 Mar 2012 12:36:57 +0000 Subject: [Live-devel] MPEG2TransportStreamMultiplexor::setProgramStreamMap Message-ID: Hi, I'm currently using MPEG2TransportStreamFromESSource successfully to generate a transport stream, however i would like to mux additional programs into the stream with additional PMT tables and PAT entries. Currently only one PMT is generated with program no. of 1 which contains all elementary streams pids. Whilst looking at what i needed to do i noticed the setProgramStreamMap method and wondered if anyone has used - i'm not clear on what exactly it's used for or rather how it's supposed to be used, any insight appreciated. Thanks Nick -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Thu Mar 8 18:27:51 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Thu, 8 Mar 2012 19:27:51 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss Message-ID: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> Hello gang.... I am soldiering right along with my (now working) video processing pipeline, which constitutes pulling H.264 streaming video across the network via RTSP, and then decoding it using ffmpeg. The lion's share of my Live555 implementation resides in a RTSPClient subclass, and a MediaSink subclass, pretty much standard in relation to the testRTSPClient example out there. So here's my latest hurdle. I have completely unhooked the ffmpeg side of the equation, and commented out all processing that is taking place in my MediaSink's afterGettingFrame() subclass -- the only thing this method implementation is doing is logging NAL unit type counts to the console. My purpose in doing this was to determine the frame rate being received in the RTSPClient, since I control the server side of this, and know exactly what is crossing the wire (or more properly put, wireless). Here's the configuration: - Server is pumping out 30 frames of video per second over a closed Wifi LAN (by closed, I mean there is only one client on the network, my app, within close proximity, so there should be next to no latency). More specifically, the server is sending 1 NAL unit type 7, 1 NAL unit type 8, and 1 NAL unit type 5 (the actual data), and sending that sequence of NAL units continually. There are 30 frames a second (and when I say "frames" I am not talking all NAL units, I'm talking NAL unit type 5's, the actual video data). So with a RTSPClient that to my knowledge (as far as my subclasses are concerned) I should be getting close to 30fps. I'm not -- I'm getting around 9fps, so about 66% frame dropping. Again, my subclasses are doing zero processing, just counting NALs and occasionally logging the number to the console -- and there is no downstream decoding or processing of any kind taking place. My questions: 1. What factors / culprits could be causing this massive frame loss? 2. Is there anything in the base RTSPClient or MediaSink subclasses (or elsewhere) that can be configured to reduce frame loss? 3. Are there any other known factors that contribute to performance that could be acting on this use case. Thanks for your help! Cheers, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com From finlayson at live555.com Thu Mar 8 19:57:39 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Mar 2012 19:57:39 -0800 Subject: [Live-devel] MPEG2TransportStreamMultiplexor::setProgramStreamMap In-Reply-To: References: Message-ID: As you've discovered, our Transport Stream multiplexing code ("MPEG2TransportStreamMultiplexor.cpp") was intended primarily for creating a single-program Transport Stream (because this is primarily a 'streaming media' library, and a single-program Transport Stream is the kind that most people will want to stream). At first glance, I'm not sure what would be involved in making this support the creation of multi-program Transport Streams, but the changes might end up not being significant. As always, people are welcome to look into this code and suggest changes. In any case, the "setProgramStreamMap()" function is probably a 'red herring', because that function is used to *read* information from a 'Program Map Table that happens to be in the input data; it's not used to *write* a Program Map Table. For that, the function "deliverPMTPacket()" is used. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lideshi at gmail.com Thu Mar 8 16:55:32 2012 From: lideshi at gmail.com (Deshi li) Date: Thu, 8 Mar 2012 19:55:32 -0500 Subject: [Live-devel] openRTSP Question Message-ID: Hi everyone, I was wondering if someone could help me understand the technology behind, cam.ly and dropCam. I wish to do something similar using openRTSP. Their basic idea is to store videos from IP Camera to the cloud. I was wondering what is the best way of achieve this? My idea was to use openRTSP to receive the RTSP stream the IP Camera provides and save them to disk. So the IP Camera would act as a RTSP server while openRTSP acts as a client. But then I feel this method might not be reliable. If anyone have any knowledge on how to store IP Camera videos to the cloud reliably, feel free to chime in. Thanks in advance, Jason -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Mar 10 06:34:24 2012 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 10 Mar 2012 14:34:24 +0000 Subject: [Live-devel] Rtsp connection issues Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1E04A2@IL-BOL-EXCH01.smartwire.com> I am using live555 in my project and am connecting to lots of cameras. I Used to connect to all of the cameras before I rewrote to use the async interface. I closely modeled the playcommon.cpp used in openrtsp. For some reason about 25-50% of the cameras are now having connection issues and I am trying to figure out what I have done wrong. It seems like it may be a latency or timing issue as it goes quickly thru the options,describe,setup, and play but never starts the background task for incomingdayhandler. If by way of comparison, I connect with vlc, it takes 15 secs or more to connect. If I use vlc on linux it connects in under 2 seconds. Any help would be greatly appreciated, thanks I sent this last night from my phone to the wrong address :( I have been debugging and I see in wireshark the video is sent. I see from breakpoints that it is incrementing fNumActiveSourcesSinceLastReset but then it resets a few seconds later. Connected by DROID on Verizon Wireless Jeff Shanab, Manager, Software Engineering D 630.633.4515 | C 630.453.7764 | F 630.633.4815 | jshanab at smartwire.com [Description: Description: Description: Description: cid:706AA5FB-B29A-4B95-B275-FE31EE559CF0 at hsd1.il.comcast.net.] [Description: Description: Description: Description: Description: Description: Description: sig4] [Description: Description: Description: Description: Description: Description: Description: sig3] [Description: Description: Description: Description: Description: Description: Description: sig2] -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.gif Type: image/gif Size: 7675 bytes Desc: image001.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.gif Type: image/gif Size: 1494 bytes Desc: image002.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.gif Type: image/gif Size: 1470 bytes Desc: image003.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image004.gif Type: image/gif Size: 1506 bytes Desc: image004.gif URL: From finlayson at live555.com Sat Mar 10 22:08:05 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 10 Mar 2012 22:08:05 -0800 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> Message-ID: > 1. What factors / culprits could be causing this massive frame loss? > > 2. Is there anything in the base RTSPClient or MediaSink subclasses (or elsewhere) that can be configured to reduce frame loss? See You should also, of course, make sure that your receiver ("MediaSink" subclass)'s buffer is big enough for the incoming NAL units. (You can do this by checking whether "numTruncatedBytes" is ever >0.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sun Mar 11 19:13:22 2012 From: jshanab at smartwire.com (Jeff Shanab) Date: Mon, 12 Mar 2012 02:13:22 +0000 Subject: [Live-devel] Rtsp connection issues In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1E04A2@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1E04A2@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1E21DF@IL-BOL-EXCH01.smartwire.com> I guess I had better answer this myself. After a bit of effort this turned out to be I think a nat traversal issue that was very particualar. It depended on camera model, ISP, and router brand. Trying openRTSP from 11.20.2011 when I changed to async interface, also showed this error. But Updating to 2012.02.29 fixed this a a few performance issues we did not realize were related. I read thru the changelog and it looks like there has been a great job done recently on muli-thread tolerence and Nat stuff. Data is really flowing now! ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Jeff Shanab [jshanab at smartwire.com] Sent: Saturday, March 10, 2012 8:34 AM To: live-devel at lists.live555.com Subject: [Live-devel] Rtsp connection issues I am using live555 in my project and am connecting to lots of cameras. I Used to connect to all of the cameras before I rewrote to use the async interface. I closely modeled the playcommon.cpp used in openrtsp. For some reason about 25-50% of the cameras are now having connection issues and I am trying to figure out what I have done wrong. It seems like it may be a latency or timing issue as it goes quickly thru the options,describe,setup, and play but never starts the background task for incomingdayhandler. If by way of comparison, I connect with vlc, it takes 15 secs or more to connect. If I use vlc on linux it connects in under 2 seconds. Any help would be greatly appreciated, thanks I sent this last night from my phone to the wrong address :( I have been debugging and I see in wireshark the video is sent. I see from breakpoints that it is incrementing fNumActiveSourcesSinceLastReset but then it resets a few seconds later. Connected by DROID on Verizon Wireless Jeff Shanab, Manager, Software Engineering D 630.633.4515 | C 630.453.7764 | F 630.633.4815 | jshanab at smartwire.com [Description: Description: Description: Description: cid:706AA5FB-B29A-4B95-B275-FE31EE559CF0 at hsd1.il.comcast.net.] [Description: Description: Description: Description: Description: Description: Description: sig4] [Description: Description: Description: Description: Description: Description: Description: sig3] [Description: Description: Description: Description: Description: Description: Description: sig2] ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1913 / Virus Database: 2114/4863 - Release Date: 03/10/12 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.gif Type: image/gif Size: 7675 bytes Desc: image001.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.gif Type: image/gif Size: 1494 bytes Desc: image002.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.gif Type: image/gif Size: 1470 bytes Desc: image003.gif URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image004.gif Type: image/gif Size: 1506 bytes Desc: image004.gif URL: From zhanm at join.net.cn Mon Mar 12 06:32:33 2012 From: zhanm at join.net.cn (=?gb2312?B?1bLD9w==?=) Date: Mon, 12 Mar 2012 21:32:33 +0800 Subject: [Live-devel] Shutdown stream question Message-ID: <000001cd0054$9517f300$bf47d900$@net.cn> Hi, My input source is a live source from a DVR. My question is: at my client side, I call shutdownStream() to notify my rtsp server to stop playing the stream. At server side, where I should put my statement to stop playing my DVR first before I can delete the stream(ie, I have to stop DVR playing first, then delete the stream at server side?). In the derived OnDemandServerSubSession class or some other places? In my derived class from FramedSource, I keep the information about the DVR, so that when I shutdown a live stream, I can use it to stop playing DVR first. Any advice will be appreciated. Zhanming -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 12 07:10:21 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 12 Mar 2012 07:10:21 -0700 Subject: [Live-devel] Shutdown stream question In-Reply-To: <000001cd0054$9517f300$bf47d900$@net.cn> References: <000001cd0054$9517f300$bf47d900$@net.cn> Message-ID: > My input source is a live source from a DVR. My question is: at my client side, I call shutdownStream() to notify my rtsp server to stop playing the stream. At server side, where I should put my statement to stop playing my DVR first before I can delete the stream(ie, I have to stop DVR playing first, then delete the stream at server side?). You should do this in the destructor of your "FramedSource" subclass. (This destructor should get called automatically when the stream is closed.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 12 07:13:46 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 12 Mar 2012 07:13:46 -0700 Subject: [Live-devel] Rtsp connection issues In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1E21DF@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1E04A2@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1E21DF@IL-BOL-EXCH01.smartwire.com> Message-ID: > I guess I had better answer this myself. After a bit of effort this turned out to be I think a nat traversal issue that was very particualar. It depended on camera model, ISP, and router brand. > > Trying openRTSP from 11.20.2011 when I changed to async interface, also showed this error. But Updating to 2012.02.29 fixed this a a few performance issues we did not realize were related. Moral of the story (for everyone): If you're having problems with the code, the ***the first thing*** you should do is make sure that you're using the latest version. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From naresh at vizexperts.com Mon Mar 12 10:48:07 2012 From: naresh at vizexperts.com (Naresh Sankapelly) Date: Mon, 12 Mar 2012 23:18:07 +0530 Subject: [Live-devel] OpenRSTP application query Message-ID: <000001cd0078$496f2a30$dc4d7e90$@com> Hey All, I am using openRTSP sample application provided with Live555 streaming media library. I used the following command to stream from an RTSP server. openRTSP.exe -d 10 rtsp://192.168.1.90/MediaInput/mpeg4 I was able to get two files with names video-MP4V-ES-1 and audio-G726-16-2. I'm here by attaching the output file. But, I'm not quite sure how to open them. Can anyone tell me how can I open them? What should I rename them to(because I tried various options but in vain)? Thanks in advance Naresh Ph. 8884199804 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: rtsp_output.txt URL: From zhanm at join.net.cn Mon Mar 12 18:13:36 2012 From: zhanm at join.net.cn (=?gb2312?B?1bLD9w==?=) Date: Tue, 13 Mar 2012 09:13:36 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIFNodXRkb3duIHN0cmVhbSBxdWVzdGlv?= =?gb2312?b?bg==?= In-Reply-To: References: <000001cd0054$9517f300$bf47d900$@net.cn> Message-ID: <001801cd00b6$849704f0$8dc50ed0$@net.cn> Thanks for your reply. I will try it. ???: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] ?? Ross Finlayson ????: 2012?3?12? 22:10 ???: LIVE555 Streaming Media - development & use ??: Re: [Live-devel] Shutdown stream question My input source is a live source from a DVR. My question is: at my client side, I call shutdownStream() to notify my rtsp server to stop playing the stream. At server side, where I should put my statement to stop playing my DVR first before I can delete the stream(ie, I have to stop DVR playing first, then delete the stream at server side?). You should do this in the destructor of your "FramedSource" subclass. (This destructor should get called automatically when the stream is closed.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From albert.staubin at gmail.com Mon Mar 12 07:37:39 2012 From: albert.staubin at gmail.com (AJ St. Aubin) Date: Mon, 12 Mar 2012 10:37:39 -0400 Subject: [Live-devel] Live555 Media Server in 64 bit Windows 7 Message-ID: I am attempting to build live555 MediaServer on x64 windows 7. I have built a VS2008 projects to build the 64 bit libraries and they are compiling, but for some reason I have been unable to get the mediaServer to build with these new libraries. I have another executable that I have been working on that is a server using the live555 libraries. It will build and run, but has some bugs with the 64-bit libs that it does not have with the 32-bit libs built with the make file. I would like to use the media server built with 64-bit libs, to test the libs and make sure it is an error on my part and not something with the libs build. I have not been able to find a 64 bit windows makefile, so I have been forced to attempt to build the libs and executables in a VS project that I am building. Has anyone had success on windows with 64-bit live555? If so how did you approach it and have you seen similar issues to what I am seeing? Thank you in advance for your help. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashfaque at iwavesystems.com Mon Mar 12 23:58:06 2012 From: ashfaque at iwavesystems.com (Ashfaque) Date: Tue, 13 Mar 2012 12:28:06 +0530 Subject: [Live-devel] RAM requirement- Live555 Message-ID: <47592C11BF8F4011AA36AF1C10A57290@iwns10> Hi everyone, We are designing a system with a camera interface and streaming the video with Live555 over Wifi. The receiver will be same another system with Wifi module to display the video to a Client PC. I want to know the minimum memory requirement for Live555. We have decided following memory layout with a 64MB RAM. Heap ? 3MB Data Section ? 3MB Stack ? 3MB Kernel Binary ? 5MB File System, Library and Executables ? 50 MB We need to do streaming to more than one client system at the same time. Is a RAM memory of 64 MB is sufficient? I had gone through the Live555 FAQ?s, but did not found any answers. Please anyone tell me the memory requirements. Please Thanks & Regards, Ashfaque -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashfaque at iwavesystems.com Tue Mar 13 01:13:37 2012 From: ashfaque at iwavesystems.com (Ashfaque) Date: Tue, 13 Mar 2012 13:43:37 +0530 Subject: [Live-devel] UPnP support in Live555 Message-ID: <24C3794EB6E14B5491DBA1BB2A0169CD@iwns10> Hello Everyone, I need UPnP supported Media server for our project. The client devices should be able to connect with the Media server and register itself for streaming. Does Live555 supports UPnP? If not then what other options are available to add this feature. Are there any other UPnP library we can use along with Live555 or a UPnP supported Media Server library would be the best instead of Live555 (got to know about MediaTomb)? However Live555 is able to support all our requirements except UPnP which I am not sure of yet. Someone please tell me how this feature can be supported. Any links to the library or sample source code would be highly appreciable. Thanks in advance. Thanks & Regards, Ashfaque -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 13 01:55:53 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2012 01:55:53 -0700 Subject: [Live-devel] UPnP support in Live555 In-Reply-To: <24C3794EB6E14B5491DBA1BB2A0169CD@iwns10> References: <24C3794EB6E14B5491DBA1BB2A0169CD@iwns10> Message-ID: <9DFC517E-F83C-4691-BEEC-C8B71C53E291@live555.com> > Does Live555 supports UPnP? No. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 13 12:30:39 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2012 12:30:39 -0700 Subject: [Live-devel] RAM requirement- Live555 In-Reply-To: <47592C11BF8F4011AA36AF1C10A57290@iwns10> References: <47592C11BF8F4011AA36AF1C10A57290@iwns10> Message-ID: <5B2CC542-9478-44B6-812E-AD678E8CFDF8@live555.com> > We are designing a system with a camera interface and streaming the video with Live555 over Wifi. The receiver will be same another system with Wifi module to display the video to a Client PC. > > I want to know the minimum memory requirement for Live555. It's difficult, if not impossible, to answer questions like this, because the answer depends on so many things: - What CPU architecture you're using - What OS you're using - What compiler you're using - What other runtime libraries you're using and, most importantly, - What specifically you want to do with the LIVE555 code. Realistically, the only thing you can do is compile and build one of the demo applications (or the "LIVE555 Media Server") - similar to an application that you want to build yourself - for your environment, and examine its memory usage yourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From naresh at vizexperts.com Tue Mar 13 13:17:04 2012 From: naresh at vizexperts.com (Naresh Sankapelly) Date: Wed, 14 Mar 2012 01:47:04 +0530 Subject: [Live-devel] OpenRTSP application Message-ID: <000901cd0156$430b4640$c921d2c0$@com> Hi All, I'm using OpenRTSP for streaming of Panasonic IP Camera output. I have to use it in a directshow application. Any clues on how can I do that? Thanks Naresh Ph. 8884199804 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremiah.Morrill at econnect.tv Tue Mar 13 14:33:14 2012 From: Jeremiah.Morrill at econnect.tv (Jeremiah Morrill) Date: Tue, 13 Mar 2012 21:33:14 +0000 Subject: [Live-devel] OpenRTSP application In-Reply-To: <000901cd0156$430b4640$c921d2c0$@com> References: <000901cd0156$430b4640$c921d2c0$@com> Message-ID: <80C795F72B3CB241A9256DABF0A04EC5022F22F4@CH1PRD0710MB391.namprd07.prod.outlook.com> You can probably use openRTSP as a good starting point for the code that does the RTSP, but it's a lot more work than just that. Here's an overview of how I've done this in the past. 1.) Create a DirectShow source filter (example in the Windows SDK C:\Program Files\microsoft sdks\Windows\v7.1\Samples\multimedia\directshow\filters\pushsource) 2.) Have your source filter connect to an RTSP via something like openRTSP and create output pins for each MediaSubsession you are interested in. Each sink will correlate to an output pin and you will have to configure the output pins for the correct mediaTypes (major type, subtype, etc) 3.) When you receive media samples, you need to deliver them via your output pins, configured with the correct timestamp (timeval -> 100ns units time) to the downstream filters connected. 4.) If you want to be able to open the filter via URL (eg, from windows media player open something like: myrtsp://server/path/etc) you want to implement IFileSourceFilter on your filter COM object. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Naresh Sankapelly Sent: Tuesday, March 13, 2012 1:17 PM To: live-devel at ns.live555.com Subject: [Live-devel] OpenRTSP application Hi All, I'm using OpenRTSP for streaming of Panasonic IP Camera output. I have to use it in a directshow application. Any clues on how can I do that? Thanks Naresh Ph. 8884199804 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 13 14:38:28 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2012 14:38:28 -0700 Subject: [Live-devel] OpenRTSP application In-Reply-To: <80C795F72B3CB241A9256DABF0A04EC5022F22F4@CH1PRD0710MB391.namprd07.prod.outlook.com> References: <000901cd0156$430b4640$c921d2c0$@com> <80C795F72B3CB241A9256DABF0A04EC5022F22F4@CH1PRD0710MB391.namprd07.prod.outlook.com> Message-ID: <784E6DD6-B2EC-41E8-A8C7-7939DD4C89C3@live555.com> > You can probably use openRTSP as a good starting point for the code that does the RTSP FYI, the code for the new "testRTSPClient" demo application is probably a better model to use than "openRTSP" (which has lots of extra 'bells and whistles'). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From naresh at vizexperts.com Wed Mar 14 01:24:07 2012 From: naresh at vizexperts.com (Naresh Sankapelly) Date: Wed, 14 Mar 2012 13:54:07 +0530 Subject: [Live-devel] OpenRTSP application In-Reply-To: <80C795F72B3CB241A9256DABF0A04EC5022F22F4@CH1PRD0710MB391.namprd07.prod.outlook.com> References: <000901cd0156$430b4640$c921d2c0$@com> <80C795F72B3CB241A9256DABF0A04EC5022F22F4@CH1PRD0710MB391.namprd07.prod.outlook.com> Message-ID: <003701cd01bb$d3fae600$7bf0b200$@com> Thanks a lotJ. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeremiah Morrill Sent: Wednesday, March 14, 2012 3:03 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] OpenRTSP application You can probably use openRTSP as a good starting point for the code that does the RTSP, but it's a lot more work than just that. Here's an overview of how I've done this in the past. 1.) Create a DirectShow source filter (example in the Windows SDK C:\Program Files\microsoft sdks\Windows\v7.1\Samples\multimedia\directshow\filters\pushsource) 2.) Have your source filter connect to an RTSP via something like openRTSP and create output pins for each MediaSubsession you are interested in. Each sink will correlate to an output pin and you will have to configure the output pins for the correct mediaTypes (major type, subtype, etc) 3.) When you receive media samples, you need to deliver them via your output pins, configured with the correct timestamp (timeval -> 100ns units time) to the downstream filters connected. 4.) If you want to be able to open the filter via URL (eg, from windows media player open something like: myrtsp://server/path/etc) you want to implement IFileSourceFilter on your filter COM object. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Naresh Sankapelly Sent: Tuesday, March 13, 2012 1:17 PM To: live-devel at ns.live555.com Subject: [Live-devel] OpenRTSP application Hi All, I'm using OpenRTSP for streaming of Panasonic IP Camera output. I have to use it in a directshow application. Any clues on how can I do that? Thanks Naresh Ph. 8884199804 -------------- next part -------------- An HTML attachment was scrubbed... URL: From rakesh.kumar at procubedinc.com Wed Mar 14 04:06:22 2012 From: rakesh.kumar at procubedinc.com (Rakesh Kumar) Date: Wed, 14 Mar 2012 16:36:22 +0530 Subject: [Live-devel] [Information Needed] In-Reply-To: References: Message-ID: <003f01cd01d2$7e0a0c00$7a1e2400$@kumar@procubedinc.com> Hi All, I am very beginner to this type of application. I need to understand if we can buffer to "testH264VideoStreamer.cpp" or not. If not what is the alternative to use it. I have a camera with gives me encoded (H264) buffer, I want to transmit using rtsp. I think this is very basic , but sorry to say I do not know how to do. Encoded output if I write in a file and then if I give the file to testH264VideoStreamer.cpp it is working fine. I do not know how to make runtime. ~rakesh kumar From finlayson at live555.com Wed Mar 14 10:13:34 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2012 10:13:34 -0700 Subject: [Live-devel] [Information Needed] In-Reply-To: <003f01cd01d2$7e0a0c00$7a1e2400$@kumar@procubedinc.com> References: <003f01cd01d2$7e0a0c00$7a1e2400$@kumar@procubedinc.com> Message-ID: <3E2CD3BF-2B8F-402F-81D2-E8863C627568@live555.com> > I am very beginner to this type of application. Then it's especially important that you read the FAQ - as you were asked to do before posting to this mailing list. > I need to understand if we can buffer to "testH264VideoStreamer.cpp" or not. > If not what is the alternative to use it. > > I have a camera with gives me encoded (H264) buffer, I want to transmit > using rtsp. > > I think this is very basic , but sorry to say I do not know how to do. There is an entry in the FAQ that specifically refers to this: http://www.live555.com/liveMedia/faq.html#liveInput Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From albert.staubin at gmail.com Wed Mar 14 08:26:15 2012 From: albert.staubin at gmail.com (AJ) Date: Wed, 14 Mar 2012 11:26:15 -0400 Subject: [Live-devel] Load new trick play index files(.tsx) without restarting the server Message-ID: What would I need to do to have the index files (.tsx) load without restarting the server? Thank you in advance for your help. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Mar 14 17:55:51 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2012 17:55:51 -0700 Subject: [Live-devel] Load new trick play index files(.tsx) without restarting the server In-Reply-To: References: Message-ID: > What would I need to do to have the index files (.tsx) load without restarting the server? Are you using the "LIVE555 Media Server": ? The "DynamicRTSPServer" class that it uses is set up so that files (including Transport Stream index files) are not read until the corresponding stream is first accessed. You won't need to 'restart the server' for this. The only time you would ever need to read an index file more than once is if the underlying Transport Stream file (and thus also its index file) changes, without the file name changing. But I don't recommend that you do this. For starters, this wouldn't work if there is already a client that's streaming the old file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vigosslive at rambler.ru Wed Mar 14 22:26:05 2012 From: vigosslive at rambler.ru (Rustam) Date: Thu, 15 Mar 2012 09:26:05 +0400 Subject: [Live-devel] VLC player( for RTP, RTSP send) using your library's? Message-ID: <192295353.1331789165.110687272.15849@mperl103.rambler.ru> It is true? From finlayson at live555.com Wed Mar 14 22:55:34 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2012 22:55:34 -0700 Subject: [Live-devel] VLC player( for RTP, RTSP send) using your library's? In-Reply-To: <192295353.1331789165.110687272.15849@mperl103.rambler.ru> References: <192295353.1331789165.110687272.15849@mperl103.rambler.ru> Message-ID: > It is true? It's true only when VLC is being used as a RTSP *client* - i.e., when VLC *receives* a RTSP/RTP stream. (When VLC is used as a RTSP server - i.e., sending RTSP/RTP streams - then it uses its own implementation, not using our libraries.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Ivan.Roubicek at zld.cz Thu Mar 15 02:07:28 2012 From: Ivan.Roubicek at zld.cz (=?iso-8859-2?Q?Ivan_Roub=ED=E8ek?=) Date: Thu, 15 Mar 2012 10:07:28 +0100 Subject: [Live-devel] Misordered frames with synchronized rtp source Message-ID: Hi, I have problem with misordered frames from when fSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP() is true. How is that possible? Is it bug or should I reorder frames myself again? My console output looks like this: Stream "rtsp://192.168.0.25"; video/H264: Received 15227 bytes. Presentation time: 1331800902.812053 Stream "rtsp://192.168.0.25"; video/H264: Received 2477 bytes. Presentation time: 1331800902.775320 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 15019 bytes. Presentation time: 1331800902.885520 Stream "rtsp://192.168.0.25"; video/H264: Received 2564 bytes. Presentation time: 1331800902.848787 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 15093 bytes. Presentation time: 1331800902.958987 Stream "rtsp://192.168.0.25"; video/H264: Received 2522 bytes. Presentation time: 1331800902.922254 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 15084 bytes. Presentation time: 1331800903.032454 Stream "rtsp://192.168.0.25"; video/H264: Received 2599 bytes. Presentation time: 1331800902.995721 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 14441 bytes. Presentation time: 1331800903.105921 Stream "rtsp://192.168.0.25"; video/H264: Received 2419 bytes. Presentation time: 1331800903.069188 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 15504 bytes. Presentation time: 1331800903.179388 Stream "rtsp://192.168.0.25"; video/H264: Received 2664 bytes. Presentation time: 1331800903.142655 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 14927 bytes. Presentation time: 1331800903.252866 Stream "rtsp://192.168.0.25"; video/H264: Received 2443 bytes. Presentation time: 1331800903.216133 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 15067 bytes. Presentation time: 1331800903.326333 Stream "rtsp://192.168.0.25"; video/H264: Received 2502 bytes. Presentation time: 1331800903.289600 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 15082 bytes. Presentation time: 1331800903.399800 Stream "rtsp://192.168.0.25"; video/H264: Received 2516 bytes. Presentation time: 1331800903.363067 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 14706 bytes. Presentation time: 1331800903.473278 Stream "rtsp://192.168.0.25"; video/H264: Received 2549 bytes. Presentation time: 1331800903.436534 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 14920 bytes. Presentation time: 1331800903.546734 Stream "rtsp://192.168.0.25"; video/H264: Received 2465 bytes. Presentation time: 1331800903.510001 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 15065 bytes. Presentation time: 1331800903.620201 Stream "rtsp://192.168.0.25"; video/H264: Received 2178 bytes. Presentation time: 1331800903.583479 <-- Stream "rtsp://192.168.0.25"; video/H264: Received 14770 bytes. Presentation time: 1331800903.693679 As you can see every second frame is misordered so when I play video I see flashbacks a lot. If I save each frame to the file and then I play them ordered correctly video is playing without any problem. I've even tried to increase packet reordering treshold time from 0.2 to 10 seconds but without any success. I have last version of live555 and I'm using camera BOSCH NBC-265-P. Thank you for any help. Best regards Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 15 05:28:56 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2012 05:28:56 -0700 Subject: [Live-devel] Misordered frames with synchronized rtp source In-Reply-To: References: Message-ID: > I have problem with misordered frames from when fSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP() is true. How is that possible? It's not. The LIVE555 code delivers the contents of RTP packets in the correct order (based on the sequence numbers in the RTP packets). (Note that the 'packet reordering threshold' takes effect only when packets get lost, which is probably not happening in your case, and - in any case - still causes data to get delivered in the correct order.) So you can assume that your data is being delivered in the correct order. Note however, that 'presentation times' are not necessarily monotonically increasing. It is common for presentation times to be out-of-order for video codecs (such as MPEG-2, MPEG-4, and H.264) which can have 'B" frames. Note that frames for these codecs are sent - over RTP - in *decoding* order (i.e., the order in which frames are supposed to be fed into a decoder), *not* display order (the order in which the decoded frames are supposed to be displayed). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From freezer at smsnet.pl Thu Mar 15 06:05:36 2012 From: freezer at smsnet.pl (freezer at smsnet.pl) Date: Thu, 15 Mar 2012 14:05:36 +0100 Subject: [Live-devel] How can I make RTSPServer to stream from file to multicast address ? Message-ID: Hi all ! I am trying to make a test using live555 rtsp server and openRTSP client to : Keep RTSP session between client and server using TCP (server : 192.168.0.1 192.168.0.10 : client). Send from server raw udp transport stream to mutlicast address on desired port (192.168.0.1 -> 230.0.0.1:1234). However I am not able to do this using openRTSP client (there is no such option while running binary). How can I achieve this ? Best regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 15 06:31:14 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2012 06:31:14 -0700 Subject: [Live-devel] How can I make RTSPServer to stream from file to multicast address ? In-Reply-To: References: Message-ID: > I am trying to make a test using live555 rtsp server and openRTSP client to : > > Keep RTSP session between client and server using TCP (server : 192.168.0.1 <-> 192.168.0.10 : client). > > Send from server raw udp transport stream to mutlicast address > You can't do this with the "LIVE555 Media Server" (or "testOnDemandRTSPServer"), because that is a unicast server (i.e., delivering streams via unicast only). If you want to stream a Transport Stream file via multicast, then I suggest using the "testMPEG2TransportStreamer" demo application (in "testProgs") instead. One change that you will need to make to that code is uncomment the line #define IMPLEMENT_RTSP_SERVER 1 on line 34 of "testMPEG2TransportStreamer.cpp", so that you get a RTSP server that clients can connect to to receive the stream. (Note that the stream will be delivered via RTP/UDP, not raw-UDP.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rakesh.kumar at procubedinc.com Thu Mar 15 07:11:56 2012 From: rakesh.kumar at procubedinc.com (Rakesh Kumar) Date: Thu, 15 Mar 2012 19:41:56 +0530 Subject: [Live-devel] [Information Needed] In-Reply-To: <3E2CD3BF-2B8F-402F-81D2-E8863C627568@live555.com> References: <003f01cd01d2$7e0a0c00$7a1e2400$@kumar@procubedinc.com> <3E2CD3BF-2B8F-402F-81D2-E8863C627568@live555.com> Message-ID: <001801cd02b5$95137970$bf3a6c50$@kumar@procubedinc.com> Hi All, I have gone to the link you have mentioned Yes. The easiest way to do this is to change the appropriate "test*Streamer. cpp" file to read from "stdin" (instead of "test.*"), and then pipe the output of your encoder to (your modified) "test*Streamer" application. I have changed char const* inputFileName = "test.264"; to char const* inputFileName = "myfifo"; but does not working. How to mention that input has to be read from [ read from "stdin"] I am creating a fifo and putting the encoder output to the fifo and then I want to give the same fifo to testH264 Streamer.cpp It will be great help if you let me where all I have to change. Any support or guidance will be highly appreciated. ----------------------------------------- Thanks !!!! ~rakesh [+91 - 97418 33005] Description: Description: Description: Description: Description: ProCubed_icon_iPhone PROcubed Inc . Bangalore, INDIA.? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, March 14, 2012 10:44 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] [Information Needed] I am very beginner to this type of application. Then it's especially important that you read the FAQ - as you were asked to do before posting to this mailing list. I need to understand if we can buffer to "testH264VideoStreamer.cpp" or not. If not what is the alternative to use it. I have a camera with gives me encoded (H264) buffer, I want to transmit using rtsp. I think this is very basic , but sorry to say I do not know how to do. There is an entry in the FAQ that specifically refers to this: http://www.live555.com/liveMedia/faq.html#liveInput Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 1685 bytes Desc: not available URL: From rakesh.kumar at procubedinc.com Thu Mar 15 07:41:10 2012 From: rakesh.kumar at procubedinc.com (Rakesh Kumar) Date: Thu, 15 Mar 2012 20:11:10 +0530 Subject: [Live-devel] [Need help for live streaming changed for H264] Message-ID: <002401cd02b9$aa2d8e50$fe88aaf0$@kumar@procubedinc.com> Hi All, I have gone to the link you have mentioned Yes. The easiest way to do this is to change the appropriate "test*Streamer. cpp" file to read from "stdin" (instead of "test.*"), and then pipe the output of your encoder to (your modified) "test*Streamer" application. I have changed char const* inputFileName = "test.264"; to char const* inputFileName = "stdin"; but does not play. I get the following print messages?.. Running Server Application [root at Linux rtsp]#Error in opening filePlay this stream using the URL "rtsp: //192.168.2.103:8554/testStream" Beginning streaming... Creating filesource inside play..... Beginning to read from file... ...done reading from file Closing the playnback [root at Linux rtsp]# I am creating a fifo and putting the encoder output to the fifo and then I want to give the same fifo to testH264Streamer.cpp I have following question: 1) How to mention that input has to be read from FIFO (myrtspfifo) 2) Which all files I need to modify, I have only modified "test*Streamer.cpp" as mentioned above. It will be great help if you let me where all I have to change. Any support or guidance will be highly appreciated. ----------------------------------------- Thanks !!!! ~rakesh [+91 - 97418 33005] Description: Description: Description: Description: Description: ProCubed_icon_iPhone PROcubed Inc . Bangalore, INDIA.? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 1685 bytes Desc: not available URL: From finlayson at live555.com Thu Mar 15 11:27:27 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2012 11:27:27 -0700 Subject: [Live-devel] [Need help for live streaming changed for H264] In-Reply-To: <002401cd02b9$aa2d8e50$fe88aaf0$@kumar@procubedinc.com> References: <002401cd02b9$aa2d8e50$fe88aaf0$@kumar@procubedinc.com> Message-ID: <15CC0F74-B37D-4A32-8758-013C30304E2D@live555.com> > I have gone to the link you have mentioned > Yes. The easiest way to do this is to change the appropriate "test*Streamer.cpp" file to read from "stdin" (instead of "test.*"), and then pipe the output of your encoder to (your modified) "test*Streamer" application. > > I have changed char const* inputFileName = "test.264"; > > to char const* inputFileName = "stdin"; but does not play. You apparently do not understand what 'stdin' and 'pipe' mean. This is something that all Unix (including Linux) developers (and anyone who uses this software) should know. Is your FIFO accessible as a named file? If so, you should be able to access the fifo by changing 'inputFileName' from "test.264" to the file name of your FIFO. Alternatively, is your FIFO an application that outputs to 'stdout'? If so, then change 'inputFileName' to "stdin", and pipe your FIFO application to "testH264VideoStreamer". (Once again, if you don't understand what 'stdout', 'stdin', and 'pipe' mean, then you should not be using Linux, or our software.) If neither of these are true, then you will need to write your own "FramedSource" subclass that implements your FIFO, delivering NAL units one-at-a-time. I suggest that you use our "DeviceSource" code (in "liveMedia") as a model for this. Also, because you will be delivering discrete NAL units, rather than a byte stream, you will need to change the "testH264VideoStreamer" code to use a "H264VideoStreamDiscreteFramer", rather than a "H264VideoStreamFramer". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From freezer at smsnet.pl Thu Mar 15 07:37:44 2012 From: freezer at smsnet.pl (freezer at smsnet.pl) Date: Thu, 15 Mar 2012 15:37:44 +0100 Subject: [Live-devel] How can I make RTSPServer to stream from file to multicast address ? In-Reply-To: <08B52F11-9926-4B00-BD86-597C5EB09AE1@smsnet.pl> References: <08B52F11-9926-4B00-BD86-597C5EB09AE1@smsnet.pl> Message-ID: W dniu 15.03.2012 15:05, Grzegorz Gaj napisa?(a): >> You can't do this with the "LIVE555 Media Server" (or "testOnDemandRTSPServer"), because that is a unicast server (i.e., delivering streams via unicast only). If you want to stream a Transport Stream file via multicast, then I suggest using the "testMPEG2TransportStreamer" demo application (in "testProgs") instead. One change that you will need to make to that code is uncomment the line #define IMPLEMENT_RTSP_SERVER 1 on line 34 of "testMPEG2TransportStreamer.cpp", so that you get a RTSP server that clients can connect to to receive the stream. (Note that the stream will be delivered via RTP/UDP, not raw-UDP.) > >> Ok. It almost worked but.. >> >> When I am sending : >> >> SETUP rtsp://127.0.0.1/testStream/track1 [1] RTSP/1.0 >> CSeq: 4 >> User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) >> Transport: RTP/AVP;multicast;destination=230.1.1.1;client_port=7000-7001 >> >> the testMPEG2TransportStreamer responds with: >> >> RTSP/1.0 200 OK >> CSeq: 4 >> Date: Thu, Mar 15 2012 14:03:51 GMT >> Transport: RTP/AVP;multicast;destination=239.255.42.42;source=xxx.xxx.xxx.xxx;port=1234-1235;ttl=7 >> Session: 0F290F2E >> >> As u can see destination address is set permanently on server side. I would like to change it thru SETUP. How can I achieve this ? >> >> Thanks Links: ------ [1] http://poczta.smsnet.pl/rtsp://127.0.0.1/testStream/track1 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 15 11:49:23 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2012 11:49:23 -0700 Subject: [Live-devel] How can I make RTSPServer to stream from file to multicast address ? In-Reply-To: References: <08B52F11-9926-4B00-BD86-597C5EB09AE1@smsnet.pl> Message-ID: <08A5F998-2085-49F3-9628-0594E2D81E93@live555.com> >>> When I am sending : >>> >>> SETUP rtsp://127.0.0.1/testStream/track1 RTSP/1.0 >>> CSeq: 4 >>> User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) >>> Transport: RTP/AVP;multicast;destination=230.1.1.1;client_port=7000-7001 >>> >>> >>> the testMPEG2TransportStreamer responds with: >>> >>> RTSP/1.0 200 OK >>> CSeq: 4 >>> Date: Thu, Mar 15 2012 14:03:51 GMT >>> Transport: RTP/AVP;multicast;destination=239.255.42.42;source=xxx.xxx.xxx.xxx;port=1234-1235;ttl=7 >>> Session: 0F290F2E >>> >>> >>> As u can see destination address is set permanently on server side. I would like to change it thru SETUP. How can I achieve this ? You can't do this with our server code (for multicast streams). Instead, it's the server, not the client, that has to choose the multicast IP address and port numbers. (It's unusual for the client to be requesting a specific destination address (and port numbers) - especially for multicast streams, which may have many clients - because this poses a potential security risk (for 'denial of service' attacks). Instead, it's usually the server, not the client, that chooses multicast addresses/ports.) However, if you know - in advance - that the client will always request the address 230.1.1.1, and the ports 7000-7001, then you can probably support this by changing the "testMPEG2TransportStreamer" code as follows: - Change the address in line 58 from 239.255.42.42 to 230.1.1.1 - Change "rtpPortNum" on line 64 from 1234 to 7000 Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rakesh.kumar at procubedinc.com Thu Mar 15 12:00:48 2012 From: rakesh.kumar at procubedinc.com (Rakesh Kumar) Date: Fri, 16 Mar 2012 00:30:48 +0530 Subject: [Live-devel] [Need help for live streaming changed for H264] In-Reply-To: <15CC0F74-B37D-4A32-8758-013C30304E2D@live555.com> References: <002401cd02b9$aa2d8e50$fe88aaf0$@kumar@procubedinc.com> <15CC0F74-B37D-4A32-8758-013C30304E2D@live555.com> Message-ID: <005101cd02dd$efd3de40$cf7b9ac0$@kumar@procubedinc.com> Hi, I have gone to the link you have mentioned Yes. The easiest way to do this is to change the appropriate "test*Streamer.cpp" file to read from "stdin" (instead of "test.*"), and then pipe the output of your encoder to (your modified) "test*Streamer" application. I have changed char const* inputFileName = "test.264"; to char const* inputFileName = "stdin"; but does not play. You apparently do not understand what 'stdin' and 'pipe' mean. This is something that all Unix (including Linux) developers (and anyone who uses this software) should know. This is my first Linux application, so I do accept that my understanding is weak, that is why I am seeking help and into this forum. Is your FIFO accessible as a named file? If so, you should be able to access the fifo by changing 'inputFileName' from "test.264" to the file name of your FIFO. Yes my FIFO is named pipe myrtspfifo and I changed inputFileName = myrtspfifo and it pops up error that "Unable to open file \"" << inputFileName << "\" as a byte-stream file source\n"; Alternatively, is your FIFO an application that outputs to 'stdout'? If so, then change 'inputFileName' to "stdin", and pipe your FIFO application to "testH264VideoStreamer". (Once again, if you don't understand what 'stdout', 'stdin', and 'pipe' mean, then you should not be using Linux, or our software.) I have to use it as I need rtsp, if you support or not that is that is different issue. I came to this forum only because I have less time to deliver it and I expected some quick solutions. If neither of these are true, then you will need to write your own "FramedSource" subclass that implements your FIFO, delivering NAL units one-at-a-time. I suggest that you use our "DeviceSource" code (in "liveMedia") as a model for this. Also, because you will be delivering discrete NAL units, rather than a byte stream, you will need to change the "testH264VideoStreamer" code to use a "H264VideoStreamDiscreteFramer", rather than a "H264VideoStreamFramer". Yes I kept it as last option , but I read the blogs and found that few people are there who has already got it worked (h264 streaming using named FIFO)so I thought it should work for me with some help from the forum. ~rakesh -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Thu Mar 15 13:03:50 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Fri, 16 Mar 2012 01:33:50 +0530 Subject: [Live-devel] RTCP BYE from the RTSP Server Message-ID: <019BF303-ABD3-485F-A312-2A9F8E6A47FD@saranyu.in> Hi, How to invoke a RTCP BYE message to the client from the RTSP Server application (e.g testOnDemandRTSPServer) ? Regards, Sambhav From finlayson at live555.com Thu Mar 15 13:23:05 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2012 13:23:05 -0700 Subject: [Live-devel] RTCP BYE from the RTSP Server In-Reply-To: <019BF303-ABD3-485F-A312-2A9F8E6A47FD@saranyu.in> References: <019BF303-ABD3-485F-A312-2A9F8E6A47FD@saranyu.in> Message-ID: <101E0DE1-63EC-4C36-840E-FA12DA0D0591@live555.com> > How to invoke a RTCP BYE message to the client from the RTSP Server application (e.g testOnDemandRTSPServer) ? This will happen automatically when the stream ends - i.e., when the server reaches the end of the file that's being streamed. There is nothing that you need to do to get this; the server will send this automatically. There are exceptions to this, however. If the file being streamed has a known 'range' - as reported in the SDP description that the server sends in response to the RTSP "DESCRIBE" - then the server will not send a RTCP "BYE" when it reaches the end of the file. The reason for this is that files with a known range are typically also 'seekable'. By not sending a RTCP "BYE" when the server reaches the end of this kind of file, the stream will be kept alive, which allows the client - if desired - to seek backwards in the stream, to replay part or all of it. In our current implementation, the following file types are 'seekable', have a known 'range', and thus our server will *not* send a RTCP "BYE" when it reaches the end of a file: - DV video files - MP3 audio files - MPEG Transport Stream files (with corresponding 'index' files) - WAV audio files A client that is receiving this kind of file therefore can't expect to receive a RTCP "BYE" to signal 'end of stream'. Instead, it should call "MediaSession::playEndTime()" (and "MediaSession::playStartTime()") to figure out the duration of the stream, and set a timer for this duration. (See, for example, the code for "testRTSPClient".) Other kinds of files - e.g., AC-3, AAC, AMR, H.264, MPEG-4 - do not have a known 'range', are not 'seekable', and thus - for such files - the server *will* send a RTCP "BYE" when it reaches the end of the file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Thu Mar 15 14:06:28 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Fri, 16 Mar 2012 02:36:28 +0530 Subject: [Live-devel] RTCP BYE from the RTSP Server In-Reply-To: <101E0DE1-63EC-4C36-840E-FA12DA0D0591@live555.com> References: <019BF303-ABD3-485F-A312-2A9F8E6A47FD@saranyu.in> <101E0DE1-63EC-4C36-840E-FA12DA0D0591@live555.com> Message-ID: <778E5627-7BFA-4D91-BFC8-45F3DE4262B8@saranyu.in> Thanks for the detailed Information. I am using live RTP source as stream source to a subclass of OnDemandServerMediaSubsession. In this case RTP when source stops sending data , the application gets a message upon which i want to close the session. On Mar 16, 2012, at 1:53 AM, Ross Finlayson wrote: >> How to invoke a RTCP BYE message to the client from the RTSP Server application (e.g testOnDemandRTSPServer) ? > > This will happen automatically when the stream ends - i.e., when the server reaches the end of the file that's being streamed. There is nothing that you need to do to get this; the server will send this automatically. > > There are exceptions to this, however. If the file being streamed has a known 'range' - as reported in the SDP description that the server sends in response to the RTSP "DESCRIBE" - then the server will not send a RTCP "BYE" when it reaches the end of the file. The reason for this is that files with a known range are typically also 'seekable'. By not sending a RTCP "BYE" when the server reaches the end of this kind of file, the stream will be kept alive, which allows the client - if desired - to seek backwards in the stream, to replay part or all of it. > > In our current implementation, the following file types are 'seekable', have a known 'range', and thus our server will *not* send a RTCP "BYE" when it reaches the end of a file: > - DV video files > - MP3 audio files > - MPEG Transport Stream files (with corresponding 'index' files) > - WAV audio files > > A client that is receiving this kind of file therefore can't expect to receive a RTCP "BYE" to signal 'end of stream'. Instead, it should call "MediaSession::playEndTime()" (and "MediaSession::playStartTime()") to figure out the duration of the stream, and set a timer for this duration. (See, for example, the code for "testRTSPClient".) > > Other kinds of files - e.g., AC-3, AAC, AMR, H.264, MPEG-4 - do not have a known 'range', are not 'seekable', and thus - for such files - the server *will* send a RTCP "BYE" when it reaches the end of the file. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From ricardo at kafemeeting.com Thu Mar 15 14:11:13 2012 From: ricardo at kafemeeting.com (Ricardo Acosta) Date: Thu, 15 Mar 2012 22:11:13 +0100 Subject: [Live-devel] StreamReplicator . Best way to remove and recreate a replica Message-ID: Hi Ross I create the replicator with the false option to avoid it deletes the input Source when removing all replicas. StreamReplicator::createNew(*env, Source, false) then create the source to feed it into the sink FramedSource* source = replicator->createStreamReplica(); What would be the best way to remove a replica, knowing that we need to create it again towards another port/address . Using the StreamReplicator::removeStreamReplica(*source) gives me an error because source is FramedSource and it expects StreamReplica ... maybe I am missing something Or closing the medium source and sink (in the destructor)? Thank you Ricardo -------------- next part -------------- An HTML attachment was scrubbed... URL: From albert.staubin at gmail.com Thu Mar 15 14:14:23 2012 From: albert.staubin at gmail.com (AJ) Date: Thu, 15 Mar 2012 17:14:23 -0400 Subject: [Live-devel] TS Streaming from a port Message-ID: Is there any test code to send a TS file over a socket to the Live555 server and then provide that up as a stream? Thank you in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 15 15:37:52 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2012 15:37:52 -0700 Subject: [Live-devel] StreamReplicator . Best way to remove and recreate a replica In-Reply-To: References: Message-ID: <4F7918D6-FFA2-482C-9382-6ED0DECC449E@live555.com> > I create the replicator with the false option to avoid it deletes the input Source when removing all replicas. > StreamReplicator::createNew(*env, Source, false) > > then create the source to feed it into the sink > FramedSource* source = replicator->createStreamReplica(); > > What would be the best way to remove a replica, knowing that we need to create it again towards another port/address . You delete a replica in the same way that you delete any other "Medium"-subclass object - by calling Medium::close(pointer-to-replica); > Using the > StreamReplicator::removeStreamReplica(*source) No, you can't call that, because it's a private function (used only in the implementation of these classes). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 15 15:42:37 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2012 15:42:37 -0700 Subject: [Live-devel] TS Streaming from a port In-Reply-To: References: Message-ID: <28DD43CC-B267-436D-AA67-52D21B6972E0@live555.com> > Is there any test code to send a TS file over a socket to the Live555 server and then provide that up as a stream? Yes, look at the following two demo apps in the "testProgs" directory: 1/ testMPEG2TransportStreamer 2/ testOnDemandRTSPServer Note the message: "mpeg2TransportStreamFromUDPSourceTest" stream, from a UDP Transport Stream input source and the corresponding code in "testProgs/testOnDemandRTSPServer.cpp" Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From developpeur02 at kafemeeting.com Sat Mar 17 06:59:40 2012 From: developpeur02 at kafemeeting.com (Rodolophe Fouquet) Date: Sat, 17 Mar 2012 14:59:40 +0100 Subject: [Live-devel] QoS records with RTPReceptionStatsDB for a RTP stream Message-ID: Hello, I'm trying to implement QoS records with RTPReceptionStatsDB, taking example from what's implemented in the playCommon.cpp file. So, when I init' my RTPsource, I also init my QoS recorder. However, when I do so, the application freezes. I figured out that it seems that when I call RTPReceptionStatsDB::Iterator statsIter(rtpSource->receptionStatsDB()), before my RTPsource receives the stream, it behaves like a lock. The problem doesn't seem to be linked to the taskScheduler, since it freezes before the call to env->taskScheduler().scheduleDelayedTask(...). So, does consulting RTP reception stats before the reception of any packets (and after the RTPsource initialisation, of course) really act like a lock? Best Regards, Rodolphe Fouquet. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Mar 17 19:37:01 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 17 Mar 2012 19:37:01 -0700 Subject: [Live-devel] QoS records with RTPReceptionStatsDB for a RTP stream In-Reply-To: References: Message-ID: > I'm trying to implement QoS records with RTPReceptionStatsDB, taking example from what's implemented in the playCommon.cpp file. > So, when I init' my RTPsource, I also init my QoS recorder. > > However, when I do so, the application freezes. I figured out that it seems that when I call RTPReceptionStatsDB::Iterator statsIter(rtpSource->receptionStatsDB()), before my RTPsource receives the stream, it behaves like a lock. > > The problem doesn't seem to be linked to the taskScheduler, since it freezes before the call to env->taskScheduler().scheduleDelayedTask(...). > > So, does consulting RTP reception stats before the reception of any packets (and after the RTPsource initialisation, of course) really act like a lock? I don't know what, if anything, in the code would cause this behavior. So you're going to have to track this down yourself, unfortunately. Let us know what you find. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhanm at join.net.cn Mon Mar 19 01:25:19 2012 From: zhanm at join.net.cn (=?gb2312?B?1bLD9w==?=) Date: Mon, 19 Mar 2012 16:25:19 +0800 Subject: [Live-devel] What is the maximum concurrent sessions can a live555 server support? Is there a limit? In-Reply-To: References: Message-ID: <00fc01cd05a9$d23f5430$76bdfc90$@net.cn> By far, I can have 32 sessions(streams) playing at the same time and no more. When I try to play the 33th stream, the describe command was sent by my client, but got no response. I wonder if there is a maximum limitation on the number of sessions(subsessions) ? Or , because the response from the socket is timeout? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 19 01:29:11 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2012 01:29:11 -0700 Subject: [Live-devel] What is the maximum concurrent sessions can a live555 server support? Is there a limit? In-Reply-To: <00fc01cd05a9$d23f5430$76bdfc90$@net.cn> References: <00fc01cd05a9$d23f5430$76bdfc90$@net.cn> Message-ID: See http://www.live555.com/liveMedia/faq.html#scalability Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jayeshpk1 at gmail.com Sat Mar 17 07:43:01 2012 From: jayeshpk1 at gmail.com (Jayesh Parayali) Date: Sat, 17 Mar 2012 10:43:01 -0400 Subject: [Live-devel] H264VideoFileSink Message-ID: I am trying to write to a new H264 file every 20 seconds. As a first step, I have modified the testRTSPClient like below. Now it captures h264 to a file but when I play the file the frames are too fast. http://pastebin.com/jeJJgswh Appreciate any help. Is there another way I can capture a new file every n seconds? -------------- next part -------------- An HTML attachment was scrubbed... URL: From vigosslive at rambler.ru Sun Mar 18 12:30:00 2012 From: vigosslive at rambler.ru (Rustam) Date: Sun, 18 Mar 2012 23:30:00 +0400 Subject: [Live-devel] Why are you in a single rtp packet to send two ac3 package? Message-ID: <90006103.1332099000.163697064.97369@mperl104.rambler.ru> Hi, Ross. Your RTSP server for VOB use a single rtp packet to send two ac3 packets. Why? Where in the code should be changed, so that in a single rtp packet was only one ac3 package. Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Mon Mar 19 00:43:15 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Mon, 19 Mar 2012 13:13:15 +0530 Subject: [Live-devel] sir Message-ID: Sir i am new to live media library.i m trying to understand from your live media library programs but could not understand much. i dont know how to give command line arguments to the openRTSP program.can u guide me the steps and what this program does. thank you in advanxe -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 19 02:31:10 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2012 02:31:10 -0700 Subject: [Live-devel] Why are you in a single rtp packet to send two ac3 package? In-Reply-To: <90006103.1332099000.163697064.97369@mperl104.rambler.ru> References: <90006103.1332099000.163697064.97369@mperl104.rambler.ru> Message-ID: <12053566-5F67-49C5-808F-F8BACF83D626@live555.com> > Your RTSP server for VOB use a single rtp packet to send two ac3 packets. > No it doesn't. Each outgoing RTP packet - in our implementation - contains only one AC-3 audio frame. (Note: The AC-3 RTP payload format used by our code changed (for both sending and receiving) in version 2011.06.16. You should ensure that both the server *and* the client are using a version of a code that's >= 2011.06.16) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 19 02:42:00 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2012 02:42:00 -0700 Subject: [Live-devel] sir In-Reply-To: References: Message-ID: > i am new to live media library.i m trying to understand from your live media library programs but could not understand much. > i dont know how to give command line arguments to the openRTSP program.can u guide me the steps and what this program does. "openRTSP" - including its command-line options - is described in detail at http://www.live555.com/openRTSP/ Note, however, that if you are trying to learn how to develop your *own* RTSP client application, then a better application to study is "testRTSPClient" (see "testProgs/testRTSPClient.cpp"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at appstv.cl Mon Mar 19 10:37:03 2012 From: david at appstv.cl (=?ISO-8859-1?Q?David_Rodr=EDguez?=) Date: Mon, 19 Mar 2012 14:37:03 -0300 Subject: [Live-devel] Amino 130 playing RTSP/Multicast Message-ID: Dear all, I'm trying to play in my Amino 130 one RTSP from a Linksys camera WVC80N. Do you know some extra or aditional requirements from Amino in order to play a RTSP stream or evena Multicast stream ?? If a stream from Live555 is OK and I can watch them into the Amino but If I play it directly from the camera (Muslticast or RTSP no video audio). Please your help. Thanks a lot David -- *David Rodr?guez Lagomarsino CEO **Contacto: +56 9 63404781* ***Web: www.appstv.cl Miguel Claro 2550, Providencia Santiago de Chile**** * -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 19 10:52:40 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2012 10:52:40 -0700 Subject: [Live-devel] Amino 130 playing RTSP/Multicast In-Reply-To: References: Message-ID: <5A7E75A7-C506-4359-9EB8-C64D854E7A3B@live555.com> > I'm trying to play in my Amino 130 one RTSP from a Linksys camera WVC80N. > > Do you know some extra or aditional requirements from Amino in order to play a RTSP stream or evena Multicast stream ?? > > If a stream from Live555 is OK and I can watch them into the Amino but If I play it directly from the camera (Muslticast or RTSP no video audio). I suspect that the problem is that the Amino client uses a weird, non-standard variant of RTSP. Our LIVE555 RTSP server code supports this, but other servers - such as your Linksys camera - probably do not. Unfortunately you're going to have to contact Amino and/or Linksys about this; it's not something that we can help you with. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at appstv.cl Mon Mar 19 11:31:56 2012 From: david at appstv.cl (=?ISO-8859-1?Q?David_Rodr=EDguez?=) Date: Mon, 19 Mar 2012 15:31:56 -0300 Subject: [Live-devel] Amino 130 playing RTSP/Multicast In-Reply-To: <5A7E75A7-C506-4359-9EB8-C64D854E7A3B@live555.com> References: <5A7E75A7-C506-4359-9EB8-C64D854E7A3B@live555.com> Message-ID: Good, but If I save the video to a Pipe.. (lInux) maybe then I will be able to encode it again using Live555 ?? Thanks for the answer El 19 de marzo de 2012 14:52, Ross Finlayson escribi?: > I'm trying to play in my Amino 130 one RTSP from a Linksys camera WVC80N. > > Do you know some extra or aditional requirements from Amino in order to > play a RTSP stream or evena Multicast stream ?? > > If a stream from Live555 is OK and I can watch them into the Amino but If > I play it directly from the camera (Muslticast or RTSP no video audio). > > > I suspect that the problem is that the Amino client uses a weird, > non-standard variant of RTSP. Our LIVE555 RTSP server code supports this, > but other servers - such as your Linksys camera - probably do not. > > Unfortunately you're going to have to contact Amino and/or Linksys about > this; it's not something that we can help you with. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- *David Rodr?guez Lagomarsino CEO **Contacto: +56 9 63404781* ***Web: www.appstv.cl Miguel Claro 2550, Providencia Santiago de Chile**** * -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 19 11:42:52 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2012 11:42:52 -0700 Subject: [Live-devel] Amino 130 playing RTSP/Multicast In-Reply-To: References: <5A7E75A7-C506-4359-9EB8-C64D854E7A3B@live555.com> Message-ID: <92E349F6-4083-46FF-B3A4-1AE64A5A5121@live555.com> > Good, but If I save the video to a Pipe.. (lInux) maybe then I will be able to encode it again using Live555 ?? LIVE555 doesn't do any 'encoding'. But yes, in principle, if you can (somehow) get video from your camera, convert it into a Transport Stream, and use this Transport Stream as input to the LIVE555 RTSP server, then you *may* be able to get the Amino client to play this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at appstv.cl Mon Mar 19 11:53:57 2012 From: david at appstv.cl (=?ISO-8859-1?Q?David_Rodr=EDguez?=) Date: Mon, 19 Mar 2012 15:53:57 -0300 Subject: [Live-devel] Amino 130 playing RTSP/Multicast In-Reply-To: <92E349F6-4083-46FF-B3A4-1AE64A5A5121@live555.com> References: <5A7E75A7-C506-4359-9EB8-C64D854E7A3B@live555.com> <92E349F6-4083-46FF-B3A4-1AE64A5A5121@live555.com> Message-ID: Thanks a lot. I will try that. Regards, David El 19 de marzo de 2012 15:42, Ross Finlayson escribi?: > Good, but If I save the video to a Pipe.. (lInux) maybe then I will be > able to encode it again using Live555 ?? > > > LIVE555 doesn't do any 'encoding'. But yes, in principle, if you can > (somehow) get video from your camera, convert it into a Transport Stream, > and use this Transport Stream as input to the LIVE555 RTSP server, then you > *may* be able to get the Amino client to play this. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- *David Rodr?guez Lagomarsino CEO **Contacto: +56 9 63404781* ***Web: www.appstv.cl Miguel Claro 2550, Providencia Santiago de Chile**** * -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Mon Mar 19 12:39:23 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Tue, 20 Mar 2012 01:09:23 +0530 Subject: [Live-devel] Live555 RTSP Client and Panasonic BL-C210 IP camera Message-ID: <16E24B4C-15CA-4041-91D8-2905C9E283AC@saranyu.in> Hi, I am using openRTSP client to receive RTSP streams from Panasonic BL-C210 IP camera. After ~65seconds openRTSP stops receiving data. Any known issues in live555 RTSP client accessing panasonic cameras ? Server stops sending data, if it does not receive RTCP packets with the timeout period. Is this correct ? I captured data using wireshark and saw that RTCP packets are sent and received by both server and client. (as no ICMP packets were seen) Regards, Sambhav From finlayson at live555.com Mon Mar 19 13:47:36 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2012 13:47:36 -0700 Subject: [Live-devel] Live555 RTSP Client and Panasonic BL-C210 IP camera In-Reply-To: <16E24B4C-15CA-4041-91D8-2905C9E283AC@saranyu.in> References: <16E24B4C-15CA-4041-91D8-2905C9E283AC@saranyu.in> Message-ID: > I am using openRTSP client to receive RTSP streams from Panasonic BL-C210 IP camera. After ~65seconds openRTSP stops receiving data. > Any known issues in live555 RTSP client accessing panasonic cameras ? No. Does "openRTSP" send a "TEARDOWN" command at all (to explicitly close the stream)? If not, the problem is not that 'openRTSP stops receiving data'; the problem is that 'the server (your Panasonic camera) stops sending data'. It appears that the server (the camera) is timing out the connection, for some reason. However, because you say that the server is receiving periodic RTCP "RR" packets from the client, there should be no reason for the server to time out the connection. You should contact your server manufacturer (Panasonic) to find out why the server stops sending data. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Mon Mar 19 21:08:01 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Tue, 20 Mar 2012 09:38:01 +0530 Subject: [Live-devel] Live555 RTSP Client and Panasonic BL-C210 IP camera In-Reply-To: References: <16E24B4C-15CA-4041-91D8-2905C9E283AC@saranyu.in> Message-ID: <74A9B4AB-283D-486A-BAD7-C97F47F7C1AD@saranyu.in> openRTSP does not send a TEARDOWN. Yes the camera stops sending data. From network capture, I dont see any RTP data being sent by the camera. However openRTSP continues to send RTCP packets. , Will contact the vendor and check with them. On Mar 20, 2012, at 2:17 AM, Ross Finlayson wrote: >> I am using openRTSP client to receive RTSP streams from Panasonic BL-C210 IP camera. After ~65seconds openRTSP stops receiving data. >> Any known issues in live555 RTSP client accessing panasonic cameras ? > > No. > > Does "openRTSP" send a "TEARDOWN" command at all (to explicitly close the stream)? If not, the problem is not that 'openRTSP stops receiving data'; the problem is that 'the server (your Panasonic camera) stops sending data'. > > It appears that the server (the camera) is timing out the connection, for some reason. However, because you say that the server is receiving periodic RTCP "RR" packets from the client, there should be no reason for the server to time out the connection. > > You should contact your server manufacturer (Panasonic) to find out why the server stops sending data. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From vigosslive at rambler.ru Mon Mar 19 06:48:40 2012 From: vigosslive at rambler.ru (Rustam) Date: Mon, 19 Mar 2012 17:48:40 +0400 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? Message-ID: <968130450.1332164920.179144024.38534@mcgi-wr-11.rambler.ru> Hi, Ross! I am can't understand how you demux vob. Me need your help which class make demux VOB. And I can't find place in the code where after demux VOB you are sending this data via rtsp server. Thank. From trn200190 at gmail.com Mon Mar 19 07:57:14 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Mon, 19 Mar 2012 20:27:14 +0530 Subject: [Live-devel] Error while building testH264streamer? Message-ID: i opened the testh264videostreamer.cpp file in VS2010 express and added Additional include libraries(all headers file livemedia.hh,basicusageenv.hh and groupsock.hh) in this. after bulilding the program i got the following errors 1>------ Build started: Project: try1h264, Configuration: Debug Win32 ------ 1> testH264VideoStreamer.cpp 1>c:\tarun\tarun\docs\live media\live555-latest.tar\live\testprogs\testh264videostreamer.cpp(81): warning C4996: 'getch': The POSIX name for this item is deprecated. Instead, use the ISO C++ conformant name: _getch. See online help for details. 1> c:\program files\microsoft visual studio 10.0\vc\include\conio.h(128) : see declaration of 'getch' 1>c:\tarun\tarun\docs\live media\live555-latest.tar\live\testprogs\testh264videostreamer.cpp(94): warning C4996: 'getch': The POSIX name for this item is deprecated. Instead, use the ISO C++ conformant name: _getch. See online help for details. 1> c:\program files\microsoft visual studio 10.0\vc\include\conio.h(128) : see declaration of 'getch' 1>c:\tarun\tarun\docs\live media\live555-latest.tar\live\testprogs\testh264videostreamer.cpp(116): warning C4996: 'getch': The POSIX name for this item is deprecated. Instead, use the ISO C++ conformant name: _getch. See online help for details. 1> c:\program files\microsoft visual studio 10.0\vc\include\conio.h(128) : see declaration of 'getch' 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: virtual __thiscall Groupsock::~Groupsock(void)" (??1Groupsock@ @UAE at XZ) referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2019*: unresolved external symbol "public: char * __thiscall RTSPServer::rtspURL(class ServerMediaSession const *,int)const " (?rtspURL at RTSPServer@@QBEPADPBVServerMediaSession@@H at Z) referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2019*: unresolved external symbol "public: void __thiscall RTSPServer::addServerMediaSession(class ServerMediaSession *)" (?addServerMediaSession at RTSPServer @@QAEXPAVServerMediaSession@@@Z) referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2019*: unresolved external symbol "public: unsigned int __thiscall ServerMediaSession::addSubsession(class ServerMediaSubsession *)" (?addSubsession at ServerMediaSession @@QAEIPAVServerMediaSubsession@@@Z) referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2019*: unresolved external symbol "public: static class PassiveServerMediaSubsession * __cdecl PassiveServerMediaSubsession::createNew(class RTPSink &,class RTCPInstance *)" (?createNew at PassiveServerMediaSubsession@@SAPAV1 at AAVRTPSink @@PAVRTCPInstance@@@Z) referenced in function _main 1>testH264VideoStreamer.obj :* error LNK2019*: unresolved external symbol "public: static class ServerMediaSession * __cdecl ServerMediaSession::createNew(class UsageEnvironment &,char const *,char const *,char const *,unsigned int,char const *)" (?createNew at ServerMediaSession@@SAPAV1 at AAVUsageEnvironment@@PBD11I1 at Z) referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2019:* unresolved external symbol "public: static class RTSPServer * __cdecl RTSPServer::createNew(class UsageEnvironment &,class Port,class UserAuthenticationDatabase *,unsigned int)" (?createNew at RTSPServer@@SAPAV1 at AAVUsageEnvironment@@VPort@ @PAVUserAuthenticationDatabase@@I at Z) referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2019*: unresolved external symbol "public: static class RTCPInstance * __cdecl RTCPInstance::createNew(class UsageEnvironment &,class Groupsock *,unsigned int,unsigned char const *,class RTPSink *,class RTPSource const *,unsigned int)" (?createNew at RTCPInstance@@SAPAV1 at AAVUsageEnvironment@@PAVGroupsock@ @IPBEPAVRTPSink@@PBVRTPSource@@I at Z) referenced in function _main 1>testH264VideoStreamer.obj :* error LNK2019*: unresolved external symbol _gethostname at 8 referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2019*: unresolved external symbol "public: static class H264VideoRTPSink * __cdecl H264VideoRTPSink::createNew(class UsageEnvironment &,class Groupsock *,unsigned char)" (?createNew at H264VideoRTPSink@@SAPAV1 at AAVUsageEnvironment @@PAVGroupsock@@E at Z) referenced in function _main 1>testH264VideoStreamer.obj : *error LNK2001*: unresolved external symbol "public: static unsigned int OutPacketBuffer::maxSize" (?maxSize at OutPacketBuffer@@2IA) 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: void __thiscall Groupsock::multicastSendOnly(void)" (?multicastSendOnly at Groupsock@@QAEXXZ) referenced in function _main 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: __thiscall Groupsock::Groupsock(class UsageEnvironment &,struct in_addr const &,class Port,unsigned char)" (??0Groupsock@ @QAE at AAVUsageEnvironment@@ABUin_addr@@VPort@@E at Z) referenced in function _main 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: __thiscall Port::Port(unsigned short)" (??0Port@@QAE at G@Z) referenced in function _main 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "unsigned int __cdecl chooseRandomIPv4SSMAddress(class UsageEnvironment &)" (?chooseRandomIPv4SSMAddress@@YAIAAVUsageEnvironment@@@Z) referenced in function _main 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: static class BasicUsageEnvironment * __cdecl BasicUsageEnvironment::createNew(class TaskScheduler &)" (?createNew at BasicUsageEnvironment@@SAPAV1 at AAVTaskScheduler@@@Z) referenced in function _main 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: static class BasicTaskScheduler * __cdecl BasicTaskScheduler::createNew(void)" (?createNew at BasicTaskScheduler @@SAPAV1 at XZ) referenced in function _main 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: static void __cdecl Medium::close(class Medium *)" (?close at Medium @@SAXPAV1@@Z) referenced in function "void __cdecl afterPlaying(void *)" (?afterPlaying@@YAXPAX at Z) 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: unsigned int __thiscall MediaSink::startPlaying(class MediaSource &,void (__cdecl*)(void *),void *)" (?startPlaying at MediaSink @@QAEIAAVMediaSource@@P6AXPAX at Z1@Z) referenced in function "void __cdecl play(void)" (?play@@YAXXZ) 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: static class H264VideoStreamFramer * __cdecl H264VideoStreamFramer::createNew(class UsageEnvironment &,class FramedSource *,unsigned int)" (?createNew at H264VideoStreamFramer @@SAPAV1 at AAVUsageEnvironment@@PAVFramedSource@@I at Z) referenced in function "void __cdecl play(void)" (?play@@YAXXZ) 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "public: static class ByteStreamFileSource * __cdecl ByteStreamFileSource::createNew(class UsageEnvironment &,char const *,unsigned int,unsigned int)" (?createNew at ByteStreamFileSource @@SAPAV1 at AAVUsageEnvironment@@PBDII at Z) referenced in function "void __cdecl play(void)" (?play@@YAXXZ) 1>testH264VideoStreamer.obj : error LNK2019: unresolved external symbol "class DelayInterval __cdecl operator*(short,class DelayInterval const &)" (??D at YA?AVDelayInterval@@FABV0@@Z) referenced in function "void __cdecl `dynamic initializer for 'DELAY_MINUTE''(void)" (??__EDELAY_MINUTE@@YAXXZ) 1>testH264VideoStreamer.obj : error LNK2001: unresolved external symbol "class DelayInterval const DELAY_SECOND" (?DELAY_SECOND@@3VDelayInterval@@B) 1>c:\documents and settings\administrator\my documents\visual studio 2010\Projects\try1h264\Debug\try1h264.exe : fatal error LNK1120: 23 unresolved externals ========== Build: 0 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== Can anyone help me to resolve it out. -------------- next part -------------- An HTML attachment was scrubbed... URL: From albert.staubin at gmail.com Mon Mar 19 12:15:13 2012 From: albert.staubin at gmail.com (AJ) Date: Mon, 19 Mar 2012 15:15:13 -0400 Subject: [Live-devel] TS streamed through unicast instead of multicast Message-ID: I was planning to wait for a response to my last email, but it is still stuck in moderation and the situation has evolved. I am currently able to stream over multicast, but I would like to stream a file using the "testMPEG2TransportStreamer" using unicast. I have followed the FAQ and the suggestions in the files, but I seem to be coming up short. I have done the following so far: -Changed the destinationAddressStr in "testMPEG2TransportStreamer" to equal my destination ip (my local ip) -Changed the inputAddrStr to NULL in my RTSP Server or receiver. With these changes The RTSP server seems to not be receiving anything from the transport streamer like it was with multicast. I have also noticed that there is not traffic to the destination port on wireshark while I am attempting to use unicast. I have also compiled and run the testReceiver, and it seems to receive some data and eventually get overwhelmed and crash. Am I missing something in my configuration? The FAQ and the comments make it seem like a simple enough change. Are there any suggestions on how to debug the issue? -------------- next part -------------- An HTML attachment was scrubbed... URL: From renatomauro at libero.it Tue Mar 20 04:28:35 2012 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Tue, 20 Mar 2012 12:28:35 +0100 Subject: [Live-devel] Live555 RTSP Client and Panasonic BL-C210 IP camera References: <16E24B4C-15CA-4041-91D8-2905C9E283AC@saranyu.in> Message-ID: <97C53BB9530E4B5FBED3AA4A68C8D1EB@CSystemDev> Hi Sambhav. I had the same problem with some Samsung devices, since the server on some camera and DVR don't handle RTCP messages; instead Samsung decided to manage liveness with GetParameter RTSP command. Anyway Samsung states clearly this (non standard) behaviour in developer's manual. Hope this helps, Renato ----- Original Message ----- From: "Kumar Sambhav" To: "LIVE555 Streaming Media - development & use" Sent: Monday, March 19, 2012 8:39 PM Subject: [Live-devel] Live555 RTSP Client and Panasonic BL-C210 IP camera > Hi, > > I am using openRTSP client to receive RTSP streams from Panasonic BL-C210 > IP camera. After ~65seconds openRTSP stops receiving data. > Any known issues in live555 RTSP client accessing panasonic cameras ? > > Server stops sending data, if it does not receive RTCP packets with the > timeout period. Is this correct ? > > I captured data using wireshark and saw that RTCP packets are sent and > received by both server and client. (as no ICMP packets were seen) > > > Regards, > Sambhav > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Mar 20 08:39:55 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Mar 2012 08:39:55 -0700 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? In-Reply-To: <968130450.1332164920.179144024.38534@mcgi-wr-11.rambler.ru> References: <968130450.1332164920.179144024.38534@mcgi-wr-11.rambler.ru> Message-ID: <2AF3B5E0-1A6C-4C9A-B379-2FC76640A5FD@live555.com> > I am can't understand how you demux vob. Me need your help which class make demux VOB. > And I can't find place in the code where after demux VOB you are sending this data via rtsp server. A VOB file is just a special kind of MPEG Program Stream file - one that contains AC-3 audio instead of MPEG-1 or 2 audio. It it demultiplexed using a "MPEG1or2Demux". If you're curious, I suggest that you review the code for the "testVOBStreamer" demo application (in "testProgs"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vigosslive at rambler.ru Tue Mar 20 21:03:35 2012 From: vigosslive at rambler.ru (Rustam) Date: Wed, 21 Mar 2012 08:03:35 +0400 Subject: [Live-devel] ubsent audio when i try vob file. Message-ID: <1239347272.1332302615.197066600.31386@mperl104.rambler.ru> > Please put your file it on a publically-accessible web (or FTP) server, and post the URL (not the file itself) > to this mailing list, so we can download it and take a look at it. http://84.54.100.231:8080/advert.VOB -------------- next part -------------- An HTML attachment was scrubbed... URL: From vigosslive at rambler.ru Tue Mar 20 21:22:51 2012 From: vigosslive at rambler.ru (Rustam) Date: Wed, 21 Mar 2012 08:22:51 +0400 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? Message-ID: <297272191.1332303771.200686920.27299@mcgi-wr-24.rambler.ru> > If you're curious, I suggest that you review the code for the "testVOBStreamer" demo application (in "testProgs"). I try use this the file but when i select options " -a " i can't heard sound. I decide try catch rtp packets( via rtsp server ) and put them in a file, and them play via VLC, but sound was with many defects. I think therefore VLC via stream can't play such sound, but when i catch rtp packet( via VLC rtsp server ) sound it excellent. Can you decide this problem. Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Wed Mar 21 00:49:03 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Wed, 21 Mar 2012 13:19:03 +0530 Subject: [Live-devel] Live555 RTSP Client and Panasonic BL-C210 IP camera In-Reply-To: <97C53BB9530E4B5FBED3AA4A68C8D1EB@CSystemDev> References: <16E24B4C-15CA-4041-91D8-2905C9E283AC@saranyu.in> <97C53BB9530E4B5FBED3AA4A68C8D1EB@CSystemDev> Message-ID: Hi Renato, Thanks for sharing the information. Panasonic manuals does not mention any such thing. Will check with their support. Regards, Sambhav On Mar 20, 2012, at 4:58 PM, Renato MAURO (Libero) wrote: > Hi Sambhav. > > > I had the same problem with some Samsung devices, since the server on some camera and DVR don't handle RTCP messages; instead Samsung decided to manage liveness with GetParameter RTSP command. > Anyway Samsung states clearly this (non standard) behaviour in developer's manual. > > > Hope this helps, > > Renato > > ----- Original Message ----- From: "Kumar Sambhav" > To: "LIVE555 Streaming Media - development & use" > Sent: Monday, March 19, 2012 8:39 PM > Subject: [Live-devel] Live555 RTSP Client and Panasonic BL-C210 IP camera > > >> Hi, >> >> I am using openRTSP client to receive RTSP streams from Panasonic BL-C210 IP camera. After ~65seconds openRTSP stops receiving data. >> Any known issues in live555 RTSP client accessing panasonic cameras ? >> >> Server stops sending data, if it does not receive RTCP packets with the timeout period. Is this correct ? >> >> I captured data using wireshark and saw that RTCP packets are sent and received by both server and client. (as no ICMP packets were seen) >> >> >> Regards, >> Sambhav >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Wed Mar 21 00:56:36 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2012 00:56:36 -0700 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? In-Reply-To: <297272191.1332303771.200686920.27299@mcgi-wr-24.rambler.ru> References: <297272191.1332303771.200686920.27299@mcgi-wr-24.rambler.ru> Message-ID: <61A5F030-E1CB-4483-A363-821C22AC31C0@live555.com> > > If you're curious, I suggest that you review the code for the "testVOBStreamer" demo application (in "testProgs"). > > I try use this the file but when i select options " -a " i can't heard sound. I decide try catch rtp packets( via rtsp server ) and put them in a file, and them play via VLC, but sound was with many defects. > Thanks for the note. Yes, I've been able to reproduce this. I'll investigate this issue some more. I don't know yet whether this is a problem with our software, or with VLC. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastien-devel at celeos.eu Wed Mar 21 01:32:28 2012 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Wed, 21 Mar 2012 09:32:28 +0100 Subject: [Live-devel] misleading error message Message-ID: <1332318748.3939.3.camel@stim-desktop> Hi Ross, in RTSPClient.cpp::handleResponseBytes, newBytesRead is casted to unsigned. But this value can be negative when a socket error occurred. In this case we will se the error message : RTSP response was truncated. Increase "RTSPClient::responseBufferSize" But it is not the case, the real error is a socket error. Regards, S?bastien. From finlayson at live555.com Wed Mar 21 07:41:17 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2012 07:41:17 -0700 Subject: [Live-devel] misleading error message In-Reply-To: <1332318748.3939.3.camel@stim-desktop> References: <1332318748.3939.3.camel@stim-desktop> Message-ID: <676D8CE8-77ED-49CB-B463-668F1BCFB6D1@live555.com> Thanks for the report. I'll fix this in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From enadler at WatchGuardVideo.com Wed Mar 21 12:54:53 2012 From: enadler at WatchGuardVideo.com (Eric Nadler) Date: Wed, 21 Mar 2012 14:54:53 -0500 Subject: [Live-devel] Patch for uClibc segfault while streaming H.264 Message-ID: I thought it would be good to mention here that I have submitted a patch to uClibc to avoid a segfault while streaming H.264 video with testOnDemandRTSPServer. The bug report and patch can be found here: https://bugs.busybox.net/show_bug.cgi?id=4964 -- Sincerely, Eric Nadler Senior Software Engineer WatchGuard Video From finlayson at live555.com Wed Mar 21 16:52:25 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2012 16:52:25 -0700 Subject: [Live-devel] Patch for uClibc segfault while streaming H.264 In-Reply-To: References: Message-ID: <5AE1EAD6-01E3-40F2-9439-A3F9835C3DCB@live555.com> > I thought it would be good to mention here that I have submitted a patch to > uClibc to avoid a segfault while streaming H.264 video with > testOnDemandRTSPServer. The bug report and patch can be found here: > > https://bugs.busybox.net/show_bug.cgi?id=4964 Eric, Thanks for mentioning this. Do you know if there's anything that we could do in our code ("liveMedia/Locale.cpp") to work around this bug, or will developers who use uClibc just need to wait for the bug to get fixed? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Wed Mar 21 18:38:53 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Wed, 21 Mar 2012 18:38:53 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> Message-ID: <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> Ross, Thanks for the reply. I appreciate the pointer to the additional information. I did read it and followed its advice, but in a nutshell, I'm still experiencing around 60% packet loss though doing very little processing in the afterGettingFrame() method callback in my MediaSink subclass. The FAQ makes the statement: "It's important to understand that because a LIVE555 Streaming Media application runs as a single thread - never writing to, or reading from, sockets concurrently - if packet loss occurs, then it must be happening either (i) on the network, or (ii) in the operating system of the sender or receiver. There's nothing in our code that can be 'losing' packets." I do not dispute that, but I need to know more about the implications of Live555 behavior, because even though it might not explicitly lose packets, the implications of its approach may result in the same for any app actually doing any processing. I hope there's something I'm missing, which can improve performance. I'll explain. Earlier in the thread, I mentioned that the big picture of my use case is receiving an H.264 video stream via RTSP with Live555, then decoding the received data with ffmpeg, and displaying the resulting video frame to the screen. I originally thought the data loss was occurring in the ffmpeg decoding and rendering to the screen, until I removed decoding and rendering entirely, and discovered the frame loss was experienced before even leaving the afterGettingFrame method. Here's the specifics of what I did: 1. I followed the FAQ's recommendations, which increased the receiveBuffer to (VLC's recommended) 2000000. (Note, I also tried 400000, same results as 2000000, also note there are no truncated bytes). 2. I removed all ffmpeg decoding, rendering to the screen, and all processing whatsoever from the afterGettingFrame() callback on the MediaSink subclass, except for logging counts of frames received. Combined with the receiveBuffer increase in 1, this resulted in receipt of 28-29fps, which tracks with the 30fps being sent by the server. At this point, I was fairly certain that I had the Live555 side of the processing pipeline figured out, and there was nothing left to do. 3. I added back in the minimal processing I was doing in the afterGettingFrame() callback of the MediaSink subclass *but didn't add any decoding/displaying* -- all I am doing is copying a 4-byte start code and and the data in the client buffer to another buffer to send it downstream. Specifically these two chunks of code: // add the start code len = 4; memcpy((newFrame + offset), start_code, len); offset += len; // add the buffer data memcpy((newFrame + offset), fReceiveBuffer, frameSize); when the code above was commented out, I get no frame loss, around 28-29fps. When the code is included, I only get 10-14fps, i.e. around 60% frame loss. Note, nothing is being done in afterGettingFrame() or anywhere else with this buffer...after this code above, continuePlaying() is called. The reason I put this code in was for the sole purpose of gauging what effect if any this had on frame loss. Clearly, it has a huge impact. This now probably warrants some explanation of our stream being processed. Our H.264 stream is very simple. gop size is 1, each one with 3 sequential NAL units being sent repeatedly: NAL 7 | NAL 8 | NAL 5 NAL units 7 and 8 are 4-9 bytes each whose general purpose is just to communicate frame size. NAL unit 5 is consistently around 23K and is the frame image data. So in reference to the code snippets above, when afterGettingFrame() is called, I copy both a start code and the actual data into a new buffer. (Now in actual processing I'll copy prop records info as well, and then pass this on to ffmpeg, but this was all commented out for this test). That memcpy is basically copying 15 bytes of data for the first two afterGettingFrame calls and 23K on the third afterGettingFrame call for what essentially is one video frame. This brings us to the heart of the matter: the only processing being done here is just simply the creation of a new buffer. Why would this result in a 60% frame loss? No decoding or displaying of any kind is even being attempted. My assumption is that I'm missing or misunderstanding something fundamental about Live555, and there's a knob or switch to be turned or flipped somewhere that will solve the problem. Because as it is, I'm not seeing how any processing at all of this stream is realistically feasible (even handing off the processing) without losing over half the stream data. So I return to the statement in the FAQ -- if the culprit in frame loss is essentially afterGettingFrame() processing, is there any metric or quantification on whether any real processing (or even handing off of data) can actually be done without prohibitive data loss? There's got to be a way to receive a video stream and actually do something minimal with it without losing so much data. Ross, yours (and anyone else's) expertise and ideas here are greatly appreciated. Thanks, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com On Mar 10, 2012, at 11:08 PM, Ross Finlayson wrote: >> 1. What factors / culprits could be causing this massive frame loss? >> >> 2. Is there anything in the base RTSPClient or MediaSink subclasses (or elsewhere) that can be configured to reduce frame loss? > > See > > You should also, of course, make sure that your receiver ("MediaSink" subclass)'s buffer is big enough for the incoming NAL units. (You can do this by checking whether "numTruncatedBytes" is ever >0.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Wed Mar 21 19:15:20 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Wed, 21 Mar 2012 19:15:20 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> Message-ID: <9E002469-1E5E-4C73-9E26-CC3C6CE3FB8A@bighillsoftware.com> One quick point of clarification about GOP size....I think it is probably clear, but just in case it isn't: we are sending I-frames in our NAL 5s -- no differentials. The stream looks like this [7][8][5][7][8][5][7][8][5]... Each NAL 5 is an i-frame, 23K each. I hope that clarifies. Brad On Mar 21, 2012, at 6:38 PM, Brad O'Hearne wrote: > Ross, > > Thanks for the reply. I appreciate the pointer to the additional information. I did read it and followed its advice, but in a nutshell, I'm still experiencing around 60% packet loss though doing very little processing in the afterGettingFrame() method callback in my MediaSink subclass. The FAQ makes the statement: > > "It's important to understand that because a LIVE555 Streaming Media application runs as a single thread - never writing to, or reading from, sockets concurrently - if packet loss: occurs, then it must be happening either (i) on the network, or (ii) in the operating system of the sender or receiver. There's nothing in our code that can be 'losing' packets." > > I do not dispute that, but I need to know more about the implications of Live555 behavior, because even though it might not explicitly lose packets, the implications of its approach may result in the same for any app actually doing any processing. I hope there's something I'm missing, which can improve performance. I'll explain. > > Earlier in the thread, I mentioned that the big picture of my use case is receiving an H.264 video stream via RTSP with Live555, then decoding the received data with ffmpeg, and displaying the resulting video frame to the screen. I originally thought the data loss was occurring in the ffmpeg decoding and rendering to the screen, until I removed decoding and rendering entirely, and discovered the frame loss was experienced before even leaving the afterGettingFrame method. Here's the specifics of what I did: > > 1. I followed the FAQ's recommendations, which increased the receiveBuffer to (VLC's recommended) 2000000. (Note, I also tried 400000, same results as 2000000, also note there are no truncated bytes). > > 2. I removed all ffmpeg decoding, rendering to the screen, and all processing whatsoever from the afterGettingFrame() callback on the MediaSink subclass, except for logging counts of frames received. Combined with the receiveBuffer increase in 1, this resulted in receipt of 28-29fps, which tracks with the 30fps being sent by the server. At this point, I was fairly certain that I had the Live555 side of the processing pipeline figured out, and there was nothing left to do. > > 3. I added back in the minimal processing I was doing in the afterGettingFrame() callback of the MediaSink subclass *but didn't add any decoding/displaying* -- all I am doing is copying a 4-byte start code and and the data in the client buffer to another buffer to send it downstream. Specifically these two chunks of code: > > // add the start code > len = 4; > memcpy((newFrame + offset), start_code, len); > offset += len; > > // add the buffer data > memcpy((newFrame + offset), fReceiveBuffer, frameSize); > > when the code above was commented out, I get no frame loss, around 28-29fps. When the code is included, I only get 10-14fps, i.e. around 60% frame loss. Note, nothing is being done in afterGettingFrame() or anywhere else with this buffer...after this code above, continuePlaying() is called. The reason I put this code in was for the sole purpose of gauging what effect if any this had on frame loss. Clearly, it has a huge impact. This now probably warrants some explanation of our stream being processed. > > Our H.264 stream is very simple. gop size is 1, each one with 3 sequential NAL units being sent repeatedly: > > NAL 7 | NAL 8 | NAL 5 > > NAL units 7 and 8 are 4-9 bytes each whose general purpose is just to communicate frame size. NAL unit 5 is consistently around 23K and is the frame image data. So in reference to the code snippets above, when afterGettingFrame() is called, I copy both a start code and the actual data into a new buffer. (Now in actual processing I'll copy prop records info as well, and then pass this on to ffmpeg, but this was all commented out for this test). That memcpy is basically copying 15 bytes of data for the first two afterGettingFrame calls and 23K on the third afterGettingFrame call for what essentially is one video frame. > > This brings us to the heart of the matter: the only processing being done here is just simply the creation of a new buffer. Why would this result in a 60% frame loss? No decoding or displaying of any kind is even being attempted. > > My assumption is that I'm missing or misunderstanding something fundamental about Live555, and there's a knob or switch to be turned or flipped somewhere that will solve the problem. Because as it is, I'm not seeing how any processing at all of this stream is realistically feasible (even handing off the processing) without losing over half the stream data. > > So I return to the statement in the FAQ -- if the culprit in frame loss is essentially afterGettingFrame() processing, is there any metric or quantification on whether any real processing (or even handing off of data) can actually be done without prohibitive data loss? There's got to be a way to receive a video stream and actually do something minimal with it without losing so much data. > > Ross, yours (and anyone else's) expertise and ideas here are greatly appreciated. > > Thanks, > > Brad > > Brad O'Hearne > Founder / Lead Developer > Big Hill Software LLC > http://www.bighillsoftware.com > > On Mar 10, 2012, at 11:08 PM, Ross Finlayson wrote: > >>> 1. What factors / culprits could be causing this massive frame loss? >>> >>> 2. Is there anything in the base RTSPClient or MediaSink subclasses (or elsewhere) that can be configured to reduce frame loss? >> >> See >> >> You should also, of course, make sure that your receiver ("MediaSink" subclass)'s buffer is big enough for the incoming NAL units. (You can do this by checking whether "numTruncatedBytes" is ever >0.) >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Mar 21 23:09:43 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2012 23:09:43 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> Message-ID: > 1. I followed the FAQ's recommendations, which increased the receiveBuffer to (VLC's recommended) 2000000. (Note, I also tried 400000, same results as 2000000, also note there are no truncated bytes). Note also that the "increaseReceiveBufferTo()" function returns the actual resulting OS socket buffer size. (It does this by doing a getsockopt(..., SOL_SOCKET) on the socket.) So, if you haven't already done so, you should check the return value from these calls, just to make sure that the OS is actually setting the socket buffer size that you're requesting. In some OS's - e.g., Linux - you have to configure the maximum possible buffer size, before you call "increaseReceiveBufferTo()". (This is described in the FAQ entry.) Apart from the OS's socket buffering, the only other possible cause for 'dropped packets' is actual packet loss occurring on the network. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 22 02:06:32 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Mar 2012 02:06:32 -0700 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? In-Reply-To: <297272191.1332303771.200686920.27299@mcgi-wr-24.rambler.ru> References: <297272191.1332303771.200686920.27299@mcgi-wr-24.rambler.ru> Message-ID: <47C60F3B-A5D6-4AFA-8EC8-0D256E07ED9F@live555.com> I've now installed a new version (2012.03.22) of the "LIVE555 Streaming Media" code that fixes a bug that could cause invalid AC3 data to be streamed from VOB files. (Thanks to Rustam for reporting this problem.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From enadler at WatchGuardVideo.com Thu Mar 22 08:46:09 2012 From: enadler at WatchGuardVideo.com (Eric Nadler) Date: Thu, 22 Mar 2012 10:46:09 -0500 Subject: [Live-devel] Patch for uClibc segfault while streaming H.264 In-Reply-To: <5AE1EAD6-01E3-40F2-9439-A3F9835C3DCB@live555.com> Message-ID: Hi Ross, I recompiled live555 with ?DLOCALE_NOT_USED in COMPILE_OPTS. This also avoided the segfault, and could be a workaround for those that don?t want to or can?t modify uClibc. Eric On 3/21/12 6:52 PM, "Ross Finlayson" wrote: >> I thought it would be good to mention here that I have submitted a patch to >> uClibc to avoid a segfault while streaming H.264 video with >> testOnDemandRTSPServer. The bug report and patch can be found here: >> >> https://bugs.busybox.net/show_bug.cgi?id=4964 > > Eric, > > Thanks for mentioning this. Do you know if there's anything that we could do > in our code ("liveMedia/Locale.cpp") to work around this bug, or will > developers who use uClibc just need to wait for the bug to get fixed? > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Thu Mar 22 15:24:09 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Thu, 22 Mar 2012 15:24:09 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> Message-ID: <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> Ross, Thank you for your reply. To your comments: 1. I am fairly confident the increase of the receive buffer is taking, because that change alone eliminated 100% of frame loss when the afterGettingFrame() method did nothing bug log NAL counts. I was getting near 30fps. 2. I do not think the network has anything to do with it. The network is a closed LAN (i.e. not connected to the Internet or WAN), there are no other machines on the network (zero traffic), and the client is sitting about three feet from the Wifi router. Plus, as stated in 1, if the network were affected something, I wouldn't be seeing frame loss. So I eliminated all memory allocation and memcpy of the 23K of data into the new frame buffer from the receive buffer, and I was back to getting almost zero frame loss. But as soon as I added in anything else, virtually immediately returned to about 2/3 (66%) frame loss again. But now I have a good specific example to discuss -- ffmpeg decodding. Now actually, I did not add everything back in that is required in processing, just a single call to avcodec_decode_video2 -- no scaling, color conversion, or rendering was done.The processing time for this one call averaged around ~19 milliseconds, which resulted in losing 66% frame loss again. I'm starting to sense that it is immaterial what is done in afterGettingFrame(), that it is on the brink hitting its ability to process 30fps before afterGettingFrame() is even called, and no matter what is done in afterGettingFrame() is going to result in frame loss. I guess in light of what I've found, it makes me wonder if the fact that the testRTSPClient example doesn't actually do any processing in its afterGettingFrame() method might be more than coincidence. For comparison, I cracked open the VLC code which uses LIVE555 and found that they don't appear to go the MediaSink route at all, which makes me wonder if the general approach is flawed, and I need to go farther upstream to streamline the RTSPClient. Two questions: 1. Are you (or anyone) aware of what actual processing can realistically be done in afterGettingFrame() method without losing data? 2. Do you (or anybody) actually have a sample that is architected like the testRTSPClient example which actually does meaningful processing and doesn't lose major data by processing in afterGettingFrame()? If there's any example out there that shows it is possible to actual process video in afterGettingFrame() without major frame loss, I think that would help immensely. Thanks, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com On Mar 21, 2012, at 11:09 PM, Ross Finlayson wrote: >> 1. I followed the FAQ's recommendations, which increased the receiveBuffer to (VLC's recommended) 2000000. (Note, I also tried 400000, same results as 2000000, also note there are no truncated bytes). > > Note also that the "increaseReceiveBufferTo()" function returns the actual resulting OS socket buffer size. (It does this by doing a getsockopt(..., SOL_SOCKET) on the socket.) So, if you haven't already done so, you should check the return value from these calls, just to make sure that the OS is actually setting the socket buffer size that you're requesting. In some OS's - e.g., Linux - you have to configure the maximum possible buffer size, before you call "increaseReceiveBufferTo()". (This is described in the FAQ entry.) > > Apart from the OS's socket buffering, the only other possible cause for 'dropped packets' is actual packet loss occurring on the network. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Thu Mar 22 20:00:59 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Fri, 23 Mar 2012 08:30:59 +0530 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> Message-ID: Hi Brad, I had done processing in afterGettingFrame() by copying the data to a Linux named PIPE on a ~300Mhz ARM based system. You can try few experiments. 1. Did you profile them time taken for memcpy in afterGettingFrame() ? 2. In every afterGettingFrame() call, instead of using the new data coming , use a pre initialized data and do processing with it. e.g a black frame 3. You can move the processing to be done in afterGettingFrame() to a different thread. You can run live555 in one thread and ffmpeg in another and share data using PIPES or ring buffers. FFMPEG may be internally using threads to decode, (not 100% sure on this) 4. Try with a low FPS / bitrate clip. 5. Just give a try on requesting TCP data from the server instead of UDP to see if the behavior changes. Regards, Sambhav On Mar 23, 2012, at 3:54 AM, Brad O'Hearne wrote: > Ross, > > Thank you for your reply. To your comments: > > 1. I am fairly confident the increase of the receive buffer is taking, because that change alone eliminated 100% of frame loss when the afterGettingFrame() method did nothing bug log NAL counts. I was getting near 30fps. > > 2. I do not think the network has anything to do with it. The network is a closed LAN (i.e. not connected to the Internet or WAN), there are no other machines on the network (zero traffic), and the client is sitting about three feet from the Wifi router. Plus, as stated in 1, if the network were affected something, I wouldn't be seeing frame loss. > > So I eliminated all memory allocation and memcpy of the 23K of data into the new frame buffer from the receive buffer, and I was back to getting almost zero frame loss. But as soon as I added in anything else, virtually immediately returned to about 2/3 (66%) frame loss again. But now I have a good specific example to discuss -- ffmpeg decodding. Now actually, I did not add everything back in that is required in processing, just a single call to avcodec_decode_video2 -- no scaling, color conversion, or rendering was done.The processing time for this one call averaged around ~19 milliseconds, which resulted in losing 66% frame loss again. I'm starting to sense that it is immaterial what is done in afterGettingFrame(), that it is on the brink hitting its ability to process 30fps before afterGettingFrame() is even called, and no matter what is done in afterGettingFrame() is going to result in frame loss. > > I guess in light of what I've found, it makes me wonder if the fact that the testRTSPClient example doesn't actually do any processing in its afterGettingFrame() method might be more than coincidence. For comparison, I cracked open the VLC code which uses LIVE555 and found that they don't appear to go the MediaSink route at all, which makes me wonder if the general approach is flawed, and I need to go farther upstream to streamline the RTSPClient. > > Two questions: > > 1. Are you (or anyone) aware of what actual processing can realistically be done in afterGettingFrame() method without losing data? > > 2. Do you (or anybody) actually have a sample that is architected like the testRTSPClient example which actually does meaningful processing and doesn't lose major data by processing in afterGettingFrame()? > > If there's any example out there that shows it is possible to actual process video in afterGettingFrame() without major frame loss, I think that would help immensely. > > Thanks, > > Brad > > Brad O'Hearne > Founder / Lead Developer > Big Hill Software LLC > http://www.bighillsoftware.com > > On Mar 21, 2012, at 11:09 PM, Ross Finlayson wrote: > >>> 1. I followed the FAQ's recommendations, which increased the receiveBuffer to (VLC's recommended) 2000000. (Note, I also tried 400000, same results as 2000000, also note there are no truncated bytes). >> >> Note also that the "increaseReceiveBufferTo()" function returns the actual resulting OS socket buffer size. (It does this by doing a getsockopt(..., SOL_SOCKET) on the socket.) So, if you haven't already done so, you should check the return value from these calls, just to make sure that the OS is actually setting the socket buffer size that you're requesting. In some OS's - e.g., Linux - you have to configure the maximum possible buffer size, before you call "increaseReceiveBufferTo()". (This is described in the FAQ entry.) >> >> Apart from the OS's socket buffering, the only other possible cause for 'dropped packets' is actual packet loss occurring on the network. >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 22 21:23:54 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Mar 2012 21:23:54 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> Message-ID: <7923988C-3F0D-4878-B84E-56FC90683E95@live555.com> This is getting silly. You're doing a lot of mental 'flailing around'; yet the answer is right in front of you: > So I eliminated all memory allocation and memcpy of the 23K of data into the new frame buffer from the receive buffer, and I was back to getting almost zero frame loss. But as soon as I added in anything else, virtually immediately returned to about 2/3 (66%) frame loss again. But now I have a good specific example to discuss -- ffmpeg decodding. Now actually, I did not add everything back in that is required in processing, just a single call to avcodec_decode_video2 -- no scaling, color conversion, or rendering was done.The processing time for this one call averaged around ~19 milliseconds There's the problem. It's taking your CPU a whopping 19ms to decode each frame. That means that - even if your CPU did absolutely nothing else - you couldn't handle any more than 50 frames-per-second. In practice, your limit will be a lot less than this. It's clear that your CPU is woefully underpowered. It's really quite simple: *Every* LIVE555-based application that receives and processes a RTP stream is structured as follows: while (1) { Step A: Get a frame of data from a socket; Step B: Do something with this frame of data; } That's it. "Step A" is what the LIVE555 library does. "Step B" is the new code that you add. Every receiving application is structured like this - including VLC. (The reason the VLC code looks a bit different is because it uses its own event loop, not ours. But its structure is essentially the same.) Note that your computer needs to be powerful enough so that it can execute each iteration of this loop (i.e., Step A + Step B) at least as fast as frames arrive. If your computer can't handle frames faster than they arrive, then you will lose data. End of story. If this happens, then increasing the OS's socket buffer size can't save you. Note also that - for each frame - "Step A" needs to be done before "Step B". Thus the loop. There's no way you can restructure this to make it more efficient. (But wait, I hear some people say. What if you used threads, so that one thread did while (1) { Step A: Get a frame of data from a socket;; } and another thread did while (1) { Step B: Do something with this frame of data; } ? That wouldn't help you. Because each "Step B" relies upon a frame that was generated by "Step A", you'd need to write your own (thread-safe) queueing mechanism to buffer the frame data that gets passed between the two threads. But this would just be duplicating - less efficiently - the socket buffering that's already inside the operating system. So it would actually end up being less efficient than the single-threaded loop. Note also that "Step A" consumes relatively little CPU, so you'd be unlikely to get any benefit even if you have multiple cores.) If - as appears to be the case for you - your CPU isn't powerful enough to handle the incoming stream of frames, then you have three possible solutions: 1/ Reduce the incoming frame rate, or the size of complexity of each incoming frame. I.e., you'll need to change your encoder. 2/ Improve your implementation of "Step B" - i.e., your code. (If (and only if) you have a multi-core CPU, then this is where you *might* be able to use threads - just within your "Step B" code - to improve performance.) 3/ Use a faster CPU. The basic problem here is more fundamental. Most people's exposure to the Internet these days is solely through HTTP, which - being a TCP-based protocol - means that data never arrives faster than the receiver can process it. But datagram-based applications are different, and unfortunately many people these days - who think that the World-Wide Web is the entire Internet - don't appreciate this. They try using the LIVE555 code and then, when they see network data loss for the first time in their life, think that the LIVE555 code has to be to blame. Quite frankly, I'm getting tired of this. This will be my last posting on this particular topic. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Wed Mar 21 08:08:51 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Wed, 21 Mar 2012 20:38:51 +0530 Subject: [Live-devel] error message Message-ID: while running testh264videostreamer.cpp an error comes in the output window of V2010 First-chance exception at 0x7c812afb in nnew11.exe: Microsoft C++ exception: int at memory location 0x0012fa70.. this error after Beginning to read from the file appears in the command prompt window plz help me to resolve it out -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailme_vinaytyagi at yahoo.com Wed Mar 21 10:37:29 2012 From: mailme_vinaytyagi at yahoo.com (Vinay Tyagi) Date: Wed, 21 Mar 2012 10:37:29 -0700 (PDT) Subject: [Live-devel] Help on testOnDemandRTSPServer In-Reply-To: <1332156103.23624.YahooMailNeo@web113419.mail.gq1.yahoo.com> References: <1332156103.23624.YahooMailNeo@web113419.mail.gq1.yahoo.com> Message-ID: <1332351449.91755.YahooMailNeo@web113409.mail.gq1.yahoo.com> Hi, Point#1I need to give VLC converted UDP/RTP stream to?testOnDemandRTSPServer and need to generate RTSP stream. Is this solution workable? Point#2Can I change port 8554 to 554 in?testOnDemandRTSPServer.cpp. if yes then how? I tried to change it in? line RTSPServer* rtspServer = RTSPServer::createNew(*env, 554, authDB) but after compile while running EXE it is giving an error - failed to create RTSP, port number <554> is unknown. Please help how can I change port if it is not open in environment! Regards, Vinay -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 22 21:58:04 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Mar 2012 21:58:04 -0700 Subject: [Live-devel] Help on testOnDemandRTSPServer In-Reply-To: <1332351449.91755.YahooMailNeo@web113409.mail.gq1.yahoo.com> References: <1332156103.23624.YahooMailNeo@web113419.mail.gq1.yahoo.com> <1332351449.91755.YahooMailNeo@web113409.mail.gq1.yahoo.com> Message-ID: <404819E6-BCFA-4E8C-B228-CCECEECA550A@live555.com> > Point#1 I need to give VLC converted UDP/RTP stream to testOnDemandRTSPServer and need to generate RTSP stream. I'm sorry, but I can't understand what you're asking here. Please reword your question to explain in detail what you want to do. > Point#2 Can I change port 8554 to 554 in testOnDemandRTSPServer.cpp. if yes then how? I tried to change it in > line RTSPServer* rtspServer = RTSPServer::createNew(*env, 554, authDB) but after compile while running EXE it is giving an error - Most Unix-based OSs (including Linux) don't let you use port numbers lower than 1024 unless you're running as "root". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sachint at hcl.com Fri Mar 23 02:43:19 2012 From: sachint at hcl.com (Sachin Taraiya - ERS, HCL Tech) Date: Fri, 23 Mar 2012 15:13:19 +0530 Subject: [Live-devel] 453 Not Enough Bandwidth Message-ID: <45F3FDEC759D21468F1F910D2B1CC43BC588003A41@NDA-HCLT-EVS04.HCLT.CORP.HCL.IN> Hi, I am using the openRTSP client for testing my own RTSPProxy server. For load testing I want to create several concurrent connections to my server using openRTSP client but I am getting the following error after I am reaching approximately 104 connections. "453 Not Enough Bandwidth" Is this a known issue? Are there any workaround to resolve this problem? I need to test it in the range of 500 to 1000 concurrent connections. Any help greatly appreciated. Regards Sachin ________________________________ ::DISCLAIMER:: ----------------------------------------------------------------------------------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only. It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect. ----------------------------------------------------------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 23 07:00:11 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 07:00:11 -0700 Subject: [Live-devel] 453 Not Enough Bandwidth In-Reply-To: <45F3FDEC759D21468F1F910D2B1CC43BC588003A41@NDA-HCLT-EVS04.HCLT.CORP.HCL.IN> References: <45F3FDEC759D21468F1F910D2B1CC43BC588003A41@NDA-HCLT-EVS04.HCLT.CORP.HCL.IN> Message-ID: > I am using the openRTSP client for testing my own RTSPProxy server. For load testing I want to create several concurrent connections to my server using openRTSP client but I am getting the following error after I am reaching approximately 104 connections. > > ?453 Not Enough Bandwidth? Our code doesn't return this error message anywhere. Therefore, because you are running a RTSP 'proxy', the error message must be getting returned by the downstream server (which must have been developed by someone else). Therefore, this is not an issue with our software (although, I suspect the error message probably means what it says...). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From albert.staubin at gmail.com Thu Mar 22 10:58:48 2012 From: albert.staubin at gmail.com (AJ) Date: Thu, 22 Mar 2012 13:58:48 -0400 Subject: [Live-devel] issues with TS video streamed through unicast Raw UDP and RTP Message-ID: I currently have a DynamicRTSP Server built off the example onDemandRtsp server. My end goal is to use Raw UDP from a "testMPEG2TransportStreamer" like program that will stream TS packets to the server. I have the transport streamer working in two forms the Live555 "testMPEG2TransportStreamer.exe" that uses RTP to send TS data and a version of my own that sends the TS data over RawUDP. Now that you know where I currently am here are my issues, Issue 1 - Unicast Transport streamer vs Multicast streamer Bug : When I stand up my RTSP server, if the "testMPEG2TransportStreamer.exe" or my Transport Streamer are already streaming video over unicast then when a client connects and the ServerMediaSession is created, they will not receive any of the video that is being sent from the transport streamer. On the other hand if the client connects first and then the transport streamer is started the video is sent and received according to plan. When I stand the same RTSP server and use "testMPEG2TransportStreamer.exe" with multicast instead of unicast when the client connects and the ServerMediaSession is created the TS data is sent all the way through and the client is able to display it. In this case the client can connect before or after the "testMPEG2TransportStreamer.exe" starts streaming. Issue 2 - Raw UDP Unicast vs RTP Unicast client connection Bug: When I use RawUDP Unicast I run into an issue when I try to connect new users when the "testMPEG2TransportStreamer.exe" is already streaming and the first client is receiving( as long as I follow the steps mentioned in issue 1). All new clients will not receive the video until the "testMPEG2TransportStreamer.exe" or my transportStreamer is stopped and started again. This issue does not seem to exist when using RTP unicast or multicast. I currently do not have a Raw UDP multicast Transport streamer to test that setup. To sum this all up there seems to be an issue in the handling of incoming Unicast UDP and RTP stream sources. Is this expected performance? Where should I look in the code to determine if there is an issue in the Groupsock that is being used for either of these situations? I am still trying to debug this issues on my own, but any help would be greatly appreciated. Thank you for your help. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vigosslive at rambler.ru Thu Mar 22 22:32:05 2012 From: vigosslive at rambler.ru (Rustam) Date: Fri, 23 Mar 2012 09:32:05 +0400 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? Message-ID: <837541926.1332480725.235189544.40042@mperl110.rambler.ru> Hi, Ross. Your the new version( 22.03.12 ) working when I catch RTP packets and put in a file( it turn out perfect sound ), but when i try play VOB via RTSP stream again no audio. I know where you doing mistake: When you send first and second RTP packets, they are is correct, but when you send third the packet only first 477 byte is correct, then is trash, the fully fourth the packet is trash too, in the fifth RTP the packet first 477 byte is incorrect but other 943 byte is correct. And subsequent RTP packets flow is completely correct. Conclusion is such: you send 1536 byte how trash. When I catch rtp packets, I in my application deleting this trash and the whole of file turns out correct. I think therefore video players cannot play sound via RTSP stream from behind this trash. Can you decide this problem? Thanks. From trn200190 at gmail.com Fri Mar 23 01:30:16 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Fri, 23 Mar 2012 14:00:16 +0530 Subject: [Live-devel] urgent compile error Message-ID: i am compiling the RTSPServerSupportingHTTPStreaming.cpp in VS2010 express and i am getting th following error MSVCRTD.lib(crtexe.obj) : error LNK2019: unresolved external symbol _main referenced in function ___tmainCRTStartup i have linked all the libraries and header files. plz help me out. -------------- next part -------------- An HTML attachment was scrubbed... URL: From daniel.liu17 at gmail.com Fri Mar 23 03:59:56 2012 From: daniel.liu17 at gmail.com (=?GB2312?B?wfXB+rfJ?=) Date: Fri, 23 Mar 2012 18:59:56 +0800 Subject: [Live-devel] why RTCP BYE does not invoke client's "TEARDOWN" Message-ID: Hi, I want to close the RTSPClientSessions belong to one ServerMediaSession from the server side. I send RTCP "BYE" message from server side, but it doesn't invoke the client's sending "TEARDOWN". But the client(openRTSP) had received the "BYE" message and it had printed the "Received RTCP "BYE"...". What's the reason? If this method can not work, are there any other solutions to shutdown sessions from server side? Thanks and Best Regards! --Daniel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 23 07:47:31 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 07:47:31 -0700 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? In-Reply-To: <837541926.1332480725.235189544.40042@mperl110.rambler.ru> References: <837541926.1332480725.235189544.40042@mperl110.rambler.ru> Message-ID: > Your the new version( 22.03.12 ) working when I catch RTP packets and put in a file( it turn out perfect sound ), but when i try play VOB via RTSP stream again no audio. When I run "testOnDemandRTSPServer" (built with the latest version of the code), using your file "advert.VOB" - renamed as "test.vob" - I am able to use "VLC" to play the RTSP/RTP stream - using the url rtsp://:8554/vobTest with no audio (or video) problems. If this file also works for you (using "testOnDemandRTSPServer" and "VLC"), then please give me an example of a different VOB server that doesn't work. If, however, "testOnDemandRTSPServer" and "VLC" does *not* work for you - using the file "advert.VOB", renamed as "test.vob" - then I won't be able to help you, because it works OK for me. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 23 07:59:20 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 07:59:20 -0700 Subject: [Live-devel] issues with TS video streamed through unicast Raw UDP and RTP In-Reply-To: References: Message-ID: <223D5DC9-4AD4-4C15-A548-FF08979A4E6E@live555.com> > When I stand up my RTSP server, if the "testMPEG2TransportStreamer.exe" or my Transport Streamer are already streaming video over unicast then when a client connects and the ServerMediaSession is created, they will not receive any of the video that is being sent from the transport streamer. On the other hand if the client connects first and then the transport streamer is started the video is sent and received according to plan. That's probably an issue only when the 'streamer' application and the 'receiver' application are running on the same computer. It shouldn't happen when the 'streamer' is on a separate computer from the 'receiver'. I don't know the reason for this, but it's not significant, because if the 'streamer' and the 'receiver' are running on the same computer (or on the same LAN, for that matter), then you can just use multicast rather than unicast. I can't help you with your second issue, unfortunately, because it concerns your own modifications that you made to the source code (to stream/receive raw UDP). But, because you have control over both the 'streamer' and the 'receiver', you shouldn't be streaming using raw UDP anyway. Continue to use RTP. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 23 08:01:44 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 08:01:44 -0700 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? In-Reply-To: <837541926.1332480725.235189544.40042@mperl110.rambler.ru> References: <837541926.1332480725.235189544.40042@mperl110.rambler.ru> Message-ID: Sorry, I made a mistake in my last response. I meant to say If this file ("advert.VOB", renamed as "test.vob") also works for you (using "testOnDemandRTSPServer" and "VLC"), then please give me an example of a different VOB ***file*** that doesn't work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 23 08:07:16 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 08:07:16 -0700 Subject: [Live-devel] why RTCP BYE does not invoke client's "TEARDOWN" In-Reply-To: References: Message-ID: <98EB7589-6B2B-42A1-B619-C767FEF49430@live555.com> > I want to close the RTSPClientSessions belong to one ServerMediaSession from the server side. > I send RTCP "BYE" message from server side, but it doesn't invoke the client's sending "TEARDOWN". But the client(openRTSP) had received the "BYE" message and it had printed the "Received RTCP "BYE"...". > What's the reason? I don't know. "openRTSP" (and "testRTSPClient") should be sending a RTSP "TEARDOWN" command after it receives a RTCP "BYE" from the server. (Note that it is the client's choice whether to send a "TEARDOWN"; the server cannot 'force' the client to send a "TEARDOWN". However, our code for both "openRTSP" and "testRTSPClient" should be sending a "TEARDOWN" after it receives a "BYE".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 23 11:45:35 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 11:45:35 -0700 Subject: [Live-devel] Reminder: Please trim your responses before posting them to the mailing list. Message-ID: <728E7748-927C-44AD-98EF-14B8D0F335DE@live555.com> A quick reminder to everyone that if you are responding to a previous message (especially if it was a long message), then please trim your response so that you're quoting only the relevant part(s) of the original message. Don't include the entire quoted previous message in your response (unless the previous message was very short). This is basic email 'netiquette', but unfortunately I've had to reject two messages in the past 24 hours because of this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Fri Mar 23 12:07:33 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Fri, 23 Mar 2012 12:07:33 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: <7923988C-3F0D-4878-B84E-56FC90683E95@live555.com> References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> <7923988C-3F0D-4878-B84E-56FC90683E95@live555.com> Message-ID: Ross, I am attempting to make sense of what I am seeing, and this statement: > This is getting silly. You're doing a lot of mental 'flailing around'; yet the answer is right in front of you: is not necessarily accurate. Up until this most recent example I provided, I reported 66% frame loss without even doing any decoding attempts at all, but instead a simple memcpy of 23K to create a new frame with the received data. For further interest, I also took 100% of the memcpy (afterGettingFrame()) work and dropped it into a block that I sent to a dispatch_async call to Grand Central Dispatch which immediately returned. It changed nothing. In other words, I cannot even dispatch any work asynchronously without suffering 2/3 frame loss. Your statement, > while (1) { > Step A: Get a frame of data from a socket; > Step B: Do something with this frame of data; > } Doesn't necessarily pin the issue on Step B. It pins the issue on Step A OR Step B, or how both steps do things together based on their design. Let's do the math. In the example below, I'm getting around 10fps where 19ms of that is decoding. That means that every second, step B is 190ms. It also means that Step A is 810ms. Now I'm not saying that LIVE555 is inefficient -- not saying that at all. All I'm saying is this: - There's something else very curious about the situations I've encountered where there is frame loss vs. when there isn't. When I'm only logging frame counts, i.e. doing nothing in the afterGettingFrame() method, I can get 30fps. That means that in this case, LIVE555 (Step A) is consuming a maximum of ~33ms per frame in order to achieve that frame rate. However, in the case of adding the one decode call, which dropped frame rate down to 10fps, LIVE555 (Step A) is consuming ~810ms total, or 81ms processing time per frame, because the decode consumes only 19ms. That is a 41% jump in processing time for LIVE555 (Step A) when a situation is introduced where frames cannot be processed fast enough to keep from losing frames. That would seem to suggest that as soon as frame loss occurs, LIVE555 incurs additional processing, which of course would compound the problem. Perhaps there's another explanation, but it doesn't make sense -- I would expect LIVE555's overhead to be fixed per frame. - Bottom line, all I'm doing is inquiring as to whether there might be possibilities to optimize things further upstream. I think that question is especially justified given that VLC, whom I looked at in lieu of not having any other example of a known working model, doesn't follow the testRTSPClient example approach. - I do not see any working example published that actually does something meaningful in its processing. The present testRTSPClient example app just outputs to a log. I can do that fine and get 30fps. Is there a known example out there that is architected this way, does decoding and does get 30fps (in other words, a video player example)? I'm not saying it can't, I'm just saying that I'd like to see it and run it myself before I throw my hands up in the air, and write off the whole thing as impossible. In conclusion, I'm not sure what the source of frustration is, but please don't take anything personally and see these questions for what they are. I'm just reporting the numbers I see, and asking questions based on them. And that would appear to be the proper protocol, as there's not much documentation either within the code or supplemental to help out here, so the only real resource is this mailing list. I also presume this is the desired scenario, as when I posted about possibly documenting out this use case, it never received a reply. FWIW, I'm not the only one in this boat, and this use case is not uncommon. I know, because since I started posting on this mailing list several weeks ago, I've been contacted offline several times by folks that have exactly the same problem they are trying to solve and aren't making much headway with the resources available. So please, don't take offense or check out...this is how problems get solved, people get educated, and code gets better. Cheers, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Fri Mar 23 12:51:43 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Fri, 23 Mar 2012 12:51:43 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> <7923988C-3F0D-4878-B84E-56FC90683E95@live555.com> Message-ID: Ugh typos....this quote: > That means that in this case, LIVE555 (Step A) is consuming a maximum of ~33ms per frame in order to achieve that frame rate. However, in the case of adding the one decode call, which dropped frame rate down to 10fps, LIVE555 (Step A) is consuming ~810ms total, or 81ms processing time per frame, because the decode consumes only 19ms. That is a 41% jump in processing time for LIVE555 (Step A) when a situation is introduced where frames cannot be processed fast enough to keep from losing frames. It should read that there is a 245% jump in processing time for LIVE555 (Step A) when a situation is introduced where frames cannot be processed fast enough to keep from losing frames. Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com From finlayson at live555.com Fri Mar 23 14:01:04 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 14:01:04 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> <7923988C-3F0D-4878-B84E-56FC90683E95@live555.com> Message-ID: I wasn't planning on contributing to this thread anymore (because I thought that I'd already said everything that I could), but you did bring up one worthwhile point: > That would seem to suggest that as soon as frame loss occurs, LIVE555 incurs additional processing That is actually correct, because - when the LIVE555 receiving code sees a gap in the RTP sequence numbers - it doesn't know initially whether the 'missing' packet has actually been lost, or whether it has merely been reordered in the network (in which case it will arrive shortly). Allowing for possible packet reordering does, indeed increase overhead (once packet loss is seen). You *could* reduce this by not having the code allow for reordered packets at all. To do this, you could call setPacketReorderingThresholdTime(0); on your "RTPSource" object(s). This *might* improve the performance of your system once it starts seeing packet loss (and thereby reducing the amount of packet loss), but it's not going to eliminate your basic problem: The reason why you're seeing systemic packet loss - despite having large OS socket buffers - is because your CPU is underpowered. You just need to accept the reality that you need a faster CPU, a slower (i.e., lower-bandwidth) incoming packet stream, or both. > - Bottom line, all I'm doing is inquiring as to whether there might be possibilities to optimize things further upstream. I think that question is especially justified given that VLC, whom I looked at in lieu of not having any other example of a known working model, doesn't follow the testRTSPClient example approach. You keep saying this, but it's either irrelevant, or wrong, depending on what you mean by "the testRTSPClient example approach". As I explained last time, VLC - like other applications that receive a RTP stream - *does* follow the while (1) { Step A: Get a frame of data from a socket; Step B: Do something with this frame of data; } model. It's true that the person who wrote VLC's LIVE555 interface code chose not to use "RTPSink" subclasses when they wrote their "Step B", but that's irrelevant; it has no effect on performance. Several people have written LIVE555-based RTP receiving applications, using "RTPSink"s, that receive a large number of high-bandwith streams concurrently. (If they haven't made their application code public, it's because they're not required to under the LGPL.) But they have sufficiently powerful CPUs. You apparently don't. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kyu at vcs.att.com Fri Mar 23 19:53:18 2012 From: kyu at vcs.att.com (Kevin Yu) Date: Fri, 23 Mar 2012 22:53:18 -0400 (EDT) Subject: [Live-devel] variable number of TS packets in a udp packet Message-ID: <16940224.95491332557598615.JavaMail.root@opportunity> Hi, First of all, thanks for the excellent work on this project. I found it has so much for building multimedia streaming solutions. I developed some code with Live555 to receive mpeg ts stream over raw UDP and send it out wrapped in RTP. It works fine for me. Now, I need to receive UDP packet containing 7 TS packets but send out the UDP/RTP packet with variable number (from 1 to 7) of TS packets in it. That means one single incoming UDP packet (with 7 TS packets payload) needs to be broken into 2 or more outgoing RTP packets. What's the best way to do that with Live555? Thanks in advance. From vigosslive at rambler.ru Fri Mar 23 20:56:37 2012 From: vigosslive at rambler.ru (Rustam) Date: Sat, 24 Mar 2012 07:56:37 +0400 Subject: [Live-devel] Where in the code you demux VOB and with the help of some class then send via rtsp server? Message-ID: <376111670.1332561397.93886360.31056@mperl110.rambler.ru> Ross, Now I can playing VOB file, a problem was in the old version VLC (1.13) it can't playing stream ac3, but the new version ( 2. 0 ) can. Problem was decided . Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 23 21:18:28 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2012 21:18:28 -0700 Subject: [Live-devel] variable number of TS packets in a udp packet In-Reply-To: <16940224.95491332557598615.JavaMail.root@opportunity> References: <16940224.95491332557598615.JavaMail.root@opportunity> Message-ID: <29359ED8-9C6C-4C58-97D5-7C51C202664E@live555.com> > I developed some code with Live555 to receive mpeg ts stream over raw UDP and send it out wrapped in RTP. It works fine for me. Now, I need to receive UDP packet containing 7 TS packets but send out the UDP/RTP packet with variable number (from 1 to 7) of TS packets in it. That means one single incoming UDP packet (with 7 TS packets payload) needs to be broken into 2 or more outgoing RTP packets. What's the best way to do that with Live555? The best way to do this is to insert a 'filter' object between the "BasicUDPSource" object (that receives Transport Stream data over raw UDP) and the "SimpleRTPSink" object (on which you send Transport Stream data over RTP). This filter object will deliver the appropriate number of 188-byte TS packets from each chunk of 7 TS packets that it receives from its upstream "BasicUDPSource" object. Specifically, you would define your own subclass of "FramedFilter", and reimplement the "doGetNextFrame()" virtual member function. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From megaplace at hotmail.com Fri Mar 23 23:05:12 2012 From: megaplace at hotmail.com (Krishna Patel) Date: Sat, 24 Mar 2012 06:05:12 +0000 Subject: [Live-devel] Vivatek digest authentication Message-ID: Hi, Live555 version 2012.02.29 is not compatible with Vivotek camera digest authentication. This is what Vivotek IP8161 camera sends when it cannot authrorize the client: - Rtsp: RESPONSE, RTSP/1.0, Status Code = 401 - Unauthorized - Response: Status of response : Unauthorized ProtocolVersion: RTSP/1.0 StatusCode: 401, Unauthorized Reason: Unauthorized CSeq: 3 WWW-Authenticate: Digest qop="auth",realm="streaming_server",nonce="bfd4e04b78959b55aeb1167adfabcec5" HeaderEnd: CRLF WWW-Authenticate response seems to be diferent from what Axis or Sony cameras send. Thanks, Krishna. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Mar 24 00:51:31 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 24 Mar 2012 00:51:31 -0700 Subject: [Live-devel] Vivatek digest authentication In-Reply-To: References: Message-ID: <5B1AD6D6-B70F-4268-A684-2B7DFF0EE004@live555.com> > Live555 version 2012.02.29 is not compatible with Vivotek camera digest authentication. (Why do I always get these sorts of reports from 3rd parties, and never from the camera manufacturers themselves?) > This is what Vivotek IP8161 camera sends when it cannot authrorize the client: > > - Rtsp: RESPONSE, RTSP/1.0, Status Code = 401 - Unauthorized > - Response: Status of response : Unauthorized > ProtocolVersion: RTSP/1.0 > StatusCode: 401, Unauthorized > Reason: Unauthorized > CSeq: 3 > WWW-Authenticate: Digest qop="auth",realm="streaming_server",nonce="bfd4e04b78959b55aeb1167adfabcec5" The problem is the qop="auth", part of the "WWW-Authenticate:" header. This is non-standard in RTSP, because: 1/ The RTSP specification (RFC 2326) uses RFC 2069 to define the digest authentication mechanism. 2/ RFC 2069 does not define the "qop" parameter. (Although that parameter was later defined in RFC 2617 (which updated RFC 2609), it was not defined in the version of the document that was referenced by the RTSP specification.) Therefore, Vivotek should update their cameras to remove the qop="auth", string from their "WWW-Authenticate:" strings, and thereby properly conform to the RTSP standard. (Note that if they do this, they will still be compatible with any RTSP client implementation that happens to use the RFC 2617 specification, because - in that document - qop="auth" is defined to be the default behavior, to be used if the "qop" parameter is not present.) (Feel free to forward this message to Vivotek's technical support.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Sat Mar 24 06:02:41 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Sat, 24 Mar 2012 15:02:41 +0200 Subject: [Live-devel] Corruption of the heap and Access violation reading errors at testRTSPClient.cpp [ on windows platform ] Message-ID: Here is my a "little modified" ?testRTSPClient.cpp [ Check ?PS for full source code]. RTSPClient* rtspClient;// global handle void Start() { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); UsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler); char* programName ="Test"; char* rtpsURL = "rtsp://192.168.3.165/video.h264"; // This is an local IP Camera rtspClient = openURL(*env, programName,rtpsURL ); rtspClient->sendDescribeCommand(continueAfterDESCRIBE); // All subsequent activity takes place within the event loop: env->taskScheduler().doEventLoop(); // does not return } void Stop() { shutdownStream(rtspClient); } int main(int argc, char** argv) { boost::thread StartClientThread = boost::thread(Start); boost::this_thread::sleep( boost::posix_time::seconds(10) ); //Try to stop after 10 seconds Stop(); int endMain; std::cin >> endMain; // Just for not to exit main loop return 0; } First: Why I modify the code? MyMotivation: I want to test if i can able to shutdown the OpenRTSPClient on my request. Results:? when i call Stop 1. Sometimes I get heap corruption error on windows platform Full Error: ?HEAP[RTSPClientTest.exe]: HEAP: Free Heap block 1d4428 modified at 1d44e0 after it was freed Windows has triggered a breakpoint in RTSPClientTest.exe. This may be due to a corruption of the heap, which indicates a bug in RTSPClientTest.exe or any of the DLLs it has loaded. This may also be due to the user pressing F12 while RTSPClientTest.exe has focus. What may be the problem? Any ideas? Suggestions? 2. Sometimes i get ? Access violation ?errors at DummySink afterGettingFrame function ?at line if (fSubsession.rtpSource() != NULL && !fSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP()) Full Error: First-chance exception at 0x00212a85 in RTSPClientTest.exe: 0xC0000005: Access violation reading location 0xfeeeffd6. Unhandled exception at 0x00212a85 in RTSPClientTest.exe: 0xC0000005: Access violation reading location 0xfeeeffd6. Has anybody can able to start and stop OpenRTSPClient sucessfully on windows platform? what is wrong with my little test? what may cause this? Any ideas suggestions? Best Wishes Novalis PS: ?Can download file at http://www.2shared.com/file/YknZUctR/testRTSPClientModified.html -------------- next part -------------- A non-text attachment was scrubbed... Name: testRTSPClientModified.cpp Type: text/x-c++src Size: 20795 bytes Desc: not available URL: From finlayson at live555.com Sat Mar 24 08:27:31 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 24 Mar 2012 08:27:31 -0700 Subject: [Live-devel] Corruption of the heap and Access violation reading errors at testRTSPClient.cpp [ on windows platform ] In-Reply-To: References: Message-ID: Your problem is that you are trying to call LIVE555 code (specifically, your "Start()" and "Stop()" functions) from two separate threads. As stated very clearly in the FAQ - ***that you were asked to read before you posted to this mailing list*** - you cannot do this! Instead, you should call "Stop()" not from your 'main' thread, but instead from the 'LIVE555 event loop' thread - i.e., from the same thread in which you called "Start()". To do this, you modify "Start()" to - Create an "event trigger" with a handler function that will call "Stop()" - using the function "TaskScheduler::createEventTrigger()". - Then, from your 'main' thread - after 10 seconds is elapsed, call "TaskScheduler::triggerEvent()". (Note that "triggerEvent()" is the *only* LIVE555 function that you're permitted to call from a non-LIVE555 thread.) For details, see "UsageEnvironment/UsageEnvironment.hh". (Note also the example use of 'event triggers' in "liveMedia/DeviceSource.cpp".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Layne.Berge.1 at my.ndsu.edu Sat Mar 24 18:26:58 2012 From: Layne.Berge.1 at my.ndsu.edu (Layne Berge) Date: Sun, 25 Mar 2012 01:26:58 +0000 Subject: [Live-devel] Ability to manipulate individual frames Message-ID: <87091305-1E89-4FED-AF28-00188767E1A5@my.ndsu.edu> Hi Ross, I've been using live555 to stream and record video and audio from a camera with a variable frame-rate. I've gotten around the issue of needing a fixed frame-rate by sending copies of the last received image frame to the quicktime file sync when the frame-rate from the camera slows down, thus faking a fixed frame-rate. The problem I've run into is in the playback of the file. While it does play, when it gets to the copied frames, any motion between the string of copies and the next new frame is blurred. It looks sort of like aliasing. I'm beginning to think this is just how the h.264 codec handles copied images. With this in mind, I'm thinking that by changing each frame slightly, perhaps adding one to each pixel, this may remedy whatever is happening in playback. So, my question to you is how and if this is possible with the live555 library? Layne From finlayson at live555.com Sat Mar 24 18:53:33 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 24 Mar 2012 18:53:33 -0700 Subject: [Live-devel] Ability to manipulate individual frames In-Reply-To: <87091305-1E89-4FED-AF28-00188767E1A5@my.ndsu.edu> References: <87091305-1E89-4FED-AF28-00188767E1A5@my.ndsu.edu> Message-ID: <015493A1-3A5F-4BD8-BFA4-621F8DEE0181@live555.com> > I've been using live555 to stream and record video and audio from a camera with a variable frame-rate. I've gotten around the issue of needing a fixed frame-rate by sending copies of the last received image frame to the quicktime file sync when the frame-rate from the camera slows down, thus faking a fixed frame-rate. The problem I've run into is in the playback of the file. While it does play, when it gets to the copied frames, any motion between the string of copies and the next new frame is blurred. It looks sort of like aliasing. I'm beginning to think this is just how the h.264 codec handles copied images. With this in mind, I'm thinking that by changing each frame slightly, perhaps adding one to each pixel, this may remedy whatever is happening in playback. > > So, my question to you is how and if this is possible with the live555 library? Well, you can do whatever you want to the data stream by inserting an appropriate 'filter' object - i.e., from a subclass of "FramedFilter" - into the stream. But it's a subclass that you'd need to write yourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brado at bighillsoftware.com Sat Mar 24 20:11:53 2012 From: brado at bighillsoftware.com (Brad O'Hearne) Date: Sat, 24 Mar 2012 20:11:53 -0700 Subject: [Live-devel] RTSPClient w/significant frame loss In-Reply-To: References: <10238C60-3AD3-41EE-9FEF-5519E1E9E749@bighillsoftware.com> <29FE2E61-E347-41AE-8B46-E58F5F536889@bighillsoftware.com> <0F1AFE50-E67D-41B2-9D8C-2D4397D925CE@bighillsoftware.com> <7923988C-3F0D-4878-B84E-56FC90683E95@live555.com> Message-ID: Just in case anyone else was actually following this thread, I thought I'd make my final post in this thread the conclusion of the matter: - Setting the packet reordering threshold time very low had a noticeable impact. I originally tried setting it to zero, but oddly my benchmarks were slightly slower when setting this threshold to 0.0 than they were when it was set to 0.001 -- so I settled on the latter. My frame rate noticeably increased (by 25-30+%) when this change was made, so I suppose the moral of the story is that the further this reordering threshold gets from zero, packet loss begets more packet loss. - CPU power also made a significant difference. I shifted from testing on iPad 1 to the new iPad (3rd generation), and that bumped my frame rate 30-40% too. Both together were significant. After adding back in the rest of my decoding process, with some additional optimizations (which weren't part of this thread discussion), I'm now getting 27-30fps on an iPad 3rd generation, full rtsp -> LIVE555 -> ffmpeg -> iPad screen. So for anyone following along at home, it can be done. Cheers, Brad Brad O'Hearne Founder / Lead Developer Big Hill Software LLC http://www.bighillsoftware.com On Mar 23, 2012, at 2:01 PM, Ross Finlayson wrote: > I wasn't planning on contributing to this thread anymore (because I thought that I'd already said everything that I could), but you did bring up one worthwhile point: > >> That would seem to suggest that as soon as frame loss occurs, LIVE555 incurs additional processing > > That is actually correct, because - when the LIVE555 receiving code sees a gap in the RTP sequence numbers - it doesn't know initially whether the 'missing' packet has actually been lost, or whether it has merely been reordered in the network (in which case it will arrive shortly). Allowing for possible packet reordering does, indeed increase overhead (once packet loss is seen). > > You *could* reduce this by not having the code allow for reordered packets at all. To do this, you could call > setPacketReorderingThresholdTime(0); > on your "RTPSource" object(s). > > This *might* improve the performance of your system once it starts seeing packet loss (and thereby reducing the amount of packet loss), but it's not going to eliminate your basic problem: The reason why you're seeing systemic packet loss - despite having large OS socket buffers - is because your CPU is underpowered. You just need to accept the reality that you need a faster CPU, a slower (i.e., lower-bandwidth) incoming packet stream, or both. > > >> - Bottom line, all I'm doing is inquiring as to whether there might be possibilities to optimize things further upstream. I think that question is especially justified given that VLC, whom I looked at in lieu of not having any other example of a known working model, doesn't follow the testRTSPClient example approach. > > You keep saying this, but it's either irrelevant, or wrong, depending on what you mean by "the testRTSPClient example approach". As I explained last time, VLC - like other applications that receive a RTP stream - *does* follow the > while (1) { > Step A: Get a frame of data from a socket; > Step B: Do something with this frame of data; > } > model. It's true that the person who wrote VLC's LIVE555 interface code chose not to use "RTPSink" subclasses when they wrote their "Step B", but that's irrelevant; it has no effect on performance. > > Several people have written LIVE555-based RTP receiving applications, using "RTPSink"s, that receive a large number of high-bandwith streams concurrently. (If they haven't made their application code public, it's because they're not required to under the LGPL.) But they have sufficiently powerful CPUs. You apparently don't. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nick at porcaro.org Sat Mar 24 21:16:30 2012 From: nick at porcaro.org (Nick Porcaro) Date: Sat, 24 Mar 2012 21:16:30 -0700 Subject: [Live-devel] modifying WAVEAudioFileSource.cpp to take input from an iOS AU buffer Message-ID: Hey folks, Here's what I have: - An iPhone app that reads audio input and writes to audio output - One of these threads runs a DynamicRTSPServer session which runs a WAVAudioFileServerMediaSubsession - When I run this session with a wav file, WAVAudioFileSource::doReadFromFile() is called, and the samples are read out of this wav file and I can send it to another app which is acting as a live555 client. Works great. - Now, getting back to the iOS audio: A function gets called when the audio output hardware wants samples. I write these samples to a circular buffer, which can be read from other threads. - In fact, WAVAudioFileSource::doReadFromFile() is being called right now since I already have the setup working for a .wav file. All I want to do is write the samples to fTo: > void WAVAudioFileSource::doReadFromFile() { > // Try to read as many bytes as will fit in the buffer provided (or "fPreferredFrameSize" if less) > if (fLimitNumBytesToStream && fNumBytesToStream < fMaxSize) { > fMaxSize = fNumBytesToStream; > } > if (fPreferredFrameSize < fMaxSize) { > fMaxSize = fPreferredFrameSize; > } > unsigned bytesPerSample = (fNumChannels*fBitsPerSample)/8; > if (bytesPerSample == 0) bytesPerSample = 1; // because we can't read less than a byte at a time > > // For 'trick play', read one sample at a time; otherwise (normal case) read samples in bulk: > unsigned bytesToRead = fScaleFactor == 1 ? fMaxSize - fMaxSize%bytesPerSample : bytesPerSample; > unsigned numBytesRead; > while (1) { // loop for 'trick play' only > if (readFromFilesSynchronously || fFidIsSeekable) { > numBytesRead = fread(fTo, 1, bytesToRead, fFid); > } else { > // For non-seekable files (e.g., pipes), call "read()" rather than "fread()", to ensure that the read doesn't block: > numBytesRead = read(fileno(fFid), fTo, bytesToRead); > } > > I think if I change this line to copy samples out of the ring buffer to fTo I would be able to spoof the wav file reader into using these samples instead of the ones from the file. > numBytesRead = fread(fTo, 1, bytesToRead, fFid); Then ultimately, if this works, think I could make a new subclass WaveAudioIOSInputSource to do the job. Preliminary experiments have been a bit frustrating, and I think it has something to do with the fTo variable. Any hints on how to proceed on this? Thanks, - Nick From finlayson at live555.com Sat Mar 24 22:18:49 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 24 Mar 2012 22:18:49 -0700 Subject: [Live-devel] modifying WAVEAudioFileSource.cpp to take input from an iOS AU buffer In-Reply-To: References: Message-ID: <3B57B709-4BE5-4987-BC1E-05405FE666F4@live555.com> First, as you know (because you've read the FAQ :-), you shouldn't modify the existing code 'in place'; see: http://www.live555.com/liveMedia/faq.html#modifying-and-extending Instead, you should write your own new subclass(es) (perhaps using the existing code as a model, when necessary). In any case, the "WAVAudioFileSource" code is the wrong code to be using as a model, because most of what it does is irrelevant for your application. In particular: - It reads and processes a WAV audio file header, which you don't need to do (because you presumably know the audio parameters (# channels, sampling frequency, etc.) in advance). - It reads from a file, which you won't be doing (as you've noted). - It provides support for 'trick play' operations, which you won't support, because you'll be reading from a live input source, rather than from a static file. So, instead, you should write your own "FramedSource" subclass (not based on "WAVAudioFileSource") to encapsulate your input audio sample buffer. For this, I suggest that you use the "DeviceSource" code as a model (see "liveMedia/DeviceSource.cpp"). Also, of course, you will need to write your own "OnDemandServerMediaSubsession" subclass, and use that - instead of "WAVAudioFileServerMediaSubsession" - in your RTSP server. Although you may want to use the "WAVAudioFileServerMediaSubsession" code as a model, you'll find that you won't need most of that code. In fact, you'll probably need to implement only the "createNewStreamSource()" and "createNewRTPSink()" virtual functions (and the implementation of those will be much simpler than those in "WAVAudioFileServerMediaSubsession", because - unlike for a WAV file - you know in advance the audio parameters). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From aviadr1 at gmail.com Sun Mar 25 06:17:10 2012 From: aviadr1 at gmail.com (aviad rozenhek) Date: Sun, 25 Mar 2012 15:17:10 +0200 Subject: [Live-devel] upgrading live555 on android Message-ID: Hi, we're using live555 to stream to android 2.x clients. we're trying to update our servers from the nov 2010 version to the latest version, but we're seeing an issue where the audio stops playing after 2-3 seconds, while video continues. this does not happen with the earlier version of live555. our server code did not change, and the client is android RTSP client [does not depend on live555]. can you offer some insight into what we should check or look into to understand the issue better? -- Aviad Rozenhek -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Mar 25 22:20:38 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 25 Mar 2012 22:20:38 -0700 Subject: [Live-devel] upgrading live555 on android In-Reply-To: References: Message-ID: > we're using live555 to stream to android 2.x clients. Do 'we' not have our own domain name :-) > we're trying to update our servers from the nov 2010 version to the latest version, > but we're seeing an issue where the audio stops playing after 2-3 seconds, while video continues. this does not happen with the earlier version of live555. > our server code did not change, and the client is android RTSP client [does not depend on live555]. > can you offer some insight into what we should check or look into to understand the issue better? First, I suggest adding #define DEBUG 1 to the start of "RTSPServer.cpp", and recompiling. This should tell you a lot about what is happening in your server, at least wrt. RTSP protocol handling. Also, you should check whether the problem happens just with this one (Android) RTSP client, or with every RTSP client. In particular, I suggest that you try running "openRTSP" as your client, and see whether it receives all of the audio, or just 2-3 seconds worth. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shaheed at scansoft.co.za Sun Mar 25 23:25:40 2012 From: Shaheed at scansoft.co.za (Shaheed Abdol) Date: Mon, 26 Mar 2012 08:25:40 +0200 Subject: [Live-devel] RTSP server delivers HTTP 404 describe response to certain clients Message-ID: <002962EA5927BE45B2FFAB0B5B5D679709F8D9@SSTSVR1.sst.local> Good afternoon, I am using the StreamUCast.cpp file to run an RTSP server, I am not providing data from files, but rather providing real-time data acquired by other means (from hardware) and providing it to a subclass of ServerMediaSession, when using this approach to stream audio, everything works fine - we can use VLC to connect to the subsessions and successfully get data. I have since modified the code to support outbound streaming of live video, which is being packaged into elementary mp4 format using ffmpeg. The conversion is successful, as there is a way to get the data to convert without having to stream it. The issue I am having is that I cannot seem to connect to the video subsessions, and receive an "HTTP - 404" message, which means "stream not found", I have checked, and I am registering my streams using the correct name, and as far as I have debugged the code, everything is correct to the point where createNewStreamSource is called in my COnDemandVideoInputSubsession derived class. Farther than this I cannot debug, but I can see in the code, and from VLC debug logs that the DESCRIBE command fails, but cannot examine the data received, I at least know that the port opens (set to default 8445) in TCP mode, and the connection to the port on the remote machine is successful. I have scoured the mailing list and have found posts from people using gmail addresses, and those posts we basically ignored. Any ideas would be greatly appreciated. Some more info: StreamUCast (Server) OnDemandVideoInputSubsession (subsession) InputDeviceSource (FramedSource - returned by createNewStreamSource in OnDemandVideoInputSubsession) taskScheduler().createEventTrigger(deliverFrame0) is called in constructor of InputDeviceSource InputDeviceSource::doGetNextFrame is never called. I have tried the RTSP client code provided in testProgs, but as expected I get the same "HTTP - 404" error in the continueAfterDESCRIBE function. I am pulling out my hair because within the same code, there is a case for adding an OnDemandAudioInputSubsession, and I can pull the RTSP stream from that perfectly using VLC and the testProgs rtsp client. I know the problem is in my code, but the audio and video subclasses are nearly identical in operation (except for the data source). The ultimate question leads to a matter of design, could laggy data aqcuisition slow the response times significantly for the RTSP clients to "time-out" while waiting for the possibly-delayed response? If so, I could move the data acquisition to a different thread to increase response time. Thank you in advance for even reading this behemoth of a post. Regards ___________________________________ Shaheed Abdol Web: www.scansoft.co.za Tel: +27 21 913 8664 Cell: +27 79 835 8771 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 32497 bytes Desc: SST Email.png URL: From finlayson at live555.com Mon Mar 26 00:55:36 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Mar 2012 00:55:36 -0700 Subject: [Live-devel] RTSP server delivers HTTP 404 describe response to certain clients In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D679709F8D9@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D679709F8D9@SSTSVR1.sst.local> Message-ID: <130340B0-DA38-416B-A167-8CA05830C1AD@live555.com> > I am using the StreamUCast.cpp file to run an RTSP server "StreamUCast.cpp" is not our file. It seems that you are working from the code for someone else's RTSP server application (that happens to be using our code). If that's the case, then perhaps you should first ask whoever it was who provided you with the "StreamUCast.cpp" file? But anyway, you talk about getting "HTTP - 404" responses. You should not be trying to access the server using HTTP (unless you really know what you're doing, and are using RTSP-over-HTTP tunneling)? Instead, at least at first, you should be accessing the stream using RTSP only. > I have tried the RTSP client code provided in testProgs, but as expected I get the same "HTTP - 404" error in the continueAfterDESCRIBE function. Once again, why are you getting ***HTTP*** errors. I suggest that you first try using our (unmodified!) "testRTSPClient" application. If you continue to get HTTP (rather than RTSP) responses from your server, then there is something badly wrong with the way that you (or someone else) made use of (or modified) the LIVE555 RTSP server code, in which case I'm not going to be able to help you. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shaheed at scansoft.co.za Mon Mar 26 04:58:25 2012 From: Shaheed at scansoft.co.za (Shaheed Abdol) Date: Mon, 26 Mar 2012 13:58:25 +0200 Subject: [Live-devel] RTSP server delivers HTTP 404 describe response tocertain clients Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8C7B@SSTSVR1.sst.local> Good morning Ross, Firstly, thank you for the notification about the file name issue. It seems the previous developer copied the testOnDemandRTSPServer code, modified and saved it so a file named StreamUCast. My initial subject line is incorrect - it should say "RTSP - 404 ...", which is generated in the RTSPServer class - due to the RTPSink not being created in the createNewRTPSink function. I have poured over the FAQ and mailing list and have written my own FramedSource class which encapsulates my live device source, and an MP4VideoRTPSink which does nothing. I have replaced the MP4VideoRTPSink with the MPEG4ESVideoRTPSink in the createNewRTPSink function in my COnDemandSubsession class. This allows me to connect to my subsession, but the frames I'm getting are not valid. I am using the unmodified testRTSPClient and VLC to get the data for the frames. I have found a previous issue: "Live555 Streaming from a live source", where you instruct the developers to insert an MPEG4VideoStreamDiscreteFramer downstream which will handle the data. I'm sure I must be a bother by now, but a little bit more background on this would be appreciated, since I cannot return the mp4 discrete framer from my createNewRTPSink function. I am returning a derived DeviceSource instance which allows me to get live data from the createNewStreamSource function. I'm not exactly sure how to insert the MPEG4DiscreteFramer in this sequence of filters. Should I allow my derived DeviceSource to inherit from the discrete framer, or should I simply return the discrete framer from the createNewStreamSource function? I do not see any method for the FramedSource to query the OnDemandVideoInputSubsession (my own class) for video data, and the DeviceSource class is (should be) the last filter in the sequence. Thank you Regards ___________________________________ Shaheed Abdol Web: www.scansoft.co.za Tel: +27 21 913 8664 Cell: +27 79 835 8771 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 32497 bytes Desc: SST Email.png URL: From finlayson at live555.com Mon Mar 26 06:37:18 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Mar 2012 06:37:18 -0700 Subject: [Live-devel] RTSP server delivers HTTP 404 describe response tocertain clients In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8C7B@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8C7B@SSTSVR1.sst.local> Message-ID: I didn't really understand your message; it seems to be flailing around, talking about lots of different, unrelated things. But your immediate problem - the one that you need to fix first, before worrying about anything else - is the RTSP 404 errors. These are usually caused by the "RTSPServer::lookupServerMediaSession()" call failing. If you *haven't* subclassed "RTSPServer" (e.g., you're not using the "DynamicRTSPServer" code, or your own subclass of "RTSPServer"), then this usually just means that your RTSP client is not using the correct 'stream name' in the "rtsp://" URL. The 'stream name' is the "streamName" string - i.e., the second parameter - that was given in the call to "ServerMediaSession::createNew()". If, however, you *are* using the "DynamicRTSPServer" code, then the 404 error is probably caused by the "DynamicRTSPServer::lookupServerMediaSession()" function, so you'll need to look at that code (in "mediaServer/DynamicRTSPServer .cpp") to figure out why that is failing. Or if you're using your own subclass of "RTSPServer", and have reimplemented the "lookupServerMediaSession()" virtual function in your own code, then the error is probably caused by that function failing. In any case, because you're using your own custom code, I can't help you much. But Remember, You Have Complete Source Code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Mon Mar 26 02:18:15 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Mon, 26 Mar 2012 12:18:15 +0300 Subject: [Live-devel] Corruption of the heap and Access violation reading errors at testRTSPClient.cpp [ on windows platform ] In-Reply-To: References: Message-ID: I read Faq, but it seems that I did not pay attention much. You are right. I modify the code , and it works perfectly . Thanks. Best Wishes Novalis PS: Here is the modified part: RTSPClient* rtspClient; EventTriggerId stopTrigger; UsageEnvironment* env; char doEventLoopWatchVariable; bool doesDoEventLoopEnded ; void Stop(void *myParameter) { shutdownStream(rtspClient); } void Clear() { std::cout << "Try to delete env and scheduler " << std::endl; TaskScheduler* scheduler = &(env->taskScheduler()); bool canDelete = ( (env->liveMediaPriv == NULL) && (env->groupsockPriv == NULL)); if(canDelete) { std::cout << "now can eble to delete enviroment" << std::endl; } else { std::cout << "can not able to delete enviroment" << std::endl; } env->reclaim(); delete scheduler; } void Start() { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); stopTrigger = env->taskScheduler().createEventTrigger(&Stop); char* programName ="Test"; char* rtpsURL = "rtsp://192.168.3.165/video.h264"; rtspClient = openURL(*env, programName,rtpsURL ); rtspClient->sendDescribeCommand(continueAfterDESCRIBE); doEventLoopWatchVariable = 0; doesDoEventLoopEnded = false; // All subsequent activity takes place within the event loop: env->taskScheduler().doEventLoop(&doEventLoopWatchVariable); // does not return doesDoEventLoopEnded = true; } int main(int argc, char** argv) { boost::thread StartClientThread = boost::thread(Start); boost::this_thread::sleep( boost::posix_time::seconds(10) ); env->taskScheduler().triggerEvent(stopTrigger); doEventLoopWatchVariable = 1; while(!doesDoEventLoopEnded) { std::cout << "do event loop does not end " << std::endl; } std::cout << "do event loop ended " << std::endl; std::cout << "now try to cleanup client " << std::endl; Clear(); int endMain; std::cin >> endMain; // Just for not to exit main loop return 0; } From daniel.liu17 at gmail.com Mon Mar 26 02:28:23 2012 From: daniel.liu17 at gmail.com (=?GB2312?B?wfXB+rfJ?=) Date: Mon, 26 Mar 2012 17:28:23 +0800 Subject: [Live-devel] why RTCP BYE does not invoke client's "TEARDOWN" In-Reply-To: <98EB7589-6B2B-42A1-B619-C767FEF49430@live555.com> References: <98EB7589-6B2B-42A1-B619-C767FEF49430@live555.com> Message-ID: Sorry, It's my mistake. My sending RTCP "BYE" from server side can invoke the client "TEARDOWN" now. ps: previously, my session has two subsessions,video and audio, but I only send "BYE" for video substream, not for audio, so it didn't work. ---------------------------------------------------------------------------------------------------------------- What's more,I find a new question. It seems that the referenceCount in class ServerMediaSession is not zero after the client have "TEARDOWN"ed this session(with two subsessions). For a session have two subsessions, the below increasement is called twice, because it have two "SETUP", each for a subsession. * in RTSPServer::RTSPClientSession::handleCmd_SETUP()* * fOurServerMediaSession->incrementReferenceCount();* But in RTSPServer::RTSPClientSession::~RTSPClientSession(), the below decrement is called only once. * fOurServerMediaSession->decrementReferenceCount();* And I printed the fReferenceCount, it's not zero after ~RTSPClientSession(). Is this right? or have I mis-understood the meaning of fReferenceCount? Thank you very much. 2012/3/23 Ross Finlayson > I want to close the RTSPClientSessions belong to one ServerMediaSession > from the server side. > I send RTCP "BYE" message from server side, but it doesn't invoke the > client's sending "TEARDOWN". But the client(openRTSP) had received the > "BYE" message and it had printed the "Received RTCP "BYE"...". > What's the reason? > > > I don't know. "openRTSP" (and "testRTSPClient") should be sending a RTSP > "TEARDOWN" command after it receives a RTCP "BYE" from the server. (Note > that it is the client's choice whether to send a "TEARDOWN"; the server > cannot 'force' the client to send a "TEARDOWN". However, our code for both > "openRTSP" and "testRTSPClient" should be sending a "TEARDOWN" after it > receives a "BYE".) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 26 22:45:37 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Mar 2012 22:45:37 -0700 Subject: [Live-devel] why RTCP BYE does not invoke client's "TEARDOWN" In-Reply-To: References: <98EB7589-6B2B-42A1-B619-C767FEF49430@live555.com> Message-ID: <8471C0DE-4050-49F8-A16D-6CDB069A0AF9@live555.com> > What's more,I find a new question. > It seems that the referenceCount in class ServerMediaSession is not zero after the client have "TEARDOWN"ed this session(with two subsessions). > > For a session have two subsessions, the below increasement is called twice, because it have two "SETUP", each for a subsession. > in RTSPServer::RTSPClientSession::handleCmd_SETUP() > fOurServerMediaSession->incrementReferenceCount(); > > But in RTSPServer::RTSPClientSession::~RTSPClientSession(), the below decrement is called only once. > fOurServerMediaSession->decrementReferenceCount(); > > And I printed the fReferenceCount, it's not zero after ~RTSPClientSession(). Is this right? or have I mis-understood the meaning of fReferenceCount? No, you've discovered a bug; thank you! (It's not a serious bug; it means just that if you remove multi-track "ServerMediaSession" objects from a server (something that isn't done very much), you can end up with a small memory leak. Nonetheless, it will be fixed in an upcoming release of the software.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Aleksandar.Milenkovic at rt-rk.com Tue Mar 27 05:24:27 2012 From: Aleksandar.Milenkovic at rt-rk.com (Aleksandar Milenkovic) Date: Tue, 27 Mar 2012 14:24:27 +0200 Subject: [Live-devel] Easiest way to implement custom RTSP messages? In-Reply-To: References: Message-ID: <4F71B17B.7030906@rt-rk.com> Hello, I was unable to find a detailed / concrete answer to my question, so sorry if I haven't searched hard enough to find it... Anyways, the question is - What is the best way to implement custom RTSP messages? I want to add a new RTSP message type, and use it to transfer a list of files I can play from server. Is RTCP the way to do it like some mails have suggested? Thanks in advance, Aleksandar *Aleksandar Milenkovic* Software Engineer Phone: +381-(0)21-4801-139 Fax: +381-(0)21-450-721 Mobile: +381-(0)64-31-666-82 E-mail: Aleksandar.Milenkovic at rt-rk.com RT-RK Computer Based Systems LLC Fruskogorska 11 21000 Novi Sad, Serbia www.rt-rk.com *RT-RK* invites you to visit us @ *IBC2011*, September 9-13 2011, stand *5.A01*, Amsterdam RAI. For more information please visit www.bbt.rs On 3/26/2012 1:58 PM, live-devel-request at ns.live555.com wrote: > Send live-devel mailing list submissions to > live-devel at lists.live555.com > > To subscribe or unsubscribe via the World Wide Web, visit > http://lists.live555.com/mailman/listinfo/live-devel > or, via email, send a message with subject or body 'help' to > live-devel-request at lists.live555.com > > You can reach the person managing the list at > live-devel-owner at lists.live555.com > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of live-devel digest..." > > > Today's Topics: > > 1. Re: RTSP server delivers HTTP 404 describe response to > certain clients (Ross Finlayson) > 2. RTSP server delivers HTTP 404 describe response tocertain > clients (Shaheed Abdol) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Mon, 26 Mar 2012 00:55:36 -0700 > From: Ross Finlayson > To: LIVE555 Streaming Media - development& use > > Subject: Re: [Live-devel] RTSP server delivers HTTP 404 describe > response to certain clients > Message-ID:<130340B0-DA38-416B-A167-8CA05830C1AD at live555.com> > Content-Type: text/plain; charset="us-ascii" > >> I am using the StreamUCast.cpp file to run an RTSP server > "StreamUCast.cpp" is not our file. It seems that you are working from the code for someone else's RTSP server application (that happens to be using our code). If that's the case, then perhaps you should first ask whoever it was who provided you with the "StreamUCast.cpp" file? > > But anyway, you talk about getting "HTTP - 404" responses. You should not be trying to access the server using HTTP (unless you really know what you're doing, and are using RTSP-over-HTTP tunneling)? Instead, at least at first, you should be accessing the stream using RTSP only. > > >> I have tried the RTSP client code provided in testProgs, but as expected I get the same "HTTP - 404" error in the continueAfterDESCRIBE function. > Once again, why are you getting ***HTTP*** errors. I suggest that you first try using our (unmodified!) "testRTSPClient" application. If you continue to get HTTP (rather than RTSP) responses from your server, then there is something badly wrong with the way that you (or someone else) made use of (or modified) the LIVE555 RTSP server code, in which case I'm not going to be able to help you. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Message: 2 > Date: Mon, 26 Mar 2012 13:58:25 +0200 > From: "Shaheed Abdol" > To: > Subject: [Live-devel] RTSP server delivers HTTP 404 describe response > tocertain clients > Message-ID:<002962EA5927BE45B2FFAB0B5B5D67970A8C7B at SSTSVR1.sst.local> > Content-Type: text/plain; charset="us-ascii" > > Good morning Ross, > > Firstly, thank you for the notification about the file name issue. It > seems the previous developer copied the testOnDemandRTSPServer code, > modified and saved it so a file named StreamUCast. My initial subject > line is incorrect - it should say "RTSP - 404 ...", which is generated > in the RTSPServer class - due to the RTPSink not being created in the > createNewRTPSink function. > > I have poured over the FAQ and mailing list and have written my own > FramedSource class which encapsulates my live device source, and an > MP4VideoRTPSink which does nothing. I have replaced the MP4VideoRTPSink > with the MPEG4ESVideoRTPSink in the createNewRTPSink function in my > COnDemandSubsession class. This allows me to connect to my subsession, > but the frames I'm getting are not valid. I am using the unmodified > testRTSPClient and VLC to get the data for the frames. I have found a > previous issue: "Live555 Streaming from a live source", where you > instruct the developers to insert an MPEG4VideoStreamDiscreteFramer > downstream which will handle the data. I'm sure I must be a bother by > now, but a little bit more background on this would be appreciated, > since I cannot return the mp4 discrete framer from my createNewRTPSink > function. I am returning a derived DeviceSource instance which allows me > to get live data from the createNewStreamSource function. I'm not > exactly sure how to insert the MPEG4DiscreteFramer in this sequence of > filters. Should I allow my derived DeviceSource to inherit from the > discrete framer, or should I simply return the discrete framer from the > createNewStreamSource function? I do not see any method for the > FramedSource to query the OnDemandVideoInputSubsession (my own class) > for video data, and the DeviceSource class is (should be) the last > filter in the sequence. > > Thank you > Regards > ___________________________________ > > Shaheed Abdol > > > > Web: www.scansoft.co.za > Tel: +27 21 913 8664 > Cell: +27 79 835 8771 > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: not available > Type: image/png > Size: 32497 bytes > Desc: SST Email.png > URL: > > ------------------------------ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > End of live-devel Digest, Vol 101, Issue 32 > ******************************************* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 27 06:13:02 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Mar 2012 06:13:02 -0700 Subject: [Live-devel] Easiest way to implement custom RTSP messages? In-Reply-To: <4F71B17B.7030906@rt-rk.com> References: <4F71B17B.7030906@rt-rk.com> Message-ID: <05A680E4-B1FB-4DD9-9DC1-B43750ED9C41@live555.com> > Anyways, the question is - What is the best way to implement custom RTSP messages? > I want to add a new RTSP message type, and use it to transfer a list of files I can play from server. RTSP is not the protocol to use for this (because there's nothing defined in the RTSP protocol standard for doing this, an any 'custom' RTSP command that you'd implement would be incompatible with all of the other clients and servers out there). What you want, instead, is a regular web page that provides a list of "rtsp://" URLs - e.g, in an XML document. For this, you'd use HTTP - i.e., using a regular web server - not RTSP. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From albert.staubin at gmail.com Mon Mar 26 13:40:10 2012 From: albert.staubin at gmail.com (AJ) Date: Mon, 26 Mar 2012 16:40:10 -0400 Subject: [Live-devel] Changing the FileSource in test TransportStreamer to BasicUDPSource issue Message-ID: I have built a test client that is nothing more than "testMPEG2TransportStreamer.exe" with a BasicUSPSource instead of a FileSource. When I make this change the streamer runs into the following issue that it does not have using a FileSource. Everything else about the streamer is the same. I have included the changed code. I have to have the client connected to the RTSP server before the transport streamer starts streaming and all new clients that connect have to wait until the transport streamer is stopped and restarted to start seeing anything. I have tried moving the server to a different computer and this did not fix the issue. Between this test client and the issues I was seeing before, it seems like the BasicUDPSource is where the problem lies, I have not been able to point to an exact place in the code yet though. Is anyone familiar with BasicUDPSource and is this expected behavior? Am I missing something in the initialization or set up? Any help would be appreciated. I replaced the play() function with the following code in "testMPEG2TransportStreamer.cpp" : struct in_addr inputAddress; inputAddress.s_addr = NULL;//Raw UDP incoming portNumBits inPortNumBits = 5544; const Port inputPort(inPortNumBits); Groupsock* inputGroupsock = new Groupsock(*env,inputAddress,inputPort,255); //create source FramedSource* udpSource = BasicUDPSource::createNew(*env,inputGroupsock); // Create a 'framer' for the input source (to give us proper inter-packet gaps): videoSource = MPEG2TransportStreamFramer::createNew(*env, udpSource); videoSink->startPlaying(*videoSource,afterPlaying,videoSink); -------------- next part -------------- An HTML attachment was scrubbed... URL: From yujian4newsgroup at gmail.com Mon Mar 26 18:06:30 2012 From: yujian4newsgroup at gmail.com (yujian) Date: Tue, 27 Mar 2012 09:06:30 +0800 Subject: [Live-devel] How much is the maximum Concurrency Count of Media Server? Message-ID: If the Media Server as the online Stream Server, how many clients can keep connection with it? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 27 14:56:06 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Mar 2012 14:56:06 -0700 Subject: [Live-devel] How much is the maximum Concurrency Count of Media Server? In-Reply-To: References: Message-ID: On Mar 26, 2012, at 6:06 PM, yujian wrote: > If the Media Server as the online Stream Server, how many clients can keep connection with it? Read the FAQ (as you were asked to do before posting to this mailing list)! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From anders.branderud at gmail.com Tue Mar 27 04:00:08 2012 From: anders.branderud at gmail.com (Anders Branderud) Date: Tue, 27 Mar 2012 13:00:08 +0200 Subject: [Live-devel] Stream FEC-encoded data over simple UDP-protocol? Message-ID: Hello! I would like to have some guidelines of how to combine video streaming over UDP with FEC-encoding? I would like to know what changes to be made - e.g. which data to take (in which classes, in which functions) and FEC-encode before sending it; and where to find the buffers with data to FEC-decode on the client-side before playing it. The server streams the data. It shouldn't keep track of any state of what data the clients have received, nor receive any acknowledgments from the clients during the video streaming. It merely keeps track of what data it has sent. I want to use a simple UDP-protocol. Thanks! --*Kind regards, Anders *[Personal blog] Will of the Creator : Logical reasons - based on scientific premises - for the existence of a Super intelligent and A Orderly Creator and that He hasn't left His sapient creatures without an Instruction Manual - Torah ['books of Moses'] - to ascertain, and aspire to, His purpose. [Company] Anders Branderud IT Solutions - www.abitsolutions.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 27 20:43:57 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Mar 2012 20:43:57 -0700 Subject: [Live-devel] Stream FEC-encoded data over simple UDP-protocol? In-Reply-To: References: Message-ID: > I would like to have some guidelines of how to combine video streaming over UDP with FEC-encoding? Unfortunately it's difficult to answer a question like this without knowing specifically what sort of FEC mechanism/protocol you're planning to use. If you can point to a specific IETF RFC that you're planning to follow, then it might be possible to say more... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tbatra18 at gmail.com Tue Mar 27 03:53:36 2012 From: tbatra18 at gmail.com (Tarun Batra) Date: Tue, 27 Mar 2012 16:23:36 +0530 Subject: [Live-devel] Error while streaming using http Message-ID: I compiled the live media server.cpp,ByteStreamMemoryBufferSource.cpp,RTSPserversuppotinghttp.cpp,TcpStreamLink.cpp and when i ran my project i compiled succesfully but when i put the address *http://URL:portno/filename(bipbop-gear1-all.ts)*,it gave the following error *Unhandled exception at 0x1027ceb7 (msvcr100d.dll) in httpstreaming.exe: 0xC0000005: Access violation reading location 0xfeeefef2. in memcpy.asm* * * Kindly help me to sort it out.Thank You. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Aleksandar.Milenkovic at rt-rk.com Wed Mar 28 02:18:08 2012 From: Aleksandar.Milenkovic at rt-rk.com (Aleksandar Milenkovic) Date: Wed, 28 Mar 2012 11:18:08 +0200 Subject: [Live-devel] Easiest way to implement custom RTSP messages? In-Reply-To: References: Message-ID: <4F72D750.5090509@rt-rk.com> >>RTSP is not the protocol to use for this (because there's nothing defined in the RTSP protocol standard for doing this, an any>>'custom' RTSP command that you'd implement would be incompatible with all of the other clients and servers out there). >> >>What you want, instead, is a regular web page that provides a list of "rtsp://" URLs - e.g, in an XML document. >> >>For this, you'd use HTTP - i.e., using a regular web server - not RTSP. Thank you for the promptly reply, however you didn't quite answer my question which was "What is the best way to implement custom RTSP messages?", not "What would you recommend me to get a list of files from server?". Surely, the implemented custom RTSP messages will be supported by my client/server combo so I don't see why a webserver (another workaround) would be neccessary :) Thanks for the suggestion though, it's something to fall-back to in case this fails. P.S. Compatibility with other clients and/or strictly sticking to the protocol is not my concern at the moment. Thanks in advance. *Aleksandar Milenkovic* Software Engineer Phone: +381-(0)21-4801-139 Fax: +381-(0)21-450-721 Mobile: +381-(0)64-31-666-82 E-mail: Aleksandar.Milenkovic at rt-rk.com RT-RK Computer Based Systems LLC Fruskogorska 11 21000 Novi Sad, Serbia www.rt-rk.com *RT-RK* invites you to visit us @ *IBC2011*, September 9-13 2011, stand *5.A01*, Amsterdam RAI. For more information please visit www.bbt.rs On 3/28/2012 5:54 AM, live-devel-request at ns.live555.com wrote: > Re: Easiest way to implement custom RTSP messages? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Mar 28 02:30:36 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 28 Mar 2012 02:30:36 -0700 Subject: [Live-devel] Easiest way to implement custom RTSP messages? In-Reply-To: <4F72D750.5090509@rt-rk.com> References: <4F72D750.5090509@rt-rk.com> Message-ID: > Thank you for the promptly reply, however you didn't quite answer my question which was "What is the best way to implement custom RTSP messages?" This is not something that's supported in the code. You *might* be able to get this to work, however, by subclassing "RTSPServer", and reimplementing the virtual function "handleCmd_notSupported(), and also by subclassing "RTSPClient". You will get absolutely no help from me on this, though. > P.S. Compatibility with other clients and/or strictly sticking to the protocol is not my concern at the moment. But it is my concern. If you want to do this, then you're on your own. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shaheed at scansoft.co.za Wed Mar 28 04:07:45 2012 From: Shaheed at scansoft.co.za (Shaheed Abdol) Date: Wed, 28 Mar 2012 13:07:45 +0200 Subject: [Live-devel] Possible missing parameter in MPEG4ESVideoRTPSink::auxSDPLine() Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8C96@SSTSVR1.sst.local> Good afternoon, I have hit an interesting obstacle while streaming RTSP video data from a live source. I am using the openRTSP client to retreive data from a testOnDemandRTSPServer, which I have added a new OnDemandServerMediaSubsession type to, named OnDemandVideoInputSubsession. I am streaming mpeg4 elementaary stream data (format: 96) and have come across a strange problem. When ServerMediaSession::generateSDPDescription is called, I trace the call stack from there down to the OnDemandServerMediaSubsession::getAuxSDPLine(RTPSink *, FramedSource *) function, which internally calls MPEG4ESVideoRTPSink::auxSDPLine(). At the point where the getAuxSDPLine function is called, the RTPSink * and FramedSource * parameters point to the correct classes. When entering the MPEG4ESVideoRTPSink::auxSDPLine() function the mp4 video rtpsink checks a variable named fSource, which should point to the FramedSource parameter from the preceding call, yet the parameter is not passed in from the getAuxSDPLine function. Is this a bug, or is there another way that the fSource parameter gets set? Currently the fSource parameter is NULL so the function returns prematurely. Without the parameter being set, my SDP description which gets sent to the client is incomplete and is missing the line which should read "a=fmtp:96 profile-level-id=3; ...". As a result I can connect to the stream, with any rtsp client (openRTSP and VLC for instance) but cannot decode the resulting data for lack of information. I do not want to edit the code in-place, since this is part of the live555 library, any suggestions to pass the parameters from OnDemandServerMediaSubsession::getAuxSDPLine(RTPSink *, FramedSource *) to MPEG4ESVideoRTPSink::auxSDPLine() , or to set the fSource parameter without destroying the live555 code? Thank you Regards ___________________________________ Shaheed Abdol Web: www.scansoft.co.za Tel: +27 21 913 8664 Cell: +27 79 835 8771 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Mar 28 08:26:11 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 28 Mar 2012 08:26:11 -0700 Subject: [Live-devel] Possible missing parameter in MPEG4ESVideoRTPSink::auxSDPLine() In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8C96@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8C96@SSTSVR1.sst.local> Message-ID: The problem with MPEG-4 (and H.264) video is that - to properly serve the stream - you need (in the stream's SDP description) certain 'configuration' parameters. Often, these configuration parameters can be obtained only by reading the stream itself. In this case, we have a 'chicken and egg' problem: Parameters that are needed to serve the stream can be obtained only by first reading the stream. To solve this, note how in "MPEG4VideoFileServerMediaSubsession" (a subclass of "OnDemandServerMediaSubsession") we reimplement the virtual function "getAuxSDPLine()". If you *don't* know - in advance - the configuration parameters for the stream, then you will need to also reimplement "getAuxSDPLine()", and do something similar, in your own subclass of "OnDemandServerMediaSubsession". If, however, you *do* know - in advance - the configuration parameters for the stream, then there is an alternative solution: Recent versions of the code (>= version 2012.03.20) have an alternative version of "MPEG4ESVideoRTPSink::createNew()" that take the configuration parameters as arguments. Specifically, it takes two extra arguments: u_int8_t profileAndLevelIndication, char const* configStr If you know this information in advance, then you should use this new version of "MPEG4ESVideoRTPSink::createNew()" in your reimplementation of the "createNewRTPSink()" virtual function, and you then *do not* need to reimplement "getAuxSDPLine()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tbatra18 at gmail.com Wed Mar 28 07:25:54 2012 From: tbatra18 at gmail.com (Tarun Batra) Date: Wed, 28 Mar 2012 19:55:54 +0530 Subject: [Live-devel] Heap corruption Message-ID: i ran the RTSPserverSupportingHttpStreaming file using live555 media server,when i opened the url .i.e. http://url:portno/filename i got an error in memcpy.asm that is Unhandled exception at 0x1027ceb7 (msvcr100d.dll) in http2.exe: 0xC0000005: Access violation reading location 0xfeeefef2. and a file(11.5kb) is downloaded in safari browser but when i open that file it also shows errors. Which step have i not followed? -------------- next part -------------- An HTML attachment was scrubbed... URL: From junaid1 at mweb.co.za Wed Mar 28 13:46:18 2012 From: junaid1 at mweb.co.za (Junaid Ebrahim) Date: Wed, 28 Mar 2012 22:46:18 +0200 Subject: [Live-devel] LIVE555 Media Server choose IP to listen on Message-ID: <4F73789A.6040308@mweb.co.za> Hi It would be great if it was possible to allow a user to select the ip that the server should listen on (maybe set it as an optional command line argument). I have multiple network devices and would like to change this to test streaming on any one of them. Where in the source code would I need to look if I wanted to change this or hard code an ip to test. thanks Regards Junaid From finlayson at live555.com Wed Mar 28 22:59:34 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 28 Mar 2012 22:59:34 -0700 Subject: [Live-devel] LIVE555 Media Server choose IP to listen on In-Reply-To: <4F73789A.6040308@mweb.co.za> References: <4F73789A.6040308@mweb.co.za> Message-ID: <5D0394B9-86AC-45FA-BFF2-A53A48458294@live555.com> The most reliable way to set the IP address that the server uses is to first configure your OS so that that interface the one that's used for multicast routing; i.e., the one that has a route for 224.0.0.0/4 In software, you could also try changing the IP address by first assigning the global variables ReceivingInterfaceAddr and SendingInterfaceAddr but this is unreliable, and doesn't seem to work on all OSs. Configuring your OS so that the desired interface is used for multicast routing is the most reliable method. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Thu Mar 29 01:36:00 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Thu, 29 Mar 2012 10:36:00 +0200 Subject: [Live-devel] Correct use of RTSPClient->sendTeardownCommand Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8CA1@SSTSVR1.sst.local> Hi, I am streaming a file from the server to the client. When the file ends, the SubsessionByeHandler in my client is called, which in turn calls the SubsessionAfterPalying in the client. What I want to do is send a teardown command back to the server at this point, so that the server knows that the file has reached its end. I had a look at RTSPClient->sendTeardownCommand, but I am not sure how to specify which function to call in the server. My questions : a) Is RTSPClient->sendTeardownCommand the correct way to send a teardown to the server? If so, how do I use it? b) Is there a way for the server itself to see when a file has reached its end? c) What do you feel is the best way to handle something like this? Thank you again for all your assistance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shaheed at scansoft.co.za Thu Mar 29 03:15:21 2012 From: Shaheed at scansoft.co.za (Shaheed Abdol) Date: Thu, 29 Mar 2012 12:15:21 +0200 Subject: [Live-devel] Possible missing parameter in MPEG4ESVideoRTPSink::auxSDPLine() Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8CA2@SSTSVR1.sst.local> Thank you Ross, Your information was spot-on, and I have discovered that I do not need to subclass the MPEG4ESVideoRTPSink class since the fSource parameter gets set when the subsession starts playing. I have come across an interesting limitation though, it turns out that I cannot send out data due to the sendto call failing in the GroupSockHelper with a WSAEMSGSIZE error code. I have read through the code, and I see there is a definite helper function in the GroupSockHelper class called setSendBufferTo, which requires the socket number in order to set the new send buffer size for the socket. Only problem is that this variable is declared in the RTSPServer class as private::fRTSPServerSocket, and no way is provided to get this socket number so that I can call the setSendBufferTo function. I am guessing that possibly the only solution would be to subclass the RTSPServer code and try to set the buffer size in the constructor or some such similar place when needed, but isn't there a more elegant solution? I am using a discrete framer, since my application is serving discrete frames, so using a byteStream is not the first option for me. I have called the MPEG4ESVideoRTPSink::setPacketSizes function, but this only gives me large enough buffers to push data into, it does not do anything to the underlying operating system buffer (probably by design), but exposes no clear way (or helper function) to increase socket buffer sizes, or am I mistaken? Thank you Regards ___________________________________ Shaheed Abdol Web: www.scansoft.co.za Tel: +27 21 913 8664 Cell: +27 79 835 8771 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 32497 bytes Desc: SST Email.png URL: From finlayson at live555.com Thu Mar 29 07:23:06 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Mar 2012 07:23:06 -0700 Subject: [Live-devel] Correct use of RTSPClient->sendTeardownCommand In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8CA1@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8CA1@SSTSVR1.sst.local> Message-ID: <1213EBEE-E7A0-44B1-A147-6A380A41FD8C@live555.com> > I am streaming a file from the server to the client. When the file ends, the SubsessionByeHandler in my client is called, which in turn calls the SubsessionAfterPalying in the client. What I want to do is send a teardown command back to the server at this point, so that the server knows that the file has reached its end. I had a look at RTSPClient->sendTeardownCommand, but I am not sure how to specify which function to call in the server. You don't. "sendTeardownCommand()" merely tells the client to send a RTSP "TEARDOWN" command for the specified session. It's up to the server to decide what to do when it receives that command. However, our server implementation will automatically do the right thing (i.e., close the input file) when it receives the command. In summary: - Look at the code for "testRTSPClient" to see how "subsessionByeHandler()" ends up calling "shutdownStream()", which in turn, calls "sendTeardownCommand()". - You don't need to make any modifications or additions to the server code; it already handles "TEARDOWN". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 29 07:47:45 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Mar 2012 07:47:45 -0700 Subject: [Live-devel] Possible missing parameter in MPEG4ESVideoRTPSink::auxSDPLine() In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8CA2@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8CA2@SSTSVR1.sst.local> Message-ID: No, you don't need to subclass "RTSPServer" to do what you think you need to do. You already implement the "createNewRTPSink()" virtual function, in which you create a "RTPSink" (subclass) object, given a "Groupsock" object. If you need to do something with the "Groupsock" object (or its socket, which you can obtain by calling "Groupsock::socketNum()"), then you can do so here. I'm puzzled by why you're suddenly getting the "WSAEMSGSIZE" error. But if you think that it's due to insufficient socket buffering in your OS, then you can increase that by calling "increaseSendBufferTo()" on the groupsock object's socket number. (See "groupsock/include/GroupsockHelper.hh") Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Fri Mar 30 12:39:19 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Sat, 31 Mar 2012 01:09:19 +0530 Subject: [Live-devel] RTP/RTCP Server Port number configuration Message-ID: Hi, How does OnDemand RTSP Server choose RTP-RTCP server ports ? How to configure these port numbers to a predefined value ? Regards, Sambhav From finlayson at live555.com Fri Mar 30 13:01:02 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Mar 2012 13:01:02 -0700 Subject: [Live-devel] RTP/RTCP Server Port number configuration In-Reply-To: References: Message-ID: <60821355-3826-4CAA-91FC-454D294EA1A4@live555.com> > How does OnDemand RTSP Server choose RTP-RTCP server ports ? > How to configure these port numbers to a predefined value ? You can't configure these port numbers to *a* predefined value, because it's possible for the server to be handling concurrent connections from several different clients, in which case the server uses different port number pairs (RTP even; RTCP odd) for each connection. The port number range that the server uses depends upon the "initialPortNum" parameter to the "OnDemandServerMediaSubsession" constructor. By default, this parameter is 6970, which means that the server will try {6970,6971}, then {6972,6973}, etc., until it finds a port number pair that isn't already used. So, if you want to change this parameter, you would do so in your subclass(es) of "OnDemandServerMediaSubsession" (when their constructor calls the "OnDemandServerMediaSubsession" constructor). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lroels at hotmail.com Fri Mar 30 02:27:55 2012 From: lroels at hotmail.com (Luc Roels) Date: Fri, 30 Mar 2012 09:27:55 +0000 Subject: [Live-devel] Frame loss Message-ID: Hi, We seem to have occasional frame loss when connecting to an Aviglion camera, streaming H264 using RTP over RTSP. We occasionally see blurring in the video. After some inspection we see that sometimes a H264 key frame is missing. Taking a wireshark trace shows it's not a camera problem, the key frame is transmitted and complete ( all RTP timestamps are present up till the frame with the marker bit set ), so livemedia should not throw it away. Did anyone experience a similar problem or can anyone direct us to the location where we can debug this issue? I don't know if this is related but when the frame loss occurs we often see an OPTIONS command being sent by livemedia to the camera. regards, Luc Roels -------------- next part -------------- An HTML attachment was scrubbed... URL: