From Marlon at scansoft.co.za Tue Nov 1 00:51:29 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Tue, 1 Nov 2011 09:51:29 +0200 Subject: [Live-devel] Payload attributes Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A897E@SSTSVR1.sst.local> Hi, I want my application to be able to stream different streams of the same type, but with different attributes. For example, I want to stream a 44100khz PCM and a 22050khz PCM. The application that receives the stream will play the one selected by the user. The question is how can I add this information to the SDP description. Should I create one for every type e.g. PCM 16 44100 stereo, PCM 8 22050 mono and add it to the lookupPayloadFormat function in MediaSession.cpp? What about the information not provided for in the function, such as kpbs? I hope the question is clear. Thanks ___________________________________ Marlon Reid Web: www.scansoft.co.za Tel: +27 21 913 8664 Cell: +27 72 359 0902 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 32497 bytes Desc: SST Logo with Shadow.png URL: From finlayson at live555.com Tue Nov 1 01:17:56 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Nov 2011 01:17:56 -0700 Subject: [Live-devel] Payload attributes In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A897E@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A897E@SSTSVR1.sst.local> Message-ID: <53ADB73A-96ED-4D40-8D4C-556284591732@live555.com> > I want my application to be able to stream different streams of the same type, but with different attributes. For example, I want to stream a 44100khz PCM and a 22050khz PCM. The application that receives the stream will play the one selected by the user. The question is how can I add this information to the SDP description. Should I create one for every type e.g. PCM 16 44100 stereo, PCM 8 22050 mono and add it to the lookupPayloadFormat function in MediaSession.cpp? What about the information not provided for in the function, such as kpbs? You (the server developer) don't deal with SDP descriptions directly; our software takes care of this for you. Because you want your server to support streaming different streams (selected by the user), these should be separate "ServerMediaSession" objects (each with a separate stream name). Each of these "ServerMediaSession" objects will have a single "ServerMediaSubsession" (subclass) member. You, as a server developer, will need to define and implement your own "ServerMediaSubsession" subclass for your PCM streams. You may find it useful to use the "WAVAudioFileServerMediaSubsession" code as a model for this. As always, you will need to implement the virtual functions "createNewStreamSource()" and "createNewRTPSink()". In particular, you will implement "createNewRTPSink()" by calling SimpleRTPSink::createNew() with appropriate parameters (again, you may find it useful to use the "WAVAudioFileServerMediaSubsession" code as a model). Our server code will automatically generate an appropriate SDP description for each stream, based on the parameters that you gave to "SimpleRTPSink::createNew()". Your RTSP client can then access the desired stream by including the desired stream name in the "rtsp://" URL. Note that you should not need to modify *any* of our library code to develop your server or client. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Tue Nov 1 01:38:42 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Tue, 1 Nov 2011 10:38:42 +0200 Subject: [Live-devel] Payload attributes In-Reply-To: <53ADB73A-96ED-4D40-8D4C-556284591732@live555.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A897E@SSTSVR1.sst.local> <53ADB73A-96ED-4D40-8D4C-556284591732@live555.com> Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8980@SSTSVR1.sst.local> Hi Ross, Thanks for the reply. I have already created my own ServerMediaSubsession subclasses for my application. One for PCM and one for MP3. By doing this, the listener is able to see if it is receiving a PCM stream or a MP3 stream and play it correctly. The problem however is that I need to send attributes along, for example the bits, frequency and channels of the PCM sample so that the listener knows how to play it. The bits, frequency and channels can be changed on the streamer side and this is the information that I want in the SDPDescription or any other appropriate place. I guess the main point is if I can put bits, channels and frequency in the RTP stream so that the listening application can get it out and play the sample correctly. I hope that this is more clear. ___________________________________ Marlon Reid Web: www.scansoft.co.za Tel: +27 21 913 8664 Cell: +27 72 359 0902 ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 01 November 2011 10:18 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Payload attributes I want my application to be able to stream different streams of the same type, but with different attributes. For example, I want to stream a 44100khz PCM and a 22050khz PCM. The application that receives the stream will play the one selected by the user. The question is how can I add this information to the SDP description. Should I create one for every type e.g. PCM 16 44100 stereo, PCM 8 22050 mono and add it to the lookupPayloadFormat function in MediaSession.cpp? What about the information not provided for in the function, such as kpbs? You (the server developer) don't deal with SDP descriptions directly; our software takes care of this for you. Because you want your server to support streaming different streams (selected by the user), these should be separate "ServerMediaSession" objects (each with a separate stream name). Each of these "ServerMediaSession" objects will have a single "ServerMediaSubsession" (subclass) member. You, as a server developer, will need to define and implement your own "ServerMediaSubsession" subclass for your PCM streams. You may find it useful to use the "WAVAudioFileServerMediaSubsession" code as a model for this. As always, you will need to implement the virtual functions "createNewStreamSource()" and "createNewRTPSink()". In particular, you will implement "createNewRTPSink()" by calling SimpleRTPSink::createNew() with appropriate parameters (again, you may find it useful to use the "WAVAudioFileServerMediaSubsession" code as a model). Our server code will automatically generate an appropriate SDP description for each stream, based on the parameters that you gave to "SimpleRTPSink::createNew()". Your RTSP client can then access the desired stream by including the desired stream name in the "rtsp://" URL. Note that you should not need to modify *any* of our library code to develop your server or client. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 32497 bytes Desc: SST Logo with Shadow.png URL: From finlayson at live555.com Tue Nov 1 01:55:04 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Nov 2011 01:55:04 -0700 Subject: [Live-devel] Payload attributes In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8980@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A897E@SSTSVR1.sst.local> <53ADB73A-96ED-4D40-8D4C-556284591732@live555.com> <002962EA5927BE45B2FFAB0B5B5D67970A8980@SSTSVR1.sst.local> Message-ID: <2FEF6270-F554-447A-ABC0-1AE89FB47FFF@live555.com> > I have already created my own ServerMediaSubsession subclasses for my application. One for PCM and one for MP3. By doing this, the listener is able to see if it is receiving a PCM stream or a MP3 stream and play it correctly. The problem however is that I need to send attributes along, for example the bits, frequency and channels of the PCM sample so that the listener knows how to play it. The bits, frequency and channels can be changed on the streamer side and this is the information that I want in the SDPDescription or any other appropriate place. The sampling frequency and number of channels will sometimes appear in the SDP description (in the a=rtpmap: line), if the RTP payload type is dynamic. If the RTP payload type is static, however, then these parameters are implied. For example, payload type 10 is defined to mean 44100 Hz 2-channel PCM (16-bit) audio. I agree, however, that this won't be enough for you, because you want this information to always appear explicitly in the SDP description. What you can do, however, is create a string - describing the stream's parameters - and pass it as either the "info" or the "description" parameter to "ServerMediaSession::createNew()". Then, this string will automatically appear in the SDP description, as the "i=" or the "s=" line, respectively. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Nov 2 01:11:20 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 2 Nov 2011 08:11:20 -0000 Subject: [Live-devel] Request for help Message-ID: <001e01cc9937$020d7870$06286950$@myers@panogenics.com> Hi, I'm not sure if this sort of request is allowed on the list but I'm looking for someone to help us modify and debug a Live555 embedded implementation. Specifically we have RTSP working/streaming with MPEG4 but when we switch to H.264 compression, it just won't hack it. We are based in the UK near Gatwick airport. If any of you Live555 gurus are free to come down and help us with this for a couple of weeks, please let me know. Thanks in advance. David Myers Panogenics Ltd David.myers at panogenics.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From kvishnath at ymail.com Tue Nov 1 22:38:33 2011 From: kvishnath at ymail.com (Viswanatha Reddy) Date: Wed, 2 Nov 2011 11:08:33 +0530 (IST) Subject: [Live-devel] RTSP Server Crash with Milestone Client In-Reply-To: <2FEF6270-F554-447A-ABC0-1AE89FB47FFF@live555.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A897E@SSTSVR1.sst.local> <53ADB73A-96ED-4D40-8D4C-556284591732@live555.com> <002962EA5927BE45B2FFAB0B5B5D67970A8980@SSTSVR1.sst.local> <2FEF6270-F554-447A-ABC0-1AE89FB47FFF@live555.com> Message-ID: <1320212313.41925.YahooMailNeo@web95006.mail.in2.yahoo.com> Dear Ross, I have developed an (mpeg4)RTSP streamer(server) using live555 libs. It's unicast based on OnDemandServer example module. Thanks for your priceless contribution to the media community. The application takes rgb frames from a buffer, encodes it, and passes it to the derived FramedSource module. I ran without any issues with vlc and mplayer client for weeks together. The application was now tested with Milestone Client(used Onvif standard to make the Streamer to be seen as a virtual camera by this client). But it's crashing frequently inside the eventloop. I thoroughly checked whether it's crashing in my development. It's not. Surely it's crashing at the step of eventloop. This never happens with vlc/mplayer !!. Please let me know, if the RTSP Server implementation is client dependent. Or give me a direction to solve this issue. Thanks in advance, K.Viswanatha Reddy, Bangalore, India. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 2 12:39:40 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Nov 2011 12:39:40 -0700 Subject: [Live-devel] RTSP Server Crash with Milestone Client In-Reply-To: <1320212313.41925.YahooMailNeo@web95006.mail.in2.yahoo.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A897E@SSTSVR1.sst.local> <53ADB73A-96ED-4D40-8D4C-556284591732@live555.com> <002962EA5927BE45B2FFAB0B5B5D67970A8980@SSTSVR1.sst.local> <2FEF6270-F554-447A-ABC0-1AE89FB47FFF@live555.com> <1320212313.41925.YahooMailNeo@web95006.mail.in2.yahoo.com> Message-ID: > I have developed an (mpeg4)RTSP streamer(server) using live555 libs. It's unicast based on OnDemandServer example module. > Thanks for your priceless contribution to the media community. > > The application takes rgb frames from a buffer, encodes it, and passes it to the derived FramedSource module. > I ran without any issues with vlc and mplayer client for weeks together. > > The application was now tested with Milestone Client(used Onvif standard to make the Streamer to be seen as a virtual camera by this client). > But it's crashing frequently inside the eventloop. > I thoroughly checked whether it's crashing in my development. It's not. Surely it's crashing at the step of eventloop. > This never happens with vlc/mplayer !!. Unfortunately, because the problem seems to happen only with your custom server, we can't reproduce it ourselves, so you're going to have to identify precisely where - in our code - your server is crashing, and why. (BTW, almost everything within a LIVE555 application happens "inside the event loop", so that's probably not significant.) For starters, I suggest turning on debugging printing in the "RTSPServer" code by adding #define DEBUG 1 near the top of "liveMedia/RTSPServer.cpp", and recompiling. This should help tell you what's going wrong. It's conceivable that your new client has somehow (perhaps by using slightly different RTSP command syntax) uncovered an error in our server code - in which case I'd be very interested in discovering this. -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Thu Nov 3 01:36:58 2011 From: isambhav at gmail.com (Sambhav) Date: Thu, 3 Nov 2011 14:06:58 +0530 Subject: [Live-devel] error (ARM) : terminate called after throwing an instance of 'int' Message-ID: Hi, I cross compiled the live555 media server for ARM linux using the montavista toolchain. The compilation was successful. When I run the test programs the following error occurs and the program gets aborted. terminate called after throwing an instance of 'int' terminate called recursively Aborted Any solutions for the error ? Regards, Sambhav -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix at embedded-sol.com Thu Nov 3 03:46:37 2011 From: felix at embedded-sol.com (Felix Radensky) Date: Thu, 03 Nov 2011 12:46:37 +0200 Subject: [Live-devel] Combining live h.264 and AAC Message-ID: <4EB2710D.4050003@embedded-sol.com> Hello, I'm trying to figure out whether it's possible to produce combined h.264 and AAC stream using live555. Both video and audio are coming from hardware encoders. Video only is streamed successfully using H264VideoStreamDiscreteFramer. I did not find any examples of reading framed AAC data and combining h.264 and AAC elementary streams. There's an example of MPEG-1 or 2 audio+video program stream, but I doubt it can help me. Any feedback is highly appreciated. Thanks. Felix. From finlayson at live555.com Thu Nov 3 07:43:36 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Nov 2011 07:43:36 -0700 Subject: [Live-devel] Combining live h.264 and AAC In-Reply-To: <4EB2710D.4050003@embedded-sol.com> References: <4EB2710D.4050003@embedded-sol.com> Message-ID: <606986B2-33A3-4841-901A-901AC460E03C@live555.com> > I'm trying to figure out whether it's possible to produce combined h.264 and AAC > stream using live555. Yes, you can do this, using separate RTP streams ('subsessions') within a single media 'session'. I.e., in your RTSP server, after you've created a "ServerMediaSession" object, you will add *two* separate "ServerMediaSubsession" (subclass) objects: One for your video (H.264) stream; the other for your audio (AAC) stream. Just make sure that the presentation times for your video and audio frames are 'in sync', and aligned with 'wall clock' time (i.e., times that are generated by "gettimeofday()"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinitus.rockt at gmx.de Fri Nov 4 01:00:52 2011 From: tinitus.rockt at gmx.de (Simon Helwig) Date: Fri, 04 Nov 2011 09:00:52 +0100 Subject: [Live-devel] Compiling a test program Message-ID: <20111104080052.27340@gmx.net> Hallo, I'm a student from Germany and yesterday I tried to compile one of your test programs, but it didn't work. So I hope you can help me with these problems, because it's very important. First I will tell you, which steps I made, and in which order, to make the program work. So here we go (I use a Windows XP Professional System (32 Bit) with Microsoft Visual Studio 2010): First I downloaded the package from your homepage (live555-latest.tar.gz 01-Nov-2011 19:01 508K) and extracted the files in a new folder on my Desktop. Then I changed the path of the 'tools' directory in the file "win32config". The new path I entered was C:\Programme\Microsoft Visual Studio 10.0\VC Afterwards I opened the commando line and went to the new folder, where I extracted the files of the package. I typed the commando "genWindowsMakefiles" to generate the *.mak files in the library folders. And always on the first try it goes wrong. But right after the first attempt (in the second try) it works. So I only repeat the commando right after the first attempt. It seems strange... But the PC generated the whole *.mak files in folders. In addition to that I opened a new "Makefile Project" in Visual Studio. In this project I imported all the *.mak files I generated before. There are four files: BasicUsageEnviornment.mak, groupsok.mak, liveMedia.mak and UsageEnviornment.mak Then I built successively each of the four files (projects) first. Now I opened C++ project and took one of the test programms (testMPEG2TransportStreamer.cpp) and copied the code in it. Thereafter I tried to compile the code, but then I got a lot of mistakes. I guess, that I have made a fundamental error in my steps before. So long and best greetings Simon -- Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de From felix at embedded-sol.com Fri Nov 4 01:26:32 2011 From: felix at embedded-sol.com (Felix Radensky) Date: Fri, 04 Nov 2011 10:26:32 +0200 Subject: [Live-devel] Combining live h.264 and AAC In-Reply-To: <606986B2-33A3-4841-901A-901AC460E03C@live555.com> References: <4EB2710D.4050003@embedded-sol.com> <606986B2-33A3-4841-901A-901AC460E03C@live555.com> Message-ID: <4EB3A1B8.2000006@embedded-sol.com> Hi Ross, On 11/03/2011 04:43 PM, Ross Finlayson wrote: >> I'm trying to figure out whether it's possible to produce combined >> h.264 and AAC >> stream using live555. > > Yes, you can do this, using separate RTP streams ('subsessions') > within a single media 'session'. > > I.e., in your RTSP server, after you've created a > "ServerMediaSession" object, you will add *two* separate > "ServerMediaSubsession" (subclass) objects: One for your video > (H.264) stream; the other for your audio (AAC) stream. > > Just make sure that the presentation times for your video and audio > frames are 'in sync', and aligned with 'wall clock' time (i.e., > times that are generated by "gettimeofday()"). > > Thanks a lot. Assuming my audio comes from encoder in ADTS, do I need to implement a framer class, similar to H264VideoStreamDiscreteFramer ? Felix. From francisco at j2kvideo.com Fri Nov 4 05:43:54 2011 From: francisco at j2kvideo.com (Francisco Feijoo) Date: Fri, 4 Nov 2011 13:43:54 +0100 Subject: [Live-devel] Segmentation fault in DelayQueue::removeEntry(DelayQueueEntry*) Message-ID: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> Hello, We have developed a RTSP client using live555. #0 0x006f253d in DelayQueue::removeEntry(DelayQueueEntry*) () from /usr/lib/libvideosource.so.1 #1 0x006f2c4f in DelayQueue::handleAlarm() () from /usr/lib/libvideosource.so.1 #2 0x006f1f70 in BasicTaskScheduler::SingleStep(unsigned int) () from /usr/lib/libvideosource.so.1 #3 0x006f3920 in BasicTaskScheduler0::doEventLoop(char*) () from /usr/lib/libvideosource.so.1 Looking at the code here http://www.live555.com/liveMedia/doxygen/html/DelayQueue_8cpp-source.html I see this: 00153 void DelayQueue::removeEntry(DelayQueueEntry* entry) { 00154 if (entry == NULL || entry->fNext == NULL) return; 00155 00156 entry->fNext->fDeltaTimeRemaining += entry->fDeltaTimeRemaining; 00157 entry->fPrev->fNext = entry->fNext; 00158 entry->fNext->fPrev = entry->fPrev; 00159 entry->fNext = entry->fPrev = NULL; 00160 // in case we should try to remove it again 00161 } I think the first if could produce a wrong memory access if entry is NULL. Is that correct? Thanks in advance. -- Francisco Feijoo Software Engineer J2K Video Limited T: +44 020 8133 9388 E: francisco at j2kvideo.com W: www.j2kvideo.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Nov 4 07:10:39 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Nov 2011 07:10:39 -0700 Subject: [Live-devel] Segmentation fault in DelayQueue::removeEntry(DelayQueueEntry*) In-Reply-To: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> References: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> Message-ID: > Looking at the code here http://www.live555.com/liveMedia/doxygen/html/DelayQueue_8cpp-source.html I see this: > > 00153 void DelayQueue::removeEntry(DelayQueueEntry* entry) { > 00154 if (entry == NULL || entry->fNext == NULL) return; > 00155 > 00156 entry->fNext->fDeltaTimeRemaining += entry->fDeltaTimeRemaining; > 00157 entry->fPrev->fNext = entry->fNext; > 00158 entry->fNext->fPrev = entry->fPrev; > 00159 entry->fNext = entry->fPrev = NULL; > 00160 // in case we should try to remove it again > 00161 } > > I think the first if could produce a wrong memory access if entry is NULL. Is that correct? No, because the statement at line 154 quite clearly tests for "entry == NULL", and returns if it is. The "DelayQueue" code is very widely used and has been tested for a long time, so I don't understand why you would be seeing an error there. (I hope you're not doing something stupid like trying to use multiple threads?) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at j2kvideo.com Fri Nov 4 07:42:56 2011 From: francisco at j2kvideo.com (Francisco Feijoo) Date: Fri, 4 Nov 2011 15:42:56 +0100 Subject: [Live-devel] Segmentation fault in DelayQueue::removeEntry(DelayQueueEntry*) In-Reply-To: References: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> Message-ID: Ross, thanks for the quick answer. We are connecting to one camera which needs a keep-alive command continuously. Apart from the doEventLoop() we call this function periodically (60s): void RtspConnection::timeout() { if ( client && session && this->getParameterSupported ) { char * psz_bye = NULL; ((RTSPClient*)client)->getMediaSessionParameter( *session, NULL, psz_bye ); } } getParameterSupported comes from: char * options = ((RTSPClient*)client)->sendOptionsCmd( this->host ); if ( strstr(options, "GET_PARAMETER") != NULL ) { this->getParameterSupported = true; } Is this a wrong way to maintain the connection to the camera? Could this be the cause of the crash? Thanks in advance. -- Francisco Feijoo Software Engineer J2K Video Limited T: +44 020 8133 9388 E: francisco at j2kvideo.com W: www.j2kvideo.com El 04/11/2011, a las 15:10, Ross Finlayson escribi?: >> Looking at the code here http://www.live555.com/liveMedia/doxygen/html/DelayQueue_8cpp-source.html I see this: >> >> 00153 void DelayQueue::removeEntry(DelayQueueEntry* entry) { >> 00154 if (entry == NULL || entry->fNext == NULL) return; >> 00155 >> 00156 entry->fNext->fDeltaTimeRemaining += entry->fDeltaTimeRemaining; >> 00157 entry->fPrev->fNext = entry->fNext; >> 00158 entry->fNext->fPrev = entry->fPrev; >> 00159 entry->fNext = entry->fPrev = NULL; >> 00160 // in case we should try to remove it again >> 00161 } >> >> I think the first if could produce a wrong memory access if entry is NULL. Is that correct? > > No, because the statement at line 154 quite clearly tests for "entry == NULL", and returns if it is. > > The "DelayQueue" code is very widely used and has been tested for a long time, so I don't understand why you would be seeing an error there. (I hope you're not doing something stupid like trying to use multiple threads?) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Nov 4 08:05:56 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Nov 2011 08:05:56 -0700 Subject: [Live-devel] Segmentation fault in DelayQueue::removeEntry(DelayQueueEntry*) In-Reply-To: References: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> Message-ID: <94521E97-11AA-4438-A240-9F1F361FE2D2@live555.com> > Is this a wrong way to maintain the connection to the camera? Could this be the cause of the crash? No, I don't think so. What you're doing looks OK. Just make sure, though, that you stop the periodic calls to timeout() if/when the session ends. (You can do this using "TaskScheduler::unscheduleDelayedTask()".) If you try to call your "RtspConnection::timeout()" after "client" and/or "session" have been deleted, then bad things will definitely happen. Also (although this is not the cause of your problem) you should really be using the asynchronous "RTSPClient" interface rather than the old synchronous interface. E.g., you should be calling "sendGetParameterCommand()" instead of "getMediaSessionParameter()", and "sendOptionsCommand()" instead of "sendOptionsCmd()". Etc. The synchronous "RTSPClient" interface is now deprecated, and will likely be removed completely sometime in the future. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francisco at j2kvideo.com Fri Nov 4 09:33:13 2011 From: francisco at j2kvideo.com (Francisco Feijoo) Date: Fri, 4 Nov 2011 17:33:13 +0100 Subject: [Live-devel] Segmentation fault in DelayQueue::removeEntry(DelayQueueEntry*) In-Reply-To: <94521E97-11AA-4438-A240-9F1F361FE2D2@live555.com> References: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> <94521E97-11AA-4438-A240-9F1F361FE2D2@live555.com> Message-ID: <9BAA26DE-5CDE-4D66-B618-F884FD003996@j2kvideo.com> OK, thanks your help. -- Francisco Feijoo Software Engineer J2K Video Limited T: +34 654967246 T: +44 020 8133 9388 E: francisco at j2kvideo.com W: www.j2kvideo.com El 04/11/2011, a las 16:05, Ross Finlayson escribi?: >> Is this a wrong way to maintain the connection to the camera? Could this be the cause of the crash? > > No, I don't think so. What you're doing looks OK. Just make sure, though, that you stop the periodic calls to timeout() if/when the session ends. (You can do this using "TaskScheduler::unscheduleDelayedTask()".) If you try to call your "RtspConnection::timeout()" after "client" and/or "session" have been deleted, then bad things will definitely happen. > > > Also (although this is not the cause of your problem) you should really be using the asynchronous "RTSPClient" interface rather than the old synchronous interface. E.g., you should be calling "sendGetParameterCommand()" instead of "getMediaSessionParameter()", and "sendOptionsCommand()" instead of "sendOptionsCmd()". Etc. The synchronous "RTSPClient" interface is now deprecated, and will likely be removed completely sometime in the future. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Fri Nov 4 11:50:40 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Fri, 4 Nov 2011 11:50:40 -0700 Subject: [Live-devel] Segmentation fault in DelayQueue::removeEntry(DelayQueueEntry*) In-Reply-To: References: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> Message-ID: On Fri, Nov 4, 2011 at 7:10 AM, Ross Finlayson wrote: > Looking at the code here > http://www.live555.com/liveMedia/doxygen/html/DelayQueue_8cpp-source.html I > see this: > > 00153 void DelayQueue::removeEntry (DelayQueueEntry * entry) {00154 if (entry == NULL || entry->fNext == NULL ) return;00155 00156 entry->fNext ->fDeltaTimeRemaining += entry->fDeltaTimeRemaining ;00157 entry->fPrev ->fNext = entry->fNext ;00158 entry->fNext ->fPrev = entry->fPrev ;00159 entry->fNext = entry->fPrev = NULL ;00160 // in case we should try to remove it again00161 } > > > I think the first if could produce a wrong memory access if entry is NULL. > Is that correct? > > > No, because the statement at line 154 quite clearly tests for "entry == > NULL", and returns if it is. > Is it possible that entry->fPrev is null? I notice it checks entry and fnext, but not fprev. But on line 157, it pretty clearly attempts to dereference both fPrev and fPrev->fNext. Also, it dereferences entry->fNext->fPrev, which could (in theory) be null. Not familiar with the code, so maybe there's no problem with any of this, but seems like ample opportunities for segmentation fault that aren't caught by the statement at like 154. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Nov 4 14:35:54 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Nov 2011 14:35:54 -0700 Subject: [Live-devel] Segmentation fault in DelayQueue::removeEntry(DelayQueueEntry*) In-Reply-To: References: <55D3C50F-B844-468E-8EDB-20CEF267F19B@j2kvideo.com> Message-ID: > Is it possible that entry->fPrev is null? No, I don't think so, because the delay queue is maintained as a doubly-linked list, and delay queue entries are initialized with their "fNext" and "fPrev" links both pointing to themself. The only place where "fPrev" is set to NULL is at line 159 (quoted earlier), which we do to protect against attempting to remove the same entry more than once. But at that same line, we also set "fNext" to NULL, so the test at line 154 will catch that. I'm fairly sure that Francisco's crash was a side effect of attempting to access an object that had already been deleted. That's why I asked him to make sure that he had stopped his periodic "GET_PARAMETER" request once the stream had ended. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kevinhilman at hotmail.com Sun Nov 6 06:32:08 2011 From: kevinhilman at hotmail.com (HilmanKevin) Date: Sun, 6 Nov 2011 14:32:08 +0000 Subject: [Live-devel] about multiplexing audio and h264 video into ts file Message-ID: hi,everyone i want to revise live/testprogs/testH264VideoToTransportStream.cpp to add the function of multiplexing audio and h264 video into ts file.i add some code here like red code below.Am i right to complete my goal?any light would be appreciated.thanks a lot. int main(int argc, char** argv) { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Open the input file as a 'byte-stream file source': FramedSource* inputSource = ByteStreamFileSource::createNew(*env, inputFileName); if (inputSource == NULL) { *env << "Unable to open file \"" << inputFileName << "\" as a byte-stream file source\n"; exit(1); } H264VideoStreamFramer* framer = H264VideoStreamFramer::createNew(*env, inputSource, True/*includeStartCodeInOutput*/); MPEG2TransportStreamFromESSource* tsFrames = MPEG2TransportStreamFromESSource::createNew(*env); tsFrames->addNewVideoSource(framer, 5/*mpegVersion: H.264*/); tsFrames->addNewaudioSource(audiofileinputsource,?) ; // Open the output file as a 'file sink': MediaSink* outputSink = FileSink::createNew(*env, outputFileName); if (outputSink == NULL) { *env << "Unable to open file \"" << outputFileName << "\" as a file sink\n"; exit(1); } // Finally, start playing: *env << "Beginning to read...\n"; outputSink->startPlaying(*tsFrames, afterPlaying, NULL); env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Mon Nov 7 02:15:25 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Mon, 7 Nov 2011 12:15:25 +0200 Subject: [Live-devel] Heap corruption in AudioInputDevice::createNew() Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A899E@SSTSVR1.sst.local> Hi everyone, I have successfully used live555 in the past for various test programs. I am now trying to incorporate live555 into another application. Live555 forms part of a plugin, which is loaded by our client application. The problem is that when loading this plugin that uses live555, I get a heap corruption. I spent quite sometime on this and I discovered that if I remove my call to "AudioInputDevice::createNew()" then my application does not suffer a heap corruption when loading the plugin the uses live555. With this line, a heap corruption while loading the plugin. I am using Windows 7 64bit, Visual Studio 2008 and the newest version of live555 (1st November). Any suggestions will be most appreciated. The call that trashes my heap is: FramedSource *micInput = (WindowsAudioInputDevice*)AudioInputDevice::createNew( envir(), 0, 16 /*BIT_RATE*/, 2 /*NUM_CHANNELS*/, 22050/*SAMPLE_RATE*/, 2/*GRANULARITY*/); Please note: My application never gets to this call. Simply having it in my code breaks the heap at startup Regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsaintmartin at mediabroadcast-t.com Mon Nov 7 03:18:09 2011 From: bsaintmartin at mediabroadcast-t.com (Boris Saint-Martin) Date: Mon, 7 Nov 2011 12:18:09 +0100 Subject: [Live-devel] Build openRTSP Message-ID: <999059AD90FC4CF4B15FFE4CB1DC771A@Boris> Hello, I would like to record IP camera streams by using openRTSP. My problem is to compile the source with Visual Studio 2008. I have built the makefile but I can't load it in VS. It trying to convert the project but it failed... Is someone can help me to build it or simply give me the windows executable or a correct VS 2008 project? Thank you so much. Boris -------------- next part -------------- An HTML attachment was scrubbed... URL: From belloni at imavis.com Mon Nov 7 09:41:38 2011 From: belloni at imavis.com (Cristiano Belloni) Date: Mon, 07 Nov 2011 18:41:38 +0100 Subject: [Live-devel] Inhibit UDP/RTP streaming Message-ID: <4EB81852.8040604@imavis.com> Hi Ross, I got an RTSP server based on liveMedia, and a problem with some (third party, black-box) buggy clients that have problems with RTP on UDP streaming, while they work fine in RTP/TCP. The problem is that they first request UDP streams to the server, then, if UDP is not available, they fall back on TCP. What I wanted to know was if our server could always force the client to connect in TCP. As far as I know that would imply altering the m= line(s ) in the SDP description, am I correct? Would it be possible to do that easily in LiveMedia? Thank you, Cristiano. -- Belloni Cristiano Imavis Srl. www.imavis.com belloni at imavis.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ghostjackyo at gmail.com Mon Nov 7 18:12:09 2011 From: ghostjackyo at gmail.com (=?UTF-8?B?4piG5bCP6a2a5YWS4piG?=) Date: Tue, 8 Nov 2011 10:12:09 +0800 Subject: [Live-devel] Some question with openRTSP! In-Reply-To: References: Message-ID: Hello, I have some question with openRTSP. I want to play rtsp url, and use the openRTSP , then which vedio stream can i choose?( Need use VLC ? or Live555 have the vedio stream to play RTSP?) And how to use the openRTSP? Thanks for you to see my question. -- -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 7 18:28:06 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Nov 2011 18:28:06 -0800 Subject: [Live-devel] about multiplexing audio and h264 video into ts file In-Reply-To: References: Message-ID: <4843D8FC-6727-4B85-AC5A-AA9E585301FB@live555.com> > i want to revise live/testprogs/testH264VideoToTransportStream.cpp to add the function of multiplexing audio and h264 video into ts file.i add some code here like red code below.Am i right to complete my goal?any light would be appreciated.thanks a lot. [...] > MPEG2TransportStreamFromESSource* tsFrames = MPEG2TransportStreamFromESSource::createNew(*env); > > tsFrames->addNewVideoSource(framer, 5/*mpegVersion: H.264*/); > tsFrames->addNewaudioSource(audiofileinputsource,?) ; Yes, that should work, depending upon what kind of audio (i.e., what audio codec) you plan to add. What kind of audio will this be? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 7 19:39:54 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Nov 2011 19:39:54 -0800 Subject: [Live-devel] Heap corruption in AudioInputDevice::createNew() In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A899E@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A899E@SSTSVR1.sst.local> Message-ID: <60338F68-5121-4888-8876-ED9D6D14F927@live555.com> On Nov 7, 2011, at 2:15 AM, Marlon Reid wrote: > The call that trashes my heap is: > > FramedSource *micInput = (WindowsAudioInputDevice*)AudioInputDevice::createNew( envir(), 0, 16 /*BIT_RATE*/, 2 /*NUM_CHANNELS*/, 22050/*SAMPLE_RATE*/, 2/*GRANULARITY*/); > Please note: > > My application never gets to this call. Simply having it in my code breaks the heap at startup > > If your application "never gets to this call", then it can't be "The call that trashes my heap". I suspect that this code is just a 'red herring', and that the memory smash is caused by something else that happened earlier. Unfortunately I can't really offer any suggestions, except to first try putting the code in its own application, rather than making it a 'plugin'. That might make it easier to figure out what's going wrong (or it might not...). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From homepuh at yandex.ru Mon Nov 7 20:11:12 2011 From: homepuh at yandex.ru (homepuh) Date: Tue, 08 Nov 2011 08:11:12 +0400 Subject: [Live-devel] RTSPClient, raw udp streaming Message-ID: <492121320725472@web2.yandex.ru> Hi there. RTSPServer now support raw udp streaming only if client sent SETUP command contains one client port, but RTSPClient always add second port number, even if it works with raw udp subsession. Could you fix that, please? Regards, Ivan. From Marlon at scansoft.co.za Mon Nov 7 22:07:54 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Tue, 8 Nov 2011 08:07:54 +0200 Subject: [Live-devel] Heap corruption in AudioInputDevice::createNew() In-Reply-To: <60338F68-5121-4888-8876-ED9D6D14F927@live555.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A899E@SSTSVR1.sst.local> <60338F68-5121-4888-8876-ED9D6D14F927@live555.com> Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A89A2@SSTSVR1.sst.local> Hi, The problem was that several functions like delete and new where defined in msvcrtd.lib instead of libcmtd. Ignoring msvcrt.lib in my project solved the issue. Regards. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 08 November 2011 05:40 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Heap corruption in AudioInputDevice::createNew() On Nov 7, 2011, at 2:15 AM, Marlon Reid wrote: The call that trashes my heap is: FramedSource *micInput = (WindowsAudioInputDevice*)AudioInputDevice::createNew( envir(), 0, 16 /*BIT_RATE*/, 2 /*NUM_CHANNELS*/, 22050/*SAMPLE_RATE*/, 2/*GRANULARITY*/); Please note: My application never gets to this call. Simply having it in my code breaks the heap at startup If your application "never gets to this call", then it can't be "The call that trashes my heap". I suspect that this code is just a 'red herring', and that the memory smash is caused by something else that happened earlier. Unfortunately I can't really offer any suggestions, except to first try putting the code in its own application, rather than making it a 'plugin'. That might make it easier to figure out what's going wrong (or it might not...). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 7 22:16:30 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Nov 2011 22:16:30 -0800 Subject: [Live-devel] Inhibit UDP/RTP streaming In-Reply-To: <4EB81852.8040604@imavis.com> References: <4EB81852.8040604@imavis.com> Message-ID: > I got an RTSP server based on liveMedia, and a problem with some (third party, black-box) buggy clients that have problems with RTP on UDP streaming, while they work fine in RTP/TCP. > > The problem is that they first request UDP streams to the server, then, if UDP is not available, they fall back on TCP. > > What I wanted to know was if our server could always force the client to connect in TCP. No, because it's the client, not the server, that chooses whether to stream via RTP/UDP or RTP/TCP. If the client asks for RTP/UDP, then that is what the server delivers. (Even if you were to modify the server code so that it tried to stream via RTP/TCP even if the client requested RTP/UDP, then it's unlikely that your client would be able to deal with this, because it's not what the client requested.) If you can't find a way to tell your client to request RTP/TCP only, then I suggest contacting your 'third party' to get them to fix it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 7 23:43:02 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Nov 2011 23:43:02 -0800 Subject: [Live-devel] Some question with openRTSP! In-Reply-To: References: Message-ID: <60E09C4C-54A9-4CCC-BEB9-1AC0650F7739@live555.com> > I have some question with openRTSP. > I want to play rtsp url, and use the openRTSP , then which vedio stream can i choose?( Need use VLC ? or Live555 have the vedio stream to play RTSP?) > And how to use the openRTSP? The "openRTSP" application merely records incoming audio/video data (from a RTSP/RTP stream) into files; it does not decode or 'play' this data at all. If you want to play such a stream, then just use a media player - such as VLC - and give it the "rtsp://" URL. You don't need "openRTSP" at all for this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 7 23:54:51 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Nov 2011 23:54:51 -0800 Subject: [Live-devel] RTSPClient, raw udp streaming In-Reply-To: <492121320725472@web2.yandex.ru> References: <492121320725472@web2.yandex.ru> Message-ID: <32484BE3-4087-4942-8169-55BC20A85C74@live555.com> > RTSPServer now support raw udp streaming only if client sent SETUP command contains one client port, but RTSPClient always add second port number, even if it works with raw udp subsession. Could you fix that, please? No, because there's nothing to 'fix'. There is no single defined standard for how to request raw UDP streaming using RTSP; the IETF standard media streaming protocol is RTP. However, clients that wish to request raw UDP streaming typically do so in one of two ways: 1/ By specifying a protocol of "RAW/RAW/UDP" (or perhaps "MP2T/H2221/UDP") in the "Transport:" RTSP header, or 2/ Specifying only a single client port in the "client_port=" part of the "Transport:" header. Our server recognizes either of these as being a request for raw UDP streaming. Note that the usual, standard requested protocol is RTP-over-UDP (or perhaps RTP-over-TCP). For RTP-over-UDP, the "Transport:" header will contain a pair of port numbers (in the "client_port=" part of the header): One port number for RTP, and a second port number for RTCP. If a client specifies two port numbers in this way, then it is implicitly requesting RTP(and RTCP)-over-UDP. Specifying two port numbers would make no sense for requesting raw UDP streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kevinhilman at hotmail.com Tue Nov 8 01:02:11 2011 From: kevinhilman at hotmail.com (HilmanKevin) Date: Tue, 8 Nov 2011 09:02:11 +0000 Subject: [Live-devel] about h264 trick play in live555 Message-ID: hi,everyone i want to add trick play func for h264.And i know i should add some code on H264VideoFileServerMediaSubsession.cpp?reload func OnDemandServerMediaSubsession::seekStreamSource. at beginning,i think i could compute seek position of h264 according to seekNPT parameter.of course, i use file lenghth to get a rough estimate. then, from the rough estimate ,i search i frame in h264 file at last ,return the nearest I frame position.(i have a way to get I frame that is "if slice_type ==2 or 7 ,it is i frame") however,i have some troubles on the ideas: 1,except for seeking file position,which enviroment vars should be cleared for new stream on H264VideoFileServerMediaSubsession::seekStreamSource func? if no enviroment vars be cleared,could testOnDemandRTSPServer play new stream sucessfully? 2,how to compute fFileDuration for h264,and how to estimate seek position roughly? thanks for any comment and reply. if someone have h264 trick play code for live555,could show me a little? regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinitus.rockt at gmx.de Tue Nov 8 05:54:59 2011 From: tinitus.rockt at gmx.de (Simon Helwig) Date: Tue, 08 Nov 2011 14:54:59 +0100 Subject: [Live-devel] Compatibility with Visual Studio 10.0 Message-ID: <20111108135459.257830@gmx.net> Hallo, could somebody tell me, if the whole libraries of LIVE555 Streaming Media are compatible to Microsoft Visual Studio 10.0? Because I've many problems, when I compile the libraries and it seems very strange. So if anybody knows an answer to this problem, please tell me and perhaps you have an instruction for me, how to build these libraries step by step in Visual Studio 10.0? I would be very happy about an answer... Best greetings Simon -- NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! Jetzt informieren: http://www.gmx.net/de/go/freephone From norris.j at gmail.com Tue Nov 8 06:44:59 2011 From: norris.j at gmail.com (James Norris) Date: Tue, 8 Nov 2011 14:44:59 +0000 Subject: [Live-devel] H264DiscreteFramer from custom source Message-ID: Hey all, Quick question I wonder if you could help with. I would like to send H.264 frames from a custom source over RTSP, most of my code is working, just have a query with regards to the framer behaviour. >From what I gather the H264DiscreteFramer expects individual NALs (that might not individually translate to a complete frame, e.g. non-VCL NALs like SPS). Inside the DeviceSource template code I have implemented, libx264 generates a number of NALs for each frame (i.e. it contains the odd SPS/PPS), and I would like to feed these into the framer for sending. Pretty standard. Should I schedule a deliverFrame (and FramedSource::afterGetting( ..) for each NAL, i.e. making a queue of NALs and repeatedly scheduling deliverFrame in order to send a single frame. Or should I package the NALs into a single buffer somehow, and call FramedSource::afterGetting ( .. ) only once. Or something else. Much appreciated, thanks, James From norris.j at gmail.com Tue Nov 8 07:41:23 2011 From: norris.j at gmail.com (James Norris) Date: Tue, 8 Nov 2011 15:41:23 +0000 Subject: [Live-devel] Compatibility with Visual Studio 10.0 In-Reply-To: <20111108135459.257830@gmx.net> References: <20111108135459.257830@gmx.net> Message-ID: What are the problems? I'm sure I did manage it when using VS10, but moved back to 9 now (its better) James On Tue, Nov 8, 2011 at 1:54 PM, Simon Helwig wrote: > Hallo, > > could somebody tell me, if the whole libraries of LIVE555 Streaming Media are compatible to Microsoft Visual Studio 10.0? Because I've many problems, when I compile the libraries and it seems very strange. > So if anybody knows an answer to this problem, please tell me and perhaps you have an instruction for me, how to build these libraries step by step in Visual Studio 10.0? > I would be very happy about an answer... > > Best greetings > > Simon > -- > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! > Jetzt informieren: http://www.gmx.net/de/go/freephone > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Nov 8 08:37:04 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Nov 2011 08:37:04 -0800 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: References: Message-ID: > Should I schedule a deliverFrame (and FramedSource::afterGetting( ..) > for each NAL, i.e. making a queue of NALs and repeatedly scheduling > deliverFrame in order to send a single frame. Yes, a "H264VideoStreamDiscreteFramer" expects to be fed one NAL unit at a time - *not* one frame at a time. In fact, this is especially important for SPS and PPS NAL units, because the "H264VideoStreamDiscreteFramer" code recognizes and saves a copy of those NAL units (for use in the stream's SDP 'config' string). If you can, try to make the SPS and PPS NAL units the first NAL units that come from your encoder, for each new stream. This is not essential (as long as SPS and PPS NAL units appear eventually), but it will make the server more efficient. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 8 12:32:17 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Nov 2011 12:32:17 -0800 Subject: [Live-devel] New support for sending/receiving Vorbis audio and VP8 video, and streaming from WebM (.webm) files Message-ID: FYI, the latest version (2011.11.08) of the "LIVE555 Streaming Media" software implements the RTP payload formats for (sending and receiving) Vorbis audio and VP8 video. Our RTSP server implementation - including the "LIVE555 Media Server" and "testOnDemandRTSPServer" applications - supports streaming from WebM (".webm") files, which are a special kind of Matroska file that contain Vorbis audio and VP8 video tracks. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From homepuh at yandex.ru Tue Nov 8 19:20:16 2011 From: homepuh at yandex.ru (homepuh) Date: Wed, 09 Nov 2011 07:20:16 +0400 Subject: [Live-devel] RTSPClient, raw udp streaming In-Reply-To: <32484BE3-4087-4942-8169-55BC20A85C74@live555.com> References: <492121320725472@web2.yandex.ru> <32484BE3-4087-4942-8169-55BC20A85C74@live555.com> Message-ID: <544621320808817@web83.yandex.ru> An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 8 20:04:36 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Nov 2011 20:04:36 -0800 Subject: [Live-devel] RTSPClient, raw udp streaming In-Reply-To: <544621320808817@web83.yandex.ru> References: <492121320725472@web2.yandex.ru> <32484BE3-4087-4942-8169-55BC20A85C74@live555.com> <544621320808817@web83.yandex.ru> Message-ID: <38F93270-0923-4D37-941D-987680E0891D@live555.com> > Please, see file OnDemandServerMediaSubsession.cpp > > if (clientRTCPPort.num() == 0) { > // We're streaming raw UDP (not RTP). Create a single groupsock: > NoReuse dummy; // ensures that we skip over ports that are already in use > for (serverPortNum = fInitialPortNum; ; ++serverPortNum) { > struct in_addr dummyAddr; dummyAddr.s_addr = 0; > > if the second port number was specified, you answer to client - Transport: RAW/RAW/UDP etc, but create RTPSink instead BasicUDPSink. > Server create BasicUDPSink _only_ if transport raw-udp requested and _one_ port specified. OK, I understand now what you're asking for. I find it strange, however, that a client would ask for "RAW/RAW/UDP", but also specify two port numbers in the "client_port=" part of the header. However, to support clients like this, please change line 714 of "liveMedia/RTSPServer.cpp" from clientRTCPPortNum = p2; to clientRTCPPortNum = streamingMode == RAW_UDP ? 0 : p2; This change will be included in the next version of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinitus.rockt at gmx.de Tue Nov 8 23:00:07 2011 From: tinitus.rockt at gmx.de (Simon Helwig) Date: Wed, 09 Nov 2011 08:00:07 +0100 Subject: [Live-devel] Compatibility with Visual Studio 10.0 In-Reply-To: References: <20111108135459.257830@gmx.net> Message-ID: <20111109070007.135660@gmx.net> Ok, it would be very nice, if I could use VS10. May you could tell me, what you did, to compile one of the test programs step by step, because I'm a beginner and it's long ago, since I've worked with VS and I got so much mistakes, that I think, that I've made a fundamental error before. In the thread "Compiling a test program", which I posted last friday, I explained the steps I've made to make one of the test programs running. I would be very grateful for your help! Best greetings Simon -------- Original-Nachricht -------- > Datum: Tue, 8 Nov 2011 15:41:23 +0000 > Von: James Norris > An: "LIVE555 Streaming Media - development & use" > Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > What are the problems? I'm sure I did manage it when using VS10, but > moved back to 9 now (its better) > > James > > > > On Tue, Nov 8, 2011 at 1:54 PM, Simon Helwig wrote: > > Hallo, > > > > could somebody tell me, if the whole libraries of LIVE555 Streaming > Media are compatible to Microsoft Visual Studio 10.0? Because I've many > problems, when I compile the libraries and it seems very strange. > > So if anybody knows an answer to this problem, please tell me and > perhaps you have an instruction for me, how to build these libraries step by step > in Visual Studio 10.0? > > I would be very happy about an answer... > > > > Best greetings > > > > Simon > > -- > > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! > > Jetzt informieren: http://www.gmx.net/de/go/freephone > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! Jetzt informieren: http://www.gmx.net/de/go/freephone From bsaintmartin at mediabroadcast-t.com Wed Nov 9 11:13:51 2011 From: bsaintmartin at mediabroadcast-t.com (Boris Saint-Martin) Date: Wed, 9 Nov 2011 20:13:51 +0100 Subject: [Live-devel] openRTSP beginner needs help Message-ID: <9C2BA9A8232241EF9E7E4CCE2700CC76@Boris> Hello, Beginner in openRTSP usage, I have many questions :D I'm trying to capture a RTSP stream from my internet modem. The URL is "rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1&service=202&flavour=hd". I can view the stream using VLC but not with openRTSP. Here is the log : Microsoft Windows [version 6.1.7601] Copyright (c) 2009 Microsoft Corporation. Tous droits r?serv?s. C:\Users\BobaL>C:\Users\BobaL\Documents\live\testProgs\openRTSP\Release\openRTSP .exe rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1&service=202&flavou r=hd Opening connection to 212.27.38.253, port 554... ...remote connection opened Sending request: OPTIONS rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace= 1 RTSP/1.0 CSeq: 2 User-Agent: C:\Users\BobaL\Documents\live\testProgs\openRTSP\Release\openRTSP.ex e (LIVE555 Streaming Media v2011.10.27) Received 127 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK Cseq: 2 Server: fbxrtspd/1.2 Freebox RTSP server Public: DESCRIBE, OPTIONS, SETUP, TEARDOWN, PLAY, PAUSE Sending request: DESCRIBE rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace =1 RTSP/1.0 CSeq: 3 User-Agent: C:\Users\BobaL\Documents\live\testProgs\openRTSP\Release\openRTSP.ex e (LIVE555 Streaming Media v2011.10.27) Accept: application/sdp Received 392 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK Cseq: 3 Server: fbxrtspd/1.2 Freebox RTSP server Public: DESCRIBE, OPTIONS, SETUP, TEARDOWN, PLAY, PAUSE Content-Length: 191 Content-Type: application/sdp Content-Language: fr v=0 o=leCDN 1320865755 1320865755 IN IP4 kapoueh.proxad.net s=unknown i=unknown c=IN IP4 0.0.0.0 t=0 0 m=video 0 RTP/AVP 33 a=control:rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1 Opened URL "rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1", returning a SDP description: v=0 o=leCDN 1320865755 1320865755 IN IP4 kapoueh.proxad.net s=unknown i=unknown c=IN IP4 0.0.0.0 t=0 0 m=video 0 RTP/AVP 33 a=control:rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1 Created receiver for "video/MP2T" subsession (client ports 65366-65367) Sending request: SETUP rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1 RTSP/1.0 CSeq: 4 User-Agent: C:\Users\BobaL\Documents\live\testProgs\openRTSP\Release\openRTSP.ex e (LIVE555 Streaming Media v2011.10.27) Transport: RTP/AVP;unicast;client_port=65366-65367 Received 146 new bytes of response data. Received a complete SETUP response: RTSP/1.0 500 Internal Server Error Cseq: 4 Server: fbxrtspd/1.2 Freebox RTSP server Public: DESCRIBE, OPTIONS, SETUP, TEARDOWN, PLAY, PAUSE Failed to setup "video/MP2T" subsession: 500 Internal Server Error 'service' n'est pas reconnu en tant que commande interne ou externe, un programme ex?cutable ou un fichier de commandes. 'flavour' n'est pas reconnu en tant que commande interne ou externe, un programme ex?cutable ou un fichier de commandes. An idea about the problem ? If I can connect openRTSP to this stream does it will create a file ? I read this in the usage "Extracting a single stream (to 'stdout')" but I don't understand how create a file... Otherwise a have Eneo IP camera but I can't found the http or rtsp stream url. Does anyone know this kind of camera ? Thank you all. Boris Saint-Martin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 9 22:19:01 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Nov 2011 14:19:01 +0800 Subject: [Live-devel] openRTSP beginner needs help In-Reply-To: <9C2BA9A8232241EF9E7E4CCE2700CC76@Boris> References: <9C2BA9A8232241EF9E7E4CCE2700CC76@Boris> Message-ID: <875C6CAB-888A-4104-8D5E-4F6F0FF7F2BB@live555.com> > I'm trying to capture a RTSP stream from my internet modem. > The URL is "rtsp://mafreebox.freebox.fr/fbxtv_pub/stream?namespace=1&service=202&flavour=hd". > I can view the stream using VLC but not with openRTSP. [...] > Received a complete SETUP response: > RTSP/1.0 500 Internal Server Error > Cseq: 4 > Server: fbxrtspd/1.2 Freebox RTSP server > Public: DESCRIBE, OPTIONS, SETUP, TEARDOWN, PLAY, PAUSE Only your server manufacturer can explain why it returns "500 Internal Server Error". However, you noted that you were able to play the stream using VLC. This suggests that the problem might be that your server does not handle requests to stream using RTP/UDP, but can handle requests to stream using RTP/TCP. (Note that VLC first tries requesting RTP/UDP, but, if that fails, then tries requesting RTP/TCP.) So, I suggest adding the "-t" option to "openRTSP", to request RTP/TCP streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From norris.j at gmail.com Wed Nov 9 01:42:11 2011 From: norris.j at gmail.com (James Norris) Date: Wed, 9 Nov 2011 09:42:11 +0000 Subject: [Live-devel] Compatibility with Visual Studio 10.0 In-Reply-To: <20111109070007.135660@gmx.net> References: <20111108135459.257830@gmx.net> <20111109070007.135660@gmx.net> Message-ID: Less excuses more throwing me bones. What are the problems specifically? It sounds like you're building the test programs, which I'm assuming means you've already built the libraries (in VS10)? If that's true it should link/work. James On Wed, Nov 9, 2011 at 7:00 AM, Simon Helwig wrote: > Ok, it would be very nice, if I could use VS10. May you could tell me, what you did, to compile one of the test programs step by step, because I'm a beginner and it's long ago, since I've worked with VS and I got so much mistakes, that I think, that I've made a fundamental error before. In the thread "Compiling a test program", which I posted last friday, I explained the steps I've made to make one of the test programs running. I would be very grateful for your help! > > Best greetings > > Simon > -------- Original-Nachricht -------- >> Datum: Tue, 8 Nov 2011 15:41:23 +0000 >> Von: James Norris >> An: "LIVE555 Streaming Media - development & use" >> Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > >> What are the problems? ?I'm sure I did manage it when using VS10, but >> moved back to 9 now (its better) >> >> James >> >> >> >> On Tue, Nov 8, 2011 at 1:54 PM, Simon Helwig wrote: >> > Hallo, >> > >> > could somebody tell me, if the whole libraries of LIVE555 Streaming >> Media are compatible to Microsoft Visual Studio 10.0? Because I've many >> problems, when I compile the libraries and it seems very strange. >> > So if anybody knows an answer to this problem, please tell me and >> perhaps you have an instruction for me, how to build these libraries step by step >> in Visual Studio 10.0? >> > I would be very happy about an answer... >> > >> > Best greetings >> > >> > Simon >> > -- >> > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! >> > Jetzt informieren: http://www.gmx.net/de/go/freephone >> > _______________________________________________ >> > live-devel mailing list >> > live-devel at lists.live555.com >> > http://lists.live555.com/mailman/listinfo/live-devel >> > >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > -- > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! > Jetzt informieren: http://www.gmx.net/de/go/freephone > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From norris.j at gmail.com Wed Nov 9 01:44:11 2011 From: norris.j at gmail.com (James Norris) Date: Wed, 9 Nov 2011 09:44:11 +0000 Subject: [Live-devel] New support for sending/receiving Vorbis audio and VP8 video, and streaming from WebM (.webm) files In-Reply-To: References: Message-ID: This is great, I would be quite interested in using! Do you know if there will need to be a VP8DiscreteVideoFramer class to support this (for instance if wanting to stream a live source via libvpx). Is it on the roadmap? James On Tue, Nov 8, 2011 at 8:32 PM, Ross Finlayson wrote: > FYI, the latest version (2011.11.08) of the "LIVE555 Streaming Media" > software implements the RTP payload formats for (sending and receiving) > Vorbis audio and VP8 video. > Our RTSP server implementation - including the "LIVE555 Media Server" and > "testOnDemandRTSPServer" applications - supports streaming from WebM > (".webm") files, which are a special kind of Matroska file that contain > Vorbis audio and VP8 video tracks. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From norris.j at gmail.com Wed Nov 9 03:17:31 2011 From: norris.j at gmail.com (James Norris) Date: Wed, 9 Nov 2011 11:17:31 +0000 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: References: Message-ID: Thanks Ross. The stream seems to be sending ok now, but I'm having some problems receiving. For as far as I can tell the stream appears to be correctly formatted, but neither VLC or OpenRTSP is able to create valid output from it. I'm pretty confident that the frames are encoded correctly, if I use a non-discrete framer with the startcodes attached, VLC can display the stream (but it's inefficient, dropping frames and the frameloader buffer keeps getting truncated here), anyway from the FAQ & to my knowledge the discrete framer is correct. I've put an avi of the stream here: http://www.cs.nott.ac.uk/~jzn/live/test.avi Which was generated using the command line: openRTSP.exe -4 -w 256 -h 64 -f 30 -d 10 rtsp://.... > test.avi I've also got a wireshark trace that was able to interpret the H264 stream (using an RTP filter & setting H264 protocol from preferences), here the NAL type order is shown as: SPS (7) PPS (8) IDR Picture (5) (6 packets) Slice (1) Slice (1) .... And in the footer of this e-mail there's a VLC log, it seems to buffer twice, but doesn't show any errors or display anything. Do you have any pointers for me? Realise this is a tricky one, but are there any 'common' or likely causes of these problems? Happy to provide whatever extra information might be helpful. Thanks again, James main debug: processing request item rtsp://127.0.0.1:49990/CamBlend node Playlist skip 0 main debug: resyncing on rtsp://127.0.0.1:49990/CamBlend main debug: rtsp://127.0.0.1:49990/CamBlend is at 2 main debug: starting new item main debug: creating new input thread main debug: Creating an input for 'rtsp://127.0.0.1:49990/CamBlend' main debug: thread (input) created at priority 1 (../.././src/input/input.c:220) main debug: thread started main debug: using timeshift granularity of 50 MiB main debug: using timeshift path 'C:\Users\jzn\AppData\Local\Temp' main debug: `rtsp://127.0.0.1:49990/CamBlend' gives access `rtsp' demux `' path `127.0.0.1:49990/CamBlend' main debug: creating demux: access='rtsp' demux='' path='127.0.0.1:49990/CamBlend' main debug: looking for access_demux module: 1 candidate live555 debug: RTP subsession 'video/H264' qt4 debug: IM: Setting an input main debug: selecting program id=0 live555 debug: setup start: 0.000000 stop:0.000000 live555 debug: We have a timeout of 60 seconds live555 debug: spawned timeout thread live555 debug: play start: 0.000000 stop:0.000000 main debug: using access_demux module "live555" main debug: TIMER module_need() : 27.000 ms - Total 27.000 ms / 1 intvls (Avg 27.000 ms) main debug: looking for decoder module: 34 candidates avcodec debug: libavcodec already initialized avcodec debug: trying to use direct rendering avcodec debug: ffmpeg codec (H264 - MPEG-4 AVC (part 10)) started main debug: using decoder module "avcodec" main debug: TIMER module_need() : 1.000 ms - Total 1.000 ms / 1 intvls (Avg 1.000 ms) main debug: looking for packetizer module: 21 candidates main debug: using packetizer module "packetizer_h264" main debug: TIMER module_need() : 0.000 ms - Total 0.000 ms / 1 intvls (Avg 0.000 ms) main debug: thread (decoder) created at priority 0 (../.././src/input/decoder.c:301) main debug: thread started main debug: looking for meta reader module: 2 candidates lua debug: Trying Lua scripts in \\keats\rapg$\jzn\Application Data\vlc\lua\meta\reader lua debug: Trying Lua scripts in C:\Program Files (x86)\vlc\lua\meta\reader lua debug: Trying Lua playlist script C:\Program Files (x86)\vlc\lua\meta\reader\filename.lua main debug: no meta reader module matching "any" could be loaded main debug: TIMER module_need() : 4.000 ms - Total 4.000 ms / 1 intvls (Avg 4.000 ms) main debug: `rtsp://127.0.0.1:49990/CamBlend' successfully opened main debug: Buffering 0% packetizer_h264 debug: found NAL_SPS (sps_id=0) main debug: Buffering 1% packetizer_h264 debug: found NAL_PPS (pps_id=0 sps_id=0) main debug: Buffering 4% main debug: Buffering 7% main debug: Buffering 9% main debug: Buffering 12% main debug: Buffering 15% main debug: Buffering 17% main debug: Buffering 20% main debug: Buffering 22% main debug: Buffering 25% main debug: Buffering 28% main debug: Buffering 30% main debug: Buffering 33% main debug: Buffering 36% main debug: Buffering 38% main debug: Buffering 41% main debug: Buffering 43% main debug: Buffering 46% main debug: Buffering 49% main debug: Buffering 51% main debug: Buffering 54% main debug: Buffering 56% main debug: Buffering 59% main debug: Buffering 62% main debug: Buffering 64% main debug: Buffering 67% main debug: Buffering 69% main debug: Buffering 72% main debug: Buffering 75% main debug: Buffering 77% main debug: Buffering 80% main debug: Buffering 82% main debug: Buffering 85% main debug: Buffering 88% main debug: Buffering 90% main debug: Buffering 93% main debug: Buffering 96% main debug: Buffering 98% main debug: Stream buffering done (1217 ms in 1218 ms) main debug: Decoder buffering done in 0 ms live555 debug: tk->rtpSource->hasBeenSynchronizedUsingRTCP() main debug: ES_OUT_RESET_PCR called main debug: Buffering 0% main debug: Buffering 2% main debug: Buffering 5% main debug: Buffering 7% main debug: Buffering 10% main debug: Buffering 13% main debug: Buffering 15% main debug: Buffering 18% main debug: Buffering 20% main debug: Buffering 23% main debug: Buffering 26% main debug: Buffering 28% main debug: Buffering 31% main debug: Buffering 33% main debug: Buffering 36% main debug: Buffering 39% main debug: Buffering 41% main debug: Buffering 44% main debug: Buffering 47% main debug: Buffering 49% main debug: Buffering 52% main debug: Buffering 54% main debug: Buffering 57% main debug: Buffering 60% main debug: Buffering 62% main debug: Buffering 65% main debug: Buffering 68% main debug: Buffering 70% main debug: Buffering 73% main debug: Buffering 75% main debug: Buffering 78% main debug: Buffering 81% main debug: Buffering 83% main debug: Buffering 86% main debug: Buffering 89% main debug: Buffering 91% main debug: Buffering 94% main debug: Buffering 97% main debug: Buffering 99% main debug: Stream buffering done (1227 ms in 1227 ms) main debug: Decoder buffering done in 0 ms On Tue, Nov 8, 2011 at 4:37 PM, Ross Finlayson wrote: > Should I schedule a deliverFrame (and FramedSource::afterGetting( ..) > for each NAL, i.e. making a queue of NALs and repeatedly scheduling > deliverFrame in order to send a single frame. > > Yes, a "H264VideoStreamDiscreteFramer" expects to be fed one NAL unit at a > time - *not* one frame at a time. > In fact, this is especially important for SPS and PPS NAL units, because > the?"H264VideoStreamDiscreteFramer" code recognizes and saves a copy of > those NAL units (for use in the stream's SDP 'config' string). > If you can, try to make the SPS and PPS NAL units the first NAL units that > come from your encoder, for each new stream. ?This is not essential (as long > as SPS and PPS NAL units appear eventually), but it will make the server > more efficient. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From czf86123 at gmail.com Wed Nov 9 04:41:50 2011 From: czf86123 at gmail.com (Zhaofei Chen) Date: Wed, 9 Nov 2011 13:41:50 +0100 Subject: [Live-devel] Question about buffer when streaming mp3 files and wav (raw pcm) files Message-ID: Hello, I've tried to use live media libraries with MPlayer for streaming audio files over RTSP and it is successful. However there's something that I'm not sure with. When a wav file or raw pcm audio data is streamed, I found that a buffer will be used to preload some data (few seconds, related with audio file's bitrate) before MPlayer starts playing. And if it is a mp3 file, there is no preloading. I would like to ask what this buffer could be and why it is used only for some specific formats. Anyone can give me some advice? Thanks! Zhaofei Chen -------------- next part -------------- An HTML attachment was scrubbed... URL: From amir.yung at gmail.com Thu Nov 10 02:31:36 2011 From: amir.yung at gmail.com (Amir Yungman) Date: Thu, 10 Nov 2011 12:31:36 +0200 Subject: [Live-devel] RTSP to iPhone iPad and so... In-Reply-To: <4eba47c1.8bcbe30a.48b2.ffffbb78@mx.google.com> References: <4eba47c1.8bcbe30a.48b2.ffffbb78@mx.google.com> Message-ID: <4ebba761.8307df0a.17c0.fffffc3f@mx.google.com> Hello, I've read that iPhone does not support the native RTP/RTSP protocol. Is it so? how to stream it a video without writing a special app on the iPhone. I've implemented RTSP server and it is working with any "normal" RTSP client like RealPlayer/VLC and so. Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 10 08:28:10 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Nov 2011 00:28:10 +0800 Subject: [Live-devel] New support for sending/receiving Vorbis audio and VP8 video, and streaming from WebM (.webm) files In-Reply-To: References: Message-ID: <3EF11A5C-97BE-4A1F-8610-F96D4190085E@live555.com> > This is great, I would be quite interested in using! > > Do you know if there will need to be a VP8DiscreteVideoFramer class to > support this (for instance if wanting to stream a live source via > libvpx). No, you should be able to feed VP8 frames (one at a time) directly into a "VP8VideoRTPSink", with no separate 'framer' object required. (Just make sure that your VP8 frames have proper presentation times.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 10 08:41:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Nov 2011 00:41:31 +0800 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: References: Message-ID: > I've put an avi of the stream here: > > http://www.cs.nott.ac.uk/~jzn/live/test.avi > > Which was generated using the command line: > > openRTSP.exe -4 -w 256 -h 64 -f 30 -d 10 rtsp://.... > test.avi This is incorrect. The "-4" option generates a ".mp4"-format file, not an AVI file. In any case, I suggest not using this option (or "-i"). Instead, don't use any options at all (other than "-d "). You will get a raw H.264 video output file. Rename this as "test.h264", and check whether or not VLC will play it. If VLC can't play your file, then put it on a web server, and send us the URL, so I can download and look at it myself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 10 08:46:05 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Nov 2011 00:46:05 +0800 Subject: [Live-devel] Question about buffer when streaming mp3 files and wav (raw pcm) files In-Reply-To: References: Message-ID: <51E835D4-67A3-4ADA-BE64-EAA15B29DC1B@live555.com> > However there's something that I'm not sure with. When a wav file or raw pcm audio data is streamed, I found that a buffer will be used to preload some data (few seconds, related with audio file's bitrate) before MPlayer starts playing. This is a MPlayer issue; there is no such buffering within our libraries. In any case, I suggest using VLC rather than MPlayer. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinitus.rockt at gmx.de Thu Nov 10 08:41:51 2011 From: tinitus.rockt at gmx.de (Simon Helwig) Date: Thu, 10 Nov 2011 17:41:51 +0100 Subject: [Live-devel] Compatibility with Visual Studio 10.0 In-Reply-To: References: <20111108135459.257830@gmx.net> <20111109070007.135660@gmx.net> Message-ID: <20111110164151.155430@gmx.net> Yes, that's correct. I've already built the libraries in VS10. First I downloaded the package (live555-latest.tar.gz 01-Nov-2011 19:01 508K) and extracted the files in a new folder on my Desktop. Then I changed the path of the 'tools' directory in the file "win32config". The new path I entered was C:\Programme\Microsoft Visual Studio 10.0\VC Afterwards I opened the commando line and went to the new folder, where I extracted the files of the package. Then I generated the .mak files with the commando "genWindowsMakefiles". In addition to that I opened a new "Makefile Project" in Visual Studio. In this project I imported all the *.mak files I generated before. There are four files: BasicUsageEnviornment.mak, groupsok.mak, liveMedia.mak,UsageEnviornment.mak and testProgs.mak. Then I built successively each of the four files (projects) first. Now I opened C++ project and took one of the test programms (testMPEG2TransportStreamer.cpp) and copied the code in it. Thereafter I tried to compile the code, but then I got a lot of mistakes. Are my steps I explained correct? Do you still want to know the exact wording of the error messages? Because I'm now sitting on the wrong computer, so I could tell you the wording of the mistakes only on Monday, because then I'm again at work and there are so many errors, that I'm not able to tell you one of them now. Sorry for that, but maybe you can already narrow the problem? Best greetings from Germany Simon -------- Original-Nachricht -------- > Datum: Wed, 9 Nov 2011 09:42:11 +0000 > Von: James Norris > An: "LIVE555 Streaming Media - development & use" > Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > Less excuses more throwing me bones. What are the problems > specifically? It sounds like you're building the test programs, which > I'm assuming means you've already built the libraries (in VS10)? If > that's true it should link/work. > > James > > > > On Wed, Nov 9, 2011 at 7:00 AM, Simon Helwig wrote: > > Ok, it would be very nice, if I could use VS10. May you could tell me, > what you did, to compile one of the test programs step by step, because I'm > a beginner and it's long ago, since I've worked with VS and I got so much > mistakes, that I think, that I've made a fundamental error before. In the > thread "Compiling a test program", which I posted last friday, I explained > the steps I've made to make one of the test programs running. I would be very > grateful for your help! > > > > Best greetings > > > > Simon > > -------- Original-Nachricht -------- > >> Datum: Tue, 8 Nov 2011 15:41:23 +0000 > >> Von: James Norris > >> An: "LIVE555 Streaming Media - development & use" > > >> Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > > > >> What are the problems? ?I'm sure I did manage it when using VS10, but > >> moved back to 9 now (its better) > >> > >> James > >> > >> > >> > >> On Tue, Nov 8, 2011 at 1:54 PM, Simon Helwig > wrote: > >> > Hallo, > >> > > >> > could somebody tell me, if the whole libraries of LIVE555 Streaming > >> Media are compatible to Microsoft Visual Studio 10.0? Because I've many > >> problems, when I compile the libraries and it seems very strange. > >> > So if anybody knows an answer to this problem, please tell me and > >> perhaps you have an instruction for me, how to build these libraries > step by step > >> in Visual Studio 10.0? > >> > I would be very happy about an answer... > >> > > >> > Best greetings > >> > > >> > Simon > >> > -- > >> > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! > >> > Jetzt informieren: http://www.gmx.net/de/go/freephone > >> > _______________________________________________ > >> > live-devel mailing list > >> > live-devel at lists.live555.com > >> > http://lists.live555.com/mailman/listinfo/live-devel > >> > > >> > >> _______________________________________________ > >> live-devel mailing list > >> live-devel at lists.live555.com > >> http://lists.live555.com/mailman/listinfo/live-devel > > > > -- > > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! > > Jetzt informieren: http://www.gmx.net/de/go/freephone > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de From finlayson at live555.com Thu Nov 10 12:26:13 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Nov 2011 04:26:13 +0800 Subject: [Live-devel] RTSP to iPhone iPad and so... In-Reply-To: <4ebba761.8307df0a.17c0.fffffc3f@mx.google.com> References: <4eba47c1.8bcbe30a.48b2.ffffbb78@mx.google.com> <4ebba761.8307df0a.17c0.fffffc3f@mx.google.com> Message-ID: <85CC2142-DCDB-482D-8DDA-E4524CD87259@live555.com> > I've read that iPhone does not support the native RTP/RTSP protocol. > Is it so? Yes, that's true. > how to stream it a video without writing a special app on the iPhone. Our 'RTSP server' implementation supports - in addition to RTSP - the "HTTP Live Streaming" protocol that Apple uses to stream to Safari on iPhones and iPads. I.e., it uses the HTTP protocol, not RTSP - even though it's part of the "RTSPServer". For details, see http://www.live555.com/mediaServer/#http-live-streaming Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From live555 at timshackleton.com Thu Nov 10 12:28:11 2011 From: live555 at timshackleton.com (Tim J Shackleton) Date: Fri, 11 Nov 2011 09:28:11 +1300 Subject: [Live-devel] Opportunity for unpredictable behaviour in MPEG2TransportStreamFramer In-Reply-To: <4E9C8CB6.8040906@timshackleton.com> References: <4E9B5B29.9040002@timshackleton.com> <2C8117BE-A9E2-4276-AE81-4127A131B108@live555.com> <4E9C8CB6.8040906@timshackleton.com> Message-ID: <4EBC33DB.6010004@timshackleton.com> On 18/10/11 09:14, Tim J Shackleton wrote: > On 17/10/11 17:54, Ross Finlayson wrote: >> Yes, but are you sure that a wrap-around is, in fact happening (and >> is the cause of your problem)? A rough calculation shows that - at 10 >> Mbps - the "fTSPacketCount" variable (which counts 188-byte Transport >> Stream 'packets') will wrap around in around 7 days. But anyway, if >> this is a plausible occurrence, then it's probably something we >> should allow for. > > Fantastic. I am going to run side by side binaries, one with the > existing unsigned, and one with the u_int64_t. I have been playing > out content with a mux rate of 8,000kbit/sec and I was seeing the > problem occur after about a week and a half and by my math that seems > to agree with a calculated value of 9.3 days. Hi Ross, My test was thwarted by an unexpected shortage of electrons about halfway through, but has now completed. I have run both a task based on the current live555 source, and a task with a modified source tree using u_int64_t side by side, playing out a loop of MPEG2 content that never closes or ends via a DeviceSource. The task that isn't using 64bit integers has, as expected, lost it's mind and is spending a lot of time blocking and using every scrap of CPU available as it tries to play out the MPEG at a ridiculously high rate. The modified version is still working as per normal. The only modifications I have made are as follows: In MPEG2TransportStreamFramer.cpp, class 'PIDStatus' I have changed lastPacketNum to u_int64_t. In include/MPEG2TransportStreamFramer.hh I have changed tsPacketCount, fTSPacketCount and fTSPCRCount types to u_int64_t. Regards, Tim Shackleton From finlayson at live555.com Thu Nov 10 14:11:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Nov 2011 06:11:31 +0800 Subject: [Live-devel] Opportunity for unpredictable behaviour in MPEG2TransportStreamFramer In-Reply-To: <4EBC33DB.6010004@timshackleton.com> References: <4E9B5B29.9040002@timshackleton.com> <2C8117BE-A9E2-4276-AE81-4127A131B108@live555.com> <4E9C8CB6.8040906@timshackleton.com> <4EBC33DB.6010004@timshackleton.com> Message-ID: Thanks for the testing. These changes will appear in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yuri.timenkov at itv.ru Thu Nov 10 20:04:23 2011 From: yuri.timenkov at itv.ru (Yuri Timenkov) Date: Fri, 11 Nov 2011 08:04:23 +0400 Subject: [Live-devel] Basic HTTP authentication in liveMedia Message-ID: <4EBC9EC7.6090808@itv.ru> Dear Ross, In our server app we already have user database with encrypted (one-way) passwords. So only chance to support authentication for current users is use basic (not digest) authentication scheme. Is it possible to implement it in current liveMedia releases without modifying existing code? Regards, Yuri From finlayson at live555.com Fri Nov 11 00:21:08 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Nov 2011 16:21:08 +0800 Subject: [Live-devel] Basic HTTP authentication in liveMedia In-Reply-To: <4EBC9EC7.6090808@itv.ru> References: <4EBC9EC7.6090808@itv.ru> Message-ID: > In our server app we already have user database with encrypted (one-way) passwords. So only chance to support authentication for current users is use basic (not digest) authentication scheme. > > Is it possible to implement it in current liveMedia releases without modifying existing code? Yes, our RTSP client implementation should already be able to handle servers that do basic authentication. (You can verify this by running "openRTSP" on one of your server's streams.) Our RTSP server implementation, however, supports digest authentication only (or else no authentication at all). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From giac.dinh at l-3com.com Fri Nov 11 08:36:11 2011 From: giac.dinh at l-3com.com (giac.dinh at l-3com.com) Date: Fri, 11 Nov 2011 11:36:11 -0500 Subject: [Live-devel] old source code Message-ID: <910E82F60F64E34D8388CAFCE988965B027D5C2F1A@mail> Does anyone have old source code release on 2007-10-31? Giac -------------- next part -------------- An HTML attachment was scrubbed... URL: From kevinhilman at hotmail.com Thu Nov 10 19:49:35 2011 From: kevinhilman at hotmail.com (HilmanKevin) Date: Fri, 11 Nov 2011 03:49:35 +0000 Subject: [Live-devel] about DeviceSource::eventTriggerId Message-ID: hi,kind expert i want to use DeviceSource.i read code of DeviceSource,DeviceSource is incomplete.i have a trouble to use it. here in DeviceSource.cpp void DeviceSource::signalNewFrameData() {//i define signalNewFrameData as a public member function of DeviceSource class in my test TaskScheduler* ourScheduler = NULL; //%%% TO BE WRITTEN %%% DeviceSource* ourDevice = NULL; //%%% TO BE WRITTEN %%% if (ourScheduler != NULL) { // sanity check ourScheduler->triggerEvent(DeviceSource::eventTriggerId, ourDevice); } } when i call myDeviceSource->signalNewFrameData from seperate thread,what happened in DeviceSource after ourScheduler->triggerEvent(DeviceSource::eventTriggerId, ourDevice); ? that is ,which function in DeviceSource would be excute after ourScheduler->triggerEvent(DeviceSource::eventTriggerId, ourDevice);? any light or any example are much appreciated. thanks a lot. -------------- next part -------------- An HTML attachment was scrubbed... URL: From norris.j at gmail.com Fri Nov 11 00:02:27 2011 From: norris.j at gmail.com (James Norris) Date: Fri, 11 Nov 2011 08:02:27 +0000 Subject: [Live-devel] Compatibility with Visual Studio 10.0 In-Reply-To: <20111110164151.155430@gmx.net> References: <20111108135459.257830@gmx.net> <20111109070007.135660@gmx.net> <20111110164151.155430@gmx.net> Message-ID: Hey, On Thu, Nov 10, 2011 at 4:41 PM, Simon Helwig wrote: > Yes, that's correct. I've already built the libraries in VS10. > First I downloaded the package (live555-latest.tar.gz ? 01-Nov-2011 19:01 ?508K) and extracted the files in a new folder on my Desktop. Then I changed the path of the 'tools' directory in the file "win32config". The new path I entered was C:\Programme\Microsoft Visual Studio 10.0\VC > Afterwards I opened the commando line and went to the new folder, where I extracted the files of the package. Then I generated the .mak files with the commando "genWindowsMakefiles". > In addition to that I opened a new "Makefile Project" in Visual Studio. In this project I imported all the *.mak files I generated before. There are four files: BasicUsageEnviornment.mak, groupsok.mak, liveMedia.mak,UsageEnviornment.mak and testProgs.mak. You've listed 5 (not 4) make files here, and if you build the testProgs.mak as you have done with the other 4, you should get a testMPEG2...exe. Do that and let us know what the *specific* errors are if any appear. > Then I built successively each of the four files (projects) first. > Now I opened C++ project and took one of the test programms (testMPEG2TransportStreamer.cpp) and copied the code in it. > Thereafter I tried to compile the code, but then I got a lot of mistakes. No need for all this, testProgs.mak builds it > > Are my steps I explained correct? > Do you still want to know the exact wording of the error messages? ?Because I'm now sitting on the wrong computer, so I could tell you the wording of the mistakes only on Monday, because then I'm again at work and there are so many errors, that I'm not able to tell you one of them now. Sorry for that, but maybe you can already narrow the problem? > > Best greetings from Germany > > Simon > > -------- Original-Nachricht -------- >> Datum: Wed, 9 Nov 2011 09:42:11 +0000 >> Von: James Norris >> An: "LIVE555 Streaming Media - development & use" >> Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > >> Less excuses more throwing me bones. ?What are the problems >> specifically? ?It sounds like you're building the test programs, which >> I'm assuming means you've already built the libraries (in VS10)? ?If >> that's true it should link/work. >> >> James >> >> >> >> On Wed, Nov 9, 2011 at 7:00 AM, Simon Helwig wrote: >> > Ok, it would be very nice, if I could use VS10. May you could tell me, >> what you did, to compile one of the test programs step by step, because I'm >> a beginner and it's long ago, since I've worked with VS and I got so much >> mistakes, that I think, that I've made a fundamental error before. In the >> thread "Compiling a test program", which I posted last friday, I explained >> the steps I've made to make one of the test programs running. I would be very >> grateful for your help! >> > >> > Best greetings >> > >> > Simon >> > -------- Original-Nachricht -------- >> >> Datum: Tue, 8 Nov 2011 15:41:23 +0000 >> >> Von: James Norris >> >> An: "LIVE555 Streaming Media - development & use" >> >> >> Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 >> > >> >> What are the problems? ?I'm sure I did manage it when using VS10, but >> >> moved back to 9 now (its better) >> >> >> >> James >> >> >> >> >> >> >> >> On Tue, Nov 8, 2011 at 1:54 PM, Simon Helwig >> wrote: >> >> > Hallo, >> >> > >> >> > could somebody tell me, if the whole libraries of LIVE555 Streaming >> >> Media are compatible to Microsoft Visual Studio 10.0? Because I've many >> >> problems, when I compile the libraries and it seems very strange. >> >> > So if anybody knows an answer to this problem, please tell me and >> >> perhaps you have an instruction for me, how to build these libraries >> step by step >> >> in Visual Studio 10.0? >> >> > I would be very happy about an answer... >> >> > >> >> > Best greetings >> >> > >> >> > Simon >> >> > -- >> >> > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! >> >> > Jetzt informieren: http://www.gmx.net/de/go/freephone >> >> > _______________________________________________ >> >> > live-devel mailing list >> >> > live-devel at lists.live555.com >> >> > http://lists.live555.com/mailman/listinfo/live-devel >> >> > >> >> >> >> _______________________________________________ >> >> live-devel mailing list >> >> live-devel at lists.live555.com >> >> http://lists.live555.com/mailman/listinfo/live-devel >> > >> > -- >> > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! >> > Jetzt informieren: http://www.gmx.net/de/go/freephone >> > _______________________________________________ >> > live-devel mailing list >> > live-devel at lists.live555.com >> > http://lists.live555.com/mailman/listinfo/live-devel >> > >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > -- > Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir > belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Sat Nov 12 07:03:05 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 12 Nov 2011 23:03:05 +0800 Subject: [Live-devel] about DeviceSource::eventTriggerId In-Reply-To: References: Message-ID: <3FB2D320-D2D0-4D00-BFDF-E99B2DB181F6@live555.com> > when i call myDeviceSource->signalNewFrameData from seperate thread,what happened in DeviceSource after ourScheduler->triggerEvent(DeviceSource::eventTriggerId, ourDevice); ? that is ,which function in DeviceSource would be excute after ourScheduler->triggerEvent(DeviceSource::eventTriggerId, ourDevice);? The function that you gave - as a parameter - to the call to "createEventTrigger()" that returned "DeviceSource::eventTriggerId". In the "DeviceSource.cpp" code, this is the function "deliverFrame0()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Mon Nov 14 04:06:49 2011 From: isambhav at gmail.com (Sambhav) Date: Mon, 14 Nov 2011 17:36:49 +0530 Subject: [Live-devel] OnDemand Server application Callbacks Message-ID: Hi, I have an application with live555 server (OnDemand) integrated. How can this application know when clients sends PLAY and TEARDOWN requests ? Is there any application level callbacks for the same ? Regards, Sambhav -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 14 04:41:18 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Nov 2011 20:41:18 +0800 Subject: [Live-devel] OnDemand Server application Callbacks In-Reply-To: References: Message-ID: <2D1A7786-7D43-4AFF-9FCA-4B2847276C51@live555.com> > I have an application with live555 server (OnDemand) integrated. > How can this application know when clients sends PLAY and TEARDOWN requests ? > Is there any application level callbacks for the same ? Yes, the way to do this is to subclass "RTSPServer", and - in your subclass - redefine the virtual functions "handleCmd_PLAY()" and "handleCmd_TEARDOWN()". Your subclassed versions of these functions can do whatever they like, and then call the original, base-class version (defined for "RTSPServer"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From emile.semmes at e6group.com Mon Nov 14 18:02:38 2011 From: emile.semmes at e6group.com (Emile Semmes) Date: Mon, 14 Nov 2011 18:02:38 -0800 Subject: [Live-devel] Stopping an RTP stream safely/Deleting objects Message-ID: <4EC1C83E.2090403@e6group.com> Hi all, I'm streaming an MPEG-2 TS file using code very similar to testMPEG2TransportStreamer.cpp. In one of my use cases, I need to stop the stream immediately instead of allowing it to complete. I'm doing this currently by passing a watch variable to the taskScheduler().doEventLoop() call and allowing another thread to set that variable when I need to stop playback. My question is what do I need to do after that to safely stop and delete my related objects (UsageEnvironment, RTPSink, etc) if I leave the doEventLoop() early? I'm assuming calling Medium::close() on the object created by MPEG2TransportStreamFramer::createNew() is correct, but I'm not sure what I should call to destroy the object created by SimpleRTPSink::createNew(). The destructor is protected so delete is forbidden. Also, should I make a call to RTPSink::stopPlaying() and if so, when? I'd like to reuse the RTPsink but give it different parameters on a subsequent use. Thanks, Emile -- -- Emile Semmes Software Engineer e6 Group, LLC www.e6group.com From finlayson at live555.com Mon Nov 14 19:33:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Nov 2011 11:33:53 +0800 Subject: [Live-devel] Stopping an RTP stream safely/Deleting objects In-Reply-To: <4EC1C83E.2090403@e6group.com> References: <4EC1C83E.2090403@e6group.com> Message-ID: > I'm streaming an MPEG-2 TS file using code very similar to testMPEG2TransportStreamer.cpp. In one of my use cases, I need to stop the stream immediately instead of allowing it to complete. I'm doing this currently by passing a watch variable to the taskScheduler().doEventLoop() call and allowing another thread to set that variable when I need to stop playback. FYI (and this is unrelated to your question below), you might find it simpler to use the new 'event trigger' mechanism instead; see "UsageEnvironment/include/UsageEnvironment.hh" > My question is what do I need to do after that to safely stop and delete my related objects (UsageEnvironment, RTPSink, etc) if I leave the doEventLoop() early? First, you can always just call "exit(0)" to leave and destroy the process (i.e., address space, i.e., application). The OS will take care of closing/reclaiming its sockets. This is by far the simplest thing to do, and it's what you should do - unless you have a very good reason for wanting to keep the process around. But, if you really want to keep the process around, you can reclaim the LIVE555 objects by doing the following (generally speaking, you're reclaiming objects in the reverse order from the order that you created them): videoSink->stopPlaying(); Medium::close(videoSource); Medium::close(videoSink); delete rtpGroupsock; delete rtcpGroupsock; env->reclaim(); delete scheduler; (I'm basing this on the "testMPEG2TransportStreamer" code, because you said you used that code as a model.) > I'd like to reuse the RTPsink but give it different parameters on a subsequent use. No, you can't do that. Just delete it (using "Medium::close()"), and create a new one next time. But again, why not just exit the process (application), and launch a new one next time? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kvishnath at ymail.com Tue Nov 15 01:12:33 2011 From: kvishnath at ymail.com (Viswanatha Reddy) Date: Tue, 15 Nov 2011 14:42:33 +0530 (IST) Subject: [Live-devel] RTSP Server Crash with Milestone Client Message-ID: <1321348353.68250.YahooMailNeo@web95012.mail.in2.yahoo.com> Dear Ross, Thanks for your reply. I collected the debug information while running with Milestone client and VLC client. The snapshot of the crash pointing the crash inside the Server is also given. Please find the attachment, which contains all the debug info. Thank you, K.Viswanatha Reddy, Bangalore, India ? Wed, 02 Nov 2011 12:48:43 -0700 > I have developed an (mpeg4)RTSP streamer(server) using live555 libs. It's > unicast based on OnDemandServer example module. > Thanks for your priceless contribution to the media community. > > The application takes rgb frames from a buffer, encodes it, and passes it to > the derived FramedSource module. > I ran without any issues with vlc and mplayer client for weeks together. > > The application was now tested with Milestone Client(used Onvif standard to > make the Streamer to be seen as a virtual camera by this client). > But it's crashing frequently inside the eventloop. > I thoroughly checked whether it's crashing in my development. It's not. > Surely it's crashing at the step of eventloop. > This never happens with vlc/mplayer !!. Unfortunately, because the problem seems to happen only with your custom server, we can't reproduce it ourselves, so you're going to have to identify precisely where - in our code - your server is crashing, and why. (BTW, almost everything within a LIVE555 application happens "inside the event loop", so that's probably not significant.) For starters, I suggest turning on debugging printing in the "RTSPServer" code by adding #define DEBUG 1 near the top of "liveMedia/RTSPServer.cpp", and recompiling. This should help tell you what's going wrong. It's conceivable that your new client has somehow (perhaps by using slightly different RTSP command syntax) uncovered an error in our server code - in which case I'd be very interested in discovering this. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ross.rar Type: application/octet-stream Size: 78919 bytes Desc: not available URL: From raffin at ermes-cctv.com Tue Nov 15 04:02:56 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Tue, 15 Nov 2011 13:02:56 +0100 Subject: [Live-devel] VLC needs to send the PLAY command twice Message-ID: <4EC254F0.8020700@ermes-cctv.com> Dear developer(s), thank you for this great code. I am developing an IP camera which streams H.264 video via rtp in UNICAST. I use VLC to receive the stream. At the beginning I used the multicast SSM and all was fine. Then I implemented the unicast as described in the FAQ starting from the testOnDemandRTSPServer.cpp. Here I noticed that when using VLC under linux the stream was quite slow to start while with the VLC windows version it did not start at all. So looking at the session init, I noticed that the session never start after the first PLAY but VLC needs to close the first session and call PLAY again. When I start playing all seems OK, I see the the OPTIONS,DESCRIBE, SETUP and PLAY from the VLC client. After the PLAY the source always replies with the OK. Then, immediatly after this *first *PLAY the VLC client sends a GET_PARAMETER request, the server answers but the streams does not starts and after about 10s the client sends a TEARDOWN and closes the session (in the middle I see a couple of RTCP packets). After the *second *PLAY, the GET_PARAMETER request is not sent and the stream starts immediatly. I attached the packet description I get with Wireshark. Some idea or suggestion about the possible cause of this strange behaviour will be very appeciated. Thank you. M. Raffin -- *Mario Raffin * Development Engineer Ermes Elettronica s.r.l. via Treviso, 36 31020 San Vendemiano (TV) Italy (Map ) Mail: raffin at ermes-cctv.com Web:http://www.ermes-cctv.com/ /Tel. +39 0438 308470/ /Fax. +39 0438 492340/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MARCHIO ERMES.jpg Type: image/jpeg Size: 3980 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: rtpsessionscollapsed2.zip Type: application/x-zip-compressed Size: 1601 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From sebastien-devel at celeos.eu Tue Nov 15 04:42:11 2011 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Tue, 15 Nov 2011 13:42:11 +0100 Subject: [Live-devel] VLC needs to send the PLAY command twice In-Reply-To: <4EC254F0.8020700@ermes-cctv.com> References: <4EC254F0.8020700@ermes-cctv.com> Message-ID: <1321360931.25208.7.camel@stim-desktop> On Tue, 2011-11-15 at 13:02 +0100, Mario Raffin wrote:. > When I start playing all seems OK, I see the the OPTIONS,DESCRIBE, > SETUP and PLAY from the VLC client. After the PLAY the source always > replies with the OK. Then, immediatly after this first PLAY the VLC > client sends a GET_PARAMETER request, the server answers but the > streams does not starts and after about 10s the client sends a > TEARDOWN and closes the session (in the middle I see a couple of RTCP > packets). > After the second PLAY, the GET_PARAMETER request is not sent and the > stream starts immediatly. Look at vlc logs : that's because vlc does not receive any data and fallback to TCP. From finlayson at live555.com Tue Nov 15 05:03:58 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Nov 2011 21:03:58 +0800 Subject: [Live-devel] RTSP Server Crash with Milestone Client In-Reply-To: <1321348353.68250.YahooMailNeo@web95012.mail.in2.yahoo.com> References: <1321348353.68250.YahooMailNeo@web95012.mail.in2.yahoo.com> Message-ID: <2A3D9DE1-F378-4D89-8B6F-F79A1C64316E@live555.com> I can't figure out exactly why your server is crashing with the "Milestone" client, but it appears to be related to the fact that this client is behaving in a very strange and non-standard way. For some reason, it is sending two different copies of each "DESCRIBE" request. In particular, it is sending, in order: - a RTSP "DESCRIBE" command, with CSeq 1. The server (correctly) responds to this with a "401 Unauthorized" response. - a different RTSP "DESCRIBE", but also with CSeq 1. This is wrong. The server also responds to this with "401 Unauthorized" - a RTSP "DESCRIBE" request, with CSeq 2, that follows up to the first "DESCRIBE", including proper cryptographic credentials. - a different RTSP "DESCRIBE" request, with CSeq 2, that follows up to the second "DESCRIBE", including proper cryptographic credentials. Again, this is wrong. I.e., this "Milestone" client is behaving in an incorrect, non-standard way. That doesn't excuse the server crashing; even incorrect client behavior should never cause the server to crash. However, you should contact your client's manufacturer, asking them to fix their client. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 15 05:32:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Nov 2011 21:32:23 +0800 Subject: [Live-devel] VLC needs to send the PLAY command twice In-Reply-To: <1321360931.25208.7.camel@stim-desktop> References: <4EC254F0.8020700@ermes-cctv.com> <1321360931.25208.7.camel@stim-desktop> Message-ID: <74549F0B-D571-4578-BC3A-86904CE301D7@live555.com> > Look at vlc logs : that's because vlc does not receive any data and > fallback to TCP. Yes, you have a firewall somewhere - between your server and client - that is blocking UDP traffic. That's why VLC, after failing to receive any RTP/UDP packets, retries, this time requesting RTP/TCP. You can get rid of this delay by having VLC request RTP/TCP only, or by fixing your firewall. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Charles.Baudouin at cobham.com Tue Nov 15 08:57:10 2011 From: Charles.Baudouin at cobham.com (Baudouin, Charles) Date: Tue, 15 Nov 2011 16:57:10 +0000 Subject: [Live-devel] unsubsribe Message-ID: <2AA7930B7DFE3B49A6CB2D982917C1DC09078CE527@NHC0-PUR-MXC001.purple.cobham.local> This E-mail and any files transmitted with it ("E-mail") is intended solely for the addressee(s) and may contain confidential and/or legally privileged information. If you are not the addressee(s), any disclosure, reproduction, copying, distribution or other use of the E-mail is prohibited. If you have received this E-mail in error, please delete it and notify the sender immediately via our switchboard or return e-mail. Neither the company nor any subsidiary or affiliate or associated company nor any individual sending this E-mail accepts any liability in respect of the content (including errors and omissions) nor shall this e-mail be deemed to enter the company or any subsidiary or affiliate or associated company into a contract or to create any legally binding obligations unless expressly agreed to in writing under separate cover and timeliness of the E-mail which arise as a result of transmission. If verification is required, please request a hard copy version from the sender. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Mon Nov 14 13:14:18 2011 From: tayeb.dotnet at gmail.com (Meftah Tayeb) Date: Mon, 14 Nov 2011 23:14:18 +0200 Subject: [Live-devel] RTSP DVB streaming Message-ID: hello folks, i'm new to this list:) please can someone tel me how to stream dvb-S channels using RTSP? thank you Meftah Tayeb IT Consulting http://www.tmvoip.com/ phone: +21321656139 Mobile: +213660347746 __________ Information from ESET NOD32 Antivirus, version of virus signature database 6633 (20111115) __________ The message was checked by ESET NOD32 Antivirus. http://www.eset.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Tue Nov 15 16:41:25 2011 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Wed, 16 Nov 2011 01:41:25 +0100 Subject: [Live-devel] Relaying transport Stream UDP source to an RTPsink Message-ID: Hi, I want to know if is it possible to relay a transport stream streamed from a source using udp to a destination using RTP. The chain is : UDPSource> (TransportStream(H264 + AAC)) to be relayed to >RTPSink. Is it possible to setup an RTSP server serving this RTP stream. Thanks. -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hana-uma.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 15 17:31:24 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Nov 2011 09:31:24 +0800 Subject: [Live-devel] Relaying transport Stream UDP source to an RTPsink In-Reply-To: References: Message-ID: <2C5B91AC-2157-4569-95C5-10A185934305@live555.com> > I want to know if is it possible to relay a transport stream streamed from a source using udp to a destination using RTP. The chain is : > UDPSource> (TransportStream(H264 + AAC)) to be relayed to >RTPSink. Is it possible to setup an RTSP server serving this RTP stream. Yes, you can use the "testOnDemandRTSPServer" application code as a model for this. You can define your own subclass of "OnDemandServerMediaSubsession", and redefine the "createNewStreamSource()" virtual function to return a "BasicUDPSource", fed into a "MPEG2TransportStreamFramer" object. See http://www.live555.com/liveMedia/faq.html#liveInput-unicast Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinitus.rockt at gmx.de Tue Nov 15 23:10:45 2011 From: tinitus.rockt at gmx.de (Simon Helwig) Date: Wed, 16 Nov 2011 08:10:45 +0100 Subject: [Live-devel] Compatibility with Visual Studio 10.0 In-Reply-To: References: <20111108135459.257830@gmx.net> <20111109070007.135660@gmx.net> <20111110164151.155430@gmx.net> Message-ID: <20111116071045.285650@gmx.net> Hey, I tried to built the .mak files, but I don't get an .exe file. Maybe I started the Makefile project in a wrong way, because when you create such a project, you will be asked, to set some settings for the Debug configuration and the release configuration, but I have omitted these steps and let the lines empty. This could be the mistake, or it doesn't matter? Best greetings Simon -------- Original-Nachricht -------- > Datum: Fri, 11 Nov 2011 08:02:27 +0000 > Von: James Norris > An: "LIVE555 Streaming Media - development & use" > Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > Hey, > > On Thu, Nov 10, 2011 at 4:41 PM, Simon Helwig > wrote: > > Yes, that's correct. I've already built the libraries in VS10. > > First I downloaded the package (live555-latest.tar.gz ? 01-Nov-2011 > 19:01 ?508K) and extracted the files in a new folder on my Desktop. Then I > changed the path of the 'tools' directory in the file "win32config". The new > path I entered was C:\Programme\Microsoft Visual Studio 10.0\VC > > Afterwards I opened the commando line and went to the new folder, where > I extracted the files of the package. Then I generated the .mak files with > the commando "genWindowsMakefiles". > > In addition to that I opened a new "Makefile Project" in Visual Studio. > In this project I imported all the *.mak files I generated before. There > are four files: BasicUsageEnviornment.mak, groupsok.mak, > liveMedia.mak,UsageEnviornment.mak and testProgs.mak. > > You've listed 5 (not 4) make files here, and if you build the > testProgs.mak as you have done with the other 4, you should get a > testMPEG2...exe. Do that and let us know what the *specific* errors > are if any appear. > > > Then I built successively each of the four files (projects) first. > > Now I opened C++ project and took one of the test programms > (testMPEG2TransportStreamer.cpp) and copied the code in it. > > Thereafter I tried to compile the code, but then I got a lot of > mistakes. > > No need for all this, testProgs.mak builds it > > > > > Are my steps I explained correct? > > Do you still want to know the exact wording of the error messages? > ?Because I'm now sitting on the wrong computer, so I could tell you the wording > of the mistakes only on Monday, because then I'm again at work and there > are so many errors, that I'm not able to tell you one of them now. Sorry for > that, but maybe you can already narrow the problem? > > > > Best greetings from Germany > > > > Simon > > > > -------- Original-Nachricht -------- > >> Datum: Wed, 9 Nov 2011 09:42:11 +0000 > >> Von: James Norris > >> An: "LIVE555 Streaming Media - development & use" > > >> Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > > > >> Less excuses more throwing me bones. ?What are the problems > >> specifically? ?It sounds like you're building the test programs, which > >> I'm assuming means you've already built the libraries (in VS10)? ?If > >> that's true it should link/work. > >> > >> James > >> > >> > >> > >> On Wed, Nov 9, 2011 at 7:00 AM, Simon Helwig > wrote: > >> > Ok, it would be very nice, if I could use VS10. May you could tell > me, > >> what you did, to compile one of the test programs step by step, because > I'm > >> a beginner and it's long ago, since I've worked with VS and I got so > much > >> mistakes, that I think, that I've made a fundamental error before. In > the > >> thread "Compiling a test program", which I posted last friday, I > explained > >> the steps I've made to make one of the test programs running. I would > be very > >> grateful for your help! > >> > > >> > Best greetings > >> > > >> > Simon > >> > -------- Original-Nachricht -------- > >> >> Datum: Tue, 8 Nov 2011 15:41:23 +0000 > >> >> Von: James Norris > >> >> An: "LIVE555 Streaming Media - development & use" > >> > >> >> Betreff: Re: [Live-devel] Compatibility with Visual Studio 10.0 > >> > > >> >> What are the problems? ?I'm sure I did manage it when using VS10, > but > >> >> moved back to 9 now (its better) > >> >> > >> >> James > >> >> > >> >> > >> >> > >> >> On Tue, Nov 8, 2011 at 1:54 PM, Simon Helwig > >> wrote: > >> >> > Hallo, > >> >> > > >> >> > could somebody tell me, if the whole libraries of LIVE555 > Streaming > >> >> Media are compatible to Microsoft Visual Studio 10.0? Because I've > many > >> >> problems, when I compile the libraries and it seems very strange. > >> >> > So if anybody knows an answer to this problem, please tell me and > >> >> perhaps you have an instruction for me, how to build these libraries > >> step by step > >> >> in Visual Studio 10.0? > >> >> > I would be very happy about an answer... > >> >> > > >> >> > Best greetings > >> >> > > >> >> > Simon > >> >> > -- > >> >> > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! > >> >> > Jetzt informieren: http://www.gmx.net/de/go/freephone > >> >> > _______________________________________________ > >> >> > live-devel mailing list > >> >> > live-devel at lists.live555.com > >> >> > http://lists.live555.com/mailman/listinfo/live-devel > >> >> > > >> >> > >> >> _______________________________________________ > >> >> live-devel mailing list > >> >> live-devel at lists.live555.com > >> >> http://lists.live555.com/mailman/listinfo/live-devel > >> > > >> > -- > >> > NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zur?ck-Garantie! > >> > Jetzt informieren: http://www.gmx.net/de/go/freephone > >> > _______________________________________________ > >> > live-devel mailing list > >> > live-devel at lists.live555.com > >> > http://lists.live555.com/mailman/listinfo/live-devel > >> > > >> > >> _______________________________________________ > >> live-devel mailing list > >> live-devel at lists.live555.com > >> http://lists.live555.com/mailman/listinfo/live-devel > > > > -- > > Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir > > belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de From isambhav at gmail.com Wed Nov 16 00:05:30 2011 From: isambhav at gmail.com (Sambhav) Date: Wed, 16 Nov 2011 13:35:30 +0530 Subject: [Live-devel] Stopping an RTP stream safely/Deleting objects In-Reply-To: References: <4EC1C83E.2090403@e6group.com> Message-ID: I have a similar use case in which the application has to start and stop the OnDemandRTSPServer before the EOF without the process being killed. I am using the code of h264ESVideoTest of testOnDemandRTSPServer To stop the server i set the watchVariable to a non null value. when the doEventLoop(&watchVariable); returns I am doing the following. rtspServer->removeServerMediaSession(sms); delete sms; env->reclaim(); delete scheduler; Next time when I start the server, port binding is failing. Anything I am missing here ? On Tue, Nov 15, 2011 at 9:03 AM, Ross Finlayson wrote: > I'm streaming an MPEG-2 TS file using code very similar to > testMPEG2TransportStreamer.cpp. In one of my use cases, I need to stop the > stream immediately instead of allowing it to complete. I'm doing this > currently by passing a watch variable to the taskScheduler().doEventLoop() > call and allowing another thread to set that variable when I need to stop > playback. > > > FYI (and this is unrelated to your question below), you might find it > simpler to use the new 'event trigger' mechanism instead; see > "UsageEnvironment/include/UsageEnvironment.hh" > > > My question is what do I need to do after that to safely stop and delete > my related objects (UsageEnvironment, RTPSink, etc) if I leave the > doEventLoop() early? > > > First, you can always just call "exit(0)" to leave and destroy the process > (i.e., address space, i.e., application). The OS will take care of > closing/reclaiming its sockets. This is by far the simplest thing to do, > and it's what you should do - unless you have a very good reason for > wanting to keep the process around. > > But, if you really want to keep the process around, you can reclaim the > LIVE555 objects by doing the following (generally speaking, you're > reclaiming objects in the reverse order from the order that you created > them): > > videoSink->stopPlaying(); > Medium::close(videoSource); > Medium::close(videoSink); > delete rtpGroupsock; > delete rtcpGroupsock; > env->reclaim(); > delete scheduler; > > (I'm basing this on the "testMPEG2TransportStreamer" code, because you > said you used that code as a model.) > > > I'd like to reuse the RTPsink but give it different parameters on a > subsequent use. > > > No, you can't do that. Just delete it (using "Medium::close()"), and > create a new one next time. > > But again, why not just exit the process (application), and launch a new > one next time? > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nunzianteantonio at gmail.com Wed Nov 16 00:41:47 2011 From: nunzianteantonio at gmail.com (antonio nunziante) Date: Wed, 16 Nov 2011 09:41:47 +0100 Subject: [Live-devel] Live555 in a Browser Plugin Message-ID: Hi all, I built a widget in Qt that play a rtsp audio/video stream with Live555 and FFMPEG. It works and now I want to include it inside a Qt BrowserPlugin. My plugin at the moment is able to start the RTSP session (i can see packets start to arrive from the camera with WireShark) but an error occurs and plugin stops with an error. Could anyone say me anything about this kind of problem? Thank you -Antonio -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 16 02:06:17 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Nov 2011 18:06:17 +0800 Subject: [Live-devel] Stopping an RTP stream safely/Deleting objects In-Reply-To: References: <4EC1C83E.2090403@e6group.com> Message-ID: > I have a similar use case in which the application has to start and stop the OnDemandRTSPServer before the EOF without the process being killed. Why? It would be much better to just leave the server running, but add/remove "ServerMediaSubsession" objects as appropriate. > I am using the code of h264ESVideoTest of testOnDemandRTSPServer > > To stop the server i set the watchVariable to a non null value. > when the doEventLoop(&watchVariable); returns > > I am doing the following. > rtspServer->removeServerMediaSession(sms); > delete sms; > env->reclaim(); > delete scheduler; > > Next time when I start the server, port binding is failing. > Anything I am missing here ? A lot, unfortunately. First, after you call "removeServerMediaSubsession(sms)", you *must not* then do "delete sms" (or "Medium::close(sms)"), because that will cause the object to be deleted twice. Second, you shouldn't reclaim the "UsageEnvironment" object, unless you first delete all of the objects that use it. In particular, you must first do Medium::close(rtspServer); but note that if you do that, then this will automatically remove all "ServerMediaSession" objects, so you won't need to do that yourself. But once again: Rather than doing all this, why not just exit() and then recreate a new process? This would be much simpler. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Nov 16 08:44:17 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 16 Nov 2011 16:44:17 +0000 Subject: [Live-devel] Access violation crash in rtspclient Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> I have been using the live555 client in our project for about a year and it has been working great for most devices. I just got a new device that does something wrong and causes a crash of the entire app. Since it handles multiple cameras this is a real problem. It is a 4 channel h264 encoder and when the server is still connected but video stops coming in, something happens in incomingReportHandler that it cannot handle. I am in windows and debugging shows at that point that it is trying to call RTCPInstance::numMembers() and all pointers are "bad pointers" Call stack BasicTaskScheduler::SingleStep-->SocketDescriptor::tcpReadHandler--> SocketDescriptor::tcpReadHandler1-->RTCPInstance::incomingReportHandler--> RTCPInstance::incomingReportHandler1-->onRecieve-->numMembers On the line Return fNumMembers It shows "access Violation and all pointers in the object are "bad pointers 0xddddddde5" Another symptom of this command is if I start the app with no video on the channel, it fails in the exact same spot. The RTSP conversation goes thru all the steps of acquiring the session and subsession but receives a BYE request which triggers a TEARDOWN. The TEARDOWN seems to go partially or completely unanswered and that causes the access violation. If you read this far, THANKS! Any help would be appreciated. I would like to just exit gracefully I am on 3/14/2011 version of live555 -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Wed Nov 16 06:34:01 2011 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Wed, 16 Nov 2011 15:34:01 +0100 Subject: [Live-devel] Relaying HTTP source into RTPSink Message-ID: >> I want to know if is it possible to relay a transport stream streamed from a source using >>udp to a destination using RTP. The chain is :> UDPSource> (TransportStream(H264 + >>AAC)) to be relayed to >RTPSink. Is it possible to setup an RTSP server serving this >>RTP stream. >Yes, you can use the "testOnDemandRTSPServer" application code as a model for this. >You can define your own subclass of "OnDemandServerMediaSubsession", and redefine >the "createNewStreamSource()" virtual function to return a "BasicUDPSource", fed into a >"MPEG2TransportStreamFramer" object. > See http://www.live555.com/liveMedia/faq.html#liveInput-unicast > What about http source relaying ? -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hana-uma.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: From norris.j at gmail.com Wed Nov 16 09:29:42 2011 From: norris.j at gmail.com (James Norris) Date: Wed, 16 Nov 2011 17:29:42 +0000 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: References: Message-ID: Hey Ross, Sorry for slow reply, now back at my desk. I've put the video up at: http://www.cs.nott.ac.uk/~jzn/live/test.h264 Unfortunately VLC doesn't play it. Any idea what's wrong with the file? I also saved the SDP description via a log of OpenRTSP running (this also shows exact params used to openrtsp if there was any doubt): http://www.cs.nott.ac.uk/~jzn/live/log.txt Am I correct in saying that there should be an extra line in this SDP which translates to an interpreted SPS/PPS NAL via the discrete framer? Even without this line in the SDP, VLC should still be able to understand the stream as the SPS/PPS are inline though? Thanks again, James On Thu, Nov 10, 2011 at 4:41 PM, Ross Finlayson wrote: > I've put an avi of the stream here: > > http://www.cs.nott.ac.uk/~jzn/live/test.avi > > Which was generated using the command line: > > openRTSP.exe -4 -w 256 -h 64 -f 30 -d 10 rtsp://.... > test.avi > > This is incorrect. ?The "-4" option generates a ".mp4"-format file, not an > AVI file. > In any case, I suggest not using this option (or "-i"). ?Instead, don't use > any options at all (other than "-d "). ?You will get a raw H.264 > video output file. ?Rename this as "test.h264", and check whether or not VLC > will play it. > If VLC can't play your file, then put it on a web server, and send us the > URL, so I can download and look at it myself. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From finlayson at live555.com Wed Nov 16 14:20:57 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Nov 2011 06:20:57 +0800 Subject: [Live-devel] Relaying HTTP source into RTPSink In-Reply-To: References: Message-ID: <3321B6D7-1009-44B2-BE43-407A5FDA0D50@live555.com> > What about http source relaying ? OK, I missed that, because your original message talked about using a "UDPSource" for input. Obviously, for HTTP, you can't do this. If your input is a HTTP source, then the easiest way to handle this is to use a 3rd-party application - like "wget" - to retrieve it, and then pipe its output into our server code (which will read from "stdin"). This is described in the FAQ. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 16 14:23:39 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Nov 2011 06:23:39 +0800 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> Message-ID: <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> > I am on 3/14/2011 version of live555 Sorry, but no support is given for old versions of the code. Please upgrade to the latest version. It contains many bugfixes since 3/14/2011, and might (or might not) end up fixing your problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Nov 16 15:05:44 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 16 Nov 2011 23:05:44 +0000 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> While waiting for a reply I updated to 11/8/2011. The problem is exactly the same. I am however still using the async interface, the re-write would kill me right now. I do have more info. I have an encoder that has 4 streams. I wiresharked the connection and it connects fine when there is video. When there is no video the conversation is the same up to the PLAY DESCRIBE --> 200ok and a SDP SETUP --> 200ok and a sessionID PLAY --> No response for a period of time. Then a few bytes that causes the TEARDOWN to be sent. The TEARDOWN goes un answered and then the library crashes and in the debug session I find that It has tried to execute SingleStep() in SingleStep it fails with numMembers() as indicated before. The funny part is the guard in doEventLoop has a non-null watchVariable and it points to valid data. By this time the pointers inside the session all are invalid. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, November 16, 2011 4:24 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient I am on 3/14/2011 version of live555 Sorry, but no support is given for old versions of the code. Please upgrade to the latest version. It contains many bugfixes since 3/14/2011, and might (or might not) end up fixing your problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1869 / Virus Database: 2092/4620 - Release Date: 11/16/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 16 16:14:47 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Nov 2011 08:14:47 +0800 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> Message-ID: <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> > While waiting for a reply I updated to 11/8/2011. The problem is exactly the same. > I am however still using the async interface, the re-write would kill me right now. Do you mean "still using the *synchronous* interface" (i.e., the old, now-deprecated interface)? But in any case, that's OK; either interface is supposed to be working without error. First, I suggest running our "openRTSP" command-line RTSP client to access the server that's causing you problems. If "openRTSP" also crashes for you (with that server), then the job will be easier, because it shows that there's a problem even with our unmodified released code. If, however, the problem happens only with your client application, then I suggest looking at what happens - in your code - after a RTCP "BYE" arrives from the server. I presume that you have set a 'BYE handler' - using the function "RTCPInstance::setByeHandler()". Do any other of your servers send RTCP "BYE" packets (thereby triggering the calling of your 'BYE handler'), or does only your new, troublesome server send a RTCP "BYE"? If it's only your new server that sends this, then this suggests that there is a problem with your 'BYE handler' (because it had not been called before). Perhaps you are closing/deleting objects in the wrong order, or trying to delete some objects more than once? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 16 16:25:51 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Nov 2011 08:25:51 +0800 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: References: Message-ID: > Am I correct in saying that there should be an extra line in this SDP > which translates to an interpreted SPS/PPS NAL via the discrete > framer? Even without this line in the SDP, VLC should still be able > to understand the stream as the SPS/PPS are inline though? Not necessarily - depending upon how smart VLC's H.264 decoder is (or isn't). But the problem here is that the SPS and PPS NAL units are *not* appearing in the stream, because the "H264VideoStreamDiscreteFramer" is not recognizing them. That's why there isn't a separate "a=fmtp:" line in the SDP description. You can verify this yourself by reviewing the code for "H264VideoStreamDiscreteFramer::afterGettingFrame1()" (in "liveMedia/H264VideoStreamDiscreteFramer.cpp"). Note the "if" statement beginning at line 67, where the code checks the "nal_unit_type". In your case, it is apparently never seeing a SPS or PPS NAL unit. That's the problem (with your input data) that you need to address. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From norris.j at gmail.com Thu Nov 17 02:27:26 2011 From: norris.j at gmail.com (James Norris) Date: Thu, 17 Nov 2011 10:27:26 +0000 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: References: Message-ID: Hey Ross, I've duplicated some code from H264VideoFileServerMediaSubsession which streams into a 'dummy' RTP sink to get the SPS/PPS. Now the SDP contains the SDP description: a=fmtp:96 packetization-mode=1;profile-level-id=42C00B;sprop-parameter-sets=Z0LA C9oECaEAAAMAAQAAAwA8jxQqoA==,aM4fIA== The entire log including the full SDP is inside the same location: http://www.cs.nott.ac.uk/~jzn/live/log.txt Unfortunately I'm getting the same symptoms, the h264 file created is in-readable and VLC buffers twice, produces no errors and cannot display anything. I also checked that the discrete framer is interpreting the SPS/PPS via a couple of println's alongside 'saveCopyOfPPS'. Seems fine! I'm at a complete loss here, as far as I can see the stream is correct, but still nothing is able to interpret it. I've put another .h264 file in the same place: http://www.cs.nott.ac.uk/~jzn/live/test.h264 Thanks again, James On Thu, Nov 17, 2011 at 12:25 AM, Ross Finlayson wrote: > Am I correct in saying that there should be an extra line in this SDP > which translates to an interpreted SPS/PPS NAL via the discrete > framer? ?Even without this line in the SDP, VLC should still be able > to understand the stream as the SPS/PPS are inline though? > > Not necessarily - depending upon how smart VLC's H.264 decoder is (or > isn't). > But the problem here is that the SPS and PPS NAL units are *not* appearing > in the stream, because the "H264VideoStreamDiscreteFramer" is not > recognizing them. ?That's why there isn't a separate "a=fmtp:" line in the > SDP description. > You can verify this yourself by reviewing the code for > "H264VideoStreamDiscreteFramer::afterGettingFrame1()" (in > "liveMedia/H264VideoStreamDiscreteFramer.cpp"). ?Note the "if" statement > beginning at line 67, where the code checks the "nal_unit_type". ?In your > case, it is apparently never seeing a SPS or PPS NAL unit. ?That's the > problem (with your input data) that you need to address. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From finlayson at live555.com Thu Nov 17 02:42:42 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Nov 2011 18:42:42 +0800 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: References: Message-ID: <5BE91CD8-D2B7-41B5-9A9F-DBF57BA9DCF7@live555.com> > I'm at a complete loss here, as far as I can see the stream is > correct, but still nothing is able to interpret it. Yes, that's what it looks like to me as well. At this point I suspect that there's a problem with your encoder. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From norris.j at gmail.com Thu Nov 17 03:22:26 2011 From: norris.j at gmail.com (James Norris) Date: Thu, 17 Nov 2011 11:22:26 +0000 Subject: [Live-devel] H264DiscreteFramer from custom source In-Reply-To: <5BE91CD8-D2B7-41B5-9A9F-DBF57BA9DCF7@live555.com> References: <5BE91CD8-D2B7-41B5-9A9F-DBF57BA9DCF7@live555.com> Message-ID: OK sorted.. The answer was indeed due to my encoder, thanks again & sorry it went a little off-topic. For others in similar situation though: Apparently the x264 annex_b needed disabling before the discrete framer would accept. I still skip the startcode in the same way as before, e.g.: for ( int i=0; i wrote: > I'm at a complete loss here, as far as I can see the stream is > correct, but still nothing is able to interpret it. > > Yes, that's what it looks like to me as well. ?At this point I suspect that > there's a problem with your encoder. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From norris.j at gmail.com Thu Nov 17 12:12:54 2011 From: norris.j at gmail.com (James Norris) Date: Thu, 17 Nov 2011 20:12:54 +0000 Subject: [Live-devel] MediaSubsession SDP description to NAL Message-ID: Hey all, This is a long shot, but does live555 contain any code that might convert the fmtp line from the SDP description of the MediaSubsession::savedSDPLines (), back into a NAL, in the same data format as the one which was parsed to create the line. I would like to feed it into ffmpeg in the form of AvCodecContext::extradata, seems like the only way to get it to parse the SPS/PPS without the NALs being inline. Cheers, James From finlayson at live555.com Thu Nov 17 14:18:18 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Nov 2011 06:18:18 +0800 Subject: [Live-devel] MediaSubsession SDP description to NAL In-Reply-To: References: Message-ID: <9DED0A32-EB75-437C-AB09-DE3E66F2D859@live555.com> > This is a long shot, but does live555 contain any code that might > convert the fmtp line from the SDP description of the > MediaSubsession::savedSDPLines (), back into a NAL, in the same data > format as the one which was parsed to create the line. It's not a 'long shot' at all. Of course we provide routines for decoding 'configuration' strings in SDP descriptions (because we supply code for receiving, as well as for sending). See the function parseSPropParameterSets() defined in "liveMedia/include/H264VideoRTPSource.hh Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Thu Nov 17 14:39:52 2011 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Thu, 17 Nov 2011 14:39:52 -0800 Subject: [Live-devel] MediaSubsession SDP description to NAL In-Reply-To: References: Message-ID: <000601cca579$d2df30d0$789d9270$@com> Hi James, It is fairly straightforward to read the fmtp line and convert it to something FFMPEG can use. First, read the SPS and PPS from the SDP using MediaSubsession::fmtp_spropparametersets(). Then you will split the returned string at the comma, if present. The first half will be the SPS and the second half will be the PPS. Base64 decode both of these strings (using Live555's base64Decode function if you wish), then write them to a buffer, with start codes, in the following order: Start Code (00 00 01 or 00 00 00 01) SPS Start Code (00 00 01 or 00 00 00 01) PPS A couple notes: 1. I can't recall if I had to use 00 00 01 or 00 00 00 01 to make FFMPEG happy. I would expect to use 00 00 00 01, but my code currently has 00 00 01, so you may have to try both. 2. I feed the extradata to FFMPEG using a buffer that is padded with 'FF_INPUT_BUFFER_PADDING_SIZE' extra bytes. I can't recall if this is necessary or not. 3. You must malloc the AVCodecContext::extradata buffer with av_malloc, if I recall correctly. These are the basic exact steps of what my implementation does and it works fine, though my streams also have SPS, PPS and IDR NALs embedded in them. Good luck, Chris Richardson WTI -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of James Norris Sent: Thursday, November 17, 2011 12:13 PM To: live-devel at ns.live555.com Subject: [Live-devel] MediaSubsession SDP description to NAL Hey all, This is a long shot, but does live555 contain any code that might convert the fmtp line from the SDP description of the MediaSubsession::savedSDPLines (), back into a NAL, in the same data format as the one which was parsed to create the line. I would like to feed it into ffmpeg in the form of AvCodecContext::extradata, seems like the only way to get it to parse the SPS/PPS without the NALs being inline. Cheers, James _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From emile.semmes at e6group.com Thu Nov 17 14:45:31 2011 From: emile.semmes at e6group.com (Emile Semmes) Date: Thu, 17 Nov 2011 14:45:31 -0800 Subject: [Live-devel] Stopping an RTP stream safely/Deleting objects In-Reply-To: References: Message-ID: <4EC58E8B.5040804@e6group.com> Hi Ross, Thanks for the quick and good advice. I wasn't too keen on using the watch variable to interrupt the doEventLoop() call. Using event triggers works great for me. I'm no longer 'reusing' the videosink as I'll just recreate another one when the other one is disposed as you suggested. As for exiting and restarting the process, what I'd like to do is actually have a single process with multiple thread with different RTP servers (not RTSP) in each. Is this possible using a single environment variable for the whole process but individual task schedulers per thread or should there be one of each (task scheduler and environment var) per thread. I'm betting that's what Sambhav is trying to do as well. Also, are there any thread safety issue with this approach? Is this why you're suggesting a single process per server and exiting to delete allocated objects? As an aside, I'm an experienced Gstreamer developer (plugin and application development) and I have to say it was significantly easier to get an RTP server to stream a file than it is do with Gstreamer; streaming live sources is Gstreamer's strength. (Hopefully 'Gstreamer' isn't a curse word here :) ). I do wish the documentation was a bit more clearer than just Doxygen output though; are you the sole maintainer? Need some help? -- Emile Semmes Software Consultant e6 Group, LLC Office: (630) 376-0626 www.e6group.com >> I have a similar use case in which the application has to start and stop the OnDemandRTSPServer before the EOF without the process being killed. > Why? It would be much better to just leave the server running, but add/remove "ServerMediaSubsession" objects as appropriate. > > >> I am using the code of h264ESVideoTest of testOnDemandRTSPServer >> >> To stop the server i set the watchVariable to a non null value. >> when the doEventLoop(&watchVariable); returns >> >> I am doing the following. >> rtspServer->removeServerMediaSession(sms); >> delete sms; >> env->reclaim(); >> delete scheduler; >> >> Next time when I start the server, port binding is failing. >> Anything I am missing here ? > A lot, unfortunately. > > First, after you call "removeServerMediaSubsession(sms)", you *must not* then do "delete sms" (or "Medium::close(sms)"), because that will cause the object to be deleted twice. > > Second, you shouldn't reclaim the "UsageEnvironment" object, unless you first delete all of the objects that use it. In particular, you must first do > Medium::close(rtspServer); > but note that if you do that, then this will automatically remove all "ServerMediaSession" objects, so you won't need to do that yourself. > > But once again: Rather than doing all this, why not just exit() and then recreate a new process? This would be much simpler. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > From jshanab at smartwire.com Thu Nov 17 14:50:32 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 17 Nov 2011 22:50:32 +0000 Subject: [Live-devel] MediaSubsession SDP description to NAL In-Reply-To: <9DED0A32-EB75-437C-AB09-DE3E66F2D859@live555.com> References: <9DED0A32-EB75-437C-AB09-DE3E66F2D859@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18B4CF@IL-BOL-EXCH01.smartwire.com> I am working with security cameras and 1/3 do not put in the SPS and PPS nal units in the stream and 1/3 are settable and the rest inject it. VLC puts them in and I took the same tack in my code. When I first start I parse the PPS and SPS nals and store them. Then when I transition from the 1 back to the 5 in my little state machine in my H264Filter, I create a new frame that is {7}{8}{5} and send/save it. (001 still in front of each nal) Is that what you guys mean? I use that function Ross mentions. PS. Ross. Good call on my bye handler. I was losing the subsession pointer in flight because of my class wrapper. I still have an issue using the environment pointer which I saved in the class.....but I haven't given up yet From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, November 17, 2011 4:18 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] MediaSubsession SDP description to NAL This is a long shot, but does live555 contain any code that might convert the fmtp line from the SDP description of the MediaSubsession::savedSDPLines (), back into a NAL, in the same data format as the one which was parsed to create the line. It's not a 'long shot' at all. Of course we provide routines for decoding 'configuration' strings in SDP descriptions (because we supply code for receiving, as well as for sending). See the function parseSPropParameterSets() defined in "liveMedia/include/H264VideoRTPSource.hh Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1872 / Virus Database: 2092/4622 - Release Date: 11/17/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 17 14:55:27 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Nov 2011 06:55:27 +0800 Subject: [Live-devel] Stopping an RTP stream safely/Deleting objects In-Reply-To: <4EC58E8B.5040804@e6group.com> References: <4EC58E8B.5040804@e6group.com> Message-ID: > I'm no longer 'reusing' the videosink as I'll just recreate another one when the other one is disposed as you suggested. > As for exiting and restarting the process, what I'd like to do is actually have a single process with multiple thread with different RTP servers (not RTSP) in each. Is this possible using a single environment variable for the whole process but individual task schedulers per thread or should there be one of each (task scheduler and environment var) per thread. No! If you access the LIVE555 code from multiple threads, then these multiple threads ***must*** each have a separate "UsageEnvironment" and "TaskScheduler", and must not share (pointers to) any other LIVE555-created objects either. Read the FAQ! But once again, why not just use multiple processes?! I'm astounded that - in this day and age - so many developers seem unaware that computer systems can run multiple, concurrent processes. (This seems to be one of the reasons why so many people are falling for bullshit like 'host virtualization'.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 17 14:58:32 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Nov 2011 06:58:32 +0800 Subject: [Live-devel] MediaSubsession SDP description to NAL In-Reply-To: <000601cca579$d2df30d0$789d9270$@com> References: <000601cca579$d2df30d0$789d9270$@com> Message-ID: > Then you will split the > returned string at the comma, if present. The first half will be the SPS > and the second half will be the PPS. Base64 decode both of these strings > (using Live555's base64Decode function if you wish) Yes, but it's simpler to call "parseSPropParameterSets()", because that does all this for you. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 17 15:26:04 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Nov 2011 07:26:04 +0800 Subject: [Live-devel] Stopping an RTP stream safely/Deleting objects In-Reply-To: <4EC58E8B.5040804@e6group.com> References: <4EC58E8B.5040804@e6group.com> Message-ID: <651239EE-366B-4F52-B37C-44B128A995AA@live555.com> > As for exiting and restarting the process, what I'd like to do is actually have a single process with multiple thread with different RTP servers (not RTSP) in each. Is this possible using a single environment variable for the whole process but individual task schedulers per thread or should there be one of each (task scheduler and environment var) per thread. And note also that you can have multiple servers (and/or clients) within a single process, without using multiple threads at all. LIVE555 applications use an event loop - rather than multiple threads - to provide concurrency. If you do not understand that last sentence, then you should not be using this software until you do :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Thu Nov 17 16:49:51 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 18 Nov 2011 00:49:51 +0000 Subject: [Live-devel] Every time I update, I have to adjust boolean.hh Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18B5BC@IL-BOL-EXCH01.smartwire.com> I use live555 in a browser plugin to be cross-platform, but windows for now. Everytime I update the live555 libs I go into Boolean.hh in UsageEnvironment and add a guard. I got this workaround from one of the GMANE archives I think. I was just wondering if there is a better way or if maybe this should be in the live555 code! I add the #ifndef __MSHTML_LIBRARY_DEFINED__ guard on line 34. It prevents multiple definition errors. And then I add the winsock2 and mshtml includes on 21 and 22 to make it always work in windows builds. (maybe these should be in a #ifdef WINDOWS ?) I know this is just to get around the windowsisms and the often times silly way they do things :( By making these changes in this one location, for me, it solves making changes all over. (that I never could get to work) /********** 2 This library is free software; you can redistribute it and/or modify it under 3 the terms of the GNU Lesser General Public License as published by the 4 Free Software Foundation; either version 2.1 of the License, or (at your 5 option) any later version. (See .) 6 7 This library is distributed in the hope that it will be useful, but WITHOUT 8 ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 9 FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for 10 more details. 11 12 You should have received a copy of the GNU Lesser General Public License 13 along with this library; if not, write to the Free Software Foundation, Inc., 14 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA 15 **********/ 16 #ifndef _BOOLEAN_HH 17 #define _BOOLEAN_HH 18 19 20 //include this to force the order and help modified Boolean.hh to work. 21 #include 22 #include 23 24 25 26 #ifdef __BORLANDC__ 27 #define Boolean bool 28 #define False false 29 #define True true 30 #else 31 32 33 typedef unsigned Boolean; 34 #ifndef __MSHTML_LIBRARY_DEFINED__ 35 #ifndef False 36 const Boolean False = 0; 37 #endif 38 #ifndef True 39 const Boolean True = 1; 40 #endif 41 42 #endif 43 #endif 44 45 #endif -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 17 17:40:05 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Nov 2011 09:40:05 +0800 Subject: [Live-devel] Every time I update, I have to adjust boolean.hh In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18B5BC@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18B5BC@IL-BOL-EXCH01.smartwire.com> Message-ID: <739E90AC-0023-4E08-89ED-A753D9B19189@live555.com> > I add the #ifndef __MSHTML_LIBRARY_DEFINED__ guard on line 34. It prevents multiple definition errors. OK, fair enough. I'll add this to "Boolean.hh" in future releases of the software. > And then I add the winsock2 and mshtml includes on 21 and 22 to make it always work in windows builds. (maybe these should be in a #ifdef WINDOWS ?) Yes, these #include files only exist in Windows, so they would need to be #ifdef'd. *However*, the include file "groupsock/include/NetCommon.h" - which is #included by almost everything in LIVE555, *already* includes "winsock2.h" (and other Windows-specific header files). So, apart from the "#ifndef __MSHTML_LIBRARY_DEFINED__" guard - which I'll add to the next release of the software - you should not, as far as I can tell, need to make any changes to "Boolean.hh" to get your compilation to work. In particular, you most definitely should not add #include to "Boolean.hh" - because that header file is very specific to your environment, and has nothing to do with LIVE555 code. Instead, "#include " only in your own, custom code. (I'm presuming that "MsHTML.h" defines "__MSHTML_LIBRARY_DEFINED__", so that the new "#ifndef __MSHTML_LIBRARY_DEFINED__" guard in "Boolean.hh" will work OK.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From norris.j at gmail.com Fri Nov 18 02:11:20 2011 From: norris.j at gmail.com (James Norris) Date: Fri, 18 Nov 2011 10:11:20 +0000 Subject: [Live-devel] MediaSubsession SDP description to NAL In-Reply-To: References: <000601cca579$d2df30d0$789d9270$@com> Message-ID: Thanks all, the live555 functionality here works fine. The problem is that ffmpeg requires H264 extradata in avcC format, which is annoyingly complex to put together. Manning up in progress... James On Thu, Nov 17, 2011 at 10:58 PM, Ross Finlayson wrote: > Then you will split the > returned string at the comma, if present. ?The first half will be the SPS > and the second half will be the PPS. ?Base64 decode both of these strings > (using Live555's base64Decode function if you wish) > > Yes, but it's simpler to call "parseSPropParameterSets()", because that does > all this for you. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From jshanab at smartwire.com Fri Nov 18 09:03:37 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 18 Nov 2011 17:03:37 +0000 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> Following your suggestion I got the Bye handlers working better. But I am back to a situation that may indicate I am doing to much in my handler. The case in incomingReportHandler1 that handles the BYE falls thru to and tries to execute onRecieve with the old data. In on receive it tries to access a now dead class member I guess and throws an access violation. I am trying to search thru the openRTSP example as I discovered the much older openRTSP was the basis for our client (before I got here) But If you have any suggestions on what not to do in the bye handler chain, I would appreciate it! From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, November 16, 2011 6:15 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient While waiting for a reply I updated to 11/8/2011. The problem is exactly the same. I am however still using the async interface, the re-write would kill me right now. Do you mean "still using the *synchronous* interface" (i.e., the old, now-deprecated interface)? But in any case, that's OK; either interface is supposed to be working without error. First, I suggest running our "openRTSP" command-line RTSP client to access the server that's causing you problems. If "openRTSP" also crashes for you (with that server), then the job will be easier, because it shows that there's a problem even with our unmodified released code. If, however, the problem happens only with your client application, then I suggest looking at what happens - in your code - after a RTCP "BYE" arrives from the server. I presume that you have set a 'BYE handler' - using the function "RTCPInstance::setByeHandler()". Do any other of your servers send RTCP "BYE" packets (thereby triggering the calling of your 'BYE handler'), or does only your new, troublesome server send a RTCP "BYE"? If it's only your new server that sends this, then this suggests that there is a problem with your 'BYE handler' (because it had not been called before). Perhaps you are closing/deleting objects in the wrong order, or trying to delete some objects more than once? Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1869 / Virus Database: 2092/4620 - Release Date: 11/16/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Fri Nov 18 10:07:39 2011 From: david.myers at panogenics.com (David J Myers) Date: Fri, 18 Nov 2011 18:07:39 -0000 Subject: [Live-devel] H.264 live video streaming Message-ID: <000001cca61c$f60f1360$e22d3a20$@myers@panogenics.com> Hi, I am trying to use Live555 to implement an RTSP server in our IP camera. Our H.264 encoder produces NAL units preceded by a 00 00 00 01 start code. The encoding main thread then writes the NAL units to a linux pipe where they can be read by the RTSP Server thread for streaming out. My questions are:- 1. Is the linux pipe method a good enough mechanism for getting the data to the server? Has anyone else done this? 2. Which H264 framer class do I need? The new H264VideoStreamDiscreteFramer class doesn't seem right because I have these start codes on each NAL unit. If I base a new class on H264VideoStreamFramer, do I need to remove the start codes? 3. What else does the framer class need to do? Is there an example of this anywhere? Thanks and regards David -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Nov 18 13:35:45 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 18 Nov 2011 21:35:45 +0000 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> After tracing through there seems to be, dare I say a bug or misleading testProgram. The bye handler calls code resulting in Medium::Close. This ends up calling the destructor on the RTCP class and deletes the fKnowMembers. The OnReceive is then attempted to be called where it ends up trying to use that now deleted pointer. It looks like The playCommon.cpp gets around the ensueing access violation by calling exit() Something I cannot do. Is this analysis correct? Or did I somehow confuse some pointers to sessions/subsessions. If so How do I handle the "BYE" message gracefully? If I move the Medium::close to the destructor for my client, then it necessarily hangs on shutdown because there is nothing to change the watchvariable. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Friday, November 18, 2011 11:04 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient Following your suggestion I got the Bye handlers working better. But I am back to a situation that may indicate I am doing to much in my handler. The case in incomingReportHandler1 that handles the BYE falls thru to and tries to execute onRecieve with the old data. In on receive it tries to access a now dead class member I guess and throws an access violation. I am trying to search thru the openRTSP example as I discovered the much older openRTSP was the basis for our client (before I got here) But If you have any suggestions on what not to do in the bye handler chain, I would appreciate it! From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, November 16, 2011 6:15 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient While waiting for a reply I updated to 11/8/2011. The problem is exactly the same. I am however still using the async interface, the re-write would kill me right now. Do you mean "still using the *synchronous* interface" (i.e., the old, now-deprecated interface)? But in any case, that's OK; either interface is supposed to be working without error. First, I suggest running our "openRTSP" command-line RTSP client to access the server that's causing you problems. If "openRTSP" also crashes for you (with that server), then the job will be easier, because it shows that there's a problem even with our unmodified released code. If, however, the problem happens only with your client application, then I suggest looking at what happens - in your code - after a RTCP "BYE" arrives from the server. I presume that you have set a 'BYE handler' - using the function "RTCPInstance::setByeHandler()". Do any other of your servers send RTCP "BYE" packets (thereby triggering the calling of your 'BYE handler'), or does only your new, troublesome server send a RTCP "BYE"? If it's only your new server that sends this, then this suggests that there is a problem with your 'BYE handler' (because it had not been called before). Perhaps you are closing/deleting objects in the wrong order, or trying to delete some objects more than once? Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1869 / Virus Database: 2092/4620 - Release Date: 11/16/11 ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1872 / Virus Database: 2092/4624 - Release Date: 11/18/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Nov 18 22:24:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Nov 2011 22:24:23 -0800 Subject: [Live-devel] H.264 live video streaming In-Reply-To: <000001cca61c$f60f1360$e22d3a20$@myers@panogenics.com> References: <000001cca61c$f60f1360$e22d3a20$@myers@panogenics.com> Message-ID: <17911470-1156-43A4-B182-9E88D19AC954@live555.com> > I am trying to use Live555 to implement an RTSP server in our IP camera. > > Our H.264 encoder produces NAL units preceded by a 00 00 00 01 start code. > > The encoding main thread then writes the NAL units to a linux pipe where they can be read by the RTSP Server thread for streaming out. > > My questions are:- > 1. Is the linux pipe method a good enough mechanism for getting the data to the server? Yes. In fact, it's an excellent mechanism, because it allows the server to treat the input stream as if it were a file. This means that the existing "testOnDemandRTSPServer" code will work; you need make only minor changes (set "reuseFirstSource" to True, and change the input file name). See http://www.live555.com/liveMedia/faq.html#liveInput-unicast > > 2. Which H264 framer class do I need? The new H264VideoStreamDiscreteFramer class doesn?t seem right because I have these start codes on each NAL unit. If I base a new class on H264VideoStreamFramer, do I need to remove the start codes? Because you're reading H.264 from an unstructured byte stream, you'd use a "H264VideoStreamFramer", and the input data is assumed to include start codes at the front of each NAL unit. > 3. What else does the framer class need to do? In this case you *don't* need to write any new 'framer' code (or subclass) of your own. Just use "H264VideoStreamFramer" as is. > Is there an example of this anywhere? Yes, see "testProgs/testOnDemandRTSPServer.cpp", and "liveMedia/H264VideoFileServerMediaSubsession.cpp". (Because you're reading from a pipe (i.e., as if it were a file), then you should be able to use the "H264VideoFileServerMediaSubsession" class, without modification.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 19 00:11:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Nov 2011 00:11:23 -0800 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> Message-ID: > The bye handler calls code resulting in Medium::Close. This ends up calling the destructor on the RTCP class and deletes the fKnowMembers. Yes. > The OnReceive is then attempted to be called No, that shouldn't be happening, because the "RTCPInstance" destructor called "stopNetworkReading()", which stops any further handling of incoming RTCP packets. I think you're on a 'wild goose chase' here. Because the problem in your application seems to be caused by your 'BYE handler' routine, then why don't you tell us what that routine is doing (or trying to do)? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From norris.j at gmail.com Sat Nov 19 02:19:40 2011 From: norris.j at gmail.com (James Norris) Date: Sat, 19 Nov 2011 10:19:40 +0000 Subject: [Live-devel] H.264 live video streaming In-Reply-To: <4ec69f71.6824440a.1bfa.ffff8b21SMTPIN_ADDED@mx.google.com> References: <4ec69f71.6824440a.1bfa.ffff8b21SMTPIN_ADDED@mx.google.com> Message-ID: If your data stream arrives as a continuous buffer (its not easy to tell where NALs start/end) use H264VideoStreamFramer, if your NALs are separated & delivered one by one, use the H264VideoStreamDiscreteFramer. You can cull the startcodes from the NALs easily enough. James On Fri, Nov 18, 2011 at 6:07 PM, David J Myers wrote: > Hi, > > I am trying to use Live555 to implement an RTSP ?server in our IP camera. > > > > Our H.264 encoder produces NAL units preceded by a 00 00 00 01 start code. > > > > The encoding main thread then writes the NAL units to a linux pipe where > they can be read by the RTSP Server thread for streaming out. > > > > My questions are:- > > 1.?????? Is the linux pipe method a good enough mechanism for getting the > data to the server? Has anyone else done this? > > 2.?????? Which H264 framer class do I need? The new > H264VideoStreamDiscreteFramer class doesn?t seem right because I have these > start codes on each NAL unit. If I base a new class on > H264VideoStreamFramer, do I need to remove the start codes? > > 3.?????? What else does the framer class need to do? Is there an example of > this anywhere? > > > > Thanks and regards > > David > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From H.Oztoprak at surrey.ac.uk Sat Nov 19 05:09:31 2011 From: H.Oztoprak at surrey.ac.uk (H.Oztoprak at surrey.ac.uk) Date: Sat, 19 Nov 2011 13:09:31 +0000 Subject: [Live-devel] Mpeg surround sound Message-ID: Hi Ross, We are interested in RTP encapsulation of AAC 5.1 audio in the context of a European Union funded project. Current version of LIVE 555 does not support that. Is there a way that we can get help from you about this or if not what is the best point to start? Regards, Huseyin From jshanab at smartwire.com Sat Nov 19 06:50:43 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 19 Nov 2011 14:50:43 +0000 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> I know that 99% of the time it is 'adjustments to' and 'missuse' that are the problems people have with using live555, but unless they changed the way CPU's work, we do have an issue. (This issue is avoided in openRTSP by just exiting the process before the access violation could hit.) It boils down to calling the destructor of a class from a member in the class puts us in undefined territory. I call your attention to RTCP.cpp from 11/08/2011 In the switch statement in incomingReportHandler1 at line 511, we handle the case of an incoming "BYE" message. On line 525 we save the program counter and create a new stack frame to handle the jump long to the bye handler function. As you say and as the openRTSP does Medium::close is called and this calls the DOTR on the very instance we are calling from. On completion of this call the program counter is restored and execution resumes at line 526. The program counter is still valid and so is the CODE segment in memory, only the DATA segment holding the instance was erased in the dtor. The break on 532 exits the switch taking us to line 542. It passes all test and gets to 583 containing valid stack data and there is nothing to stop it from calling onReceive. In onReceive, line 593, it tries to access the memory that was returned to the OS during the dtor and an access violation attempting to read that memory is thrown. For my use case of the live555 libraries, I cannot take the process approach. It would require a re-arching of an existing project and the result would not work well cross platform. I have upto hundreds if not thousands of connected sources and as many, plus another few hundred, sinks. I have status pages that show status of these streams and notifications on some if they fail. Even then with the dynamic many to many model in this app, I am not sure it would work to have a process for every source with n connections. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 19, 2011 2:11 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient The bye handler calls code resulting in Medium::Close. This ends up calling the destructor on the RTCP class and deletes the fKnowMembers. Yes. The OnReceive is then attempted to be called No, that shouldn't be happening, because the "RTCPInstance" destructor called "stopNetworkReading()", which stops any further handling of incoming RTCP packets. I think you're on a 'wild goose chase' here. Because the problem in your application seems to be caused by your 'BYE handler' routine, then why don't you tell us what that routine is doing (or trying to do)? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 19 07:12:22 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Nov 2011 07:12:22 -0800 Subject: [Live-devel] Mpeg surround sound In-Reply-To: References: Message-ID: <42ECD5B3-AAAC-442B-9F00-E25EA0B00F7E@live555.com> > We are interested in RTP encapsulation of AAC 5.1 audio in the context of a European Union funded project. Current version of LIVE 555 does not support that. Yes we do. We support both the receiving (using "MPEG4GenericRTPSource") and sending (using "MPEG4GenericRTPSink") of AAC audio RTP streams, using the RTP payload format "audio/MPEG4-GENERIC" defined in RFC 3640. This should include support for 5.1 audio also. Why do you think that we don't support this? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 19 07:30:05 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Nov 2011 07:30:05 -0800 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> Message-ID: <44D103AA-BE7C-4865-955F-652342B09D81@live555.com> OK, I see what you're saying now. Yes, you're correct: There is a problem if the RTCP "BYE" handler - called from within "RTCPInstance::incomingReportHandler1()" - happens to delete the "RTCPInstance" object, because we'll continue to execute the rest of the "incomingReportHandler1()" member function afterwards. Because having the "BYE" handler routine delete the "RTCPInstance" object is something that will typically happen (e.g., it happens in both "openRTSP" and in your application), this is a bug in the LIVE555 code. The fix will be to delay the call to the "BYE" handler routine until the very end of the "incomingReportHandler1()" member function. Very shortly, I'll be releasing a new version of the LIVE555 code that fixes this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 19 07:37:56 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 19 Nov 2011 15:37:56 +0000 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> 2 adjustments to the library allow me to tiptoe thru the call stack avoiding accessing destroyed or already deleted members and unwind this stack. I do not suggest these at all, they are a hack, but it confirms for me what is happening with the call stack. Add an exit label with a dummy command at the end of the while loop beyond the onReceive call to avoid it. First modify RTCP.cpp .... onReceive(typeOfPacket, totPacketSize, reportSenderSSRC); exit: int dummy = 0; } while (0); } Then after calling the bye handler jump to it (*byeHandler)(fByeHandlerClientData); goto exit; } This gest me almost all the way out of the stack unwind. But when I return to RTSPClient::playMediaSession after the call to the event loop there is a "delete[] fResultString" It has already been deleted and causes a access violation so commenting it out gets me thru, returning false which allows the application to continue unscathed. In GCC, or Microsoft release mode this may not show up as they zero out memory for you, but in debug mode a special non-null value is placed in the heap to signify bad pointer. Either way we cannot depend on it because it is not part of c++. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Saturday, November 19, 2011 8:51 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient I know that 99% of the time it is 'adjustments to' and 'missuse' that are the problems people have with using live555, but unless they changed the way CPU's work, we do have an issue. (This issue is avoided in openRTSP by just exiting the process before the access violation could hit.) It boils down to calling the destructor of a class from a member in the class puts us in undefined territory. I call your attention to RTCP.cpp from 11/08/2011 In the switch statement in incomingReportHandler1 at line 511, we handle the case of an incoming "BYE" message. On line 525 we save the program counter and create a new stack frame to handle the jump long to the bye handler function. As you say and as the openRTSP does Medium::close is called and this calls the DOTR on the very instance we are calling from. On completion of this call the program counter is restored and execution resumes at line 526. The program counter is still valid and so is the CODE segment in memory, only the DATA segment holding the instance was erased in the dtor. The break on 532 exits the switch taking us to line 542. It passes all test and gets to 583 containing valid stack data and there is nothing to stop it from calling onReceive. In onReceive, line 593, it tries to access the memory that was returned to the OS during the dtor and an access violation attempting to read that memory is thrown. For my use case of the live555 libraries, I cannot take the process approach. It would require a re-arching of an existing project and the result would not work well cross platform. I have upto hundreds if not thousands of connected sources and as many, plus another few hundred, sinks. I have status pages that show status of these streams and notifications on some if they fail. Even then with the dynamic many to many model in this app, I am not sure it would work to have a process for every source with n connections. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 19, 2011 2:11 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient The bye handler calls code resulting in Medium::Close. This ends up calling the destructor on the RTCP class and deletes the fKnowMembers. Yes. The OnReceive is then attempted to be called No, that shouldn't be happening, because the "RTCPInstance" destructor called "stopNetworkReading()", which stops any further handling of incoming RTCP packets. I think you're on a 'wild goose chase' here. Because the problem in your application seems to be caused by your 'BYE handler' routine, then why don't you tell us what that routine is doing (or trying to do)? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 19 09:11:06 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 19 Nov 2011 17:11:06 +0000 Subject: [Live-devel] Windows builds Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18BE00@IL-BOL-EXCH01.smartwire.com> I know there has in the past been requests for windows builds but windows changes their and breaks their own build system so much, it is not worth it. However, I use the FireBreath plugin framework (http://www.firebreath.org/display/documentation/FireBreath+Home) and I really like the way CMAKE works. He has a prep script prepmac, prepmake, prepeclipse, prep2008,prep2010....and so on and CMAKE generates the entire build directory. One make fiel definition that is very similar to traditional make systems and any number of a large group of targets. It has proven ideal for this crossplatform and even just cross Visual Studio versioning. The prep scripts for the *nix side are trivial =====prepmake.sh============= #!/bin/bash if [ "${GEN}" = "" ]; then GEN='Unix Makefiles' fi source ${0%/*}/common.sh "$@" pushd "$BUILDDIR" cmake -G "$GEN" -DFB_PROJECTS_DIR="${PROJDIR}" "$@" "${FB_ROOT}" pop =========prep2010.cmd============ @echo off & setlocal enableextensions enabledelayedexpansion set _FB_GEN="Visual Studio 10" call "%~d0%~p0\common.cmd" %* if %errorlevel% == 2 exit /b 1 call "%~d0%~p0\winprep.cmd Wouldn't such a system be great for live555? (Taxillian on freenode->firebreath is the author ) -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Sat Nov 19 10:01:33 2011 From: david.myers at panogenics.com (David J Myers) Date: Sat, 19 Nov 2011 18:01:33 -0000 Subject: [Live-devel] H.264 live video streaming In-Reply-To: <001201cca6e3$5dc684e0$19538ea0$@myers@panogenics.com> References: <001201cca6e3$5dc684e0$19538ea0$@myers@panogenics.com> Message-ID: <001301cca6e5$47782700$d6687500$@myers@panogenics.com> Reposted with less trace data... Hi Ross, Many thanks for the great support. So, I've started to use the H264VideoStreamer class directly as you suggest and I get much further. However at a certain point after a number of successful sendRTPOverTCP calls, subsequent calls seem to fail and then never recover. Here's the Live555 debug trace, sorry it's so long. Thanks again accept()ed connection from 10.26.7.96 Liveness indication from client at 10.26.7.96 Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 125 new bytes:OPTIONS rtsp://10.26.7.53:8554/stream1 RTSP/1.0 CSeq: 2 User-Agent: LibVLC/1.1.11 (LIVE555 Streaming Media v2011.05.25) parseRTSPRequestString() succeeded, returning cmdName "OPTIONS", urlPreSuffix "", urlSuffix "stream1", CSeq "2", Content-Length 0 sending response: RTSP/1.0 200 OK CSeq: 2 Date: Sat, Nov 19 2011 17:25:00 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 151 new bytes:DESCRIBE rtsp://10.26.7.53:8554/stream1 RTSP/1.0 CSeq: 3 User-Agent: LibVLC/1.1.11 (LIVE555 Streaming Media v2011.05.25) Accept: application/sdp parseRTSPRequestString() succeeded, returning cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "stream1", CSeq "3", Content-Length 0 sending response: RTSP/1.0 401 Unauthorized CSeq: 3 Date: Sat, Nov 19 2011 17:25:00 GMT WWW-Authenticate: Digest realm="LIVE555 Streaming Media", nonce="d2fa67191f913f704cf96039a83117d5" Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 349 new bytes:DESCRIBE rtsp://10.26.7.53:8554/stream1 RTSP/1.0 CSeq: 4 Authorization: Digest username="Admin", realm="LIVE555 Streaming Media", nonce="d2fa67191f913f704cf96039a83117d5", uri="rtsp://10.26.7.53:8554/stream1", response="b8d8668ce69d7dd84d08563a83f3e9ca" User-Agent: LibVLC/1.1.11 (LIVE555 Streaming Media v2011.05.25) Accept: application/sdp parseRTSPRequestString() succeeded, returning cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "stream1", CSeq "4", Content-Length 0 lookupPassword(Admin) returned password Admin Unable to determine our source address: This computer has an invalid IP address: 0x0 H264VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error) Parsed 12-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 7 ("Sequence parameter set")) profile_idc: 66 constraint_setN_flag: 0 level_idc: 50 seq_parameter_set_id: 0 log2_max_frame_num_minus4: 10 pic_order_cnt_type: 2 max_num_ref_frames: 1 gaps_in_frame_num_value_allowed_flag: 0 pic_width_in_mbs_minus1: 39 pic_height_in_map_units_minus1: 29 frame_mbs_only_flag: 1 frame_cropping_flag: 0 vui_parameters_present_flag: 1 BEGIN vui_parameters aspect_ratio_info_present_flag: 1 aspect_ratio_idc: 2 overscan_info_present_flag: 0 video_signal_type_present_flag: 1 colour_description_present_flag: 0 chroma_loc_info_present_flag: 0 timing_info_present_flag: 0 This "Sequence Parameter Set" NAL unit contained no frame rate information, so we use a default frame rate of 25.000000 fps Presentation time: 1321723500.943206 12 bytes @1321723500.943206, fDurationInMicroseconds: 0 ((0*1000000)/25.000000) Parsed 4-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 8 ("Picture parameter set")) Presentation time: 1321723500.943206 4 bytes @1321723500.943206, fDurationInMicroseconds: 0 ((0*1000000)/25.000000) H264VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error) Parsed 122203-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 5 ("Coded slice of an IDR picture")) Presentation time: 1321723500.943206 (IdrPicFlag differs in value) *****This NAL unit ends the current access unit***** 122203 bytes @1321723500.943206, fDurationInMicroseconds: 40000 ((1*1000000)/25.000000) sending response: RTSP/1.0 200 OK CSeq: 4 Date: Sat, Nov 19 2011 17:25:01 GMT Content-Base: rtsp://10.26.7.53:8554/stream1/ Content-Type: application/sdp Content-Length: 418 v=0 o=- 1321723458779236 1 IN IP4 0.0.0.0 s=H.264 video i=PanoCam t=0 0 a=tool:LIVE555 Streaming Media v2011.11.08 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:H.264 video a=x-qt-text-inf:PanoCam m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:2000 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=420032;sprop-parameter-sets=J0IAMotoCg PTAkgE,KM48gA== a=control:track1 Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 377 new bytes:SETUP rtsp://10.26.7.53:8554/stream1/track1 RTSP/1.0 CSeq: 5 Authorization: Digest username="Admin", realm="LIVE555 Streaming Media", nonce="d2fa67191f913f704cf96039a83117d5", uri="rtsp://10.26.7.53:8554/stream1/", response="6b4af4759579c26321f93e881a4912de" User-Agent: LibVLC/1.1.11 (LIVE555 Streaming Media v2011.05.25) Transport: RTP/AVP/TCP;unicast;interleaved=0-1 parseRTSPRequestString() succeeded, returning cmdName "SETUP", urlPreSuffix "stream1", urlSuffix "track1", CSeq "5", Content-Length 0 Unable to determine our source address: This computer has an invalid IP address: 0x0 Unable to determine our source address: This computer has an invalid IP address: 0x0 sending response: RTSP/1.0 200 OK CSeq: 5 Date: Sat, Nov 19 2011 17:25:01 GMT Transport: RTP/AVP/TCP;unicast;destination=10.26.7.96;source=10.26.7.53;interleaved=0-1 Session: 320A13EA Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 360 new bytes:PLAY rtsp://10.26.7.53:8554/stream1/ RTSP/1.0 CSeq: 6 Authorization: Digest username="Admin", realm="LIVE555 Streaming Media", nonce="d2fa67191f913f704cf96039a83117d5", uri="rtsp://10.26.7.53:8554/stream1/", response="148e0c525e1b4d85716a81b7b6299f07" User-Agent: LibVLC/1.1.11 (LIVE555 Streaming Media v2011.05.25) Session: 320A13EA Range: npt=0.000- parseRTSPRequestString() succeeded, returning cmdName "PLAY", urlPreSuffix "stream1", urlSuffix "", CSeq "6", Content-Length 0 RTCPInstance[0x5674f0c8]::RTCPInstance() schedule(2.473867->1321723503.719488) H264VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error) sending response: RTSP/1.0 200 OK CSeq: 6 Date: Sat, Nov 19 2011 17:25:01 GMT Range: npt=0.000- Session: 320A13EA RTP-Info: url=rtsp://10.26.7.53:8554/stream1/track1;seq=19460;rtptime=878953738 H264VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error) Parsed 129734-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 1 ("Coded slice of a non-IDR picture")) Presentation time: 1321723501.214402 (The next NAL unit is not a VCL) *****This NAL unit ends the current access unit***** 129734 bytes @1321723501.214402, fDurationInMicroseconds: 40000 ((1*1000000)/25.000000) sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed . . . sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:G sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:E sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:T sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:_ sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:P sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:A sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:R sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:A sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:M sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:E sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:T sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:E sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:R sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes: sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:r sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:t sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:s sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:p sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:: sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:/ sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:/ sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:1 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:0 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:. sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:2 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:6 sendRTPOverTCP: 687 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Parsed 12-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 7 ("Sequence parameter set")) profile_idc: 66 constraint_setN_flag: 0 level_idc: 50 seq_parameter_set_id: 0 log2_max_frame_num_minus4: 10 pic_order_cnt_type: 2 max_num_ref_frames: 1 gaps_in_frame_num_value_allowed_flag: 0 pic_width_in_mbs_minus1: 39 pic_height_in_map_units_minus1: 29 frame_mbs_only_flag: 1 frame_cropping_flag: 0 vui_parameters_present_flag: 1 BEGIN vui_parameters aspect_ratio_info_present_flag: 1 aspect_ratio_idc: 2 overscan_info_present_flag: 0 video_signal_type_present_flag: 1 colour_description_present_flag: 0 chroma_loc_info_present_flag: 0 timing_info_present_flag: 0 This "Sequence Parameter Set" NAL unit contained no frame rate information, so we use a default frame rate of 25.000000 fps Presentation time: 1321723501.254402 12 bytes @1321723501.254402, fDurationInMicroseconds: 0 ((0*1000000)/25.000000) sendRTPOverTCP: 24 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:. Parsed 5-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 8 ("Picture parameter set")) Presentation time: 1321723501.254402 5 bytes @1321723501.254402, fDurationInMicroseconds: 0 ((0*1000000)/25.000000) sendRTPOverTCP: 17 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:7 H264VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error) Parsed 20233-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 5 ("Coded slice of an IDR picture")) Presentation time: 1321723501.254402 (IdrPicFlag differs in value) *****This NAL unit ends the current access unit***** 20233 bytes @1321723501.254402, fDurationInMicroseconds: 40000 ((1*1000000)/25.000000) sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:. sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:5 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:3 Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:: sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:8 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:5 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:5 Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:4 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:/ sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:s sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:t Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:r sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:e sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:a sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: completed sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:m sendRTPOverTCP: 170 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! H264VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error) Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:1 Parsed 98178-byte NAL-unit (nal_ref_idc: 1, nal_unit_type: 1 ("Coded slice of a non-IDR picture")) Presentation time: 1321723501.294402 first_mb_in_slice: 0 slice_type: 0 pic_parameter_set_id: 0 frame_num: 1 Next NAL unit's slice_header: first_mb_in_slice: 0 slice_type: 0 pic_parameter_set_id: 0 frame_num: 2 (frame_num differs in value) *****This NAL unit ends the current access unit***** 98178 bytes @1321723501.294402, fDurationInMicroseconds: 40000 ((1*1000000)/25.000000) sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:/ sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes: sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:R sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:T sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:S sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:P sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:/ sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:1 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:. sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes:0 sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! Liveness indication from client at 10.26.7.96 RTSPClientSession[0x56700490]::handleRequestBytes() read 1 new bytes: sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! sendRTPOverTCP: 1448 bytes over channel 0 (socket 19) sendRTPOverTCP: failed! -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 38254 bytes Desc: not available URL: From gabor.molnar at vidux.net Sat Nov 19 07:35:42 2011 From: gabor.molnar at vidux.net (Molnar Gabor) Date: Sat, 19 Nov 2011 16:35:42 +0100 Subject: [Live-devel] How can I bind live555MedisaServer to an IP? Message-ID: <1321716942.2798.0.camel@gmiller-laptop> Hello, I would like to use live555 streaming multiple streams from one box, but each socket must be listen to another IP. Can you tell me is there a solution yet, or how can I solve that in the easiest way? Best regards, Gabor Molnar from Hungary -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 19 12:51:22 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Nov 2011 12:51:22 -0800 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> Message-ID: > 2 adjustments to the library allow me to tiptoe thru the call stack avoiding accessing destroyed or already deleted members and unwind this stack. I'll be doing something similar to your first fix (but it most certainly will not be using a "goto" :-) > This gest me almost all the way out of the stack unwind. But when I return to RTSPClient::playMediaSession after the call to the event loop there is a ?delete[] fResultString? OK, that's your fault. You're using the synchronous "RTSPClient" interface - which is OK - but you're having your "BYE" handler callback routine delete your "RTSPClient" object while you're in the middle of a synchronous operation. That's bad (on your part). (It's weird that you're getting a RTCP "BYE" before you get a response to your RTSP "PLAY" command, but I suppose that's possible...) What you'll need to do, then, is have your "BYE" handler routine set a Boolean flag (but not actually delete anything then), and have your code test for this flag after it returns from "playMediaSession()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 19 13:34:29 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 19 Nov 2011 21:34:29 +0000 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18BF1A@IL-BOL-EXCH01.smartwire.com> From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 19, 2011 2:51 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient 2 adjustments to the library allow me to tiptoe thru the call stack avoiding accessing destroyed or already deleted members and unwind this stack. I'll be doing something similar to your first fix (but it most certainly will not be using a "goto" :-) Yeah it was a "hack" I looked at the code and it seems we may need to de-couple destruction from stopping, maybe a flag. Let the objects natural destruction occur when the last thing exits. This gest me almost all the way out of the stack unwind. But when I return to RTSPClient::playMediaSession after the call to the event loop there is a "delete[] fResultString" OK, that's your fault. You're using the synchronous "RTSPClient" interface - which is OK - but you're having your "BYE" handler callback routine delete your "RTSPClient" object while you're in the middle of a synchronous operation. That's bad (on your part). (It's weird that you're getting a RTCP "BYE" before you get a response to your RTSP "PLAY" command, but I suppose that's possible...) What can I say, legacy code. Gotta refactor when there is time. This was discovered a week before a deployment. Here is how I get the BYE request. The hardware is a 4 channel h264 encoder that takes analog camera inputs. This encoder has what I consider to be a great feature over the other encoder we have. If there is no video input, instead of compressing a fancy frame that says "no video" over and over again, wasting disk space, It times out after 30 seconds and sends the BYE Request. It goes thru the entire conversation up to that point. It probably should respond negatively to the PLAY request. If that would fit the standard better, I can have the firmware adjusted (but we need to handle these rouge servers robustly at least) It was just when I unplugged a video camera's power supply by accident that the client on my end crashed. Mild panic ensued and almost ruined my whole weekend. :) What you'll need to do, then, is have your "BYE" handler routine set a Boolean flag (but not actually delete anything then), and have your code test for this flag after it returns from "playMediaSession()". Just curious, When/if is that next release scheduled? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 19 22:17:43 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Nov 2011 22:17:43 -0800 Subject: [Live-devel] H.264 live video streaming In-Reply-To: <001301cca6e5$47782700$d6687500$@myers@panogenics.com> References: <001201cca6e3$5dc684e0$19538ea0$@myers@panogenics.com> <001301cca6e5$47782700$d6687500$@myers@panogenics.com> Message-ID: <33D87B5E-4BBE-478D-A3EF-4FAAA60F2BB6@live555.com> > So, I've started to use the H264VideoStreamer class directly as you suggest > and I get much further. However at a certain point after a number of > successful sendRTPOverTCP calls, subsequent calls seem to fail and then > never recover. Your problem here is basically that you are trying to 'cram 6 pounds into a 5-pound sack'. I.e., you are trying to send data over a TCP connection at a faster rate than the TCP connection is capable of delivering. Each network connection has a certain bandwidth limit (obviously). if you try to stream data over this connection at a faster rate, then: - If you're streaming over UDP (the ideal thing to be doing), then some packets will get lost in the network. - If you're streaming over TCP (the 'less ideal' thing to be doing), then data will start buffering up in the operating system at the sending end, and then eventually (when the sending OS's buffer fills up), sends will fail. This is what you are seeing. The important thing to understand is that there is *nothing* that you can do about this, other than to reduce the bitrate of your encoded data - which is something that you have to do at the encoder. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 19 22:22:04 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Nov 2011 22:22:04 -0800 Subject: [Live-devel] How can I bind live555MedisaServer to an IP? In-Reply-To: <1321716942.2798.0.camel@gmiller-laptop> References: <1321716942.2798.0.camel@gmiller-laptop> Message-ID: <1B763633-6F58-42DA-BF61-6ABA775541E8@live555.com> > I would like to use live555 streaming multiple streams from one box, but each socket must be listen to another IP. > Can you tell me is there a solution yet, or how can I solve that in the easiest way? Yes, the easiest way to solve this is to make sure that multicast routing is enabled on the interface that you want (and not the other one). I.e., make sure that the interface that you want to use has a route for 224.0.0.0/4 Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Nov 20 03:03:59 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Nov 2011 03:03:59 -0800 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> Message-ID: <87B4613F-E22E-4E3A-B1D9-04C30AD39C32@live555.com> FYI, I have now released a new version (2011.11.20) of the "LIVE555 Streaming Media" code that should fix this problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Sun Nov 20 05:18:04 2011 From: david.myers at panogenics.com (David J Myers) Date: Sun, 20 Nov 2011 13:18:04 -0000 Subject: [Live-devel] H.264 live video streaming Message-ID: <000001cca786$d69643d0$83c2cb70$@myers@panogenics.com> HI Ross, Thanks again for your sterling advice. I finally got some video out to a VLC client. You were right, I had to throttle back the encoder to 500kb/s. This is not enough bandwidth for a decent image (640x480 @ 12fps) though. The funny thing is, once the stream is going, I can dynamically bump up the bandwidth to 2000kb/s or more and it stays up fine. Could it be that I need a delay of some sort to allow the stream to set up before I start sending encoded video data? Cheers, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sun Nov 20 08:57:53 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sun, 20 Nov 2011 16:57:53 +0000 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <87B4613F-E22E-4E3A-B1D9-04C30AD39C32@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> <87B4613F-E22E-4E3A-B1D9-04C30AD39C32@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18C185@IL-BOL-EXCH01.smartwire.com> Excellent!. Thanks It built and ran first time. Now for the other issue, the one that is "my fault". I am not sure how to fix that issue. my shutdown() function calls medium::close(). This is where the first crash was noticed in onReceive that you fixed. But it also calls MediaLookupTable::remove and that calls delete medium so I still get to the crash of delete[] fResultString (unless I comment it out) Maybe I can if guard it using the same flag? I do have the ideal piece of equipment to test it. :) From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Sunday, November 20, 2011 5:04 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Access violation crash in rtspclient FYI, I have now released a new version (2011.11.20) of the "LIVE555 Streaming Media" code that should fix this problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Nov 20 15:13:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Nov 2011 15:13:23 -0800 Subject: [Live-devel] Access violation crash in rtspclient In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18C185@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18AD48@IL-BOL-EXCH01.smartwire.com> <2785FF9F-E209-4A68-B2A8-81C24E2DE3B3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18AF2C@IL-BOL-EXCH01.smartwire.com> <873E8673-32F0-48A2-850E-AA9850B611E3@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18B8F4@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BABB@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD00@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18BD3D@IL-BOL-EXCH01.smartwire.com> <87B4613F-E22E-4E3A-B1D9-04C30AD39C32@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18C185@IL-BOL-EXCH01.smartwire.com> Message-ID: <08C9971F-BBDD-4CC1-B1C3-87C550D04588@live555.com> > It built and ran first time. Now for the other issue, the one that is ?my fault?. I am not sure how to fix that issue. my shutdown() function calls medium::close(). This is where the first crash was noticed in onReceive that you fixed. But it also calls MediaLookupTable::remove and that calls delete medium so I still get to the crash of delete[] fResultString (unless I comment it out) As I said before, you shouldn't be calling your shutdown() function directly from within your "BYE" handler, because of the possibility of the "BYE" handler being called while you're still in a synchronous "RTSPClient" operation. Instead, your "BYE" handler routine should just set a Boolean flag (but not call "shutdown()"). Then, keep testing this flag elsewhere (i.e., outside the synchronous "RTSPClient" calls), and only then - if the flag is True - call shutdown(). As always, you should be able to do this without making any changes to the supplied code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pbvelikov at gmail.com Fri Nov 18 00:50:55 2011 From: pbvelikov at gmail.com (Plamen) Date: Fri, 18 Nov 2011 08:50:55 +0000 (UTC) Subject: [Live-devel] Live555 in a Browser Plugin References: Message-ID: antonio nunziante writes: > > > Hi all,I built a widget in Qt that play a rtsp audio/video stream with Live555 and FFMPEG.It works and now I want to include it inside a Qt BrowserPlugin.My plugin at the moment is able to start the RTSP session (i can see packets start to arrive from the camera with WireShark) but an error occurs and plugin stops with an error.Could anyone say me anything about this kind of problem?Thank you-Antonio > > > > _______________________________________________ > live-devel mailing list > live-devel at ... > http://lists.live555.com/mailman/listinfo/live-devel > Hi Antonio, I am also trying to make a widget for Qt and to show video (RTSP stream) in Qt application. Could you help me and explain me how to do this? How to have the stream from OpenRTSP application and to show it on a widget (not to write a file)? Do I have to modificate the source code of OpenRTSP and how? Thank you in advance, From jshanab at smartwire.com Mon Nov 21 06:07:39 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Mon, 21 Nov 2011 14:07:39 +0000 Subject: [Live-devel] Live555 in a Browser Plugin In-Reply-To: References: Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18C462@IL-BOL-EXCH01.smartwire.com> We are gonna need some more information. But having said that I have live555 in browser plugin (firebreath) I will say this. Use firefox, turn off moz crash reporter and find the setting to get it to run in one process. Then if you are in windows you can attach with Visual studio to firefox and put breakpoints in your code and catch a callback -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Plamen Sent: Friday, November 18, 2011 2:51 AM To: live-devel at ns.live555.com Subject: Re: [Live-devel] Live555 in a Browser Plugin antonio nunziante writes: > > > Hi all,I built a widget in Qt that play a rtsp audio/video stream with > Live555 and FFMPEG.It works and now I want to include it inside a Qt BrowserPlugin.My plugin at the moment is able to start the RTSP session (i can see packets start to arrive from the camera with WireShark) but an error occurs and plugin stops with an error.Could anyone say me anything about this kind of problem?Thank you-Antonio > > > > _______________________________________________ > live-devel mailing list > live-devel at ... > http://lists.live555.com/mailman/listinfo/live-devel > Hi Antonio, I am also trying to make a widget for Qt and to show video (RTSP stream) in Qt application. Could you help me and explain me how to do this? How to have the stream from OpenRTSP application and to show it on a widget (not to write a file)? Do I have to modificate the source code of OpenRTSP and how? Thank you in advance, _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From jshanab at smartwire.com Mon Nov 21 11:41:37 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Mon, 21 Nov 2011 19:41:37 +0000 Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> I updated to the newest version of live555 and the WindowsAudioInput folder was added. In one of my projects I cannot build because I get the error : error C2664: 'strncpy' : cannot convert parameter 2 from 'WCHAR [32]' to 'const char *' c:\Users\jshanab\archiver\src\common\RTSP\WindowsAudioInputDevice\WindowsAudioInputDevice_noMixer.cpp Obviously UNICODE must be defined for WAVEINCAPS to be set for wide. How do I get this to work with Unicode. I have seen suggestions online that I must use some MS specific function other than strncpy? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 21 14:08:58 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Nov 2011 14:08:58 -0800 Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> Message-ID: <90EE433F-9820-4D70-87C3-8520308E9326@live555.com> > I updated to the newest version of live555 and the WindowsAudioInput folder was added. It was added in July 2003! You've probably had it all along. (But you don't have to use it if you don't want to; i.e., if you don't want to read PCM audio input from a Windows sound card.) > In one of my projects I cannot build because I get the error : error C2664: 'strncpy' : cannot convert parameter 2 from 'WCHAR [32]' to 'const char *' c:\Users\jshanab\archiver\src\common\RTSP\WindowsAudioInputDevice\WindowsAudioInputDevice_noMixer.cpp > > > Obviously UNICODE must be defined for WAVEINCAPS to be set for wide. How do I get this to work with Unicode. I have seen suggestions online that I must use some MS specific function other than strncpy? If you find that this is the case, then please let us know, so we can update the code in future releases. (Because this code - unlike the rest of the code distributed with LIVE555 - is Windows-specific, it's OK to call Windows-specific libraries from it.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Mon Nov 21 14:28:35 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Mon, 21 Nov 2011 22:28:35 +0000 Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode In-Reply-To: <90EE433F-9820-4D70-87C3-8520308E9326@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> <90EE433F-9820-4D70-87C3-8520308E9326@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18C6AC@IL-BOL-EXCH01.smartwire.com> What happened was one of the files now has an include and that cascaded to it I think. I had updated last march, so I wasn't that far behind. There appears to be "standard" methods also. Here is what I did to get me by until someone in the know can fix it. http://pastebin.com/bim5wnFg I just ifdef'd look for the string UNICODE I did not test this on a non-unicode machine or linux box yet. Just Web Searching From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, November 21, 2011 4:09 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode I updated to the newest version of live555 and the WindowsAudioInput folder was added. It was added in July 2003! You've probably had it all along. (But you don't have to use it if you don't want to; i.e., if you don't want to read PCM audio input from a Windows sound card.) In one of my projects I cannot build because I get the error : error C2664: 'strncpy' : cannot convert parameter 2 from 'WCHAR [32]' to 'const char *' c:\Users\jshanab\archiver\src\common\RTSP\WindowsAudioInputDevice\WindowsAudioInputDevice_noMixer.cpp Obviously UNICODE must be defined for WAVEINCAPS to be set for wide. How do I get this to work with Unicode. I have seen suggestions online that I must use some MS specific function other than strncpy? If you find that this is the case, then please let us know, so we can update the code in future releases. (Because this code - unlike the rest of the code distributed with LIVE555 - is Windows-specific, it's OK to call Windows-specific libraries from it.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Mon Nov 21 14:47:59 2011 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Mon, 21 Nov 2011 14:47:59 -0800 Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> Message-ID: <00cb01cca89f$9f43aa20$ddcafe60$@com> Hi Jeff, On Windows, when you want to build both Unicode and non-Unicode, the easiest way is to use the Microsoft-specific TCHAR functions. If you look up a particular function on MSDN you can find the TCHAR routine name. In this case it will be _tcsncpy. http://msdn.microsoft.com/en-us/library/xdsywd25(v=VS.100).aspx Chris Richardson WTI From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Monday, November 21, 2011 11:42 AM To: LIVE555 Streaming Media - development & use (live-devel at ns.live555.com) Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode I updated to the newest version of live555 and the WindowsAudioInput folder was added. In one of my projects I cannot build because I get the error : error C2664: 'strncpy' : cannot convert parameter 2 from 'WCHAR [32]' to 'const char *' c:\Users\jshanab\archiver\src\common\RTSP\WindowsAudioInputDevice\WindowsAud ioInputDevice_noMixer.cpp Obviously UNICODE must be defined for WAVEINCAPS to be set for wide. How do I get this to work with Unicode. I have seen suggestions online that I must use some MS specific function other than strncpy? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 21 15:04:50 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Nov 2011 15:04:50 -0800 Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18C6AC@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> <90EE433F-9820-4D70-87C3-8520308E9326@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18C6AC@IL-BOL-EXCH01.smartwire.com> Message-ID: <0D1377F7-9715-4974-9F75-2E9A3A58E13F@live555.com> > What happened was one of the files now has an include and that cascaded to it I think. I had updated last march, so I wasn?t that far behind. Well, nothing in the "WindowsAudioInputDevice" directory has changed in years. You should not be building (or including) this stuff at all, if you don't need it. > I just ifdef?d look for the string UNICODE > > I did not test this on a non-unicode machine or linux box yet. (It's Windows-only code, so it's not for Linux or any other sane OS :-) Because this code is so old, it's possible that it now breaks with more modern Windows systems, but I'm hoping that your "#ifdef UNICODE" fixes will be enough. Unless I hear otherwise, I'll include these changes in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Mon Nov 21 15:09:51 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Mon, 21 Nov 2011 23:09:51 +0000 Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode In-Reply-To: <00cb01cca89f$9f43aa20$ddcafe60$@com> References: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> <00cb01cca89f$9f43aa20$ddcafe60$@com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18C6F4@IL-BOL-EXCH01.smartwire.com> I am building cross platform from one src repo. TCHAR to be avoided at all cost. :) From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Chris Richardson (WTI) Sent: Monday, November 21, 2011 4:48 PM To: 'LIVE555 Streaming Media - development & use' Subject: Re: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode Hi Jeff, On Windows, when you want to build both Unicode and non-Unicode, the easiest way is to use the Microsoft-specific TCHAR functions. If you look up a particular function on MSDN you can find the TCHAR routine name. In this case it will be _tcsncpy. http://msdn.microsoft.com/en-us/library/xdsywd25(v=VS.100).aspx Chris Richardson WTI From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Monday, November 21, 2011 11:42 AM To: LIVE555 Streaming Media - development & use (live-devel at ns.live555.com) Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode I updated to the newest version of live555 and the WindowsAudioInput folder was added. In one of my projects I cannot build because I get the error : error C2664: 'strncpy' : cannot convert parameter 2 from 'WCHAR [32]' to 'const char *' c:\Users\jshanab\archiver\src\common\RTSP\WindowsAudioInputDevice\WindowsAudioInputDevice_noMixer.cpp Obviously UNICODE must be defined for WAVEINCAPS to be set for wide. How do I get this to work with Unicode. I have seen suggestions online that I must use some MS specific function other than strncpy? -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Mon Nov 21 16:09:47 2011 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Mon, 21 Nov 2011 16:09:47 -0800 Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18C6F4@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18C5C0@IL-BOL-EXCH01.smartwire.com> <00cb01cca89f$9f43aa20$ddcafe60$@com> <615FD77639372542BF647F5EBAA2DBC20B18C6F4@IL-BOL-EXCH01.smartwire.com> Message-ID: <00e201cca8ab$0bd78750$238695f0$@com> I also build cross-platform from one repo and understand the issues. However it is my understanding that the code you are modifying is calling mixerGetDevCaps, which is a Win32 function. Also, as Ross has already said, this code is all Windows only. Chris Richardson WTI From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Monday, November 21, 2011 3:10 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode I am building cross platform from one src repo. TCHAR to be avoided at all cost. J From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Chris Richardson (WTI) Sent: Monday, November 21, 2011 4:48 PM To: 'LIVE555 Streaming Media - development & use' Subject: Re: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode Hi Jeff, On Windows, when you want to build both Unicode and non-Unicode, the easiest way is to use the Microsoft-specific TCHAR functions. If you look up a particular function on MSDN you can find the TCHAR routine name. In this case it will be _tcsncpy. http://msdn.microsoft.com/en-us/library/xdsywd25(v=VS.100).aspx Chris Richardson WTI From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Monday, November 21, 2011 11:42 AM To: LIVE555 Streaming Media - development & use (live-devel at ns.live555.com) Subject: [Live-devel] WindowsAudioInputDevice_noMixer.hh and unicode I updated to the newest version of live555 and the WindowsAudioInput folder was added. In one of my projects I cannot build because I get the error : error C2664: 'strncpy' : cannot convert parameter 2 from 'WCHAR [32]' to 'const char *' c:\Users\jshanab\archiver\src\common\RTSP\WindowsAudioInputDevice\WindowsAud ioInputDevice_noMixer.cpp Obviously UNICODE must be defined for WAVEINCAPS to be set for wide. How do I get this to work with Unicode. I have seen suggestions online that I must use some MS specific function other than strncpy? -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcmordie at viionsystems.com Tue Nov 22 08:02:34 2011 From: mcmordie at viionsystems.com (Dave McMordie) Date: Tue, 22 Nov 2011 11:02:34 -0500 Subject: [Live-devel] Another question about connecting a single source to multiple sinks (eg file output and decoder) using a Frame Duplicator / Tee Sink Message-ID: <88e55256c7bf84c0a998028de0f70bc1@mail.gmail.com> Hi all, Like many others, I am interested in connecting a single source to multiple sinks in order to write the stream to disk in a playable format (AVI in my case) and also display it. I have reviewed various posts on the topic: http://lists.live555.com/pipermail/live-devel/2006-July/004648.html http://lists.live555.com/pipermail/live-devel/2006-May/004454.html http://lists.live555.com/pipermail/live-devel/2011-October/013919.html To summarize advice seems to be to simply write the output in a filter, not using the AviFileSink class, because we do not yet have a general purpose ?Frame Duplicator? class (what I would have called a Tee Sink). My questions for the community are these: 1. Would having a fully implemented Tee-sink / Frame Duplicator class offer significant advantages over ad-hoc methods of replicating a stream (eg. inside filters)? 2. How difficult would it be to implement this class (any gotchas, or should it be straightforward)? 3. Can anyone provide an outline of what this class would need to do? a. Create a MediaSink and two or more FramedSources as members b. Dispatch frame data from afterGettingFrame of the MediaSink to each of the afterGettingFunctions of the FramedSource..? I am not an expert with this library, but I might be willing to take a first crack at this class if it is not going to be a huge commitment and looks like an efficient way to solve the problem. As I look closely at this, it looks to me like a challenge may be that inheriting from both MediaSink and FramedSource could result in some clashes. Any guidance would be appreciated. Best, Dave McMordie -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Mon Nov 21 09:11:43 2011 From: tayeb.dotnet at gmail.com (Meftah Tayeb) Date: Mon, 21 Nov 2011 19:11:43 +0200 Subject: [Live-devel] DVB to RTSP streaming Message-ID: <869BA927426B4DA59BDF84C2F79147C0@work> hello, i see my email have bean moderated and i'm not sure if is send out to the ML or no, so i'm sending it back so please don't wory. i do have a dvb multicasting machine please can someone tel me how to convert multicast streams to Unicast RTSP? thank you Meftah Tayeb IT Consulting http://www.tmvoip.com/ phone: +21321656139 Mobile: +213660347746 __________ Information from ESET NOD32 Antivirus, version of virus signature database 6652 (20111122) __________ The message was checked by ESET NOD32 Antivirus. http://www.eset.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 22 12:41:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Nov 2011 12:41:23 -0800 Subject: [Live-devel] Another question about connecting a single source to multiple sinks (eg file output and decoder) using a Frame Duplicator / Tee Sink In-Reply-To: <88e55256c7bf84c0a998028de0f70bc1@mail.gmail.com> References: <88e55256c7bf84c0a998028de0f70bc1@mail.gmail.com> Message-ID: <66431C2A-E2AE-44FF-A041-791E5166A8C9@live555.com> > 1. Would having a fully implemented Tee-sink / Frame Duplicator class offer significant advantages over ad-hoc methods of replicating a stream (eg. inside filters)? It depends on what you want to do with each 'replica'. If you just need two 'replicas' - one going to a LIVE555 "MediaSink" (as normal); the other just being written to a file - then the simplest solution would be to just do the file writing normally (without using LIVE555 objects). What specifically do *you* want to do with the replicated streams? > 2. How difficult would it be to implement this class (any gotchas, or should it be straightforward)? Unfortunately, if it were easy, I probably would have done it already. The only real advice I can give right now is here: http://lists.live555.com/pipermail/live-devel/2006-May/004454.html > As I look closely at this, it looks to me like a challenge may be that inheriting from both MediaSink and FramedSource could result in some clashes. No, this is definitely 'barking up the wrong tree'. Multiple inheritance in C++ is (generally speaking) a bad idea, and not relevant here anyway, because a general 'frame duplicator' mechanism would not contain (or create) any "MediaSink" objects at all. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcmordie at viionsystems.com Tue Nov 22 13:04:24 2011 From: mcmordie at viionsystems.com (Dave McMordie) Date: Tue, 22 Nov 2011 16:04:24 -0500 Subject: [Live-devel] Another question about connecting a single source to multiple sinks (eg file output and decoder) using a Frame Duplicator / Tee Sink In-Reply-To: <66431C2A-E2AE-44FF-A041-791E5166A8C9@live555.com> References: <88e55256c7bf84c0a998028de0f70bc1@mail.gmail.com> <66431C2A-E2AE-44FF-A041-791E5166A8C9@live555.com> Message-ID: <57351b3267e7223e17708141d255c5a1@mail.gmail.com> 1. Would having a fully implemented Tee-sink / Frame Duplicator class offer significant advantages over ad-hoc methods of replicating a stream (eg. inside filters)? It depends on what you want to do with each 'replica'. If you just need two 'replicas' - one going to a LIVE555 "MediaSink" (as normal); the other just being written to a file - then the simplest solution would be to just do the file writing normally (without using LIVE555 objects). What specifically do *you* want to do with the replicated streams? 2. How difficult would it be to implement this class (any gotchas, or should it be straightforward)? Unfortunately, if it were easy, I probably would have done it already. The only real advice I can give right now is here: http://lists.live555.com/pipermail/live-devel/2006-May/004454.html As I look closely at this, it looks to me like a challenge may be that inheriting from both MediaSink and FramedSource could result in some clashes. No, this is definitely 'barking up the wrong tree'. Multiple inheritance in C++ is (generally speaking) a bad idea, and not relevant here anyway, because a general 'frame duplicator' mechanism would not contain (or create) any "MediaSink" objects at all. Thanks for the prompt reply, Ross. The issue is that I have no experience writing readable video files in the correct formats (AVI, mp4, etc.) and wanted to use your code to do so. I conclude from your input that it is simply easier to write a new AVIFileWrite class borrowing code liberally from your AVIFileSink class in order to get the job done. Certainly it is easy to write the packets out as they appear for decoding. It is just a question of getting the rest of the AVI structure (in this case) correct. Best, Dave -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Nov 22 13:11:09 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 22 Nov 2011 21:11:09 +0000 Subject: [Live-devel] Another question about connecting a single source to multiple sinks (eg file output and decoder) using a Frame Duplicator / Tee Sink In-Reply-To: <66431C2A-E2AE-44FF-A041-791E5166A8C9@live555.com> References: <88e55256c7bf84c0a998028de0f70bc1@mail.gmail.com> <66431C2A-E2AE-44FF-A041-791E5166A8C9@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18CAE0@IL-BOL-EXCH01.smartwire.com> I have a similar situation, although the original program had only a sink that wrote MJPEG to disk. I now have a sink that allows subscribers. The sink in my case calls add datablock with a boost::shared pointer to the frame (after my filter) on all subscribers. Each subscriber has a ring buffer for the shared pointers and the one to disk does not throw away if poorly serviced where as the ones to display customers can drop up to the next keyframe if the data "backs up". The addition of the pointer is very fast so the overall delay of a list of subscribers is minimal. I know this deviates from the pull model. but it is a thin line when you look at the code in depth. The main thing here is the frame is allocated once and only shared pointers are moved around. With many sources and many subscribers, I could not actually afford to be copying data more than once. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, November 22, 2011 2:41 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Another question about connecting a single source to multiple sinks (eg file output and decoder) using a Frame Duplicator / Tee Sink 1. Would having a fully implemented Tee-sink / Frame Duplicator class offer significant advantages over ad-hoc methods of replicating a stream (eg. inside filters)? It depends on what you want to do with each 'replica'. If you just need two 'replicas' - one going to a LIVE555 "MediaSink" (as normal); the other just being written to a file - then the simplest solution would be to just do the file writing normally (without using LIVE555 objects). What specifically do *you* want to do with the replicated streams? 2. How difficult would it be to implement this class (any gotchas, or should it be straightforward)? Unfortunately, if it were easy, I probably would have done it already. The only real advice I can give right now is here: http://lists.live555.com/pipermail/live-devel/2006-May/004454.html As I look closely at this, it looks to me like a challenge may be that inheriting from both MediaSink and FramedSource could result in some clashes. No, this is definitely 'barking up the wrong tree'. Multiple inheritance in C++ is (generally speaking) a bad idea, and not relevant here anyway, because a general 'frame duplicator' mechanism would not contain (or create) any "MediaSink" objects at all. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Nov 23 13:29:33 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 23 Nov 2011 21:29:33 +0000 Subject: [Live-devel] Aborting RTSP session Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18CF1D@IL-BOL-EXCH01.smartwire.com> I have a problematic encoder that goes thru the entire rtsp conversation and then if it has no video sends a BYE request after 30 seconds. I am trying to figure out how I can abort this process. It looks like after live555 sends the PLAY request , it is in the event loop (which has a fast freq) waiting fro an event. I suppose I need to create an event on a timeout and send the event to break the loop. I also see a start and end argument all the way down to the sendPlayCommand, but They are always set to 0 and -1 and I couldn't find documentation. Can someone point me to documentation for doing this? The lib is used in a browser plugin and it needs to shut down quicly if someone jumps to different page. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 23 14:12:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Nov 2011 14:12:53 -0800 Subject: [Live-devel] Aborting RTSP session In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18CF1D@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18CF1D@IL-BOL-EXCH01.smartwire.com> Message-ID: <81B2A325-8661-4984-95D1-B20B99FCF1D0@live555.com> > I have a problematic encoder that goes thru the entire rtsp conversation and then if it has no video sends a BYE request after 30 seconds. > I am trying to figure out how I can abort this process. Just to be clear - you're talking about wanting to abort the RTSP *client* process, right (not the RTSP server that's sending the BYE)? > It looks like after live555 sends the PLAY request , it is in the event loop (which has a fast freq) waiting fro an event. Yes, because you're using the synchronous RTSP client interface. As I've noted in the past, you can't close your "RTSPClient" from within your "BYE" handler - because at this point you're still within the synchronous "playMediaSession()" routine. But the problem you face is that your server never responds to the RTSP "PLAY" command, so you're never going to return from "playMediaSession()". Your situation is a perfect illustration of why the synchronous RTSP client interface is deprecated (and will someday go away). You need to be using the asynchronous interface instead. That way you'll avoid the problem; you will simply be able to close the "RTSPClient" object from within your "BYE" handler. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lovrencsics.eva.2 at gmail.com Wed Nov 23 07:20:28 2011 From: lovrencsics.eva.2 at gmail.com (Eva Lovi) Date: Wed, 23 Nov 2011 16:20:28 +0100 Subject: [Live-devel] RTSPClient example? Message-ID: Hello! I'd like to make an RTSPClient which doesn't receive but SEND data (video and audio) to an RTSP server. I don't see any example like this. Example openRTSP. is receiving data. It's not good for me. Other examples, like testH264VideoStreamer creates a server, but I don't want to create it, but connect to a server and send video and audio. Please help me, I need a (working) example. Best regards, Evi -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Nov 23 14:25:28 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 23 Nov 2011 22:25:28 +0000 Subject: [Live-devel] Aborting RTSP session In-Reply-To: <81B2A325-8661-4984-95D1-B20B99FCF1D0@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18CF1D@IL-BOL-EXCH01.smartwire.com> <81B2A325-8661-4984-95D1-B20B99FCF1D0@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18CF74@IL-BOL-EXCH01.smartwire.com> Thanks. For now I am setting a little watchdog timer to change the watchvariable. I will take a look at the openRTSP example over the holidays and modify my client over to the async. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, November 23, 2011 4:13 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Aborting RTSP session I have a problematic encoder that goes thru the entire rtsp conversation and then if it has no video sends a BYE request after 30 seconds. I am trying to figure out how I can abort this process. Just to be clear - you're talking about wanting to abort the RTSP *client* process, right (not the RTSP server that's sending the BYE)? It looks like after live555 sends the PLAY request , it is in the event loop (which has a fast freq) waiting fro an event. Yes, because you're using the synchronous RTSP client interface. As I've noted in the past, you can't close your "RTSPClient" from within your "BYE" handler - because at this point you're still within the synchronous "playMediaSession()" routine. But the problem you face is that your server never responds to the RTSP "PLAY" command, so you're never going to return from "playMediaSession()". Your situation is a perfect illustration of why the synchronous RTSP client interface is deprecated (and will someday go away). You need to be using the asynchronous interface instead. That way you'll avoid the problem; you will simply be able to close the "RTSPClient" object from within your "BYE" handler. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 23 14:50:50 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Nov 2011 14:50:50 -0800 Subject: [Live-devel] RTSPClient example? In-Reply-To: References: Message-ID: > I'd like to make an RTSPClient which doesn't receive but SEND data (video and audio) to an RTSP server. Most RTSP servers - including ours - do not support this (being fed data from clients). However, for those RTSP servers that do happen to support this, the RTSP "ANNOUNCE" command is used to tell the server about the session (using a SDP description), and then some (server-specific) mechanism is used to receive data. We have a mechanism - the "DarwinInjector" class - that can be used to announce/inject data into one specific kind of server - Apple's "Darwin Streaming Server", as well as a couple of demo programs: "testMPEG1or2AudioVideoToDarwin" and "testMPEG4VideoToDarwin". (However, this code is no longer actively supported.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lanlamer at gmail.com Wed Nov 23 23:52:27 2011 From: lanlamer at gmail.com (ouyang NJUPT) Date: Thu, 24 Nov 2011 15:52:27 +0800 Subject: [Live-devel] How can I modify the openRtsp to support multi rtsp ? Message-ID: Hi,everyone! I was asked to modify the openRtsp to support multi different rtsp. I have build it and run normally on windows xp but have no idea how to modify the source code. I have surfed the office FAQ in the Live555.com but find nothing. Can anyone give me same suggestion? Thanks very much! -------------- next part -------------- An HTML attachment was scrubbed... URL: From sledz at dresearch-fe.de Thu Nov 24 05:39:15 2011 From: sledz at dresearch-fe.de (Steffen Sledz) Date: Thu, 24 Nov 2011 14:39:15 +0100 Subject: [Live-devel] older source archives Message-ID: <4ECE4903.5000103@dresearch-fe.de> It seems that at only the latest source tarball is published. Is there an URL where older tarballs are available? They are needed especially for maintained distributions (in our case OpenEmbedded[1] related distributions like ?ngstr?m[2]), where it is not acceptable to switch to a latest version every few days. Regards, Steffen Sledz [1] [2] -- DResearch Fahrzeugelektronik GmbH Otto-Schmirgal-Str. 3, 10319 Berlin, Germany Tel: +49 30 515932-237 mailto:sledz at dresearch-fe.de Fax: +49 30 515932-299 Gesch?ftsf?hrer: Dr. Michael Weber, Werner M?gle; Amtsgericht Berlin Charlottenburg; HRB 130120 B; Ust.-IDNr. DE273952058 From finlayson at live555.com Thu Nov 24 05:56:26 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Nov 2011 05:56:26 -0800 Subject: [Live-devel] How can I modify the openRtsp to support multi rtsp ? In-Reply-To: References: Message-ID: <1F66CD0C-6C99-4820-A4B5-55A31A2B403C@live555.com> > I was asked to modify the openRtsp to support multi different rtsp. I have build it and run normally on windows xp but have no idea how to modify the source code. I have surfed the office FAQ in the Live555.com but find nothing. Can anyone give me same suggestion? I presume that by "support multi different rtsp" you mean "support opening multiple "rtsp://" URLs concurrently". You do this by creating separate "RTSPClient" objects for each "rtsp://" URL. You can use the existing "openRTSP" code (in "testProgs/playCommon.cpp" and "testProgs/openRTSP.cpp") as a model. Note, however, the "continueAfter...()" functions (in "testProgs/playCommon.cpp"). The first parameter to these functions is a "RTSPClient*". In the current code, this first parameter is not used - because there is only one "RTSPClient" object, and it's a global variable. In your code, however, you will need to use this first parameter; it will point to the particular "RTSPClient" object that the result was for. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 24 06:15:26 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Nov 2011 06:15:26 -0800 Subject: [Live-devel] older source archives In-Reply-To: <4ECE4903.5000103@dresearch-fe.de> References: <4ECE4903.5000103@dresearch-fe.de> Message-ID: > It seems that at only the latest source tarball is published. Is there an URL where older tarballs are available? No, see and > They are needed especially for maintained distributions (in our case OpenEmbedded[1] related distributions like ?ngstr?m[2]), where it is not acceptable to switch to a latest version every few days. I don't understand. Nobody is making you switch to the latest version whenever it comes out (although it's a good idea to stay reasonably up-to-date, and it's very easy to do so). If you are using an older version of the code, then you have it. Why do you need a web site to access a version of the code that you already have?? It's important to understand that this software - unlike some others - does not have separate 'stable' and 'experimental' releases. Instead, there's just one release, and it can be considered 'stable'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sledz at dresearch-fe.de Thu Nov 24 06:42:57 2011 From: sledz at dresearch-fe.de (Steffen Sledz) Date: Thu, 24 Nov 2011 15:42:57 +0100 Subject: [Live-devel] older source archives In-Reply-To: References: <4ECE4903.5000103@dresearch-fe.de> Message-ID: <4ECE57F1.5020209@dresearch-fe.de> On 24.11.2011 15:15, Ross Finlayson wrote: >> It seems that at only the latest source tarball is published. Is there an URL where older tarballs are available? > > No, see > > and > > > >> They are needed especially for maintained distributions (in our case OpenEmbedded[1] related distributions like ?ngstr?m[2]), where it is not acceptable to switch to a latest version every few days. > > I don't understand. Nobody is making you switch to the latest version whenever it comes out (although it's a good idea to stay reasonably up-to-date, and it's very easy to do so). If you are using an older version of the code, then you have it. Why do you need a web site to access a version of the code that you already have?? The OpenEmbedded related distributions are used by many developers and companies developing for embedded devices. They are not distributed as a set of binaries or source archives, but as a set of metadata describing where to fetch the sources, and how to build, install, and package them. So someone who like/need to build a distribution needs to be able to download the specified versions over a longer time. > It's important to understand that this software - unlike some others - does not have separate 'stable' and 'experimental' releases. Instead, there's just one release, and it can be considered 'stable'. I understand that. But i think there's no need to hide older versions to others (i do not know any other project following such a strategy). On the contrary. In my opinion making older versions available helps others to identify problems and suggest solutions for it. BTW: It would be OK for us, if the project history would be available in a public code repository (e.g. svn or git). Regards, Steffen Sledz -- DResearch Fahrzeugelektronik GmbH Otto-Schmirgal-Str. 3, 10319 Berlin, Germany Tel: +49 30 515932-237 mailto:sledz at dresearch-fe.de Fax: +49 30 515932-299 Gesch?ftsf?hrer: Dr. Michael Weber, Werner M?gle; Amtsgericht Berlin Charlottenburg; HRB 130120 B; Ust.-IDNr. DE273952058 From finlayson at live555.com Thu Nov 24 11:23:25 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Nov 2011 11:23:25 -0800 Subject: [Live-devel] older source archives In-Reply-To: <4ECE57F1.5020209@dresearch-fe.de> References: <4ECE4903.5000103@dresearch-fe.de> <4ECE57F1.5020209@dresearch-fe.de> Message-ID: <5DB452FC-FC2E-4AC9-BE3A-573297F931ED@live555.com> >>> They are needed especially for maintained distributions (in our case OpenEmbedded[1] related distributions like ?ngstr?m[2]), where it is not acceptable to switch to a latest version every few days. >> >> I don't understand. Nobody is making you switch to the latest version whenever it comes out (although it's a good idea to stay reasonably up-to-date, and it's very easy to do so). If you are using an older version of the code, then you have it. Why do you need a web site to access a version of the code that you already have?? > > The OpenEmbedded related distributions are used by many developers and companies developing for embedded devices. They are not distributed as a set of binaries or source archives, but as a set of metadata describing where to fetch the sources, and how to build, install, and package them. OK, but I still don't understand. In the case of the "LIVE555 Streaming Media" software, the "metadata describing where to fetch the sources" can simply be a link to http://www.live555.com/liveMedia/public/live555-latest.tar.gz which tells you where to get the latest (and most bug-free) version of the code. Why not just do this? > So someone who like/need to build a distribution needs to be able to download the specified versions over a longer time. No, they should always be using the latest version of the software. >> It's important to understand that this software - unlike some others - does not have separate 'stable' and 'experimental' releases. Instead, there's just one release, and it can be considered 'stable'. > > I understand that. But i think there's no need to hide older versions to others I'm not 'hiding' older versions; I'm just not putting them on our web site. (Because this is ope source, other people, if they wish, may keep copies of older versions, but I'm not.) The reason is quite simple: I feel I cannot, with a clear conscience, distribute software that I know is buggy. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Nov 24 12:31:01 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 24 Nov 2011 12:31:01 -0800 Subject: [Live-devel] older source archives In-Reply-To: <4ECE4903.5000103@dresearch-fe.de> References: <4ECE4903.5000103@dresearch-fe.de> Message-ID: On Thu, Nov 24, 2011 at 5:39 AM, Steffen Sledz wrote: > It seems that at only the > latest source tarball is published. Is there an URL where older tarballs > are available? > > They are needed especially for maintained distributions (in our case > OpenEmbedded[1] related distributions like ?ngstr?m[2]), where it is not > acceptable to switch to a latest version every few days. > > To the OP, I've been stuffing versions of Live555 into a google code project so I have SVN history: http://code.google.com/p/live555sourcecontrol/ ...mainly, my interest is in having SVN history so I know _why_ the code is, and what's changed over time. Still need to stuff the most recent version in there, I'll get to that sometime today. As for updating to the most recent version, this works okay--until there's some bug in the library you can't solve and nobody on the mailing list seems to know what's going on, and you have no source code history so it's much more difficult to understand how the code changed to precipitate that bug. Currently, this is the situation I face. Receiving RTP over TCP in current releases with multiple sub-sessions is, by and large, broken as best I can tell due to something with the new synchronous API--I have not been able to solve it, and the "workaround" I found has unacceptable results. History would help, which is why I started stuffing versions into the aforementioned project. -Jeremy -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Nov 24 13:25:44 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Nov 2011 13:25:44 -0800 Subject: [Live-devel] Jeremy Noring's problem with RTP-over-TCP In-Reply-To: References: <4ECE4903.5000103@dresearch-fe.de> Message-ID: <42A73F20-450C-4360-BBAC-E047D3A598C8@live555.com> > Receiving RTP over TCP in current releases with multiple sub-sessions is, by and large, broken as best I can tell due to something with the new synchronous API You mean "asynchronous API". We talked about your problem a lot on the mailing list back in July. As far as I can tell, you're the only person having this problem with receiving RTP-over-TCP, so it may be inaccurate to describe it as being "broken". (I also doubt that the asynchronous API is responsible.) Back in July I wasn't able to explain the symptoms that you were seeing, but I still suspect that you might be using multiple threads incorrectly (by, perhaps inadvertently, accessing the same LIVE555 object(s) from multiple threads). If you stop using multiple threads (as I've noted several times, you don't need multiple threads to concurrently receive multiple RTSP/RTP streams within a single LIVE555 application if you use the asynchronous API), then I suspect your problems will go away. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Nov 24 14:32:12 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 24 Nov 2011 14:32:12 -0800 Subject: [Live-devel] Jeremy Noring's problem with RTP-over-TCP In-Reply-To: <42A73F20-450C-4360-BBAC-E047D3A598C8@live555.com> References: <4ECE4903.5000103@dresearch-fe.de> <42A73F20-450C-4360-BBAC-E047D3A598C8@live555.com> Message-ID: On Thu, Nov 24, 2011 at 1:25 PM, Ross Finlayson wrote: > Receiving RTP over TCP in current releases with multiple sub-sessions is, > by and large, broken as best I can tell due to something with the new > synchronous API > > > You mean "asynchronous API". > > We talked about your problem a lot on the mailing list back in July. As > far as I can tell, you're the only person having this problem with > receiving RTP-over-TCP, so it may be inaccurate to describe it as being > "broken". (I also doubt that the asynchronous API is responsible.) Back > in July I wasn't able to explain the symptoms that you were seeing, but I > still suspect that you might be using multiple threads incorrectly (by, > perhaps inadvertently, accessing the same LIVE555 object(s) from multiple > threads). If you stop using multiple threads (as I've noted several times, > you don't need multiple threads to concurrently receive multiple RTSP/RTP > streams within a single LIVE555 application if you use the asynchronous > API), then I suspect your problems will go away. > > Yes, I meant asynchronous. My apologies. I am not using multiple threads; I've been using this library for almost four years, so I'm well aware of the threading implications. Every Live555 instance I work with is completely contained on a single thread, and the only signaling that occurs is through watch variables. Also, I could easily see someone missing this issue because in the currently library the behavior is masked. You still get marginal performance due to the workaround that was included in the library. But I can still reproduce the issue on demand, and I suspect other people A) aren't using TCP with multiple subsessions or B) aren't looking closely. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lanlamer at gmail.com Thu Nov 24 22:41:40 2011 From: lanlamer at gmail.com (Johnnie Au-Yeung) Date: Fri, 25 Nov 2011 14:41:40 +0800 Subject: [Live-devel] How can I modify the openRtsp to support multi rtsp ? In-Reply-To: <1F66CD0C-6C99-4820-A4B5-55A31A2B403C@live555.com> References: <1F66CD0C-6C99-4820-A4B5-55A31A2B403C@live555.com> Message-ID: Thanks for your reply and kindness guide!! On Thu, Nov 24, 2011 at 9:56 PM, Ross Finlayson wrote: > I was asked to modify the openRtsp to support multi different rtsp. > I have build it and run normally on windows xp but have no idea how to > modify the source code. I have surfed the office FAQ in the Live555.combut find nothing. Can anyone give me same suggestion? > > > I presume that by "support multi different rtsp" you mean "support opening > multiple "rtsp://" URLs concurrently". You do this by creating separate > "RTSPClient" objects for each "rtsp://" URL. > > You can use the existing "openRTSP" code (in "testProgs/playCommon.cpp" > and "testProgs/openRTSP.cpp") as a model. > > Note, however, the "continueAfter...()" functions > (in "testProgs/playCommon.cpp"). The first parameter to these functions is > a "RTSPClient*". In the current code, this first parameter is not used - > because there is only one "RTSPClient" object, and it's a global variable. > In your code, however, you will need to use this first parameter; it will > point to the particular "RTSPClient" object that the result was for. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sledz at dresearch-fe.de Thu Nov 24 23:42:28 2011 From: sledz at dresearch-fe.de (Steffen Sledz) Date: Fri, 25 Nov 2011 08:42:28 +0100 Subject: [Live-devel] older source archives In-Reply-To: <5DB452FC-FC2E-4AC9-BE3A-573297F931ED@live555.com> References: <4ECE4903.5000103@dresearch-fe.de> <4ECE57F1.5020209@dresearch-fe.de> <5DB452FC-FC2E-4AC9-BE3A-573297F931ED@live555.com> Message-ID: <4ECF46E4.3020204@dresearch-fe.de> On 24.11.2011 20:23, Ross Finlayson wrote: >>>> They are needed especially for maintained distributions (in our case OpenEmbedded[1] related distributions like ?ngstr?m[2]), where it is not acceptable to switch to a latest version every few days. >>> >>> I don't understand. Nobody is making you switch to the latest version whenever it comes out (although it's a good idea to stay reasonably up-to-date, and it's very easy to do so). If you are using an older version of the code, then you have it. Why do you need a web site to access a version of the code that you already have?? >> >> The OpenEmbedded related distributions are used by many developers and companies developing for embedded devices. They are not distributed as a set of binaries or source archives, but as a set of metadata describing where to fetch the sources, and how to build, install, and package them. > > OK, but I still don't understand. In the case of the "LIVE555 Streaming Media" software, the "metadata describing where to fetch the sources" can simply be a link to > http://www.live555.com/liveMedia/public/live555-latest.tar.gz > which tells you where to get the latest (and most bug-free) version of the code. Why not just do this? > >> So someone who like/need to build a distribution needs to be able to download the specified versions over a longer time. > > No, they should always be using the latest version of the software. In theory this may be right. But in the real world outside there this is definitely wrong. It seems that you do not understand the work of a distribution maintainer. He has to pick versions of dozens or hundreds of packages and make them work *together*. This needs (depending of the kind of the distribution) a lot of testing which makes it impossible to follow your strategy. In the case of our distributions which often are used for embedded industrial devices the problem is much bigger. In some industrial environments it is necessary to run very expensive certification procedures on *each* software change. So if i have a version of a package which fulfills all requirements *in my distribution context* i definitely will not change to a newer one. Regards, Steffen -- DResearch Fahrzeugelektronik GmbH Otto-Schmirgal-Str. 3, 10319 Berlin, Germany Tel: +49 30 515932-237 mailto:sledz at dresearch-fe.de Fax: +49 30 515932-299 Gesch?ftsf?hrer: Dr. Michael Weber, Werner M?gle; Amtsgericht Berlin Charlottenburg; HRB 130120 B; Ust.-IDNr. DE273952058 From rglobisch at csir.co.za Fri Nov 25 00:21:42 2011 From: rglobisch at csir.co.za (Ralf Globisch) Date: Fri, 25 Nov 2011 10:21:42 +0200 Subject: [Live-devel] Jeremy Noring's problem with RTP-over-TCP In-Reply-To: References: Message-ID: Hi Jeremy & Ross, FWIW: Re-reading the posts from July i *might* have something to add to the discussion. We exclusively use the RTP over TCP option and also saw a strange issue with it fairly recently.?Just recapping in case it is of use to either of you: We compiled the Android version of VLC and made some changes to be able to stream over the network using RTP over RTSP. When switching from wireless to 3G/Edge the stream would fail to play. At the time I thought it might have something to do with RTP being in the same packet as the RTSP (see http://lists.live555.com/pipermail/live-devel/2011-October/013866.html) but Ross corrected this line of thought in his response and further testing?confirmed that. Further debugging shows that when things went wrong, the stream channel ID could not be matched on the client correctly: IIRC the server (also running live555) was sending video and audio using IDs 2 and 3, and the client was?trying to match 0 and 1. Also, IIRC, the latest live555 code in the client (obtained when?building the android version)?does a check and goes back into a?searching for $ state if it?fails to match the ID. Older code didn't have this check. It looked like mismatching the IDs had broken the state machine on the client. At this stage I was clueless how to proceed (esp. since this failure to match IDs only happened over 3G and *not* over wireless). I then updated our server application to use the latest version of live555 and the problem went away completely. So to conclude: somehow using a new version of the RTSP client (with async interface) over 3G/Edge?with an older version of the live555 server caused a mismatch in stream channel IDs. @Jeremy: I hope this helps. In your case, is the server also running live555 and are you able to try to upgrade the server? From finlayson at live555.com Fri Nov 25 02:26:51 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Nov 2011 02:26:51 -0800 Subject: [Live-devel] older source archives In-Reply-To: <4ECF46E4.3020204@dresearch-fe.de> References: <4ECE4903.5000103@dresearch-fe.de> <4ECE57F1.5020209@dresearch-fe.de> <5DB452FC-FC2E-4AC9-BE3A-573297F931ED@live555.com> <4ECF46E4.3020204@dresearch-fe.de> Message-ID: <7BEA81F0-500C-4EB4-B5E8-B190EFA99DC0@live555.com> This is basically a difference in philosophy. You believe - as the user of the library - that you best know which version of the library you should be using. However, I believe - as the developer and maintainer of the library - that I best know which version of the library you should be using. You may feel differently, however - in which case you are free to keep your own copy of whatever version you want to use. That's the beauty of open source software. However, I'm not going to put - on our own web site - a version of the software that I know contains bugs, and therefore that I feel people should not be using. Nor am I going to support - on this mailing list - people who are using old versions of the software. That's my choice, and I'm sticking with it. And the fact that other software that you may use does things differently is not an argument. You've found a different project. Congratulations. I remind people once again to read the FAQ - http://www.live555.com/liveMedia/faq.html - (as you were asked to do before posting to the mailing list :-). It answers a lot of common questions (such as this) about the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Fri Nov 25 10:44:08 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Fri, 25 Nov 2011 10:44:08 -0800 Subject: [Live-devel] Jeremy Noring's problem with RTP-over-TCP In-Reply-To: References: Message-ID: On Fri, Nov 25, 2011 at 12:21 AM, Ralf Globisch wrote: > @Jeremy: I hope this helps. In your case, is the server also running > live555 and are you able > to try to upgrade the server? > Yes, our server is live555 based as well. At the time, I did try updating, but that didn't appear to resolve the issue. Here's the change that would allow the library to recover: http://code.google.com/p/live555sourcecontrol/source/diff?spec=svn11&r=11&format=side&path=/trunk/liveMedia/RTPInterface.cpp ...this lets the client keep going, but some data gets tossed, so I get jerky playback. Seems like it involves the stream channel ID as well, which would corroborate well with your description? When I get a chance, I'll revisit the issue and see if latest on both sides resolves anything. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Nov 25 20:06:28 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Nov 2011 20:06:28 -0800 Subject: [Live-devel] Removal of "inet_ntoa()" (and "our_inet_ntoa()") Message-ID: As you all know (because you've all read the FAQ :-), the LIVE555 libraries can be called from multiple threads, although only if each thread uses its own "TaskScheduler" and "UsageEnvironment". Even this, however, may not be totally safe, because of the possibility that the LIVE555 libraries, and/or your own code, may call system library functions that themselves are not 'thread safe'. The VLC developers have identified a handful of such system library functions that are currently used the LIVE555 library code, and I am currently going through the code to remove/replace them. The first such library function is "inet_ntoa()" (which, in our code, we renamed "our_inet_ntoa()"). I have just released a new version (2011.11.26) of the "LIVE555 Streaming Media" code that removes calls to this function. In its place is a new class "AddressString" - defined in "groupsock/include/NetAddress.hh". For example, now, in the code, instead of calling our_inet_addr(addr) we now call AddressString(addr).val() If your own code happens to use multiple threads, you may wish to consider replacing any calls to "inet_ntoa()" with this new mechanism. (A note to the VLC developers: This new release removes the need for the first, and largest of your 'LIVE555 patches'. I hope to do the same for your other patches also.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 26 09:40:27 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 26 Nov 2011 17:40:27 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> I just started trying to upgrade our code to use the asynchronous interface and I am a bit perplexed at the mix of C and C++. In other libraries the callbacks generally have a void pointer to clientData or 'instance' or something like that which allows me to pass the 'this' pointer, cast it, to the class and access all the particular instances' members. In the openRTSP and PlayCommon code I see the signatures for the callbacks are hard coded to foo(RTSPClient::responsehandler*,int,char*). One level deep it is ok, but as soon as one of these callbacks chains to the next callback, I have lost any reference to my instance data. I suspect the architecture of this code is to thready but I still want instances of my objects. For example. This is what causes the ourRTSPClient in the example to be a Global variable. I need to have this as a class member variable, and perhaps do something like this.... getOptions(RTSPClient::responseHandler* afterFunc, void* clientData) { myClass * instancePTR = (myClass *) clientData; instancePTR->ourClient_->sendOptionsCommand(instancePTR, ourAuthenticator, clientData); } I guess an ideal way would be to create a callback interface that we just include in our code and fill in the blanks. A pointer to the instance is then all live555 needs as it would dictate the names and signatures of the calls. What is the best way to use these callbacks in a multi threaded (multiple UsageEnvironments) c++ program.? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 26 11:34:56 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Nov 2011 11:34:56 -0800 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> Message-ID: <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> > I just started trying to upgrade our code to use the asynchronous interface and I am a bit perplexed at the mix of C and C++. In other libraries the callbacks generally have a void pointer to clientData or ?instance? or something like that which allows me to pass the ?this? pointer, cast it, to the class and access all the particular instances? members. In the openRTSP and PlayCommon code I see the signatures for the callbacks are hard coded to foo(RTSPClient::responsehandler*,int,char*). The "openRTSP.cpp"/"playCommon.cpp" code, unfortunately, does not provide the greatest example of how to use the asynchronous "RTSPClient" interface, precisely because it uses a single, global "RTSPClient*" variable (called "ourRTSPClient"). (Also, because "playCommon.cpp" shares code with a separate application called "playSIP".) The key thing to note is that the "RTSPClient:responseHandler()" functions all have a "RTSPClient*" as their first parameter. When the response handler gets called, its first parameter will be (a pointer to) the "RTSPClient" object that made the request. In our "playCommon.cpp" code, the first parameter to our response handler functions ends up not being needed, because its value will - in this case - be the same as our global variable "ourRTSPClient". In your case, however, you will want to use this first parameter, because it will tell you which particular "RTSPClient" object made the request. > For example. This is what causes the ourRTSPClient in the example to be a Global variable. I need to have this as a class member variable, and perhaps do something like this?. > > getOptions(RTSPClient::responseHandler* afterFunc, void* clientData) { > myClass * instancePTR = (myClass *) clientData; > instancePTR->ourClient_->sendOptionsCommand(instancePTR, ourAuthenticator, clientData); > } No, this won't work, because the signature of your call to "RTSPClient::sendOptionsCommand()" is all wrong. Instead, don't define or call a "getOptions()" function at all. Instead just call: instancePTR->ourClient->sendOptionsCommand(continueAfterOPTIONS, ourAuthenticator); And then, when the "continueAfterOPTIONS()" response handler later gets called, its first parameter will be "instancePTR->ourClient" - i.e., a "RTSPClient*". If you need more information from this "RTSPClient*" pointer (e.g., in your case, to identify it's "myClass*" parent), then you can do so by subclassing "RTSPClient", and putting the information that you want in a subclass member field. > What is the best way to use these callbacks in a multi threaded (multiple UsageEnvironments) c++ program.? You do realize, I hope, that the use of the asynchronous "RTSPClient" interface makes it even less necessary to use multiple threads. (If you use the asynchronous interface, you can access multiple RTSP streams concurrently from a single thread (running a single event loop).) But if you insist on using multiple threads (something that I still don't recommend, even though it's possible), you can do so provided that you use a separate "TaskScheduler" and "UsageEnvironment" for each. Note that - if you wish - you can call "->envir()" on your response handlers' "RTSPClient*" first parameter, if you need to find out which "UsageEnvironment" - and thus which thread - it is using. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 26 12:26:44 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 26 Nov 2011 20:26:44 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18D9EE@IL-BOL-EXCH01.smartwire.com> From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 26, 2011 1:35 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] problems moving to asynchronous rtsp interface I just started trying to upgrade our code to use the asynchronous interface and I am a bit perplexed at the mix of C and C++. In other libraries the callbacks generally have a void pointer to clientData or 'instance' or something like that which allows me to pass the 'this' pointer, cast it, to the class and access all the particular instances' members. In the openRTSP and PlayCommon code I see the signatures for the callbacks are hard coded to foo(RTSPClient::responsehandler*,int,char*). The "openRTSP.cpp"/"playCommon.cpp" code, unfortunately, does not provide the greatest example of how to use the asynchronous "RTSPClient" interface, precisely because it uses a single, global "RTSPClient*" variable (called "ourRTSPClient"). (Also, because "playCommon.cpp" shares code with a separate application called "playSIP".) The key thing to note is that the "RTSPClient:responseHandler()" functions all have a "RTSPClient*" as their first parameter. When the response handler gets called, its first parameter will be (a pointer to) the "RTSPClient" object that made the request. In our "playCommon.cpp" code, the first parameter to our response handler functions ends up not being needed, because its value will - in this case - be the same as our global variable "ourRTSPClient". In your case, however, you will want to use this first parameter, because it will tell you which particular "RTSPClient" object made the request. >> I realized that and am working on the subclass from RTSPClient idea. Because there is a LOT of other instance data, just like in the openRTSP example, thre is a LOT of global information. But I am finding that the RTSPClient was not really designed to be subclassed. The gotcha I have now is, although it does nothing but set values in it's constructor, I cannot instanitiate my subclass without an initializer for the RTSPClient base class, but then I cannot call setBaseURL() for example, later when I know all the info. For example. This is what causes the ourRTSPClient in the example to be a Global variable. I need to have this as a class member variable, and perhaps do something like this.... getOptions(RTSPClient::responseHandler* afterFunc, void* clientData) { myClass * instancePTR = (myClass *) clientData; instancePTR->ourClient_->sendOptionsCommand(instancePTR, ourAuthenticator, clientData); } No, this won't work, because the signature of your call to "RTSPClient::sendOptionsCommand()" is all wrong. Instead, don't define or call a "getOptions()" function at all. Instead just call: instancePTR->ourClient->sendOptionsCommand(continueAfterOPTIONS, ourAuthenticator); And then, when the "continueAfterOPTIONS()" response handler later gets called, its first parameter will be "instancePTR->ourClient" - i.e., a "RTSPClient*". If you need more information from this "RTSPClient*" pointer (e.g., in your case, to identify it's "myClass*" parent), then you can do so by subclassing "RTSPClient", and putting the information that you want in a subclass member field. See above why this is difficult. What is the best way to use these callbacks in a multi threaded (multiple UsageEnvironments) c++ program.? You do realize, I hope, that the use of the asynchronous "RTSPClient" interface makes it even less necessary to use multiple threads. (If you use the asynchronous interface, you can access multiple RTSP streams concurrently from a single thread (running a single event loop).) But if you insist on using multiple threads (something that I still don't recommend, even though it's possible), you can do so provided that you use a separate "TaskScheduler" and "UsageEnvironment" for each. Note that - if you wish - you can call "->envir()" on your response handlers' "RTSPClient*" first parameter, if you need to find out which "UsageEnvironment" - and thus which thread - it is using. Yes but I am trying to upgrade to the newer interface without breaking everthing all at once. Especially under the threat of impending removal! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 26 12:57:11 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Nov 2011 12:57:11 -0800 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18D9EE@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18D9EE@IL-BOL-EXCH01.smartwire.com> Message-ID: <0F6501C1-3741-4114-A0F3-739E04A5E74A@live555.com> > >> I realized that and am working on the subclass from RTSPClient idea. Because there is a LOT of other instance data, just like in the openRTSP example, thre is a LOT of global information. But I am finding that the RTSPClient was not really designed to be subclassed. The gotcha I have now is, although it does nothing but set values in it?s constructor, I cannot instanitiate my subclass without an initializer for the RTSPClient base class, but then I cannot call setBaseURL() for example, later when I know all the info. I'm not sure I understand this. Your subclass's constructor will, of course, call the "RTSPClient" constructor before initializing its own variables. One of the parameters to the "RTSPClient" constructor is a "rtsp://" URL, which the "RTSPClient" constructor uses, internally, by calling "setBaseURL()" to save its value. Are you saying that you want to call "setBaseURL()" from your subclass, to set a *different* "rtsp://" URL - one that you want to build inside your subclass constructor (rather than one that you know in advance and are giving to the subclass constructor, as is the case for "RTSPClient")? If that's the case, then yes, "setBaseURL()" will need to be protected, rather than private (and then your subclass constructor can just pass NULL as the "rtspURL" parameter to the "RTSPClient" constructor, before it creates the real URL). So, would you like me to make "setBaseURL()" protected rather than private? Any other member functions or variables as well? (This is what this mailing list is for :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 26 13:07:28 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 26 Nov 2011 21:07:28 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> Threads != instances???. Don't I still need a way of keeping all the instance data separate? How does the callback allow me to differentiate between different instances (other than just the RTSPClient* itself? Can I lie to it? Put a pointer to a class in the that happens to have the same call back and then cast as needed to get to instance data along for the ride? If it doesn't do anything other than call a func with three args... why not make that void* ? unless you actually use the RTSPClient internally on the callbacks. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 26, 2011 1:35 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] problems moving to asynchronous rtsp interface I just started trying to upgrade our code to use the asynchronous interface and I am a bit perplexed at the mix of C and C++. In other libraries the callbacks generally have a void pointer to clientData or 'instance' or something like that which allows me to pass the 'this' pointer, cast it, to the class and access all the particular instances' members. In the openRTSP and PlayCommon code I see the signatures for the callbacks are hard coded to foo(RTSPClient::responsehandler*,int,char*). The "openRTSP.cpp"/"playCommon.cpp" code, unfortunately, does not provide the greatest example of how to use the asynchronous "RTSPClient" interface, precisely because it uses a single, global "RTSPClient*" variable (called "ourRTSPClient"). (Also, because "playCommon.cpp" shares code with a separate application called "playSIP".) The key thing to note is that the "RTSPClient:responseHandler()" functions all have a "RTSPClient*" as their first parameter. When the response handler gets called, its first parameter will be (a pointer to) the "RTSPClient" object that made the request. In our "playCommon.cpp" code, the first parameter to our response handler functions ends up not being needed, because its value will - in this case - be the same as our global variable "ourRTSPClient". In your case, however, you will want to use this first parameter, because it will tell you which particular "RTSPClient" object made the request. For example. This is what causes the ourRTSPClient in the example to be a Global variable. I need to have this as a class member variable, and perhaps do something like this.... getOptions(RTSPClient::responseHandler* afterFunc, void* clientData) { myClass * instancePTR = (myClass *) clientData; instancePTR->ourClient_->sendOptionsCommand(instancePTR, ourAuthenticator, clientData); } No, this won't work, because the signature of your call to "RTSPClient::sendOptionsCommand()" is all wrong. Instead, don't define or call a "getOptions()" function at all. Instead just call: instancePTR->ourClient->sendOptionsCommand(continueAfterOPTIONS, ourAuthenticator); And then, when the "continueAfterOPTIONS()" response handler later gets called, its first parameter will be "instancePTR->ourClient" - i.e., a "RTSPClient*". If you need more information from this "RTSPClient*" pointer (e.g., in your case, to identify it's "myClass*" parent), then you can do so by subclassing "RTSPClient", and putting the information that you want in a subclass member field. What is the best way to use these callbacks in a multi threaded (multiple UsageEnvironments) c++ program.? You do realize, I hope, that the use of the asynchronous "RTSPClient" interface makes it even less necessary to use multiple threads. (If you use the asynchronous interface, you can access multiple RTSP streams concurrently from a single thread (running a single event loop).) But if you insist on using multiple threads (something that I still don't recommend, even though it's possible), you can do so provided that you use a separate "TaskScheduler" and "UsageEnvironment" for each. Note that - if you wish - you can call "->envir()" on your response handlers' "RTSPClient*" first parameter, if you need to find out which "UsageEnvironment" - and thus which thread - it is using. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 26 13:17:00 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 26 Nov 2011 21:17:00 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <0F6501C1-3741-4114-A0F3-739E04A5E74A@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18D9EE@IL-BOL-EXCH01.smartwire.com> <0F6501C1-3741-4114-A0F3-739E04A5E74A@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18DA5D@IL-BOL-EXCH01.smartwire.com> I am not sure if making some functions virtual or separating construction from setting is best. My case may be too edge. I haven't gotten far enough to give you a list, that's for sure. But if the callback could use a void* for the first arg, then I can wrap the client in a myClient class, pass it's pointer and access the internal pointer to the RTSPClient. That is a different form of subclassing for sure, but it helps callbacks. For me the requiring of info on ctor doesn't happen to fit my existing interface. (not all clients in my code are live555 RTSP clients) In this legacy case, the url is built dynamically depending on the model of the encoder. I suppose it could be redone, but that would change an interface and all the classes that inherit from and use it. //thinking out loud.. What if you created an abstract interface class for c++ users. They inherit from this class an implement the callback functions. It would have c-style call wrappers kinda like the whole afterGettingFrame and afterGettingFrame1 setup. You could grant access by befriending the interface? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 26, 2011 2:57 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] problems moving to asynchronous rtsp interface >> I realized that and am working on the subclass from RTSPClient idea. Because there is a LOT of other instance data, just like in the openRTSP example, thre is a LOT of global information. But I am finding that the RTSPClient was not really designed to be subclassed. The gotcha I have now is, although it does nothing but set values in it's constructor, I cannot instanitiate my subclass without an initializer for the RTSPClient base class, but then I cannot call setBaseURL() for example, later when I know all the info. I'm not sure I understand this. Your subclass's constructor will, of course, call the "RTSPClient" constructor before initializing its own variables. One of the parameters to the "RTSPClient" constructor is a "rtsp://" URL, which the "RTSPClient" constructor uses, internally, by calling "setBaseURL()" to save its value. Are you saying that you want to call "setBaseURL()" from your subclass, to set a *different* "rtsp://" URL - one that you want to build inside your subclass constructor (rather than one that you know in advance and are giving to the subclass constructor, as is the case for "RTSPClient")? If that's the case, then yes, "setBaseURL()" will need to be protected, rather than private (and then your subclass constructor can just pass NULL as the "rtspURL" parameter to the "RTSPClient" constructor, before it creates the real URL). So, would you like me to make "setBaseURL()" protected rather than private? Any other member functions or variables as well? (This is what this mailing list is for :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 26 13:22:57 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Nov 2011 13:22:57 -0800 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> Message-ID: <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> > Threads != instances???. Don?t I still need a way of keeping all the instance data separate? How does the callback allow me to differentiate between different instances (other than just the RTSPClient* itself? Once again, I don't understand. What's an "instance". Your "RTSPClient" subclass can contain whatever fields you like. The "RTSPClient*" pointer that you get in each of your response handler callbacks will be the same "RTSPClient*" on which you made the request; it should be able to (unambigously) give you all the state that you need. And no, don't "lie" to the interface (at least, not if you expect help on this mailing list). And yes, of course the "RTSPClient*" is used internally. Once again - if you (e.g.) call fooClient->sendOptionsCommand(continueAfterOPTIONS, ourAuthenticator); Then, when the "continueAfterOPTIONS()" response handler later gets called, its first parameter will be "fooClient" - i.e., a pointer to the same "RTSPClient" object on which you made the request. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Nov 26 13:50:53 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 26 Nov 2011 21:50:53 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> I think it is I who doesn't understand. :) If I ever get the time I really need to just sit down and read thru the live555 code. For me an instance is one rtsp connection to a source. It has a statistics class that is updated and displayed thru a web console. It has some control, stop start pause, shutdown. Obviously the moment the eventloop is started it dictates another thread because otherwise it would be blocked to other requests. So for example I had all the env setup and everthing in a start function. I am now trying to at least refactor that to a static env and scheduler for all connections (is that correct?) I either need to have my class intercept the call back by having the callback more generic, or I need to make my class become a RTSPClient subclass. I am using a lot of boost::shrared pointers and the instance is contained in a map at the main application level and is passed to the stream manager for that style of stream. It is subscribed to by any number of consumers the first of which usually records to disk. I am trying the subclass idea now with just the setBaseURL moved to the protected area. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 26, 2011 3:23 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] problems moving to asynchronous rtsp interface Threads != instances???. Don't I still need a way of keeping all the instance data separate? How does the callback allow me to differentiate between different instances (other than just the RTSPClient* itself? Once again, I don't understand. What's an "instance". Your "RTSPClient" subclass can contain whatever fields you like. The "RTSPClient*" pointer that you get in each of your response handler callbacks will be the same "RTSPClient*" on which you made the request; it should be able to (unambigously) give you all the state that you need. And no, don't "lie" to the interface (at least, not if you expect help on this mailing list). And yes, of course the "RTSPClient*" is used internally. Once again - if you (e.g.) call fooClient->sendOptionsCommand(continueAfterOPTIONS, ourAuthenticator); Then, when the "continueAfterOPTIONS()" response handler later gets called, its first parameter will be "fooClient" - i.e., a pointer to the same "RTSPClient" object on which you made the request. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 26 14:01:57 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Nov 2011 14:01:57 -0800 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18DA5D@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18D9EE@IL-BOL-EXCH01.smartwire.com> <0F6501C1-3741-4114-A0F3-739E04A5E74A@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA5D@IL-BOL-EXCH01.smartwire.com> Message-ID: > But if the callback could use a void* for the first arg, then [...] You need to abandon this line of thinking. The asynchronous "RTSPClient" interface has been in place - and in use by lots of people - for a year and a half now. It's not going to change (at least, not significantly). Each response handler callback routine gets a pointer (as its first parameter). It's not a "void*"; instead, it's a "RTSPClient*". But this one pointer is enough to give you all the state you need. Having a second (or third) extra "void*" pointer would be extraneous and unnecessary. If you want to carry around a pointer to some extra state, then you can just do so using a field in your "RTSPClient" subclass. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 26 14:12:38 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Nov 2011 14:12:38 -0800 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> Message-ID: > I think it is I who doesn?t understand. J If I ever get the time I really need to just sit down and read thru the live555 code. Ideally, you'd need to read only the header files, and the example applications - not the library ".cpp" files. (We don't live in an ideal world, though :-) > For me an instance is one rtsp connection to a source. OK. In LIVE555, this is encapsulated by a "RTSPClient" object. But you can add more state as well (see below). > It has a statistics class that is updated and displayed thru a web console. > It has some control, stop start pause, shutdown. Obviously the moment the eventloop is started it dictates another thread because otherwise it would be blocked to other requests. > > So for example I had all the env setup and everthing in a start function. I am now trying to at least refactor that to a static env and scheduler for all connections (is that correct?) > I either need to have my class intercept the call back by having the callback more generic, or I need to make my class become a RTSPClient subclass. Or your "RTSPClient" subclass can simply contain a member field that *points to* an object of your "instance" class. That's all I'm suggesting. Subclass "RTSPClient", and add a field called "fParentInstance" (or something) that points to an object of your existing "instance" (e.g.) class. Then, in each of your response handlers, you just cast the first parameter to be a pointer to your "RTSPClient" subclass (because you know it is), and then access "fParentInstance". Voila! You have your state back. > I am trying the subclass idea now with just the setBaseURL moved to the protected area. I'll make this change in the next release of the software, because it's likely to be generally useful. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Nov 26 19:50:32 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Nov 2011 19:50:32 -0800 Subject: [Live-devel] New LIVE555 version - changes the implementation of "Locale" Message-ID: The VLC developers have pointed out that the system "setlocale()" function that we used to implement our "Locale" class might not be 'thread safe' in all systems. They suggested using the more modern 'xlocale.h' functions "newlocale()"/"uselocale()" instead. I've installed a new version (2011.11.27) of the "LIVE555 Streaming Media" code that changes the implementation of "Locale" accordingly. Unfortunately, though, things are not quite that simple, because some systems do not support the new 'xlocale.h' functions. For those systems, I needed to keep around an #ifdef'd version of the old implementation. One such system is FreeBSD. To support this, I had to add a new compile-time definition -DXLOCALE_NOT_USED=1 to the "config.freebsd" file. Also, because at least some Windows versions don't support the new 'xlocale.h' functions, Windows, by default, will continue to use the old implementation. If, however, you're sure that your own Windows version supports the new functions, you can instead get the new implementation by defining XLOCALE_USED at compile time. Some of you may find that some other systems out there also do not support the new 'xlocale' functions (i.e., they don't have the "xlocale.h" header file). If you find such a system, then please let us know ASAP, so we can update their "config." files also. (A note to the VLC developers: The first three of your 'LIVE555 patches' have now been taken care of. Two more to go!) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Mon Nov 28 11:54:28 2011 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Mon, 28 Nov 2011 11:54:28 -0800 Subject: [Live-devel] MPEG-2 Program Map Table PID Message-ID: <01b301ccae07$8ac8c7c0$a05a5740$@com> Hi Ross, I am using the LIVE555 libraries to wrap H.264 video data inside an MPEG-2 transport stream. Everything is working great except for one problem: The MPEG2TransportStreamMultiplexor class is currently using a hard-coded Program Map Table PID of 0x10 (OUR_PROGRAM_MAP_PID), which is not allowed by either DVB or ATSC standards. I can obviously change the value of this constant in the library code, but I try to avoid making changes to the library code, based on both your recommendation and the fact that I don't want to constantly merge my own changes in when a new version is released. Do you have a recommended approach for changing this value? I'd be happy to implement the required changes and post a patch, if desired. Thanks, Chris Richardson WTI -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Mon Nov 28 14:24:43 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Mon, 28 Nov 2011 22:24:43 +0000 Subject: [Live-devel] Infinit loop in DelayQue::synchronize on BasicTaskScheduler::createNew Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18E277@IL-BOL-EXCH01.smartwire.com> I am trying to use a subclass of RTSPClient and it has a static variable for the scheduler and environment so they will be available for the CTO of my subclass and therefore the baseclass initializer. This seems like it would be the correct order of initialization but the createNew for the scheduler goes into an infinite loop trying to synchronize. Debugging shows the curEntry = curEntry->next and ->prev. It looks like I need to set some variables before I call createNew?? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 28 14:42:50 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Nov 2011 14:42:50 -0800 Subject: [Live-devel] Infinit loop in DelayQue::synchronize on BasicTaskScheduler::createNew In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18E277@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18E277@IL-BOL-EXCH01.smartwire.com> Message-ID: <96C24696-56AB-46D0-A79C-86B6A5B41B00@live555.com> > I am trying to use a subclass of RTSPClient and it has a static variable for the scheduler and environment You don't need to do this. Each subclass of "Medium" (which includes "RTSPClient") has a member function envir() which returns its "UsageEnvironment". And "UsageEnvironment" has a member function taskScheduler() which returns its "TaskScheduler". So you shouldn't need to add any fields for these. Note that - each time you create a "TaskScheduler"/"UsageEnvironment" pair - you first create the "TaskScheduler"; then you create a "UsageEnvironment" that uses it. (Note the numerous examples in "testProgs".) Therefore, when deleting these objects, you should do so in reverse order: - First call "reclaim()" on the "UsageEnvironment" object - Then "delete" the "TaskScheduler" object. (Yes, this is rather ugly and inconsistent...) Ross. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Mon Nov 28 14:56:10 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Mon, 28 Nov 2011 22:56:10 +0000 Subject: [Live-devel] Infinit loop in DelayQue::synchronize on BasicTaskScheduler::createNew In-Reply-To: <96C24696-56AB-46D0-A79C-86B6A5B41B00@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18E277@IL-BOL-EXCH01.smartwire.com> <96C24696-56AB-46D0-A79C-86B6A5B41B00@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18E2B4@IL-BOL-EXCH01.smartwire.com> I think I was not clear. I had a CRTSPClient that had a RTSPClient as a variable. What I tried to do was inherit from both the RTSPClient and my IClient interface. So I need to create the scheduler and environment before the ctor because of the basclass initializer needeing the env at time of ctor. So I gave the class the two class static variables and they are guaranteed to initialize before the ctor runs. It looks like this..... namespace MVS { TaskScheduler* MVSRTSPClient::scheduler_ = BasicTaskScheduler::createNew(); UsageEnvironment* MVSRTSPClient::env_ = BasicUsageEnvironment::createNew(*scheduler_); const char* MVSRTSPClient::clientProtocolName = "RTSP"; //boost::mutex MVSRTSPClient::handlerMutex_; MVSRTSPClient::MVSRTSPClient(CAMERA::stats* stats) : IClient(), RTSPClient(*env_, "rtspURL", 0, "MVSRTSPClient", 0), durationSlop_(-1.0), madeProgress_(False), areAlreadyShuttingDown_(false), session_(NULL), setupIter_(NULL), sessionTimerTask_(NULL), qosMeasurementTimerTask_(NULL), arrivalCheckTimerTask_(NULL), interPacketGapCheckTimerTask_(NULL), playContinuously_(False), syncStreams_(True), notifyOnPacketArrival_(True), interPacketGapMaxTime_(0), totNumPacketsReceived_(0), StreamName_("unnamed source") { } From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, November 28, 2011 4:43 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Infinit loop in DelayQue::synchronize on BasicTaskScheduler::createNew I am trying to use a subclass of RTSPClient and it has a static variable for the scheduler and environment You don't need to do this. Each subclass of "Medium" (which includes "RTSPClient") has a member function envir() which returns its "UsageEnvironment". And "UsageEnvironment" has a member function taskScheduler() which returns its "TaskScheduler". So you shouldn't need to add any fields for these. Note that - each time you create a "TaskScheduler"/"UsageEnvironment" pair - you first create the "TaskScheduler"; then you create a "UsageEnvironment" that uses it. (Note the numerous examples in "testProgs".) Therefore, when deleting these objects, you should do so in reverse order: - First call "reclaim()" on the "UsageEnvironment" object - Then "delete" the "TaskScheduler" object. (Yes, this is rather ugly and inconsistent...) Ross. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 28 15:23:38 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Nov 2011 15:23:38 -0800 Subject: [Live-devel] Infinit loop in DelayQue::synchronize on BasicTaskScheduler::createNew In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18E2B4@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18E277@IL-BOL-EXCH01.smartwire.com> <96C24696-56AB-46D0-A79C-86B6A5B41B00@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18E2B4@IL-BOL-EXCH01.smartwire.com> Message-ID: OK, I see now what you're doing. Because you've made your "TaskScheduler" and "UsageEnvironment" *static* member variables, it's not inconceivable that their initialization is taking place before the initialization of some other static variables or constants in the LIVE555 code that they happen to depend on. I can't say for sure, but because you are seeing such strange behavior (an infinite loop), then I suggest that you not do this. Instead, do what most (every) other LIVE555 application does: Create the "TaskScheduler" and "UsageEnvironment" objects at the start of the main program (or at the start of each thread if you're using multiple threads), before you create any "Medium" objects that use them. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Nov 28 15:56:15 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Nov 2011 15:56:15 -0800 Subject: [Live-devel] MPEG-2 Program Map Table PID In-Reply-To: <01b301ccae07$8ac8c7c0$a05a5740$@com> References: <01b301ccae07$8ac8c7c0$a05a5740$@com> Message-ID: <7D021675-AC5C-4642-A2A4-62A057A05513@live555.com> > I am using the LIVE555 libraries to wrap H.264 video data inside an MPEG-2 transport stream. Everything is working great except for one problem: The MPEG2TransportStreamMultiplexor class is currently using a hard-coded Program Map Table PID of 0x10 (OUR_PROGRAM_MAP_PID), which is not allowed by either DVB or ATSC standards. That's odd. I've seen several Transport Stream files that use 0x10 as the "Program_map_PID", and Table 2-3 of ISO/IEC 13818-1 says that PIDs in the range 0x10 through 0x1FFE "May be assigned as network_PID, Program-map_PID, elementary_PID, or for other purposes". But perhaps DVB and ATSC place more restrictions on what PIDs can be valid "Program_map_PID"s? (If so, then what PIDs are valid in these systems?) > I can obviously change the value of this constant in the library code, but I try to avoid making changes to the library code, based on both your recommendation and the fact that I don?t want to constantly merge my own changes in when a new version is released. Yes, definitely. Thanks for bringing this issue to our attention. > Do you have a recommended approach for changing this value? The easiest solution would simply be to redefine the constant to be some different value that's valid for DVB and ATSC (as well as every other Transport Stream file). Do you have a suggestion? (I've also seen several Transport Stream files that use 0x42. Would that be valid?) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Mon Nov 28 16:30:30 2011 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Mon, 28 Nov 2011 16:30:30 -0800 Subject: [Live-devel] MPEG-2 Program Map Table PID In-Reply-To: <7D021675-AC5C-4642-A2A4-62A057A05513@live555.com> References: <01b301ccae07$8ac8c7c0$a05a5740$@com> <7D021675-AC5C-4642-A2A4-62A057A05513@live555.com> Message-ID: <01f101ccae2e$1a4a3020$4ede9060$@com> Hi Ross, Thanks for your prompt response. >>That's odd. ?I've seen several Transport Stream files that use 0x10 as the "Program_map_PID", and Table 2-3 of?ISO/IEC 13818-1 says that PIDs in the range 0x10 through 0x1FFE "May be assigned as network_PID, Program-map_PID, elementary_PID, or for other purposes". >>But perhaps DVB and ATSC place more restrictions on what PIDs can be valid?"Program_map_PID"s? ?(If so, then what PIDs are valid in these systems?) Yep, indeed I didn't mean to imply that 0x10 is an invalid MPEG-2 PID; only that it is invalid for ATSC, as well as for the program map table in DVB. DVB reserves PID 0x10 for the ?Network Information Table? (NIT), and ATSC disallows using PIDs that will conflict with DVB. The first legal ATSC Program Map Table PID is 0x30, as seen in the following document on page 23: http://www.atsc.org/cms/standards/a53/a_53-Part-3-2009.pdf "6.9 PID Value Assignments In order to avoid collisions with fixed PID values and ranges already established in this and other international standards, transport_packet() PID field values are restricted as follows: ? TS packets identified with PID values in the range 0x1FF0 ? 0x1FFE shall only be used to transport data compliant with ATSC-recognized standards specifying fixed-value PID assignments in that range. (Informative note: One such use is A/65, which requires the use of 0x1FFB to identify packets containing certain tables defined in that standard.) ? In order to avoid collisions with fixed PID values and ranges already established in thisand other international standards, PID values used to identify Transport Stream packets carrying TS_program_map_section() or program elements shall not be set below 0x0030. (Informative note: One such use is in ETS 300 468, which requires the use of 0x0011 to identify packets containing certain tables defined in that standard.)" >>The easiest solution would simply be to redefine the constant to be some different value that's valid for DVB and ATSC (as well as every other Transport Stream file). ?Do you have a suggestion? ?(I've also seen several Transport Stream files that use 0x42. ?Would that be valid?) My customer has informed me that ATSC uses PIDs 0x30, 0x40, 0x50, etc. to represent channel 3, 4, 5 etc., so I am not sure if 0x42 would be sub-channel 4.2, or if it would be allowed at all. My opinion would be to use 0x30, as it is legal for both ATSC and DVB, and seems to be valid for any transport stream. However I am not an expert here and welcome anybody else's opinions or facts on this matter. Thanks, Chris Richardson WTI From finlayson at live555.com Mon Nov 28 17:07:13 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Nov 2011 17:07:13 -0800 Subject: [Live-devel] MPEG-2 Program Map Table PID In-Reply-To: <01f101ccae2e$1a4a3020$4ede9060$@com> References: <01b301ccae07$8ac8c7c0$a05a5740$@com> <7D021675-AC5C-4642-A2A4-62A057A05513@live555.com> <01f101ccae2e$1a4a3020$4ede9060$@com> Message-ID: > My opinion would be to use 0x30, as it is legal for both ATSC and DVB, and seems to be valid for > any transport stream. Ok, thanks. I've just installed a new version (2011.11.29) of the "LIVE555 Streaming Media" code that changes this constant. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Mahesh.Duvarapu at corpus.com Mon Nov 28 23:05:04 2011 From: Mahesh.Duvarapu at corpus.com (Mahesh Duvarapu) Date: Tue, 29 Nov 2011 07:05:04 +0000 Subject: [Live-devel] live streaming from a file Message-ID: <5F3F612A7EC7744D992CA39123BA0ED4B3B988@COLOCORMBWS8-04.corpusinc.corp> Hi, My requirement needs a live mpeg ts stream for testing some media players. I would like to simulate a live streaming by looping a *.ts file in my server (lack of encoders and live source), so that the connecting clients will not start playing the stream from beginning of the file always. I have setup the live555 to stream on demand the *.ts file, the players are able to play the stream from beginning of the file always. Please let me know the process to simulate a live stream from a *.ts file stored on my server HDD. Mahesh.DL -------------- next part -------------- An HTML attachment was scrubbed... URL: From sr at coexsi.fr Tue Nov 29 00:24:04 2011 From: sr at coexsi.fr (=?iso-8859-1?Q?S=E9bastien_RAILLARD_=28COEXSI=29?=) Date: Tue, 29 Nov 2011 09:24:04 +0100 Subject: [Live-devel] MPEG-2 Program Map Table PID In-Reply-To: References: <01b301ccae07$8ac8c7c0$a05a5740$@com> <7D021675-AC5C-4642-A2A4-62A057A05513@live555.com> <01f101ccae2e$1a4a3020$4ede9060$@com> Message-ID: <002201ccae70$42387c30$c6a97490$@coexsi.fr> For DVB, PID up to 0x1F are reserved, see the table #1 in page #18 of the norm : http://www.etsi.org/deliver/etsi_en/300400_300499/300468/01.11.01_60/en_3004 68v011101p.pdf Sebastien. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: mardi 29 novembre 2011 02:07 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] MPEG-2 Program Map Table PID My opinion would be to use 0x30, as it is legal for both ATSC and DVB, and seems to be valid for any transport stream. Ok, thanks. I've just installed a new version (2011.11.29) of the "LIVE555 Streaming Media" code that changes this constant. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 29 00:49:59 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Nov 2011 00:49:59 -0800 Subject: [Live-devel] live streaming from a file In-Reply-To: <5F3F612A7EC7744D992CA39123BA0ED4B3B988@COLOCORMBWS8-04.corpusinc.corp> References: <5F3F612A7EC7744D992CA39123BA0ED4B3B988@COLOCORMBWS8-04.corpusinc.corp> Message-ID: <91B58F7C-1FD0-41EF-944A-F7CB072D9DD9@live555.com> > My requirement needs a live mpeg ts stream for testing some media players. I would like to simulate a live streaming by looping a *.ts file in my server (lack of encoders and live source), so that the connecting clients will not start playing the stream from beginning of the file always. The easiest way to do this is to develop a server that appears to be reading from a live input source (see http://www.live555.com/liveMedia/faq.html#liveInput-unicast for details) but have the input source be your own subclass (that you would write) of "ByteStreamFileSource" that handles reads ("doGetNextFrame()") by looping through the file, and never actually generating "file closed" events. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix at embedded-sol.com Tue Nov 29 04:57:29 2011 From: felix at embedded-sol.com (Felix Radensky) Date: Tue, 29 Nov 2011 14:57:29 +0200 Subject: [Live-devel] Passing H.264 RTP stream to hardware decoder Message-ID: <4ED4D6B9.3040008@embedded-sol.com> Hi, I'd like to create an application that gets H.264 stream over RTP and passes it to hardware decoder. The decoder API expects a buffer containing H.264 frame and frame size. Will the combination of H264VideoRTPSource and H264VideoStreamDiscreteFramer will do what I need ? Thanks. Felix. From finlayson at live555.com Tue Nov 29 05:26:34 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Nov 2011 05:26:34 -0800 Subject: [Live-devel] Passing H.264 RTP stream to hardware decoder In-Reply-To: <4ED4D6B9.3040008@embedded-sol.com> References: <4ED4D6B9.3040008@embedded-sol.com> Message-ID: <1B0B51AE-845D-4EA2-AE0C-C52FF4089AA1@live555.com> > I'd like to create an application that gets H.264 stream over RTP and > passes it to hardware decoder. The decoder API expects a buffer containing > H.264 frame and frame size. > > Will the combination of H264VideoRTPSource and H264VideoStreamDiscreteFramer > will do what I need ? You probably need just a "H264VideoRTPSource". (A "H264VideoStreamDiscreteFramer" is needed only when you are *transmitting* H.264 over RTP.) You may also need to call "parseSPropParameterSets()" to generate SPS and PPS NAL units to feed to your decoder, in case these don't appear in-line in the stream (or even if they do, if they don't appear frequently). See "liveMedia/include/H264VideoRTPSource.hh" Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Tue Nov 29 05:36:13 2011 From: isambhav at gmail.com (Sambhav) Date: Tue, 29 Nov 2011 19:06:13 +0530 Subject: [Live-devel] RTSP latency on Android Clients Message-ID: Hi, On Android clients RTSP playback has a latency of around 5-6 seconds. The client buffers data for this duration and then starts playing. For live streaming this latency is very high. Does the client use the bandwidth information (b=AS:) to calculate the buffer size. e.g 5-6sec in this case? How is the bandwidth computed in the Live555 ? Did some search and few people suggested to do a buffer blasting initially when the session starts. i.e send data at a high rate at start so that the client buffer fills up fast and eventually reduce the startup latency. Can streaming framerate be controlled at runtime in Live555 ? Regards, Sambhav -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 29 06:00:16 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Nov 2011 06:00:16 -0800 Subject: [Live-devel] RTSP latency on Android Clients In-Reply-To: References: Message-ID: <2676030A-89DD-4E44-B6B3-79BAF55ABBE3@live555.com> > On Android clients RTSP playback has a latency of around 5-6 seconds. The client buffers data for this duration and then starts playing. > For live streaming this latency is very high. > > Does the client use the bandwidth information (b=AS:) to calculate the buffer size. e.g 5-6sec in this case? How would we know? You'll have to ask whoever developed the client. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix at embedded-sol.com Tue Nov 29 06:32:06 2011 From: felix at embedded-sol.com (Felix Radensky) Date: Tue, 29 Nov 2011 16:32:06 +0200 Subject: [Live-devel] Passing H.264 RTP stream to hardware decoder In-Reply-To: <1B0B51AE-845D-4EA2-AE0C-C52FF4089AA1@live555.com> References: <4ED4D6B9.3040008@embedded-sol.com> <1B0B51AE-845D-4EA2-AE0C-C52FF4089AA1@live555.com> Message-ID: <4ED4ECE6.6090204@embedded-sol.com> Hi Ross, On 11/29/2011 03:26 PM, Ross Finlayson wrote: >> I'd like to create an application that gets H.264 stream over RTP and >> passes it to hardware decoder. The decoder API expects a buffer >> containing >> H.264 frame and frame size. >> >> Will the combination of H264VideoRTPSource and >> H264VideoStreamDiscreteFramer >> will do what I need ? > > You probably need just a "H264VideoRTPSource". (A > "H264VideoStreamDiscreteFramer" is needed only when you are > *transmitting* H.264 over RTP.) > > You may also need to call "parseSPropParameterSets()" to generate > SPS and PPS NAL units to feed to your decoder, in case these don't > appear in-line in the stream (or even if they do, if they don't > appear frequently). See "liveMedia/include/H264VideoRTPSource.hh" > > Thanks for a prompt reply. I guess I'm still a bit confused. How would my application access H264 frame received by H264VideoRTPSource ? One of the requirements I have, is that decoder will ask for a frame when it's ready to process one, so I need to have a routine that returns the last H264 frame and its size. Thanks. Felix. From finlayson at live555.com Tue Nov 29 06:45:45 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Nov 2011 06:45:45 -0800 Subject: [Live-devel] Passing H.264 RTP stream to hardware decoder In-Reply-To: <4ED4ECE6.6090204@embedded-sol.com> References: <4ED4D6B9.3040008@embedded-sol.com> <1B0B51AE-845D-4EA2-AE0C-C52FF4089AA1@live555.com> <4ED4ECE6.6090204@embedded-sol.com> Message-ID: <837C281C-A0FD-49C6-A98B-4C3E4A525A79@live555.com> > Thanks for a prompt reply. I guess I'm still a bit confused. How would my application access H264 frame > received by H264VideoRTPSource ? You would write a "MediaSink" subclass that encapsulates your decoder, and then call yourDecoderMediaSink->startPlaying(yourH264VideoRTPSource, ... ); and then env->taskScheduler().doEventLoop(); to enter the event loop. (See the numerous examples of this in the "testProgs" directory.) > One of the requirements I have, is that decoder will ask for a frame > when it's ready to process one, so I need to have a routine that returns the last H264 frame and its size. Yes, your decoder "MediaSink" subclass would do this by implementing the "continuePlaying()" virtual function by calling "getNextFrame()" on its input source (which, in this case, will be "yourH264VideoRTPSource"). I suggest that you look at the code for "FileSink" (another "MediaSink" subclass) for hints on how to do this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Tue Nov 29 06:48:07 2011 From: isambhav at gmail.com (Sambhav) Date: Tue, 29 Nov 2011 20:18:07 +0530 Subject: [Live-devel] RTSP latency on Android Clients In-Reply-To: <2676030A-89DD-4E44-B6B3-79BAF55ABBE3@live555.com> References: <2676030A-89DD-4E44-B6B3-79BAF55ABBE3@live555.com> Message-ID: Sorry. I was asking the question in more general sense about RTSP clients interpreting the bandwidth parameter of the SDP. The other question of controlling streaming framerate, where can I modify this parameter ? On Tue, Nov 29, 2011 at 7:30 PM, Ross Finlayson wrote: > On Android clients RTSP playback has a latency of around 5-6 seconds. The > client buffers data for this duration and then starts playing. > For live streaming this latency is very high. > > Does the client use the bandwidth information (b=AS:) to > calculate the buffer size. e.g 5-6sec in this case? > > > How would we know? You'll have to ask whoever developed the client. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 29 07:22:17 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Nov 2011 07:22:17 -0800 Subject: [Live-devel] RTSP latency on Android Clients In-Reply-To: References: <2676030A-89DD-4E44-B6B3-79BAF55ABBE3@live555.com> Message-ID: > Sorry. I was asking the question in more general sense about RTSP clients interpreting the bandwidth parameter of the SDP. Media player applications often have a "buffer duration" parameter that you can adjust... > The other question of controlling streaming framerate, where can I modify this parameter ? Because a stream's frame rate (and bit rate) is a property of the stream data, it can be modified only at the server (perhaps by changing the data that's streamed). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alvarezgalberto at uniovi.es Tue Nov 29 08:58:53 2011 From: alvarezgalberto at uniovi.es (Alberto Alvarez) Date: Tue, 29 Nov 2011 17:58:53 +0100 Subject: [Live-devel] Passing H.264 RTP stream to hardware decode Message-ID: Hello, I am working on parsing H264 too. A RTP H264 stream I found that it is a common approach of decoders (hard and soft) to request an entire AU (or frame). However, the code in live is parsing the streams at NAL-by-NAL basis. At the (file) sink every NAL is appended to a file with an start_code is added accordingly. It is, indeed, working marvelous, thanks to your efforts. Still, I am fighting to migrate this approach to a AU-by-AU basis. For that, i think the (H264VideoRTP) source should detect the end of an AU (not an straightforward matter though) and then instruct the FramedSource to write the entire AU to the sink (aftergetting). For that, I think the H264VideoRTPSource should handle the start_codes (before each NAL) instead of doing it at the sink. Right now I am looking for a way to implement this issue. If my idea is not entirely misguided, where should I add the start_codes in the source? Best regards Alberto ?lvarez Gonz?lez alvarezgalberto at uniovi.es -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 29 10:08:32 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Nov 2011 10:08:32 -0800 Subject: [Live-devel] Passing H.264 RTP stream to hardware decode In-Reply-To: References: Message-ID: <6634FB8C-1025-47C9-8F10-E0D66B300EAD@live555.com> > I am working on parsing H264 too. A RTP H264 stream > I found that it is a common approach of decoders (hard and soft) to request an entire AU (or frame). > However, the code in live is parsing the streams at NAL-by-NAL basis. If you're talking about *receiving* H.264/RTP streams, then the only LIVE555 code that's involved is "H264VideoRTPSource", which doesn't do any 'parsing' at all. It just 'delivers' - in this case NAL units, because that is what the H.264 RTP payload format defines to be the units that are packed into RTP packets. (Note also that, at least in principle, decoders can decode NAL units at a time, not just entire "access units" at a time; that's why it makes sense to deliver NAL units at a time.) > Still, I am fighting to migrate this approach to a AU-by-AU basis. For that, i think the (H264VideoRTP) source should detect the end of an AU (not an straightforward matter though) and then instruct the FramedSource to write the entire AU to the sink (aftergetting). We do that for many other video RTP payload formats. For H.264, however, the units of delivery - as noted above - are NAL units, not "access units". What you can do, however, is - after you've read each NAL unit from your "H264VideoRTPSource" - call "RTPSource::curPacketMarkerBit()" to check whether or not the RTP "M" bit was set on the most recently-received packet. This bit is used to indicate the end of an "access unit" (not just a NAL unit). You might also want to look at the VLC code, because they've done H.264 RTP receiving/decoding using the LIVE555 libraries for several years now. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Nov 29 12:30:36 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 29 Nov 2011 20:30:36 +0000 Subject: [Live-devel] Passing H.264 RTP stream to hardware decode In-Reply-To: <6634FB8C-1025-47C9-8F10-E0D66B300EAD@live555.com> References: <6634FB8C-1025-47C9-8F10-E0D66B300EAD@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18E72A@IL-BOL-EXCH01.smartwire.com> I have a filter that recievecs nal units and spits out frames. It is a simple state machine on nal type. Watchout for the embedded/unembedded SPS amd PPS packets, some sources have them and some don't. If they don't I build them from the SDP and then insert them in so Clients are happy. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, November 29, 2011 12:09 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Passing H.264 RTP stream to hardware decode I am working on parsing H264 too. A RTP H264 stream I found that it is a common approach of decoders (hard and soft) to request an entire AU (or frame). However, the code in live is parsing the streams at NAL-by-NAL basis. If you're talking about *receiving* H.264/RTP streams, then the only LIVE555 code that's involved is "H264VideoRTPSource", which doesn't do any 'parsing' at all. It just 'delivers' - in this case NAL units, because that is what the H.264 RTP payload format defines to be the units that are packed into RTP packets. (Note also that, at least in principle, decoders can decode NAL units at a time, not just entire "access units" at a time; that's why it makes sense to deliver NAL units at a time.) Still, I am fighting to migrate this approach to a AU-by-AU basis. For that, i think the (H264VideoRTP) source should detect the end of an AU (not an straightforward matter though) and then instruct the FramedSource to write the entire AU to the sink (aftergetting). We do that for many other video RTP payload formats. For H.264, however, the units of delivery - as noted above - are NAL units, not "access units". What you can do, however, is - after you've read each NAL unit from your "H264VideoRTPSource" - call "RTPSource::curPacketMarkerBit()" to check whether or not the RTP "M" bit was set on the most recently-received packet. This bit is used to indicate the end of an "access unit" (not just a NAL unit). You might also want to look at the VLC code, because they've done H.264 RTP receiving/decoding using the LIVE555 libraries for several years now. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From egoltzman at gmail.com Tue Nov 29 13:26:17 2011 From: egoltzman at gmail.com (eyal goltzman) Date: Tue, 29 Nov 2011 23:26:17 +0200 Subject: [Live-devel] StreamParser internal error (149999+ 4 > 150000) Message-ID: Hello, I'm using live as an RSTP server and it worked great with the test videos that come with the package, I also succeed to work with some other sample video. The version I'm using is 0.72 2011.10.27 I tried to stream the attached sample ( http://dl.dropbox.com/u/12151823/2mp.264) video that have the following mediainfo: * * *General* *Complete name : C:\Documents and Settings\Eyalg\My Documents\2mp.264* *Format : AVC* *Format/Info : Advanced Video Codec* *File size : 7.83 MiB* * * *Video* *Format : AVC* *Format/Info : Advanced Video Codec* *Format profile : Main at L5.1* *Format settings, CABAC : Yes* *Format settings, ReFrames : 1 frame* *Format settings, GOP : M=1, N=50* *Bit rate : 1 000 Kbps* *Width : 1 600 pixels* *Height : 1 200 pixels* *Display aspect ratio : 4:3* *Frame rate : 25.000 fps* *Color space : YUV* *Chroma subsampling : 4:2:0* *Bit depth : 8 bits* *Scan type : Progressive* *Bits/(Pixel*Frame) : 0.021* *Writing library : x264 core 116 r2074 2641b9e* *Encoding settings : cabac=1 / ref=1 / deblock=0:0:0 / analyse=0x1:0x111 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=0 / me_range=16 / chroma_me=0 / trellis=1 / 8x8dct=0 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=3 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=0 / weightp=0 / keyint=50 / keyint_min=2 / scenecut=0 / intra_refresh=0 / rc_lookahead=40 / rc=abr / mbtree=1 / bitrate=1000 / ratetol=1.0 / qcomp=0.60 / qpmin=10 / qpmax=51 / qpstep=4 / ip_ratio=1.40 / aq=1:1.00* I found it to be same as test.264 other then the higher resolution But the application stop working with the following error: *StreamParser internal error (149999+ 4 > 150000)* *Aborted (core dumped)* Thanks for any help in advanced, Eyal -------------- next part -------------- An HTML attachment was scrubbed... URL: From alvarezgalberto at uniovi.es Tue Nov 29 13:40:45 2011 From: alvarezgalberto at uniovi.es (Alberto Alvarez) Date: Tue, 29 Nov 2011 22:40:45 +0100 Subject: [Live-devel] live-devel Digest, Vol 97, Issue 39 In-Reply-To: References: Message-ID: > > Date: Tue, 29 Nov 2011 10:08:32 -0800 > From: Ross Finlayson > To: LIVE555 Streaming Media - development & use > > Subject: Re: [Live-devel] Passing H.264 RTP stream to hardware decode > Message-ID: <6634FB8C-1025-47C9-8F10-E0D66B300EAD at live555.com> > Content-Type: text/plain; charset="iso-8859-1" > > > I am working on parsing H264 too. A RTP H264 stream > > I found that it is a common approach of decoders (hard and soft) to > request an entire AU (or frame). > > However, the code in live is parsing the streams at NAL-by-NAL basis. > > If you're talking about *receiving* H.264/RTP streams, then the only > LIVE555 code that's involved is "H264VideoRTPSource", which doesn't do any > 'parsing' at all. It just 'delivers' - in this case NAL units, because > that is what the H.264 RTP payload format defines to be the units that are > packed into RTP packets. (Note also that, at least in principle, decoders > can decode NAL units at a time, not just entire "access units" at a time; > that's why it makes sense to deliver NAL units at a time.) > > > > Still, I am fighting to migrate this approach to a AU-by-AU basis. For > that, i think the (H264VideoRTP) source should detect the end of an AU (not > an straightforward matter though) and then instruct the FramedSource to > write the entire AU to the sink (aftergetting). > > We do that for many other video RTP payload formats. For H.264, however, > the units of delivery - as noted above - are NAL units, not "access units". > > What you can do, however, is - after you've read each NAL unit from your > "H264VideoRTPSource" - call "RTPSource::curPacketMarkerBit()" to check > whether or not the RTP "M" bit was set on the most recently-received > packet. This bit is used to indicate the end of an "access unit" (not just > a NAL unit). > > I already use MarkerBit, along with NAL type to identify the end of AU. (still need to prove it to be secure detection with losses) > You might also want to look at the VLC code, because they've done H.264 > RTP receiving/decoding using the LIVE555 libraries for several years now. > > VLC, AFAIK does not implement SVC decoding, which I pursue. Mplayer does, but the parsing is again Nal-by-Nal and their code is a bit heavy for my taste. I do like your code more. I am trying to insert the startcodes once I detect a packet that begins a new NAL (not always true). Then continue doing so up to I detect a New AU and there, mark the fcurrPacketCompletesFrame so the afterGetting is executed. ?Is this correct? Thank you very much. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alvarezgalberto at uniovi.es Tue Nov 29 13:42:51 2011 From: alvarezgalberto at uniovi.es (Alberto Alvarez) Date: Tue, 29 Nov 2011 22:42:51 +0100 Subject: [Live-devel] Passing H.264 RTP stream to hardware decode Message-ID: Sorry for the bad subject! > > Date: Tue, 29 Nov 2011 10:08:32 -0800 > From: Ross Finlayson > To: LIVE555 Streaming Media - development & use > > Subject: Re: [Live-devel] Passing H.264 RTP stream to hardware decode > Message-ID: <6634FB8C-1025-47C9-8F10-E0D66B300EAD at live555.com> > Content-Type: text/plain; charset="iso-8859-1" > > > I am working on parsing H264 too. A RTP H264 stream > > I found that it is a common approach of decoders (hard and soft) to > request an entire AU (or frame). > > However, the code in live is parsing the streams at NAL-by-NAL basis. > > If you're talking about *receiving* H.264/RTP streams, then the only > LIVE555 code that's involved is "H264VideoRTPSource", which doesn't do any > 'parsing' at all. It just 'delivers' - in this case NAL units, because > that is what the H.264 RTP payload format defines to be the units that are > packed into RTP packets. (Note also that, at least in principle, decoders > can decode NAL units at a time, not just entire "access units" at a time; > that's why it makes sense to deliver NAL units at a time.) > > > > Still, I am fighting to migrate this approach to a AU-by-AU basis. For > that, i think the (H264VideoRTP) source should detect the end of an AU (not > an straightforward matter though) and then instruct the FramedSource to > write the entire AU to the sink (aftergetting). > > We do that for many other video RTP payload formats. For H.264, however, > the units of delivery - as noted above - are NAL units, not "access units". > > What you can do, however, is - after you've read each NAL unit from your > "H264VideoRTPSource" - call "RTPSource::curPacketMarkerBit()" to check > whether or not the RTP "M" bit was set on the most recently-received > packet. This bit is used to indicate the end of an "access unit" (not just > a NAL unit). > > I already use MarkerBit, along with NAL type to identify the end of AU. (still need to prove it to be secure detection with losses) > You might also want to look at the VLC code, because they've done H.264 > RTP receiving/decoding using the LIVE555 libraries for several years now. > > VLC, AFAIK does not implement SVC decoding, which I pursue. Mplayer does, but the parsing is again Nal-by-Nal and their code is a bit heavy for my taste. I do like your code more. I am trying to insert the startcodes once I detect a packet that begins a new NAL (not always true). Then continue doing so up to I detect a New AU and there, mark the fcurrPacketCompletesFrame so the afterGetting is executed. ?Is this correct? Thank you very much. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alvarezgalberto at uniovi.es Tue Nov 29 13:51:00 2011 From: alvarezgalberto at uniovi.es (Alberto Alvarez) Date: Tue, 29 Nov 2011 22:51:00 +0100 Subject: [Live-devel] Passing H.264 RTP stream to hardware decode Message-ID: > > > Date: Tue, 29 Nov 2011 20:30:36 +0000 > From: Jeff Shanab > To: LIVE555 Streaming Media - development & use > > Subject: Re: [Live-devel] Passing H.264 RTP stream to hardware decode > Message-ID: > < > 615FD77639372542BF647F5EBAA2DBC20B18E72A at IL-BOL-EXCH01.smartwire.com> > Content-Type: text/plain; charset="us-ascii" > > I have a filter that recievecs nal units and spits out frames. > It is a simple state machine on nal type. > I merely use MarkerBit of RTP and Nal unit type 6 (payload type 10) which for my econder configuration is working pretty nice so far. However, I have not tested it against heavy losses or with other encoders. > > Watchout for the embedded/unembedded SPS amd PPS packets, some sources > have them and some don't. If they don't I build them from the SDP and then > insert them in so Clients are happy. > I should investigate that. My enconder configuration produces those at the start of stream, but I have not thought about losing them yet. Thank you for your help. Best Regards > > From: live-devel-bounces at ns.live555.com [mailto: > live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson > Sent: Tuesday, November 29, 2011 12:09 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Passing H.264 RTP stream to hardware decode > > I am working on parsing H264 too. A RTP H264 stream > I found that it is a common approach of decoders (hard and soft) to > request an entire AU (or frame). > However, the code in live is parsing the streams at NAL-by-NAL basis. > > If you're talking about *receiving* H.264/RTP streams, then the only > LIVE555 code that's involved is "H264VideoRTPSource", which doesn't do any > 'parsing' at all. It just 'delivers' - in this case NAL units, because > that is what the H.264 RTP payload format defines to be the units that are > packed into RTP packets. (Note also that, at least in principle, decoders > can decode NAL units at a time, not just entire "access units" at a time; > that's why it makes sense to deliver NAL units at a time.) > > > > Still, I am fighting to migrate this approach to a AU-by-AU basis. For > that, i think the (H264VideoRTP) source should detect the end of an AU (not > an straightforward matter though) and then instruct the FramedSource to > write the entire AU to the sink (aftergetting). > > We do that for many other video RTP payload formats. For H.264, however, > the units of delivery - as noted above - are NAL units, not "access units". > > What you can do, however, is - after you've read each NAL unit from your > "H264VideoRTPSource" - call "RTPSource::curPacketMarkerBit()" to check > whether or not the RTP "M" bit was set on the most recently-received > packet. This bit is used to indicate the end of an "access unit" (not just > a NAL unit). > > You might also want to look at the VLC code, because they've done H.264 > RTP receiving/decoding using the LIVE555 libraries for several years now. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://lists.live555.com/pipermail/live-devel/attachments/20111129/692908f0/attachment.html > > > > ------------------------------ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > End of live-devel Digest, Vol 97, Issue 39 > ****************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Nov 29 16:58:09 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Nov 2011 16:58:09 -0800 Subject: [Live-devel] StreamParser internal error (149999+ 4 > 150000) In-Reply-To: References: Message-ID: <0B428036-277D-430B-8664-FEB4B992EE31@live555.com> The problem here is the extremely large H.264 frame (about 280,000 bytes in size) that you have in this video. This was too big for our stream parsing code to handle. You can fix this by changing the constant BANK_SIZE in "liveMedia/StreamParser.cpp" from 150000 to 300000. (I'll also make this change in the next release of the software.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tuan.dn at anlab.vn Wed Nov 30 01:55:12 2011 From: tuan.dn at anlab.vn (Tuan DN) Date: Wed, 30 Nov 2011 16:55:12 +0700 Subject: [Live-devel] option to make audio louder Message-ID: Hi everyone, I am a newbie with live555, I would like to ask a question. I use live555 to record a mediastream by following command: *testProgs/openRTSP.exe -d 10 -4 rtsp://192.168.1.174:5544 >live555.mp4 * Is there any option to set audio volume of output file live555.mp4 bigger or smaller? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 30 05:05:36 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Nov 2011 05:05:36 -0800 Subject: [Live-devel] option to make audio louder In-Reply-To: References: Message-ID: > I use live555 to record a mediastream by following command: > > testProgs/openRTSP.exe -d 10 -4 rtsp://192.168.1.174:5544 >live555.mp4 > > Is there any option to set audio volume of output file live555.mp4 bigger or smaller? No, not in our software. If it's possible at all, it would only be by (somehow) editing the ".mp4" file after you've recorded it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Nov 30 10:07:36 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 30 Nov 2011 18:07:36 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18EB14@IL-BOL-EXCH01.smartwire.com> I have almost gotten the migration to RTSP asynchronous working in my code. I get to the point where the play command is send and I get a reply and it prints out the Receiving Streaming Data, but then it just sets there. Looping endlessly never detecting arrival of packets. I have the identical checkForPacketArrival() as the openRTSP() and it finds one subsession and then src->receptionStatsDB().numActiveSourcesSinceLasteRest() is always zero so it exits the while, reschedules and tries again. When I wireshark and follow the stream I have the full conversation and then a stream of binary data. What did I miss? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, November 26, 2011 4:13 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] problems moving to asynchronous rtsp interface I think it is I who doesn't understand. :) If I ever get the time I really need to just sit down and read thru the live555 code. Ideally, you'd need to read only the header files, and the example applications - not the library ".cpp" files. (We don't live in an ideal world, though :-) For me an instance is one rtsp connection to a source. OK. In LIVE555, this is encapsulated by a "RTSPClient" object. But you can add more state as well (see below). It has a statistics class that is updated and displayed thru a web console. It has some control, stop start pause, shutdown. Obviously the moment the eventloop is started it dictates another thread because otherwise it would be blocked to other requests. So for example I had all the env setup and everthing in a start function. I am now trying to at least refactor that to a static env and scheduler for all connections (is that correct?) I either need to have my class intercept the call back by having the callback more generic, or I need to make my class become a RTSPClient subclass. Or your "RTSPClient" subclass can simply contain a member field that *points to* an object of your "instance" class. That's all I'm suggesting. Subclass "RTSPClient", and add a field called "fParentInstance" (or something) that points to an object of your existing "instance" (e.g.) class. Then, in each of your response handlers, you just cast the first parameter to be a pointer to your "RTSPClient" subclass (because you know it is), and then access "fParentInstance". Voila! You have your state back. I am trying the subclass idea now with just the setBaseURL moved to the protected area. I'll make this change in the next release of the software, because it's likely to be generally useful. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1873 / Virus Database: 2102/4648 - Release Date: 11/30/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 30 10:45:24 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Nov 2011 10:45:24 -0800 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18EB14@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18EB14@IL-BOL-EXCH01.smartwire.com> Message-ID: > I have almost gotten the migration to RTSP asynchronous working in my code. > I get to the point where the play command is send and I get a reply and it prints out the Receiving Streaming Data, but then it just sets there. Looping endlessly never detecting arrival of packets. [...] > What did I miss? My guess: You're forgetting to call "startPlaying() on each input source. (You should do this between getting the response to "SETUP", and the sending of "PLAY".) (Note that the RTSP "PLAY" command merely starts the streaming at the server end. You still need to - at the client end - call "startPlaying()" on each input source, to actually receive the data.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Nov 30 11:26:02 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 30 Nov 2011 19:26:02 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18EB14@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18EBAF@IL-BOL-EXCH01.smartwire.com> In the hour+ it took for the post to go I found something even more "stupid" the Sink was not set due to a bad if condition. :) I still have some errors but I am getting further now. So it seems clear that having one environment with a single scheduler the prefered method. The side effect is of course that things printed to the env get all jumbled up. (Windows console is over 100 times slower than a *nix one) I can protect these with a mutex, but should we consider internally protecting the environment streaming calls? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, November 30, 2011 12:45 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] problems moving to asynchronous rtsp interface I have almost gotten the migration to RTSP asynchronous working in my code. I get to the point where the play command is send and I get a reply and it prints out the Receiving Streaming Data, but then it just sets there. Looping endlessly never detecting arrival of packets. [...] What did I miss? My guess: You're forgetting to call "startPlaying() on each input source. (You should do this between getting the response to "SETUP", and the sending of "PLAY".) (Note that the RTSP "PLAY" command merely starts the streaming at the server end. You still need to - at the client end - call "startPlaying()" on each input source, to actually receive the data.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1873 / Virus Database: 2102/4648 - Release Date: 11/30/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Nov 30 13:40:11 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Nov 2011 13:40:11 -0800 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B18EBAF@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18EB14@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18EBAF@IL-BOL-EXCH01.smartwire.com> Message-ID: <8B45D5CD-A93A-4859-B97C-412E13A6C276@live555.com> > So it seems clear that having one environment with a single scheduler the prefered method. The side effect is of course that things printed to the env get all jumbled up. (Windows console is over 100 times slower than a *nix one) I can protect these with a mutex, but should we consider internally protecting the environment streaming calls? Remember that the various "operator<<" functions defined on a "UsageEnvironment" are pure virtual functions, and can defined to do whatever you want in a subclass. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Nov 30 16:13:25 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 1 Dec 2011 00:13:25 +0000 Subject: [Live-devel] problems moving to asynchronous rtsp interface In-Reply-To: <8B45D5CD-A93A-4859-B97C-412E13A6C276@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B18D93B@IL-BOL-EXCH01.smartwire.com> <9D34D434-89C2-4F74-9D05-44FCC2888400@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA38@IL-BOL-EXCH01.smartwire.com> <771004CE-E49A-44B2-816B-DE317F588FA6@live555.com> <615FD77639372542BF647F5EBAA2DBC20B18DA98@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18EB14@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B18EBAF@IL-BOL-EXCH01.smartwire.com> <8B45D5CD-A93A-4859-B97C-412E13A6C276@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18ED03@IL-BOL-EXCH01.smartwire.com> I went back to my thread model. I am afraid threads are already in existence and reworking everything at the moment is not possible. It is running great on rtsp, my custom push-rtsp and my http->restreamer. I think I have accumulated enough knowledge to write a multistream version of openRTSP. :) Would that be a good testProg ?? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, November 30, 2011 3:40 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] problems moving to asynchronous rtsp interface So it seems clear that having one environment with a single scheduler the prefered method. The side effect is of course that things printed to the env get all jumbled up. (Windows console is over 100 times slower than a *nix one) I can protect these with a mutex, but should we consider internally protecting the environment streaming calls? Remember that the various "operator<<" functions defined on a "UsageEnvironment" are pure virtual functions, and can defined to do whatever you want in a subclass. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1873 / Virus Database: 2102/4648 - Release Date: 11/30/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From lanlamer at gmail.com Wed Nov 30 22:30:22 2011 From: lanlamer at gmail.com (Johnnie Au-Yeung) Date: Thu, 1 Dec 2011 14:30:22 +0800 Subject: [Live-devel] Duplicate data while create two rtsp clients to access the same live555 media server. Message-ID: Hi,everyone! I was asked to write a multiview media streaming client that can playing multiple rtsp video stream concurrency. I encapsulated the openRTSP into a C++ class that has it's own environment/taskscheduler /RTSPClient and eventloop. While try to connect to the latest live555 media server(testOnDemandRTSPServer.cpp),it works fine. But while access an IP Camera that developed based on the live555 media server 2008.04.02 version,the following thread all received the duplicate data with the first thread (as shown in the following log,all thread/instance received the same data size,which was different while access to the latest live 555 media server ). And I also find that two Videolan process can access the IPC concurrency and play the video fine. Can you please give me some information/suggestion how to debug this issue? Which part of the code may cause this problem? Thanks very much! RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData RtspMpeg4Sink::addData -------------- next part -------------- An HTML attachment was scrubbed... URL: