From dny at vicon.co.il Sun Mar 1 07:39:52 2009 From: dny at vicon.co.il (Danny Halamish) Date: Sun, 01 Mar 2009 17:39:52 +0200 Subject: [Live-devel] Can someone please explain the control flow? Message-ID: <200903011541.n21Ffovv015374@ns.live555.com> Hello, I'm trying to integrate a TI h264 codec with the live streamer (so far without success...). I am struggling with the control flow of the live streamer. Every function I look at seems to call another function, ad infinitum... The two-line control flow explanation in the FAQ helps, but not enough... Can someone please explain the control flow from a source to a sink, i.e. where, exactly, should the "meat" of the program go? What functions do I need to add to the h264 stub? Do I need to create a new source to enclose my codec (DeviceSource)? Thanks in advance, Danny From joeflin at 126.com Sun Mar 1 11:56:21 2009 From: joeflin at 126.com (joeflin) Date: Mon, 2 Mar 2009 03:56:21 +0800 (CST) Subject: [Live-devel] audio streaming Message-ID: <23787787.261171235937381090.JavaMail.coremail@bj126app26.126.com> Hi, Can the existing code in live555 latest release (2009-02-13) handle AAC streaming (for both HE and LC) ? what about G711? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Mar 1 20:35:49 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 2 Mar 2009 14:35:49 +1000 Subject: [Live-devel] audio streaming In-Reply-To: <23787787.261171235937381090.JavaMail.coremail@bj126app26.126.com> References: <23787787.261171235937381090.JavaMail.coremail@bj126app26.126.com> Message-ID: >Can the existing code in live555 latest release (2009-02-13) The latest release is 2009-02-23 > handle AAC streaming (for both HE and LC) ? Yes. > what about G711? Yes. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Mar 1 21:47:19 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 2 Mar 2009 15:47:19 +1000 Subject: [Live-devel] Can someone please explain the control flow? In-Reply-To: <200903011541.n21Ffovv015374@ns.live555.com> References: <200903011541.n21Ffovv015374@ns.live555.com> Message-ID: >Can someone please explain the control flow from a source to a sink, >i.e. where, exactly, should the >"meat" of the program go? You can use one of the existing demo applications (in the 'testProgs" directory) as a model. If you want to stream using multicast, use one of the "test*Streamer" applications as a model. If you want to stream unicast on-demand, use "testOnDemandRTSPServer" as a model. To understand how an application that uses this library works, you need to understand event-driven programming, and, in particular, the concept of an event loop. 'Control flow' is not really the right way to think about such applications. >What functions do I need to add to the h264 stub? Search for "H.264" in the FAQ. > Do I need to create a >new source to enclose my codec (DeviceSource)? Yes. You can use the "DeviceSource.cpp" code as a model. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From dny at vicon.co.il Sun Mar 1 23:59:20 2009 From: dny at vicon.co.il (Danny Halamish) Date: Mon, 02 Mar 2009 09:59:20 +0200 Subject: [Live-devel] Can someone please explain the control flow? Message-ID: <200903020801.n2281KFi036171@ns.live555.com> 02/03/2009 07:47:19, Ross Finlayson wrote: >>Can someone please explain the control flow from a source to a sink... > >You can use one of the existing demo applications (in the 'testProgs" >directory) as a model. If you want to stream using multicast, use >one of the "test*Streamer" applications as a model. If you want to >stream unicast on-demand, use "testOnDemandRTSPServer" as a model. I have looked at these for some time, and I must admit - like I said - that it simply eludes me. Every function I look at seems to defer to another function, etc. >To understand how an application that uses this library works, you >need to understand event-driven programming, and, in particular, the >concept of an event loop. 'Control flow' is not really the right way >to think about such applications. I understand these concepts, and have written such programs before. Like I said, the part I'm struggling with is the inter-relation between the different classes, etc. >>What functions do I need to add to the h264 stub? > >Search for "H.264" in the FAQ. I have, and I understand the principle, but I do not seem to be able to translate it into code. It's not that I don't undestand what the code needs to do, I just don't quite see where each part should go. >> Do I need to create a >>new source to enclose my codec (DeviceSource)? > >Yes. You can use the "DeviceSource.cpp" code as a model. I undestand, but don't quiter understand what need to go into the "doGetNextFrame" function - will it defer to another class? Will it defer some action to another class? which part? I want to use background processing, of course, and my device does not use a file handle. I guess I will need to use "scheduleDelayedTask" to poll for data if it is not available; but what will the deferred function do? Again, I understand the principle of it, and I have written such code before; but the interrelations between the classes, and the control flow, elude me. I understand that the code is event driven, and I am not looking for the "main loop"; but event driven systems have a control flow as well, for each event - and this is the thing which eludes me. Help would be appreciated, Danny From robert.klotzner at bluetechnix.at Mon Mar 2 11:52:46 2009 From: robert.klotzner at bluetechnix.at (Robert Klotzner) Date: Mon, 2 Mar 2009 20:52:46 +0100 Subject: [Live-devel] H264 File Streaming In-Reply-To: References: Message-ID: <1236023566.4048.7.camel@localhost> Hi all! I just finnished my work on a working H264VideoStreamFramer for files containing plain h264. I'd like to post it here, maybe it can be integrated some how in the main tree. Best regards, Robert ---------------------------------------- Bluetechnix Mechatronische Systeme GmbH Robert Klotzner Waidhausenstr. 3/19 1140 Wien AUSTRIA Development Office (Delivery Address): Lainzerstr. 162/32) 1130 Wien AUSTRIA Tel: +43 (1) 914 20 91 x DW3) Fax: +43 (1) 914 20 91 x 99 Email: robert.klotzner at bluetechnix.at) Website: www.bluetechnix.com --------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: live-h264.patch Type: text/x-patch Size: 14860 bytes Desc: not available URL: From joeflin at 126.com Mon Mar 2 23:19:40 2009 From: joeflin at 126.com (joeflin) Date: Tue, 3 Mar 2009 15:19:40 +0800 (CST) Subject: [Live-devel] audio streaming In-Reply-To: References: <23787787.261171235937381090.JavaMail.coremail@bj126app26.126.com> Message-ID: <16951616.268491236064780188.JavaMail.coremail@bj126app29.126.com> hi, Ross, I only see the AAC file server, do we have to write our own code for streaming live AAC? using MPEG4Generic? Could you please add a simple one? Thanks! ?2009-03-02?12:35:49?"Ross?Finlayson"????? >>Can?the?existing?code?in?live555?latest?release?(2009-02-13) > >The?latest?release?is?2009-02-23 > >>??handle?AAC?streaming?(for?both?HE?and?LC)?? > >Yes. > >>??what?about?G711? > >Yes. >--? > >Ross?Finlayson >Live?Networks,?Inc. >http://www.live555.com/ >_______________________________________________ >live-devel?mailing?list >live-devel at lists.live555.com >http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.klotzner at bluetechnix.at Tue Mar 3 08:49:09 2009 From: robert.klotzner at bluetechnix.at (Robert Klotzner) Date: Tue, 3 Mar 2009 17:49:09 +0100 Subject: [Live-devel] h264 unicast Message-ID: <1236098949.9969.12.camel@localhost> Hi all! I've added support for unicast streaming. Best regards, Robert P.S.: The only_unicast patch is a diff to my last version. -- -------------------------------------- Bluetechnix Mechatronische Systeme GmbH Robert Klotzner Waidhausenstr. 3/19 1140 Wien AUSTRIA Development Office (Delivery Address): Lainzerstr. 162/32) 1130 Wien AUSTRIA Tel: +43 (1) 914 20 91 x DW3) Fax: +43 (1) 914 20 91 x 99 Email: robert.klotzner at bluetechnix.at) Website: www.bluetechnix.com --------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: live-h264-only_unicast.patch Type: text/x-patch Size: 15186 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: live-h264+unicast.patch Type: text/x-patch Size: 30326 bytes Desc: not available URL: From asnow at pathfindertv.net Tue Mar 3 16:46:56 2009 From: asnow at pathfindertv.net (Austin Snow (pftv)) Date: Tue, 3 Mar 2009 19:46:56 -0500 Subject: [Live-devel] Difficulties with a live source Message-ID: <2F73633F-0445-41B2-BF09-088DC4476D9C@pathfindertv.net> Hello All, I having some difficulty getting samples from a live source (QTCaptureSession) into MPEG4VideoStreamDiscreteFramer via a DeviceSource.cpp. The live data does not contain any 000001B3xxx000001B6..... (group of vop start code) or 000001B0.... (visual object sequence start code). The only header info in the stream is the 000001B6...... the vop start code. Can you tell me if the MPEG4VideoStreamDiscreteFramer or MPEG4VideoStreamFramer can be used with this type of live source? Thank you Austin From finlayson at live555.com Tue Mar 3 17:04:20 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 4 Mar 2009 11:04:20 +1000 Subject: [Live-devel] audio streaming In-Reply-To: <16951616.268491236064780188.JavaMail.coremail@bj126app29.126.com> References: <23787787.261171235937381090.JavaMail.coremail@bj126app26.126.com> <16951616.268491236064780188.JavaMail.coremail@bj126app29.126.com> Message-ID: >I only see the AAC file server, do we have to write our own code for >streaming live AAC? using MPEG4Generic? To stream live AAC (unicast, on-demand) you would need to write your own subclass of "OnDemand ServerMediaSubsession", similar to the existing "ADTSAudioFileServerMediaSubsession" class. In particular, you would need to implement the two virtual functions: - "createNewStreamSource()" to create an instance of an object that encapsulates your input device. (This object will implement "doGetNextFrame()" by delivering a discrete AAC audio frame each time. If your input device is an open file, and your audio frames are fixed-size, then you can just use an instance of "ByteStreamFileSource for this.) - "createNewRTPSink()". This will work similar to the existing "ADTSAudioFileServerMediaSubsession::createNewRTPSink()", by creating a new instance of the "MPEG4GenericRTPSink" class (with appropriate parameters to designate AAC audio). Also, because you are streaming from a live input source, don't forget to set "reuseFirstSource" to True. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 3 20:13:50 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 4 Mar 2009 14:13:50 +1000 Subject: [Live-devel] Difficulties with a live source In-Reply-To: <2F73633F-0445-41B2-BF09-088DC4476D9C@pathfindertv.net> References: <2F73633F-0445-41B2-BF09-088DC4476D9C@pathfindertv.net> Message-ID: >Hello All, >I having some difficulty getting samples from a live source >(QTCaptureSession) into MPEG4VideoStreamDiscreteFramer via a >DeviceSource.cpp. > >The live data does not contain any 000001B3xxx000001B6..... (group >of vop start code) or 000001B0.... (visual object sequence start >code). >The only header info in the stream is the 000001B6...... the vop start code. Then it appears that your data is not standards compliant (or perhaps, not regular MPEG-4 Video data at all). Are you sure that it's not something else (e.g., not H.264)? > >Can you tell me if the MPEG4VideoStreamDiscreteFramer or >MPEG4VideoStreamFramer can be used with this type of live source? No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Alan.Roberts at e2v.com Wed Mar 4 02:02:14 2009 From: Alan.Roberts at e2v.com (Roberts, Alan) Date: Wed, 4 Mar 2009 10:02:14 -0000 Subject: [Live-devel] openRTSP saving to a microSD card Message-ID: <8821BCD7410B064BA4414E4F8080E5EB043A2D15@whl46.e2v.com> Hi. A brain-teaser for your coffee break... (but I don't know the solution!)... I'm using openRTSP to read incoming streaming mpeg4 video over ethernet and store it on a micro SD card. The aim is to then download the mp4 file(s) over the USB port onto a host pc for subsequent viewing. The input video stream comes from our thermal imaging camera: 320x240 at 10fps. The Linux target is an ARM-based 3rd party embedded card with ethernet, SD card and USB connections. It's "so nearly" working! I can successfully record video onto the micro SD card and play it back reliably... which proves it's all possible. The devil is in the detail though... Once I've started openRTSP running in the background from the command line it seems to be that: IF (the recording is => 2 minutes long) then it is successfully saved onto the micro SD card... I can turn off the camera straight after stopping the recording and "all is well" when I come to play back the file over USB ELSE a short recording (< 2 minutes) is only successfully saved onto the micro SD card if I wait about 2 minutes from starting the recording to subsequently stopping it and turning off the camera. The only code I've written is a short bit of C that establishes if the record button has been pressed, a bit of filename control and ethernet link checking. Oh, and the following command line that gets run if (and only if) the record button has been pressed and the ethernet is up and running (established by a successful "ping"): strcpy(start_command_str,"\n"); strcat(start_command_str, "./openRTSP -V -b 50000 -4 -w 320 -h 240 -f 10 rtsp://10.133.1.20:554/tic04 > "); strcat(start_command_str, "/mnt/uSDcard/"); strcat(start_command_str, new_filename); strcat(start_command_str," &\n"); system(start_command); On further investigation it appears that the micro SD card is "somehow" put into read only mode for about 2 minutes from the point at which I invoke openRSTP. I can't even write to a text file on the micro SD card during that magic 2 minute period. Saying that though, in order to be able to stop the recording session I redirect the output of "ps" to a text file (after about 1 second) and use fscanf to find the Process ID of the existing openRTSP process from this file. So, creating this file within the 2 minutes worked but I can't seem to write to any others. I fopen it as readonly and fclose it once I've found the Process ID (pid_str). Confusing. Once the record button has been pressed to stop recording (it's polled many times a second) I issue the following : strcpy(stop_command_str,"kill -HUP "); strcat(stop_command_str, pid_str); strcat(stop_command_str, " \n"); system(stop_command_str); This all seems to work fine. It's the "2 minute thing" that's getting me. I haven't got the luxury of making software changes on the camera. I'd really appreciate any help you may be able to offer and I'm more than happy to provide more details on the implementation, background, output logs etc. I'm not and expert in C or Linux - but I'm not usually defeated. Many thanks and I look forward to hearing from you soon. Alan Sent by E2V TECHNOLOGIES PLC or a member of the E2V group of companies. A company registered in England and Wales. Company number; 04439718. Registered address; 106 Waterhouse Lane, Chelmsford, Essex, CM1 2QU, UK. ______________________________________________________ ________________ This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bartha_adam at yahoo.com Wed Mar 4 07:08:31 2009 From: bartha_adam at yahoo.com (Bartha Adam) Date: Wed, 4 Mar 2009 07:08:31 -0800 (PST) Subject: [Live-devel] realtime streaming questions Message-ID: <45576.15673.qm@web110315.mail.gq1.yahoo.com> Hello all! I'm new to live555, just compiled and tried some examples. I need to transmit live data with live555, the source is produced by an opengl application. The OS used is Windows XP. I'm a little bit confused. Those examples are streaming from a static file. How the connection between my application and live555 can be made? I am planing to develop the encoding part using CUDA. Another question: what codec should be used? I will need to transmit at huge resolution, at 2mpixel/frame. After some research,? my conclusion was, that even with hardware (gpu) acceleration, realtime h264 coding is not possible at big resoultuion. Mpeg2 could make it? ? My appologize, if my questions were already answered, I searched the archive several hours, but no results. Regards, Adam -------------- next part -------------- An HTML attachment was scrubbed... URL: From hailinwudi at 126.com Wed Mar 4 19:31:49 2009 From: hailinwudi at 126.com (hailinwudi) Date: Thu, 5 Mar 2009 11:31:49 +0800 (CST) Subject: [Live-devel] some problems of streaming a h.264file Message-ID: <20288331.134291236223909389.JavaMail.coremail@bj126app87.126.com> hi ,I have built the ?testh264? and ?h264frame ?and so on. when I stream the h264 file using the ?testh264?,it runs as flows: Play this stream using the URL "rtsp://192.168.7.240:8554/testStream" Beginning streaming... Beginning to read from file... ...done reading from file Beginning to read from file... ...done reading from file Beginning to read from file... ...done reading from file Beginning to read from file... ...done reading from file But,in another pc,when I used the vlc to take over and play the h264file,the vlc can?t open the h264file. Can somebody tell me the reason? -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.klotzner at bluetechnix.at Wed Mar 4 23:52:28 2009 From: robert.klotzner at bluetechnix.at (Robert Klotzner) Date: Thu, 5 Mar 2009 08:52:28 +0100 Subject: [Live-devel] some problems of streaming a h.264file In-Reply-To: <20288331.134291236223909389.JavaMail.coremail@bj126app87.126.com> References: <20288331.134291236223909389.JavaMail.coremail@bj126app87.126.com> Message-ID: <1236239548.3990.0.camel@vhios.dyndns.org> For me a newer version of vlc worked. Best regards, Robert Am Donnerstag, den 05.03.2009, 04:31 +0100 schrieb hailinwudi: > hi ,I have built the ?testh264? and ?h264frame ?and so on. > > when I stream the h264 file using the ?testh264?,it runs as flows: > > Play this stream using the URL "rtsp://192.168.7.240:8554/testStream" > > Beginning streaming... > > Beginning to read from file... > > ...done reading from file > > Beginning to read from file... > > ...done reading from file > > Beginning to read from file... > > ...done reading from file > > Beginning to read from file... > > ...done reading from file > > > > But,in another pc,when I used the vlc to take over and play the > h264file,the vlc can?t open the h264file. > > Can somebody tell me the reason? > > > > > > > > ______________________________________________________________________ > ????????????????? From gampa.harsha at gmail.com Thu Mar 5 05:16:07 2009 From: gampa.harsha at gmail.com (harsha gampa) Date: Thu, 5 Mar 2009 18:46:07 +0530 Subject: [Live-devel] Maximum no of clients live555MediaServer can handle ? In-Reply-To: <27116a4e0903040826j1ac0970enf3d626cab4bcff2c@mail.gmail.com> References: <27116a4e0903040826j1ac0970enf3d626cab4bcff2c@mail.gmail.com> Message-ID: <27116a4e0903050516x1b815d56j30c43de03e038b1@mail.gmail.com> Hi Ross, I have read one of the previous solutions in the mailing lists , where u have said that the maximum number of clients live555Media server can handle is defined by LISTEN_BACKLOG_SIZE in the RTSPServer.cpp file. By default it has value '20' , but when the server is run with 50 client requests for connection, I see that the wireshark traces showing 50 established connections. Plz tell me where im goin wrong , as i expected the server to reject the last 30 connections which didnt happen . Is there any other factor i need to consider while testing this ? Thanks, Harsha From aditya.vikram at aricent.com Fri Mar 6 01:24:50 2009 From: aditya.vikram at aricent.com (Aditya Vikram) Date: Fri, 6 Mar 2009 14:54:50 +0530 Subject: [Live-devel] Difficulties with a live source In-Reply-To: References: <2F73633F-0445-41B2-BF09-088DC4476D9C@pathfindertv.net> Message-ID: Hi, I am receiving audio and video rtp streams using RTSP and filling two queue from audio and video buffersink , after this I am playing the video and audio from these queues using directshow, how can I synchronize audio and video while playing. Regards Aditya "DISCLAIMER: This message is proprietary to Aricent and is intended solely for the use of the individual to whom it is addressed. It may contain privileged or confidential information and should not be circulated or used for any purpose other than for what it is intended. If you have received this message in error,please notify the originator immediately. If you are not the intended recipient, you are notified that you are strictly prohibited from using, copying, altering, or disclosing the contents of this message. Aricent accepts no responsibility for loss or damage arising from the use of the information transmitted by this email including damage from virus." From finlayson at live555.com Sat Mar 7 12:25:19 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 8 Mar 2009 06:25:19 +1000 Subject: [Live-devel] Any feedback re: Change to 'trick play' Transport Stream generation - reduces output bit rate ?? Message-ID: Does anyone have any feedback about the following change (which was put into place for the most recent version of the code)? A few months ago, several people were complaining about the high bitrate of 'trick play' streams (fast-forward or reverse play) for Transport Stream data. Does this new version of the code work better for you? Ross Finlayson Live Networks, Inc. >Date: Mon, 23 Feb 2009 20:20:33 +1000 >To: live-devel at lists.live555.com >From: Ross Finlayson >Subject: Change to 'trick play' Transport Stream generation - >reduces output bit rate > >Some people have reported having problems with 'trick play' >operations on Transport Streams, due to the high bitrate of the >'trick play' output streams. > >By popular demand, I have now released a new version (2009.02.23) of >the "LIVE555 Streaming Media" software that changes the way that >Transport Streams are generated for 'trick play' operations >(fast-forward or reverse play). Now, each I-frame (i.e., key frame) >appears no more than once in the output Transport Stream for 'trick >play' operations. This will have the effect of reducing the average >output bitrate for 'trick play' streams, except for high 'scale' >values. > >For those of you who have been having problems with the high bit >rate of 'trick play' Transport Stream data - please try this new >code, and let us know if this new version of the code improves >things. Note that because these changes are experimental, I have >not yet changed the prebuilt binary versions of the "LIVE555 Media >Server" application - therefore, if you use this application, you >will need to build your own version from the new source code. > >If - for whatever reason - you wish to go back to the old behavior >(in which we always keep the original frame rate, even if it means >duplicating I-frames), then you can do so by changing the definition >of "KEEP_ORIGINAL_FRAME_RATE" in >"liveMedia/MPEG2TransportStreamTrickModeFilter.cpp" to "True". >However, if you find you need to do this, please let us know why. >This change to the code is experimental, and I will back it out if >people end up having problems with it. >-- > > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Mar 7 12:39:36 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 8 Mar 2009 06:39:36 +1000 Subject: [Live-devel] Maximum no of clients live555MediaServer can handle ? In-Reply-To: <27116a4e0903050516x1b815d56j30c43de03e038b1@mail.gmail.com> References: <27116a4e0903040826j1ac0970enf3d626cab4bcff2c@mail.gmail.com> <27116a4e0903050516x1b815d56j30c43de03e038b1@mail.gmail.com> Message-ID: >I have read one of the previous solutions in the mailing lists , where u >have said that the maximum number of clients live555Media server can handle >is defined by LISTEN_BACKLOG_SIZE in the RTSPServer.cpp file. No. That constant defines how many clients may *connect* to the server simultaneously, but does not affect how many clients the server can handle at the same time (even if the clients connected to the server at different times). Our code does not impose any limit on how many clients the server can handle at the same time. Any such limit would be imposed by the underlying operating system's limit of the maximum number of open files that it can support. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sat Mar 7 16:20:43 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 8 Mar 2009 10:20:43 +1000 Subject: [Live-devel] Difficulties with a live source In-Reply-To: References: <2F73633F-0445-41B2-BF09-088DC4476D9C@pathfindertv.net> Message-ID: >I am receiving audio and video rtp streams using RTSP and filling >two queue from audio and video buffersink , after this I am playing >the video and audio from these queues using directshow, how can I >synchronize audio and video while playing. Assuming that your client is also implementing RTCP (as it should), then you can synchronize the audio and video streams using the presentation timestamp values that are delivered along with each audio or video frame. See -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From SRawling at pelco.com Sat Mar 7 17:21:27 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Sat, 07 Mar 2009 17:21:27 -0800 Subject: [Live-devel] Receiving in multiple threads In-Reply-To: Message-ID: Hi Guys, I have an application that uses the RTSPServer code to send streams out to clients. The source of what it sends varies and is pluggable. All of these plugins end up writing to a unix pipe, which the RTSPServer code reads and streams out. Even tho the app and the plugins are multi threaded, I have taken great care to protect the integrity and single threaded nature of the live555 event loop. And it works very well. However, I am integrating with some new plugins that are also using live555 (they are restreaming existing rtsp streams) and I am seeing some strange behaviours. First of all sometimes I see the wrong video coming in the wrong pipe. I believe the cause of this is because although the plugins have been written to instantiate their own usageenvironement and taskscheduler, they are all using RTSP client. I am theorizing that in some circumstances, such as connecting to another RTSP server of the same type, they are actually initiating streams being sent to the same port, and the 2 instances of the live555 event loops are competing over the incoming packets. Streaming a single source is fine, and mixing other streams from plugins not based on live555 is fine too. To address this I am looking at making these plugins share the same usageenvironment and event loop. Can you comment on 3 things: 1. My understanding is that so long as each instance of the library uses its own usageenvironment and event loop, it is ok to have multiple instances in multiple threads (although I agree this is really not advised). 2. My hypothesis about the RTSPClient initiating streams from similar sources that stream to similar port numbers could be the cause of the above issue, and my proposal would make sense. 3. Would it make sense to add an optional base port for the client to use, similar to RTSPServer? Thanks for any help provided. Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Sat Mar 7 18:14:40 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Sat, 07 Mar 2009 18:14:40 -0800 Subject: [Live-devel] Receiving in multiple threads In-Reply-To: Message-ID: Apologies, Ignore point 3, I found the necessary call in MediaSubsession (setClientPortNum). I have been working mainly in the server domain, and this is my first real play with the Client side code. Comments on the rest of the email are still requested tho. Stuart On 3/7/09 5:21 PM, "Rawling, Stuart" wrote: > Hi Guys, > > I have an application that uses the RTSPServer code to send streams out to > clients. The source of what it sends varies and is pluggable. All of these > plugins end up writing to a unix pipe, which the RTSPServer code reads and > streams out. Even tho the app and the plugins are multi threaded, I have > taken great care to protect the integrity and single threaded nature of the > live555 event loop. And it works very well. > > However, I am integrating with some new plugins that are also using live555 > (they are restreaming existing rtsp streams) and I am seeing some strange > behaviours. First of all sometimes I see the wrong video coming in the wrong > pipe. I believe the cause of this is because although the plugins have been > written to instantiate their own usageenvironement and taskscheduler, they are > all using RTSP client. I am theorizing that in some circumstances, such as > connecting to another RTSP server of the same type, they are actually > initiating streams being sent to the same port, and the 2 instances of the > live555 event loops are competing over the incoming packets. Streaming a > single source is fine, and mixing other streams from plugins not based on > live555 is fine too. > > To address this I am looking at making these plugins share the same > usageenvironment and event loop. > > Can you comment on 3 things: > > 1. My understanding is that so long as each instance of the library uses its > own usageenvironment and event loop, it is ok to have multiple instances in > multiple threads (although I agree this is really not advised). > 2. My hypothesis about the RTSPClient initiating streams from similar sources > that stream to similar port numbers could be the cause of the above issue, and > my proposal would make sense. > 3. Would it make sense to add an optional base port for the client to use, > similar to RTSPServer? > > > Thanks for any help provided. > > Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Mar 7 18:09:38 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 8 Mar 2009 12:09:38 +1000 Subject: [Live-devel] Receiving in multiple threads In-Reply-To: References: Message-ID: >1. My understanding is that so long as each instance of the >library uses its own usageenvironment and event loop, it is ok to >have multiple instances in multiple threads (although I agree this >is really not advised). Yes, this should work, in principle. However, it has not been extensively tested, and I can't guarantee that there's not still some inadvertently shared state somewhere. (If anyone finds some, though, please let us know, and I'll try to remove it.) Have you tried putting your RTSP server in a separate *process* from your input plugins. (Since your plugins are communicating with the server via pipes, this should be straightforward, I think...) >3. Would it make sense to add an optional base port for the >client to use, similar to RTSPServer? No, because there's no standard port number for RTSP *clients*. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Sun Mar 8 22:06:47 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Sun, 08 Mar 2009 21:06:47 -0800 Subject: [Live-devel] Receiving in multiple threads In-Reply-To: Message-ID: I just wanted to update with my findings. As I suspected, the issue at hand was that occasionally 2 different RTPClients in the same process would be assigned the same client_port (either from the server or the algorithm in MediaSession). Basically, my finding is that if the 2 RTSPClient?s (in separate threads and usage environments) initiated at about the same time, they would both be assigned the same port in their bind call. I notice that in GroupsockHelper the library does set SO_REUSEADDR and SO_REUSEPORT so multiple binds binds on the same port will work. Although on the same port in the same process, in my experience only 1 receiver will be able to process the incoming UDP packets. I have not tried this in multiple processes yet as you suggested, I will try and do that and report back my findings. I will be changing the plugin model to share a single UsageEnvironment and TaskScheduler across many plugins to work around these issues. This definitely appears to be the better way forward, and the way the library was designed to be used. Stuart On 3/7/09 6:09 PM, "Ross Finlayson" wrote: >>> 1. My understanding is that so long as each instance of the library >>> uses its own usageenvironment and event loop, it is ok to have multiple >>> instances in multiple threads (although I agree this is really not advised). > > Yes, this should work, in principle. However, it has not been extensively > tested, and I can't guarantee that there's not still some inadvertently > shared state somewhere. (If anyone finds some, though, please let us know, > and I'll try to remove it.) > > Have you tried putting your RTSP server in a separate *process* from your > input plugins. (Since your plugins are communicating with the server via > pipes, this should be straightforward, I think...) > >>> 3. Would it make sense to add an optional base port for the client to >>> use, similar to RTSPServer? > > No, because there's no standard port number for RTSP *clients*. - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido.marelli at intraway.com Mon Mar 9 11:03:49 2009 From: guido.marelli at intraway.com (Guido Marelli) Date: Mon, 09 Mar 2009 16:03:49 -0200 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems Message-ID: <49B55A05.8040709@intraway.com> Hi, I've found that in some cases (it seems to be random) the method mediaSubsession::initiate will use the same UDP ports for both a video stream and an audio stream. A sample output for the openRTSP program is the following: Created receiver for "video/MP4V-ES" subsession (client ports 34394-34395) Created receiver for "audio/PCMU" subsession (client ports 34394-34395) Setup "video/MP4V-ES" subsession (client ports 34394-34395) Setup "audio/PCMU" subsession (client ports 34394-34395) The problem seems to be the SO_REUSEPORT socket option on the setupDatagramSocket function. So, my question is: Is it safe to disable that option? I look forward to hearing from you, Regads, -- Guido Marelli Intraway Corp. Oficina AR: +54 (11) 4393-2091 Oficina CO: +57 (1) 750-4929 Oficina US: +1 (516) 620-3890 Fax: +54 (11) 5258-2631 MSN: guido.marelli at intraway.com Visite nuestro sitio web en http://www.intraway.com From matt at schuckmannacres.com Mon Mar 9 11:33:41 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Mon, 09 Mar 2009 11:33:41 -0700 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: <49B55A05.8040709@intraway.com> References: <49B55A05.8040709@intraway.com> Message-ID: <49B56105.7010201@schuckmannacres.com> We have seen similar problems, i.e. the same port numbers being assigned to 2 different streams. Basically our client would work for several successive instantiations and shutdowns of the client app and each time the OS (or whatever it is that hands out the next available port number) would increment the assigned port numbers until eventually it hit the limit of assignable ports (something like max short, something pretty close to the 34394 that your seeing) and then it would always assign the same port numbers to both streams. I think the only way to clear the problem and get it working again was to reboot the system.We did check, and there were not zombie instances of the client running and we didn't seen any left over connections in the socket tables. I don't think we've figured this one out yet. Matt S. Guido Marelli wrote: > Hi, > I've found that in some cases (it seems to be random) the method > mediaSubsession::initiate will use the same UDP ports for both a video > stream and an audio stream. A sample output for the openRTSP program > is the following: > > > Created receiver for "video/MP4V-ES" subsession (client ports > 34394-34395) > Created receiver for "audio/PCMU" subsession (client ports 34394-34395) > Setup "video/MP4V-ES" subsession (client ports 34394-34395) > Setup "audio/PCMU" subsession (client ports 34394-34395) > > > The problem seems to be the SO_REUSEPORT socket option on the > setupDatagramSocket function. > So, my question is: Is it safe to disable that option? > > > I look forward to hearing from you, > Regads, > From asnow at pathfindertv.net Mon Mar 9 12:07:56 2009 From: asnow at pathfindertv.net (Austin Snow (pftv)) Date: Mon, 9 Mar 2009 15:07:56 -0400 Subject: [Live-devel] Difficulties with a live source In-Reply-To: References: <2F73633F-0445-41B2-BF09-088DC4476D9C@pathfindertv.net> Message-ID: <7F767851-EB3F-4E07-9EE5-B73FC9F79178@pathfindertv.net> Thanks Ross, I'm now generating the header with the proper visual object sequence start code and supporting information. 000001B003000001B509000001000000012000C8888007D05841214103 Can I send this information into the MPEG4VideoStreamDiscreteFramer and/or MPEG4VideoStreamFramer without the next frame info and data? Or must they be sent together for the first frame to initialize things? Also, I wanted to know, can the MPEG4VideoStreamDiscreteFramer and MPEG4VideoStreamFramer function without the group of VOP start code (000001B3xxxxxx)? This is defined as optional in the mpeg4 standard. Thanks Austin On Mar 3, 2009, at 11:13 PM, Ross Finlayson wrote: >> Hello All, >> I having some difficulty getting samples from a live source >> (QTCaptureSession) into MPEG4VideoStreamDiscreteFramer via a >> DeviceSource.cpp. >> >> The live data does not contain any 000001B3xxx000001B6..... (group >> of vop start code) or 000001B0.... (visual object sequence start >> code). The only header info in the stream is the 000001B6...... >> the vop start code. > > Then it appears that your data is not standards compliant (or > perhaps, not regular MPEG-4 Video data at all). Are you sure that > it's not something else (e.g., not H.264)? > >> >> Can you tell me if the MPEG4VideoStreamDiscreteFramer or >> MPEG4VideoStreamFramer can be used with this type of live source? > > No. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: PGP.sig Type: application/pgp-signature Size: 195 bytes Desc: This is a digitally signed message part URL: From matt at schuckmannacres.com Mon Mar 9 15:14:46 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Mon, 09 Mar 2009 15:14:46 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server Message-ID: <49B594D6.5040706@schuckmannacres.com> I've been trying to determine why my Live555 based RTSP client is never seeing the RTCP BYE messages from the LIVE555 server object (i.e. my bye handler is never getting called). In reading the code it looks like the server RTCP code aways combines the BYE packet with the SR packet and it looks like the RTCP packet handler code on the client sees the SR packet reads the SR stuff then reads the stuff that is the same as the RR stuff then ignores anything else that might be in the packet. Do I have this correct? or am I missing something. Note pretty much the same appears to be true for the server receiving the BYE messages from the client. I'm having a difficult time debugging this (probably because I'm starting to feel sick, darn kids keep bring these germs home) so any help or hints as to what might be going on would be helpful. Thanks Matt S. From bitter at vtilt.com Mon Mar 9 14:02:58 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Mon, 9 Mar 2009 17:02:58 -0400 Subject: [Live-devel] Difficulties with a live source In-Reply-To: <7F767851-EB3F-4E07-9EE5-B73FC9F79178@pathfindertv.net> References: <2F73633F-0445-41B2-BF09-088DC4476D9C@pathfindertv.net> <7F767851-EB3F-4E07-9EE5-B73FC9F79178@pathfindertv.net> Message-ID: <0181B8A6-FD99-46E9-A565-F669D3AFDDC3@vtilt.com> I have several live sources that do not have GOV's and they work perfect with live555. You should not have a problem..... BTW: Tell Rich Brad says hello :) - Brad On Mar 9, 2009, at 3:07 PM, Austin Snow (pftv) wrote: > Thanks Ross, > I'm now generating the header with the proper visual object sequence > start code and supporting information. > 000001B003000001B509000001000000012000C8888007D05841214103 > > Can I send this information into the MPEG4VideoStreamDiscreteFramer > and/or MPEG4VideoStreamFramer without the next frame info and data? > Or must they be sent together for the first frame to initialize > things? > > Also, I wanted to know, can the MPEG4VideoStreamDiscreteFramer and > MPEG4VideoStreamFramer function without the group of VOP start code > (000001B3xxxxxx)? This is defined as optional in the mpeg4 standard. > > Thanks > Austin > > On Mar 3, 2009, at 11:13 PM, Ross Finlayson wrote: > >>> Hello All, >>> I having some difficulty getting samples from a live source >>> (QTCaptureSession) into MPEG4VideoStreamDiscreteFramer via a >>> DeviceSource.cpp. >>> >>> The live data does not contain any 000001B3xxx000001B6..... (group >>> of vop start code) or 000001B0.... (visual object sequence start >>> code). The only header info in the stream is the 000001B6...... >>> the vop start code. >> >> Then it appears that your data is not standards compliant (or >> perhaps, not regular MPEG-4 Video data at all). Are you sure that >> it's not something else (e.g., not H.264)? >> >>> >>> Can you tell me if the MPEG4VideoStreamDiscreteFramer or >>> MPEG4VideoStreamFramer can be used with this type of live source? >> >> No. >> -- >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From gbzbz at yahoo.com Mon Mar 9 23:12:37 2009 From: gbzbz at yahoo.com (gather bzbz) Date: Mon, 9 Mar 2009 23:12:37 -0700 (PDT) Subject: [Live-devel] frame rate supported Message-ID: <512257.7348.qm@web51311.mail.re2.yahoo.com> Does current live555 support 60 fps? say 720p60? any parameters in this area? From asnow at pathfindertv.net Tue Mar 10 11:23:21 2009 From: asnow at pathfindertv.net (Austin Snow (pftv)) Date: Tue, 10 Mar 2009 14:23:21 -0400 Subject: [Live-devel] Live input MPEG4 Video input Message-ID: Hello All, I'm having an issue getting a live MPEG4 video source into Live555, any help would be great. This is how I defined my live input; in DeviceSource.hh; removed //private: //void deliverFrame(); //ags added public: void deliverFrame(void *data, int len, int dur); in DeviceSource.cpp; changed //void DeviceSource::deliverFrame() { to void DeviceSource::deliverFrame(void *data, int len, int dur) { <-- passing my data in here added // Deliver the data here: if(fMaxSize < len){ fNumTruncatedBytes = len - fMaxSize; len = fMaxSize; printf("Frame size truncated\n"); } gettimeofday(&fPresentationTime, NULL); fDurationInMicroseconds = dur; fFrameSize = len; memcpy(fTo, data, len); printf("Frame sent, len=%d\n", len); printf("dur=%d\n", dur); printf("fPresentationTime.tv_sec=%d\n", fPresentationTime.tv_sec); printf("fPresentationTime.tv_usec=%d\n", fPresentationTime.tv_usec); The printfs indicate that the data is only accepted approximately every 1 to 1.5 seconds on average. Is this the proper way to add a live input? Thanks Austin _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel Austin -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: PGP.sig Type: application/pgp-signature Size: 195 bytes Desc: This is a digitally signed message part URL: From matt at schuckmannacres.com Tue Mar 10 11:21:40 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Tue, 10 Mar 2009 11:21:40 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: <49B594D6.5040706@schuckmannacres.com> References: <49B594D6.5040706@schuckmannacres.com> Message-ID: <49B6AFB4.204@schuckmannacres.com> Ok, I'm feeling better now and I've determined that the server is attempting to send the RTCP BYE packet in the RTCPInstance destructor but by that time all the destinations have been removed from the GroupSocket so no data is actually sent (I've confirmed this with WireShark). I had thought that the client could use the receipt of the BYE packet, for each stream, from the server as a signal that the session had been closed (either the video ended or the server had force-ably closed the session (my UI supports this) and the client could clean up and notify the user appropriately. Now I'm not so sure anymore. I did find a reference on the forums that the server will send the BYE packet if the source supports seeking. Why can't it send the BYE packet when the session is ending too? Seems like it would be the right thing to do. The shutdown code for the streams is very confusing and I'm not sure how to proceed to get things working the way I'd like, can you make any suggestions? Thanks, Matt S. Matt Schuckmann wrote: > I've been trying to determine why my Live555 based RTSP client is > never seeing the RTCP BYE messages from the LIVE555 server object > (i.e. my bye handler is never getting called). > > In reading the code it looks like the server RTCP code aways combines > the BYE packet with the SR packet and it looks like the RTCP packet > handler code on the client sees the SR packet reads the SR stuff then > reads the stuff that is the same as the RR stuff then ignores anything > else that might be in the packet. Do I have this correct? or am I > missing something. Note pretty much the same appears to be true for > the server receiving the BYE messages from the client. > > I'm having a difficult time debugging this (probably because I'm > starting to feel sick, darn kids keep bring these germs home) so any > help or hints as to what might be going on would be helpful. > > Thanks > Matt S. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Mar 10 14:19:54 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Mar 2009 14:19:54 -0700 Subject: [Live-devel] frame rate supported In-Reply-To: <512257.7348.qm@web51311.mail.re2.yahoo.com> References: <512257.7348.qm@web51311.mail.re2.yahoo.com> Message-ID: >Does current live555 support 60 fps? say 720p60? any parameters in this area? Frame rate and dimension parameters are carried within the video data itself (and therefore is specific to the video codec, and has nothing to do with RTSP or RTP). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From matt at schuckmannacres.com Tue Mar 10 15:02:38 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Tue, 10 Mar 2009 15:02:38 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: <49B6AFB4.204@schuckmannacres.com> References: <49B594D6.5040706@schuckmannacres.com> <49B6AFB4.204@schuckmannacres.com> Message-ID: <49B6E37E.7080604@schuckmannacres.com> Ok, I added a call to RTCPInstance::sendBYE() at the very start of the StreamState::endPlaying and that seems to get get the BYE sent to the client. Although I'm not sure what this would do in a multiple client scenario, I think everybody would get the BYE message which may not be right, might want to add a way to send the BYE to just the destinations that are being removed, what do you think. This also means that in some cases 2 BYE messages for each stream could be sent which may or may not be a problem. Matt S. PS. You also should note that the BYE handler code in OpenRTSP causes all the streams to be deleted and the RTCPInstance objects with them, the problem is the RTCPInstance object is in the process of handling a packet. It's not a problem for OpenRTSP since the BYE handler never actually exits in this case (i.e. exit(0) is called in shutdown() which is called from sessionAfterPlaying()) but anybody using this code as an example for an application that persists after the session has ended could be in heap (pun intended) of trouble. I'm considering solving this in my case by scheduling a shutdown task after the last BYE message is received. Matt Schuckmann wrote: > Ok, I'm feeling better now and I've determined that the server is > attempting to send the RTCP BYE packet in the RTCPInstance destructor > but by that time all the destinations have been removed from the > GroupSocket so no data is actually sent (I've confirmed this with > WireShark). > > I had thought that the client could use the receipt of the BYE packet, > for each stream, from the server as a signal that the session had been > closed (either the video ended or the server had force-ably closed the > session (my UI supports this) and the client could clean up and notify > the user appropriately. > Now I'm not so sure anymore. > > I did find a reference on the forums that the server will send the BYE > packet if the source supports seeking. Why can't it send the BYE > packet when the session is ending too? Seems like it would be the > right thing to do. > > The shutdown code for the streams is very confusing and I'm not sure > how to proceed to get things working the way I'd like, can you make > any suggestions? > > Thanks, > Matt S. > > > Matt Schuckmann wrote: >> I've been trying to determine why my Live555 based RTSP client is >> never seeing the RTCP BYE messages from the LIVE555 server object >> (i.e. my bye handler is never getting called). >> >> In reading the code it looks like the server RTCP code aways combines >> the BYE packet with the SR packet and it looks like the RTCP packet >> handler code on the client sees the SR packet reads the SR stuff then >> reads the stuff that is the same as the RR stuff then ignores >> anything else that might be in the packet. Do I have this correct? >> or am I missing something. Note pretty much the same appears to be >> true for the server receiving the BYE messages from the client. >> >> I'm having a difficult time debugging this (probably because I'm >> starting to feel sick, darn kids keep bring these germs home) so any >> help or hints as to what might be going on would be helpful. >> >> Thanks >> Matt S. >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From gbzbz3 at yahoo.com Tue Mar 10 16:57:24 2009 From: gbzbz3 at yahoo.com (Gbzbz Gbzbz) Date: Tue, 10 Mar 2009 16:57:24 -0700 (PDT) Subject: [Live-devel] frame rate supported Message-ID: <549762.37537.qm@web111102.mail.gq1.yahoo.com> I thought the fDuration is related to the frame rate? We have a hardware encoder and we save the contents to a file before we stream them out. When play the file locally with VLC, it looks a 720P60, but at the remote VLC RTSP client side it is 720P30(or less) - visually it is not as smooth. I am not familiar with the live555 (or C++ in general). So I am not sure if the schedulTask or fDuration has anything to do with the above. May not?!?! --- On Tue, 3/10/09, Ross Finlayson wrote: > From: Ross Finlayson > Subject: Re: [Live-devel] frame rate supported > To: "LIVE555 Streaming Media - development & use" > Date: Tuesday, March 10, 2009, 9:19 PM > > Does current live555 support 60 > fps? say 720p60? any parameters in this area? > > Frame rate and dimension parameters are carried within the > video data itself (and therefore is specific to the video > codec, and has nothing to do with RTSP or RTP). > -- > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Mar 10 17:29:32 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Mar 2009 17:29:32 -0700 Subject: [Live-devel] frame rate supported In-Reply-To: <549762.37537.qm@web111102.mail.gq1.yahoo.com> References: <549762.37537.qm@web111102.mail.gq1.yahoo.com> Message-ID: >I thought the fDuration is related to the frame rate? It is. You (as the programmer of a source object) set it (actually, "fDurationInMicroseconds") for each frame, as appropriate. But the 'frame rate' itself is not carried as a parameter anywhere within RTP or RTSP. > We have a hardware encode Who's "we"? Yahoo? I don't think so. Sorry, but if you use a "From:" address like "From: Gbzbz Gbzbz ", then I'm going to assume that you're just a child, using your parents' computer while they're away. If you want to be taken seriously on this mailing list, use a real, professional email address - not "@yahoo.com" or "@gmail.com". >When play the file locally with VLC, it looks a 720P60, but at the >remote VLC RTSP client side it is 720P30(or less) - visually it is >not as smooth. Perhaps you're just seeing packet loss? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 10 17:45:01 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Mar 2009 17:45:01 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: <49B6E37E.7080604@schuckmannacres.com> References: <49B594D6.5040706@schuckmannacres.com> <49B6AFB4.204@schuckmannacres.com> <49B6E37E.7080604@schuckmannacres.com> Message-ID: In general, it's hard to respond to alleged bug reports on modified code. The best bug reports are those that apply to the original, unmodified code, so we can (hopefully) reproduce the problem (if any) ourselves. >PS. You also should note that the BYE handler code in OpenRTSP >causes all the streams to be deleted and the RTCPInstance objects >with them, the problem is the RTCPInstance object is in the process >of handling a packet. Remember that this is all single-threaded code. If an "RTCPInstance" object is deleted, then it's not also 'in the process' of doing anything else. The only way it could also be involved in 'handling a packet' would be if this packet handling happened later, as a result of an 'incoming packet' event in the event loop. But that should never happen, because - as a result of deleting the "RTCPInstance" object - "TaskScheduler::turnOffBackgroundReadHandling()" gets called. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 10 18:56:29 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Mar 2009 18:56:29 -0700 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: <49B55A05.8040709@intraway.com> References: <49B55A05.8040709@intraway.com> Message-ID: >Hi, >I've found that in some cases (it seems to be random) the method >mediaSubsession::initiate will use the same UDP ports for both a >video stream and an audio stream. A sample output for the openRTSP >program is the following: > > >Created receiver for "video/MP4V-ES" subsession (client ports 34394-34395) >Created receiver for "audio/PCMU" subsession (client ports 34394-34395) >Setup "video/MP4V-ES" subsession (client ports 34394-34395) >Setup "audio/PCMU" subsession (client ports 34394-34395) > > >The problem seems to be the SO_REUSEPORT socket option on the >setupDatagramSocket function. >So, my question is: Is it safe to disable that option? The SO_REUSEPORT option is intended to allow more than one process - on the same host - to use the same (explicit) socket number. For example, it allows more than one application to receive the same multicast stream, or, in the case of a unicast stream, it allows a different process to receive it than the actual RTSP client (e.g., the "-r" option to "openRTSP"). It should not be allowing the OS to hand out the same ephemeral port number more than once. If your OS is doing this, then it may be buggy. If you're not anticpating any other process on the client host from receiving the stream, then you could probably get away with not setting SO_REUSEPORT. However, fixing your buggy OS is probably the better option :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From SRawling at pelco.com Tue Mar 10 20:31:37 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Tue, 10 Mar 2009 20:31:37 -0700 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: Message-ID: Hi, I tried an experiment running 2 openRTSP processes in the testProgs folder. They connected to the same URL (an Axis Camera), and they both used the same client_ports. The server handled the connection on different ports, but the client_port reported in the Transport Section were the same. Should this be the case on separate processes? Stuart On 3/10/09 6:56 PM, "Ross Finlayson" wrote: >> >Hi, >> >I've found that in some cases (it seems to be random) the method >> >mediaSubsession::initiate will use the same UDP ports for both a >> >video stream and an audio stream. A sample output for the openRTSP >> >program is the following: >> > >> > >> >Created receiver for "video/MP4V-ES" subsession (client ports 34394-34395) >> >Created receiver for "audio/PCMU" subsession (client ports 34394-34395) >> >Setup "video/MP4V-ES" subsession (client ports 34394-34395) >> >Setup "audio/PCMU" subsession (client ports 34394-34395) >> > >> > >> >The problem seems to be the SO_REUSEPORT socket option on the >> >setupDatagramSocket function. >> >So, my question is: Is it safe to disable that option? > > The SO_REUSEPORT option is intended to allow more than one process - > on the same host - to use the same (explicit) socket number. For > example, it allows more than one application to receive the same > multicast stream, or, in the case of a unicast stream, it allows a > different process to receive it than the actual RTSP client (e.g., > the "-r" option to "openRTSP"). It should not be allowing the OS to > hand out the same ephemeral port number more than once. If your OS > is doing this, then it may be buggy. > > If you're not anticpating any other process on the client host from > receiving the stream, then you could probably get away with not > setting SO_REUSEPORT. However, fixing your buggy OS is probably the > better option :-) > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido.marelli at intraway.com Tue Mar 10 21:52:32 2009 From: guido.marelli at intraway.com (Guido Marelli) Date: Wed, 11 Mar 2009 01:52:32 -0300 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: References: <49B55A05.8040709@intraway.com> Message-ID: <977F5CDB71D1408B8AF55F8297D86DDA@ricardoc466e9e> Hi, I'm agraid I have to desagree with you. It seems that the code on MediaSubsession::initiate will cause the effect I'am reporting when the OS offers the same odd port number for both the video and the audio stream. The method looks like this: While (1) { unsigned short rtpPortNum = fClientPortNum&~1; ... fRTPSocket = new Groupsock(env(), tempAddr, rtpPortNum, 255); ... fClientPortNum = ntohs(clientPort.num()); ... fClientPortNum += 2 } So you force the OS to give you an explicit port (rtpPortNum), a port that it will always give you cause you are using SO_REUSEPORT. Regards, Guido -----Mensaje original----- De: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] En nombre de Ross Finlayson Enviado el: Martes, 10 de Marzo de 2009 22:56 Para: LIVE555 Streaming Media - development & use Asunto: Re: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems >Hi, >I've found that in some cases (it seems to be random) the method >mediaSubsession::initiate will use the same UDP ports for both a video >stream and an audio stream. A sample output for the openRTSP program is >the following: > > >Created receiver for "video/MP4V-ES" subsession (client ports >34394-34395) Created receiver for "audio/PCMU" subsession (client ports >34394-34395) Setup "video/MP4V-ES" subsession (client ports >34394-34395) Setup "audio/PCMU" subsession (client ports 34394-34395) > > >The problem seems to be the SO_REUSEPORT socket option on the >setupDatagramSocket function. >So, my question is: Is it safe to disable that option? The SO_REUSEPORT option is intended to allow more than one process - on the same host - to use the same (explicit) socket number. For example, it allows more than one application to receive the same multicast stream, or, in the case of a unicast stream, it allows a different process to receive it than the actual RTSP client (e.g., the "-r" option to "openRTSP"). It should not be allowing the OS to hand out the same ephemeral port number more than once. If your OS is doing this, then it may be buggy. If you're not anticpating any other process on the client host from receiving the stream, then you could probably get away with not setting SO_REUSEPORT. However, fixing your buggy OS is probably the better option :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.10/1994 - Release Date: 03/10/09 19:51:00 From zhangjm at uusee.com Tue Mar 10 22:52:38 2009 From: zhangjm at uusee.com (Jimmy Zhang) Date: Wed, 11 Mar 2009 13:52:38 +0800 Subject: [Live-devel] RtpOverTcp problem Message-ID: hi, As a RTSP server, I try to use RtpOverTcp which has the same socket as "RTSPClientSession" objet. When start playing, RTCPInstance socket read handler will replace RTSPClientSession's handler. If there is any other sequent RTSP Request, such as "PAUSE, TEARDOWN", RTSPClientSession will not handle the request. How can I resolve this problem? Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 10 23:06:58 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Mar 2009 23:06:58 -0700 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: <977F5CDB71D1408B8AF55F8297D86DDA@ricardoc466e9e> References: <49B55A05.8040709@intraway.com> <977F5CDB71D1408B8AF55F8297D86DDA@ricardoc466e9e> Message-ID: >It seems that the code on MediaSubsession::initiate will cause the effect >I'am reporting when the OS offers the same odd port number for both the >video and the audio stream. Yes, you're right. This bug got introduced in version 2008.12.20 when I changed the port number selection code in response to another bug that some people were seeing. (Before, the code was always letting the OS choose the port number, and this was sometimes causing a loop whereby the same (odd) port number would get chosen over and over again.) From what I can tell, the problem occurs only if we end up making the code - rather than the OS - choose a port number. (So, SO_REUSEPORT is not the problem, because even if this were not set, we'd end up getting an error when we tried to create the socket with the same port number the second time.) It seems that I need to change the code again so that it always lets the OS choose the port number, but be smarter about doing so, so we don't end up in an infinite loop. Stay tuned... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 10 23:15:17 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Mar 2009 23:15:17 -0700 Subject: [Live-devel] RtpOverTcp problem In-Reply-To: References: Message-ID: >hi, >As a RTSP server, I try to use RtpOverTcp which has the same socket >as "RTSPClientSession" objet. >When start playing, RTCPInstance socket read handler will replace >RTSPClientSession's handler. >If there is any other sequent RTSP Request, such as "PAUSE, >TEARDOWN", RTSPClientSession will not handle the request. >How can I resolve this problem? Be patient. This is a known problem, and will be fixed soon. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangliu.seu at gmail.com Wed Mar 11 02:28:25 2009 From: yangliu.seu at gmail.com (liu yang) Date: Wed, 11 Mar 2009 17:28:25 +0800 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory Message-ID: <55d43d1d0903110228m55c51089o3365ebd6a4dd3672@mail.gmail.com> Hi, I plan to develop an application which may support 500+ or even 1000+ rtp session simultaneously. So anybody could tell me whether livemedia could support such load? BTW, I did some test based on testWAV sample program. The result is not satisfactory, frankly speaking. FAQ told me livemedia is a single threaded framework, which all logics are processed in single thread sequentially. Below is my test statistics. Hardware : Intel Celeron 2G + 1G memory Scenario: Streaming one wave file to one external IP which is of the same subnet of test box with 100 instances ( 100 RTPSource, 100 RTPSink, 1 TaskSchedule, 1 Environment ) Result: CPU usage : ~30% Avg Jitter: ~30ms Besides those criteria, I also saw significant packet loss. So do you have any insightful thoughts of where we can optimize to enhance livemeida as a high-performance rtp streaming stack which could undergo heavy load. Thanks Kandy From amadorim at vdavda.com Wed Mar 11 03:31:39 2009 From: amadorim at vdavda.com (Marco Amadori) Date: Wed, 11 Mar 2009 11:31:39 +0100 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory In-Reply-To: <55d43d1d0903110228m55c51089o3365ebd6a4dd3672@mail.gmail.com> References: <55d43d1d0903110228m55c51089o3365ebd6a4dd3672@mail.gmail.com> Message-ID: <200903111131.39306.amadorim@vdavda.com> On Wednesday 11 March 2009, 10:28:25, liu yang wrote: > I plan to develop an application which may support 500+ or even 1000+ > rtp session simultaneously. So anybody could tell me whether livemedia > could support such load? > BTW, I did some test based on testWAV sample program. The result is > not satisfactory, frankly speaking. On a bigger machine (dual xeon) with 6 raid5 15K SAS disks streaming 4mbits MPEG2 ts I found that I could not stream more than 95 streams without artifacts on screen. My bold analisys (on a early 2008 release of livemedia) was that the problem wasn't IO bound but CPU bound (95%+). Also the network (400Mbps) wasn't problematic since we had tried both a single gigabit and a bonding of 4 interfaces sawing both server and router side very little load. But I just did a quick analisys, so I could be enterely wrong or misleaded. > FAQ told me livemedia is a single threaded framework, which all logics > are processed in single thread sequentially. This could be a problem (a known one) in our case since multiple Xeon cores and CPUs was not used. Launching another session of live555MediaServer helper in adding another 95 streams to our tests.. so this could be a hint for looking for optimization interventions. Do some profiling and if some computation effort is really needed, parallelize the code as possible in order to use multiples cores/CPUs. > So do you have any insightful thoughts of where we can optimize to > enhance livemeida as a high-performance rtp streaming stack which > could undergo heavy load. This is of real interest to me too. -- ESC:wq -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From Alan.Roberts at e2v.com Wed Mar 11 04:01:44 2009 From: Alan.Roberts at e2v.com (Roberts, Alan) Date: Wed, 11 Mar 2009 11:01:44 -0000 Subject: [Live-devel] openRTSP saving to a microSD card Message-ID: <8821BCD7410B064BA4414E4F8080E5EB043A2D2E@whl46.e2v.com> Hi all Could anyone point me in the right direction please? I suspect that somewhere within openRTSP there's some sort of "link timeout" to cope with timing issues related to networking. I need to take a closer look. I've got a much better idea of my problem now - my incoming streaming mpeg4 video turns off at pretty much the same time as my record OFF signal... unfortunatley I have no control over this (ie I can't change it). If the stream stops after the record OFF then I don't have a problem - I can successfully kill the openRTSP process and unmount my SD card (thanks patbob). However, if the stream turns off before the record OFF signal then some sort of 30 timeout looks like it's getting invoked within openRTSP - maybe some sort of "waiting for the link to come up again" thing. During this time a Kill -HUP has no effect - which is my problem. I need to kill it straight away and unmount the SD card because under some scenarios the power to the unit could be removed within 3 seconds. If I leave it runnung then the process gets successfully killed, I can unmount the SD card and all is well. Like I say though - I haven't got the luxury of just leaving it to do it's own thing. I have a point-to-point ethernet connection with (hopefully) very little fluctuation in data rate... and when the link is up... it's up. If I could knock the 30 seconds down to, say, 2 seconds then life would be rosy. Any ideas? Kind regards Alan ps If it's an ethernet driver thing then I apologise for troubling you butI genuinly think it's some sort of networking robustness built into openRTSP. Sent by E2V TECHNOLOGIES PLC or a member of the E2V group of companies. A company registered in England and Wales. Company number; 04439718. Registered address; 106 Waterhouse Lane, Chelmsford, Essex, CM1 2QU, UK. ______________________________________________________ ________________ This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From daijun88 at gmail.com Wed Mar 11 08:45:45 2009 From: daijun88 at gmail.com (dai jun) Date: Wed, 11 Mar 2009 23:45:45 +0800 Subject: [Live-devel] What's the DirectedNetInterface used for? Message-ID: I'm reading the groupSock source code these days.I found DirectedNetInterface is a pure abstract class, In groupSock::outputToAllMemberExcept(...), the member functions are called, but I cannot find any implementation in the project. I wonder what's this class used for? I guess it can be extended and used for stream forwarding? or application layer multicast? Daly -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Wed Mar 11 10:02:06 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Wed, 11 Mar 2009 10:02:06 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: References: <49B594D6.5040706@schuckmannacres.com> <49B6AFB4.204@schuckmannacres.com> <49B6E37E.7080604@schuckmannacres.com> Message-ID: <49B7EE8E.2010805@schuckmannacres.com> I understand that it's hard to test bugs on modified code, I'd submit my modifications to the project but you've already told me that you won't accept some of them (I understand your reasons), and I'm not ready to submit the others. I'm only trying to help you out by reporting what I see, I haven't changed any of the code in the areas I'm referring to here. I'm pretty sure that this would happen with unmodified code, just see if your bye handler is called in OpenRTSP after sending or a teardown message while the source is still playing. Or better yet stick a network sniffer on a test setup and you'll see the server never sends the bye message after receiving a teardown from the client. Also note the call sequence when the server is tearing things down: The server deletes the clientSession which causes reclaimStreamStates to be called which causes OnDemandServerMediaSubsession::deleteStream(). The first thing OnDemandServerMediaSubsession::deleteStream() does is call StreamState::endPlaying() on the streamState with destination addresses to stop playing for. End playing removes the destinations from the groupSock associated with the RTCPInstance object. OnDemandServerMediaSubsession::deleteStream() goes through and decrements the refcounts for all the streamStates and deletes all the streamState's with a reference count of 0 (which a unicast session is all of them). The destructor for the streamState objects calls StreamState::reclaim which calls Medium::close() for the RTCPinstance , which deletes the RTCPInstance. The destructor for RTCPInstance calls SendBYE() which on the surface appears to work but in fact doesn't do anything at all, because all the destinations in the groupsock have been removed already so the BYE message never gets sent to the server. Make sense? Ross Finlayson wrote: > In general, it's hard to respond to alleged bug reports on modified > code. The best bug reports are those that apply to the original, > unmodified code, so we can (hopefully) reproduce the problem (if any) > ourselves. > > >> PS. You also should note that the BYE handler code in OpenRTSP causes >> all the streams to be deleted and the RTCPInstance objects with them, >> the problem is the RTCPInstance object is in the process of handling >> a packet. > > Remember that this is all single-threaded code. If an "RTCPInstance" > object is deleted, then it's not also 'in the process' of doing > anything else. The only way it could also be involved in 'handling a > packet' would be if this packet handling happened later, as a result > of an 'incoming packet' event in the event loop. But that should > never happen, because - as a result of deleting the "RTCPInstance" > object - "TaskScheduler::turnOffBackgroundReadHandling()" gets called. I'm very aware that it's single threaded and I've made sure to work within that context. The fact that it's a single thread library doesn't preclude one from calling delete from within one of that objects methods, or worse yet while such a method is on the call stack, which is the case here. If you look at the BYE case in RTCPInstance::incomingReportHandler1() you see that the byeHandler is called on or about line 522 (I've changed all the fprintf's to calls to envir() << to improve debugging in a non command line app, so my line numbers might not match yours, that's the only change I've made to this code). If you follow the call sequence for OpenRTSP you'll see that this becomes a call to subsessionByeHandler in PlayCommon.cpp, If all the subsessions have been stopped subsesionByeHandler() then calls sessionAfterPlaying() (also in PlayCommon.cpp). The sessionAfterPlaying() function then calls Shutdown() which calls closeMediaSinks() and tearDownStreams() , and closes the session before calling exit(). It think it's closeMediaSink() or tearDownStreams() that causes all the streams and there respective RTCPInstance's to be deleted. Now if you remove the call to exit() you'd see that eventually the call stack will unwind all the way back up to RTCPInstance::incomingReportHandler1() which oops the object associated with this call has been deleted so if any of the execution after the call the ByeHandler tries to use the any of that objects state you'll be in trouble. That's what I mean by the RTCPInstance is deleted while it's in the process of handling the BYE message. I generally consider it bad form to call delete on a object while one of it's methods are on the call stack. In this particular case I don't think that RTCPInstance::incomingReportHandler1() should be changed it's, it can't know that a client is going to want to delete it, in fact it should assume that it won't. More likely OpenRTSP should be changed to clean things up outside of the call sequence origonating from it's byeHandler. My case I just scheduled a delayed task with the taskScheduler to do the clean up. I discovered this when: 1. I got the server to send BYE messages by adding a call to SendBYE as I described before. 2. I changed all the fprintf's in RTCPInstance to envir() << and my app would crash when RTCPInstance::incomingReportHandler1() would try to log something after the call to the byeHandler. I hope this all makes sense and it helps you and other users of this very useful library out. Matt S. From guido.marelli at intraway.com Wed Mar 11 10:33:01 2009 From: guido.marelli at intraway.com (Guido Marelli) Date: Wed, 11 Mar 2009 15:33:01 -0200 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: References: <49B55A05.8040709@intraway.com> <977F5CDB71D1408B8AF55F8297D86DDA@ricardoc466e9e> Message-ID: <49B7F5CD.8060501@intraway.com> Hi, I think that SO_REAUSEPORT will continue introducing problems cause we need to force a port when creating a socket for the RTCP channel (RTP port + 1). I believe that the best approach is to forget about SO_REAUSEPORT so we can be sure that every socket requested to the OS will work just fine. Regards! Ross Finlayson wrote: >> It seems that the code on MediaSubsession::initiate will cause the >> effect >> I'am reporting when the OS offers the same odd port number for both the >> video and the audio stream. > > Yes, you're right. This bug got introduced in version 2008.12.20 when > I changed the port number selection code in response to another bug > that some people were seeing. (Before, the code was always letting > the OS choose the port number, and this was sometimes causing a loop > whereby the same (odd) port number would get chosen over and over again.) > > From what I can tell, the problem occurs only if we end up making the > code - rather than the OS - choose a port number. (So, SO_REUSEPORT > is not the problem, because even if this were not set, we'd end up > getting an error when we tried to create the socket with the same port > number the second time.) > > It seems that I need to change the code again so that it always lets > the OS choose the port number, but be smarter about doing so, so we > don't end up in an infinite loop. Stay tuned... > -- Guido Marelli Intraway Corp. Oficina AR: +54 (11) 4393-2091 Oficina CO: +57 (1) 750-4929 Oficina US: +1 (516) 620-3890 Fax: +54 (11) 5258-2631 MSN: guido.marelli at intraway.com Visite nuestro sitio web en http://www.intraway.com From finlayson at live555.com Wed Mar 11 08:57:50 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Mar 2009 08:57:50 -0700 Subject: [Live-devel] What's the DirectedNetInterface used for? In-Reply-To: References: Message-ID: >I'm reading the groupSock source code these days. >I found DirectedNetInterface is a pure abstract class, >In groupSock::outputToAllMemberExcept(...), the member functions are >called, but I cannot find any implementation in the project. >I wonder what's this class used for? >I guess it can be extended and used for stream forwarding? or >application layer multicast? Yes, we use this for our implementation of UMTP Otherwise, this code ends up not being used. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From martin at martin.st Wed Mar 11 12:36:14 2009 From: martin at martin.st (=?iso-8859-1?Q?Martin_Storsj=F6?=) Date: Wed, 11 Mar 2009 21:36:14 +0200 (EET) Subject: [Live-devel] [PATCH] RTSPClient::recordMediaSession? Message-ID: Hi, I noticed that the RTSPClient class lacks a method for sending RECORD requests for a whole media session; there's only a method for sending RECORD requests for individual subsessions. Is this an intentional omission, or has it just not been needed yet? I implemented such a method, see the attached patch. This seems to work fine for me. Regards, // Martin Storsj? -------------- next part -------------- A non-text attachment was scrubbed... Name: live555-record-media-session.diff Type: text/x-diff Size: 2587 bytes Desc: URL: From finlayson at live555.com Wed Mar 11 12:45:20 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Mar 2009 12:45:20 -0700 Subject: [Live-devel] [PATCH] RTSPClient::recordMediaSession? In-Reply-To: References: Message-ID: >I noticed that the RTSPClient class lacks a method for sending >RECORD requests for a whole media session; there's only a method for >sending RECORD requests for individual subsessions. Is this an >intentional omission, or has it just not been needed yet? The latter. >I implemented such a method, see the attached patch. This seems to >work fine for me. I will likely add this when I update the "RTSPClient" code, -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bitter at vtilt.com Wed Mar 11 13:09:15 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Wed, 11 Mar 2009 16:09:15 -0400 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory In-Reply-To: <200903111131.39306.amadorim@vdavda.com> References: <55d43d1d0903110228m55c51089o3365ebd6a4dd3672@mail.gmail.com> <200903111131.39306.amadorim@vdavda.com> Message-ID: <5A4BFC47-6A00-4FA4-8B6C-36DB7FE7E204@vtilt.com> I found that under Linux a single threaded process such as one using live555 only runs on one core of a multi-core CPU. My suggestion is to run multiple processes if possible. This will let Linux distribute the processes to the different cores. - Brad On Mar 11, 2009, at 6:31 AM, Marco Amadori wrote: > On Wednesday 11 March 2009, 10:28:25, liu yang wrote: > >> I plan to develop an application which may support 500+ or even 1000+ >> rtp session simultaneously. So anybody could tell me whether >> livemedia >> could support such load? >> BTW, I did some test based on testWAV sample program. The result is >> not satisfactory, frankly speaking. > > On a bigger machine (dual xeon) with 6 raid5 15K SAS disks streaming > 4mbits > MPEG2 ts I found that I could not stream more than 95 streams without > artifacts on screen. > > My bold analisys (on a early 2008 release of livemedia) was that the > problem > wasn't IO bound but CPU bound (95%+). Also the network (400Mbps) > wasn't > problematic since we had tried both a single gigabit and a bonding > of 4 > interfaces sawing both server and router side very little load. > > But I just did a quick analisys, so I could be enterely wrong or > misleaded. > >> FAQ told me livemedia is a single threaded framework, which all >> logics >> are processed in single thread sequentially. > > This could be a problem (a known one) in our case since multiple > Xeon cores > and CPUs was not used. Launching another session of > live555MediaServer helper > in adding another 95 streams to our tests.. so this could be a hint > for > looking for optimization interventions. Do some profiling and if some > computation effort is really needed, parallelize the code as > possible in > order to use multiples cores/CPUs. > >> So do you have any insightful thoughts of where we can optimize to >> enhance livemeida as a high-performance rtp streaming stack which >> could undergo heavy load. > > This is of real interest to me too. > > -- > ESC:wq > > -- > This message has been scanned for viruses and > dangerous content by MailScanner, and is > believed to be clean. > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From patbob at imoveinc.com Wed Mar 11 17:10:03 2009 From: patbob at imoveinc.com (Patrick White) Date: Wed, 11 Mar 2009 17:10:03 -0700 Subject: [Live-devel] frame rate supported In-Reply-To: <549762.37537.qm@web111102.mail.gq1.yahoo.com> References: <549762.37537.qm@web111102.mail.gq1.yahoo.com> Message-ID: <200903111710.03977.patbob@imoveinc.com> Could you be running into sleep quantitization issues? 60 fps is down in sleep quantitization issueland. The buffering on the receiver should take care of things, but maybe it isn't, or the timestamps are not getting generated correctly? patbob On Tuesday 10 March 2009 4:57 pm, Gbzbz Gbzbz wrote: > I thought the fDuration is related to the frame rate? We have a hardware > encoder and we save the contents to a file before we stream them out. When > play the file locally with VLC, it looks a 720P60, but at the remote VLC > RTSP client side it is 720P30(or less) - visually it is not as smooth. > > I am not familiar with the live555 (or C++ in general). So I am not sure if > the schedulTask or fDuration has anything to do with the above. May not?!?! > > --- On Tue, 3/10/09, Ross Finlayson wrote: > > From: Ross Finlayson > > Subject: Re: [Live-devel] frame rate supported > > To: "LIVE555 Streaming Media - development & use" > > Date: Tuesday, March 10, 2009, 9:19 PM > > > > > Does current live555 support 60 > > > > fps? say 720p60? any parameters in this area? > > > > Frame rate and dimension parameters are carried within the > > video data itself (and therefore is specific to the video > > codec, and has nothing to do with RTSP or RTP). > > -- > > Ross Finlayson > > Live Networks, Inc. > > http://www.live555.com/ > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From yangliu.seu at gmail.com Wed Mar 11 19:52:15 2009 From: yangliu.seu at gmail.com (liu yang) Date: Thu, 12 Mar 2009 10:52:15 +0800 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory Message-ID: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> Thanks you guys' reply. But starting multiple processes doesn't suit for my case, because I use livemedia as one independent component of my application which is of only one instance. My application is concerned about performance, high capacity, real-time packet delivery. Seem livemedia is not designed for such purpose. BTW, I plan to do some change to make livemedia threadsafe and multi-threaded. So any part of code you think I need change? As far as I know now, what I think needs be modified as below :- 1. static member variable "tokenCounter" needs be changed to class member variable to isolate eachTaskScheduler instance which is the main entry of thread polling. 2. IMHO, current delta timing mechanism to trigger timer is not very efficient because every time when invoking "synchronize", whole queue will be traversed. Why not just keep it simple to maintain a sorted timeval-eventProc map? Scheduler just compares the absolute now time with the expected timeval, if expires, fire the proc. 3. Is it possible to make delayed task polling and trigger as a separate thread? I mean, a separate thread runs dedicatedly to maintain the timer queue and do event trigger. We know, most logic processing of livemedia is timer triggered ( DelayTask in livemedia's jargon ) Welcome you guys' thoughts and comments. I think, most of you also concern about livemdia performance. Hope we can make it better together. Thanks Kandy On Thu, Mar 12, 2009 at 8:12 AM, wrote: > Send live-devel mailing list submissions to > ? ? ? ?live-devel at lists.live555.com > > To subscribe or unsubscribe via the World Wide Web, visit > ? ? ? ?http://lists.live555.com/mailman/listinfo/live-devel > or, via email, send a message with subject or body 'help' to > ? ? ? ?live-devel-request at lists.live555.com > > You can reach the person managing the list at > ? ? ? ?live-devel-owner at lists.live555.com > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of live-devel digest..." > > > Today's Topics: > > ? 1. What's the DirectedNetInterface used for? (dai jun) > ? 2. Re: Live555 RTSP Client never sees RTCP BYE message from > ? ? ?Live555 Server (Matt Schuckmann) > ? 3. Re: setupDatagramSocket - SO_REUSEADDR ?problems (Guido Marelli) > ? 4. Re: What's the DirectedNetInterface used for? (Ross Finlayson) > ? 5. [PATCH] RTSPClient::recordMediaSession? (Martin Storsj?) > ? 6. Re: [PATCH] RTSPClient::recordMediaSession? (Ross Finlayson) > ? 7. Re: my performance benchmark of livemedia library, ? ? ? ?not > ? ? ?satisfactory (Brad Bitterman) > ? 8. Re: frame rate supported (Patrick White) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 11 Mar 2009 23:45:45 +0800 > From: dai jun > Subject: [Live-devel] What's the DirectedNetInterface used for? > To: live-devel at ns.live555.com > Message-ID: > ? ? ? ? > Content-Type: text/plain; charset="iso-8859-1" > > I'm reading the groupSock source code these days.I found > DirectedNetInterface is a pure abstract class, > In groupSock::outputToAllMemberExcept(...), the member functions are called, > but I cannot find any implementation in the project. > I wonder what's this class used for? > I guess it can be extended and used for stream forwarding? or application > layer multicast? > > Daly > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Message: 2 > Date: Wed, 11 Mar 2009 10:02:06 -0700 > From: Matt Schuckmann > Subject: Re: [Live-devel] Live555 RTSP Client never sees RTCP BYE > ? ? ? ?message from ? ?Live555 Server > To: LIVE555 Streaming Media - development & use > ? ? ? ? > Message-ID: <49B7EE8E.2010805 at schuckmannacres.com> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > I understand that it's hard to test bugs on modified code, I'd submit my > modifications to the project but you've already told me that you won't > accept some of them (I understand your reasons), and I'm not ready to > submit the others. I'm only trying to help you out by reporting what I > see, I haven't changed any of the code in the areas I'm referring to here. > > I'm pretty sure that this would happen with unmodified code, just see if > your bye handler is called in OpenRTSP after sending or a teardown > message while the source is still playing. Or better yet stick a network > sniffer on a test setup and you'll see the server never sends the bye > message after receiving a teardown from the client. > > Also note the call sequence when the server is tearing things down: > The server deletes the clientSession which causes reclaimStreamStates to > be called which causes OnDemandServerMediaSubsession::deleteStream(). > The first thing OnDemandServerMediaSubsession::deleteStream() does is > call StreamState::endPlaying() on the streamState with destination > addresses to stop playing for. > End playing removes the destinations from the groupSock associated with > the RTCPInstance object. > OnDemandServerMediaSubsession::deleteStream() goes through and > decrements the refcounts for all the streamStates and deletes all the > streamState's with a reference count of 0 (which a unicast session is > all of them). The destructor for the streamState objects calls > StreamState::reclaim which calls Medium::close() for the RTCPinstance , > which deletes the RTCPInstance. The destructor for RTCPInstance calls > SendBYE() which on the surface appears to work but in fact doesn't do > anything at all, because all the destinations in the groupsock have been > removed already so the BYE message never gets sent to the server. > Make sense? > > Ross Finlayson wrote: >> In general, it's hard to respond to alleged bug reports on modified >> code. ?The best bug reports are those that apply to the original, >> unmodified code, so we can (hopefully) reproduce the problem (if any) >> ourselves. >> >> >>> PS. You also should note that the BYE handler code in OpenRTSP causes >>> all the streams to be deleted and the RTCPInstance objects with them, >>> the problem is the RTCPInstance object is in the process of handling >>> a packet. >> >> Remember that this is all single-threaded code. ?If an "RTCPInstance" >> object is deleted, then it's not also 'in the process' of doing >> anything else. ?The only way it could also be involved in 'handling a >> packet' would be if this packet handling happened later, as a result >> of an 'incoming packet' event in the event loop. ?But that should >> never happen, because - as a result of deleting the "RTCPInstance" >> object - "TaskScheduler::turnOffBackgroundReadHandling()" gets called. > > I'm very aware that it's single threaded and I've made sure to work > within that context. The fact that it's a single thread library doesn't > preclude one from calling delete from within one of that objects > methods, or worse yet while such a method is on the call stack, which is > the case here. > > If you look at the BYE case in RTCPInstance::incomingReportHandler1() > you see that the byeHandler is called on or about line 522 (I've changed > all the fprintf's to calls to envir() << to improve debugging in a non > command line app, so my line numbers might not match yours, that's the > only change I've made to this code). > If you follow the call sequence for OpenRTSP you'll see that this > becomes a call to subsessionByeHandler in PlayCommon.cpp, ?If all the > subsessions have been stopped subsesionByeHandler() then calls > sessionAfterPlaying() (also in PlayCommon.cpp). ?The > sessionAfterPlaying() function then calls Shutdown() which calls > closeMediaSinks() and tearDownStreams() , and closes the session before > calling exit(). It think it's closeMediaSink() or tearDownStreams() that > causes all the streams and there respective RTCPInstance's to be > deleted. Now if you remove the call to exit() you'd see that eventually > the call stack will unwind all the way back up to > RTCPInstance::incomingReportHandler1() which oops the object associated > with this call has been deleted so if any of the execution after the > call the ByeHandler tries to use the any of that objects state you'll be > in trouble. That's what I mean by the RTCPInstance is deleted while it's > in the process of handling the BYE message. I generally consider it bad > form to call delete on a object while one of it's methods are on the > call stack. ?In this particular case I don't think that > RTCPInstance::incomingReportHandler1() should be changed it's, it can't > know that a client is going to want to delete it, in fact it should > assume that it won't. More likely OpenRTSP should be changed to clean > things up outside of the call sequence origonating from it's byeHandler. > My case I just scheduled a delayed task with the taskScheduler to do the > clean up. > > I discovered this when: > 1. I got the server to send BYE messages by adding a call to SendBYE as > I described before. > 2. I changed all the fprintf's in RTCPInstance to envir() << and my app > would crash when RTCPInstance::incomingReportHandler1() would try to log > something after the call to the byeHandler. > > I hope this all makes sense and it helps you and other users of this > very useful library out. > > Matt S. > > > ------------------------------ > > Message: 3 > Date: Wed, 11 Mar 2009 15:33:01 -0200 > From: Guido Marelli > Subject: Re: [Live-devel] setupDatagramSocket - SO_REUSEADDR ?problems > To: LIVE555 Streaming Media - development & use > ? ? ? ? > Message-ID: <49B7F5CD.8060501 at intraway.com> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > Hi, > I think that SO_REAUSEPORT will continue introducing problems cause we > need to force a port when creating a socket for the RTCP channel (RTP > port + 1). > > I believe that the best approach is to forget about SO_REAUSEPORT so we > can be sure that every socket requested to the OS will work just fine. > > > Regards! > > > Ross Finlayson wrote: >>> It seems that the code on MediaSubsession::initiate will cause the >>> effect >>> I'am reporting when the OS offers the same odd port number for both the >>> video and the audio stream. >> >> Yes, you're right. ?This bug got introduced in version 2008.12.20 when >> I changed the port number selection code in response to another bug >> that some people were seeing. ?(Before, the code was always letting >> the OS choose the port number, and this was sometimes causing a loop >> whereby the same (odd) port number would get chosen over and over again.) >> >> From what I can tell, the problem occurs only if we end up making the >> code - rather than the OS - choose a port number. ?(So, SO_REUSEPORT >> is not the problem, because even if this were not set, we'd end up >> getting an error when we tried to create the socket with the same port >> number the second time.) >> >> It seems that I need to change the code again so that it always lets >> the OS choose the port number, but be smarter about doing so, so we >> don't end up in an infinite loop. ?Stay tuned... >> > > -- > Guido Marelli > Intraway Corp. > > Oficina AR: +54 (11) ?4393-2091 > Oficina CO: +57 (1) ? 750-4929 > Oficina US: +1 ?(516) 620-3890 > ? ? ? Fax: +54 (11) ?5258-2631 > ? ? ? MSN: guido.marelli at intraway.com > > Visite nuestro sitio web en http://www.intraway.com > > > > > ------------------------------ > > Message: 4 > Date: Wed, 11 Mar 2009 08:57:50 -0700 > From: Ross Finlayson > Subject: Re: [Live-devel] What's the DirectedNetInterface used for? > To: LIVE555 Streaming Media - development & use > ? ? ? ? > Message-ID: > Content-Type: text/plain; charset="us-ascii" ; format="flowed" > >>I'm reading the groupSock source code these days. >>I found DirectedNetInterface is a pure abstract class, >>In groupSock::outputToAllMemberExcept(...), the member functions are >>called, but I cannot find any implementation in the project. >>I wonder what's this class used for? >>I guess it can be extended and used for stream forwarding? or >>application layer multicast? > > Yes, we use this for our implementation of UMTP > > > Otherwise, this code ends up not being used. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > ------------------------------ > > Message: 5 > Date: Wed, 11 Mar 2009 21:36:14 +0200 (EET) > From: Martin Storsj? > Subject: [Live-devel] [PATCH] RTSPClient::recordMediaSession? > To: live-devel at ns.live555.com > Message-ID: > Content-Type: text/plain; charset="iso-8859-1"; Format="flowed" > > Hi, > > I noticed that the RTSPClient class lacks a method for sending RECORD > requests for a whole media session; there's only a method for sending > RECORD requests for individual subsessions. Is this an intentional > omission, or has it just not been needed yet? > > I implemented such a method, see the attached patch. This seems to work > fine for me. > > Regards, > // Martin Storsj? > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: live555-record-media-session.diff > Type: text/x-diff > Size: 2587 bytes > Desc: > URL: > > ------------------------------ > > Message: 6 > Date: Wed, 11 Mar 2009 12:45:20 -0700 > From: Ross Finlayson > Subject: Re: [Live-devel] [PATCH] RTSPClient::recordMediaSession? > To: LIVE555 Streaming Media - development & use > ? ? ? ? > Message-ID: > Content-Type: text/plain; charset="us-ascii" ; format="flowed" > >>I noticed that the RTSPClient class lacks a method for sending >>RECORD requests for a whole media session; there's only a method for >>sending RECORD requests for individual subsessions. Is this an >>intentional omission, or has it just not been needed yet? > > The latter. > > >>I implemented such a method, see the attached patch. This seems to >>work fine for me. > > I will likely add this when I update the "RTSPClient" code, > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > ------------------------------ > > Message: 7 > Date: Wed, 11 Mar 2009 16:09:15 -0400 > From: Brad Bitterman > Subject: Re: [Live-devel] my performance benchmark of livemedia > ? ? ? ?library, ? ? ? ?not satisfactory > To: LIVE555 Streaming Media - development & use > ? ? ? ? > Message-ID: <5A4BFC47-6A00-4FA4-8B6C-36DB7FE7E204 at vtilt.com> > Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes > > I found that under Linux a single threaded process such as one using > live555 only runs on one core of a multi-core CPU. My suggestion is to > run multiple processes if possible. This will let Linux distribute the > processes to the different cores. > > - Brad > > On Mar 11, 2009, at 6:31 AM, Marco Amadori wrote: > >> On Wednesday 11 March 2009, 10:28:25, liu yang wrote: >> >>> I plan to develop an application which may support 500+ or even 1000+ >>> rtp session simultaneously. So anybody could tell me whether >>> livemedia >>> could support such load? >>> BTW, I did some test based on testWAV sample program. The result is >>> not satisfactory, frankly speaking. >> >> On a bigger machine (dual xeon) with 6 raid5 15K SAS disks streaming >> 4mbits >> MPEG2 ts I found that I could not stream more than 95 streams without >> artifacts on screen. >> >> My bold analisys (on a early 2008 release of livemedia) was that the >> problem >> wasn't IO bound but CPU bound (95%+). Also the network (400Mbps) >> wasn't >> problematic since we had tried both a single gigabit and a bonding >> of 4 >> interfaces sawing both server and router side very little load. >> >> But I just did a quick analisys, so I could be enterely wrong or >> misleaded. >> >>> FAQ told me livemedia is a single threaded framework, which all >>> logics >>> are processed in single thread sequentially. >> >> This could be a problem (a known one) in our case since multiple >> Xeon cores >> and CPUs was not used. Launching another session of >> live555MediaServer helper >> in adding another 95 streams to our tests.. so this could be a hint >> for >> looking for optimization interventions. Do some profiling and if some >> computation effort is really needed, parallelize the code as >> possible in >> order to use multiples cores/CPUs. >> >>> So do you have any insightful thoughts of where we can optimize to >>> enhance livemeida as a high-performance rtp streaming stack which >>> could undergo heavy load. >> >> This is of real interest to me too. >> >> -- >> ESC:wq >> >> -- >> This message has been scanned for viruses and >> dangerous content by MailScanner, and is >> believed to be clean. >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > > > ------------------------------ > > Message: 8 > Date: Wed, 11 Mar 2009 17:10:03 -0700 > From: Patrick White > Subject: Re: [Live-devel] frame rate supported > To: LIVE555 Streaming Media - development & use > ? ? ? ? > Message-ID: <200903111710.03977.patbob at imoveinc.com> > Content-Type: text/plain; ?charset="iso-8859-1" > > > Could you be running into sleep quantitization issues? ?60 fps is down in > sleep quantitization issueland. ?The buffering on the receiver should take > care of things, but maybe it isn't, or the timestamps are not getting > generated correctly? > > patbob > > > On Tuesday 10 March 2009 4:57 pm, Gbzbz Gbzbz wrote: >> I thought the fDuration is related to the frame rate? We have a hardware >> encoder and we save the contents to a file before we stream them out. When >> play the file locally with VLC, it looks a 720P60, but at the remote VLC >> RTSP client side it is 720P30(or less) - visually it is not as smooth. >> >> I am not familiar with the live555 (or C++ in general). So I am not sure if >> the schedulTask or fDuration has anything to do with the above. May not?!?! >> >> --- On Tue, 3/10/09, Ross Finlayson wrote: >> > From: Ross Finlayson >> > Subject: Re: [Live-devel] frame rate supported >> > To: "LIVE555 Streaming Media - development & use" >> > Date: Tuesday, March 10, 2009, 9:19 PM >> > >> > > Does current live555 support 60 >> > >> > fps? say 720p60? any parameters in this area? >> > >> > Frame rate and dimension parameters are carried within the >> > video data itself (and therefore is specific to the video >> > codec, and has nothing to do with RTSP or RTP). >> > -- >> > Ross Finlayson >> > Live Networks, Inc. >> > http://www.live555.com/ >> > _______________________________________________ >> > live-devel mailing list >> > live-devel at lists.live555.com >> > http://lists.live555.com/mailman/listinfo/live-devel >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > > ------------------------------ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > End of live-devel Digest, Vol 65, Issue 9 > ***************************************** > From finlayson at live555.com Wed Mar 11 23:03:03 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Mar 2009 23:03:03 -0700 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory In-Reply-To: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> References: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> Message-ID: When replying to a digest, PLEASE trim your post! If you fail to do this again (especially because - as evidenced by your use of a "@gmail.com" email address - you are just a casual hobbyist), you risk being permanently banned from the list. >BTW, I plan to do some change to make livemedia threadsafe and >multi-threaded. So any part of code you think I need change? For Christ's sake, PLEASE read the FAQ!!! Can anyone help me understand this? I'm baffled by why so many people are failing to read the FAQ before posting to the maililing list. On our web page - where you first learn about the mailing list - you are clearly told, in boldface, to read the FAQ first. Also, the email message that you receive after you've joined the list reminds you to read the FAQ before posting to the list. >1. static member variable "tokenCounter" needs be changed In principle that's true. In practice, though, it wouldn't be a problem (because the only way it's ever modified is by incrementing it). >2. IMHO, current delta timing mechanism to trigger timer is not very >efficient because every time when invoking "synchronize", whole queue >will be traversed. No, that's not true at all. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 12 01:11:27 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Mar 2009 01:11:27 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: <49B7EE8E.2010805@schuckmannacres.com> References: <49B594D6.5040706@schuckmannacres.com> <49B6AFB4.204@schuckmannacres.com> <49B6E37E.7080604@schuckmannacres.com> <49B7EE8E.2010805@schuckmannacres.com> Message-ID: >I'm pretty sure that this would happen with unmodified code, just >see if your bye handler is called in OpenRTSP after sending or a >teardown message while the source is still playing. It is, *provided that* the original source file does not have a known duration. When streaming from a live source, or from a file for which the server does not know the duration, the server sends a RTCP "BYE" packet when the input source ends. However, when streaming from a file for which the server knows the file's duration (currently, MP3, .MPG, .TS and .AVI files), the server *does not* send a RTCP "BYE" packet when it reaches the end of the file. The reason for this is to keep the stream alive, and therefore allow the client - if it desires - to continue playing the file, by seeking back within it and replaying from an earlier time. Because - in this case - the client was told the stream's duration beforehand, it doesn't need to see a RTCP "BYE" in order to know when it ends. Note the code for the function "afterPlayingStreamState()" in "liveMedia/OnDemandServerMediaSubsession.cpp". And yes, I've tested this on the installed code. If you suspect a bug in the code, then please test this using the original, released code; not your modifications to the code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 12 01:43:17 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Mar 2009 01:43:17 -0700 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory In-Reply-To: <5A4BFC47-6A00-4FA4-8B6C-36DB7FE7E204@vtilt.com> References: <55d43d1d0903110228m55c51089o3365ebd6a4dd3672@mail.gmail.com> <200903111131.39306.amadorim@vdavda.com> <5A4BFC47-6A00-4FA4-8B6C-36DB7FE7E204@vtilt.com> Message-ID: Regarding packet loss: It's important to understand that because a LIVE555 application runs as a single thread, never writing to, or reading from, sockets concurrently, then if packet loss occurs, then it *must* be happening either (i) on the network, or (ii) in the operating system of the sender or receiver. There's nothing in our code that can be 'losing' packets. If you're sure that you have suficient network bandwidth for your stream(s), then the packet losses must be occurring inside the (sender or receiver) OS, most likely because of insufficient internal packet buffering. Note , which describes how to increase the receiver's OS buffering. There's also a function "increaseSendBufferTo()" that you can use to increase the *sender's* OS buffering. You might find this useful. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From guido.marelli at intraway.com Thu Mar 12 05:28:12 2009 From: guido.marelli at intraway.com (Guido Marelli) Date: Thu, 12 Mar 2009 10:28:12 -0200 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: <49B7F5CD.8060501@intraway.com> References: <49B55A05.8040709@intraway.com> <977F5CDB71D1408B8AF55F8297D86DDA@ricardoc466e9e> <49B7F5CD.8060501@intraway.com> Message-ID: <49B8FFDC.5030305@intraway.com> I've been testing this and the problem isn't SO_REUSEPORT (indeed we aren't setting this option). Instead, the problem is the SO_REUSEADDR socket option ... Guido Marelli wrote: > Hi, > I think that SO_REAUSEPORT will continue introducing problems cause we > need to force a port when creating a socket for the RTCP channel (RTP > port + 1). > > I believe that the best approach is to forget about SO_REAUSEPORT so > we can be sure that every socket requested to the OS will work just fine. > > > Regards! > > > Ross Finlayson wrote: >>> It seems that the code on MediaSubsession::initiate will cause the >>> effect >>> I'am reporting when the OS offers the same odd port number for both the >>> video and the audio stream. >> >> Yes, you're right. This bug got introduced in version 2008.12.20 >> when I changed the port number selection code in response to another >> bug that some people were seeing. (Before, the code was always >> letting the OS choose the port number, and this was sometimes causing >> a loop whereby the same (odd) port number would get chosen over and >> over again.) >> >> From what I can tell, the problem occurs only if we end up making the >> code - rather than the OS - choose a port number. (So, SO_REUSEPORT >> is not the problem, because even if this were not set, we'd end up >> getting an error when we tried to create the socket with the same >> port number the second time.) >> >> It seems that I need to change the code again so that it always lets >> the OS choose the port number, but be smarter about doing so, so we >> don't end up in an infinite loop. Stay tuned... >> > -- Guido Marelli Intraway Corp. Oficina AR: +54 (11) 4393-2091 Oficina CO: +57 (1) 750-4929 Oficina US: +1 (516) 620-3890 Fax: +54 (11) 5258-2631 MSN: guido.marelli at intraway.com Visite nuestro sitio web en http://www.intraway.com From asnow at pathfindertv.net Thu Mar 12 06:52:56 2009 From: asnow at pathfindertv.net (Austin Snow (pftv)) Date: Thu, 12 Mar 2009 09:52:56 -0400 Subject: [Live-devel] Fwd: Live input MPEG4 Video input References: Message-ID: <4A51B7D4-A811-43F3-8810-2F3BF13E7CFD@pathfindertv.net> Hello Ross, I was hoping to get an answer on the implementation below so we can try and figure out why we cannot get the data into live555. Thank you Austin Begin forwarded message: > From: "Austin Snow (pftv)" > Date: March 10, 2009 2:23:21 PM EDT > To: Post-Live555 > Subject: [Live-devel] Live input MPEG4 Video input > Reply-To: LIVE555 Streaming Media - development & use > > > Hello All, > I'm having an issue getting a live MPEG4 video source into Live555, > any help would be great. > > This is how I defined my live input; > in DeviceSource.hh; > removed > //private: > //void deliverFrame(); //ags > added > public: > void deliverFrame(void *data, int len, int dur); > > in DeviceSource.cpp; > changed > //void DeviceSource::deliverFrame() { > to > void DeviceSource::deliverFrame(void *data, int len, int dur) > { <--passing my data in here > added > // Deliver the data here: > if(fMaxSize < len){ > fNumTruncatedBytes = len - fMaxSize; > len = fMaxSize; > printf("Frame size truncated\n"); > } > gettimeofday(&fPresentationTime, NULL); > fDurationInMicroseconds = dur; > fFrameSize = len; > memcpy(fTo, data, len); > printf("Frame sent, len=%d\n", len); > printf("dur=%d\n", dur); > printf("fPresentationTime.tv_sec=%d\n", fPresentationTime.tv_sec); > printf("fPresentationTime.tv_usec=%d\n", fPresentationTime.tv_usec); > > The printfs indicate that the data is only accepted approximately > every 1 to 1.5 seconds on average. > > Is this the proper way to add a live input? > > Thanks > Austin > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > Austin > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: PGP.sig Type: application/pgp-signature Size: 195 bytes Desc: This is a digitally signed message part URL: From finlayson at live555.com Thu Mar 12 07:47:39 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Mar 2009 07:47:39 -0700 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: <49B7F5CD.8060501@intraway.com> References: <49B55A05.8040709@intraway.com> <977F5CDB71D1408B8AF55F8297D86DDA@ricardoc466e9e> <49B7F5CD.8060501@intraway.com> Message-ID: >I think that SO_REAUSEPORT will continue introducing problems cause >we need to force a port when creating a socket for the RTCP channel >(RTP port + 1). That's why the new code won't be choosing port numbers explicitly (or, as you say 'forcing a port'). Once again, stay tuned... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From matt at schuckmannacres.com Thu Mar 12 11:22:38 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Thu, 12 Mar 2009 11:22:38 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: References: <49B594D6.5040706@schuckmannacres.com> <49B6AFB4.204@schuckmannacres.com> <49B6E37E.7080604@schuckmannacres.com> <49B7EE8E.2010805@schuckmannacres.com> Message-ID: <49B952EE.9070608@schuckmannacres.com> Did you actually read what I said: it's in the line you quoted below When the *TEARDOWN* message is sent by the client to the server and the server is shutting down the session the server does not successfully send a BYE message and it appears that it should. This *is* the behavior is regardless if it's a file with a known duration or a live feed. You've coded it to try to send a BYE message and the RTP spec says when a participant leaves a session a BYE packet should be sent and I'd say that the server responding to a TEARDOWN message and shutting down the session constitutes leaving a session. The afterPlayingStreamState() method is not called in the case of a TEARDOWN, it appears that it is only called as a indirect result of calling handleClosure() on the FramedSource. I have now tested this with the 1/26/2009 release without modifications and the behavior I describe is the case for the release code. Here is what I did: On one system I ran the TestOnDemandRTSPServer application serving up a mpg file. On another system I ran openRTSP with the command openRTSP -m rtsp//paine:8554/mepg1or2AudioVideoTest I ran a network sniffer (WireShark) on both the server and the client systems. After letting the client receive a few seconds I made the client shutdown so that shutdown() in playCommon.cpp is called, this causes a TEARDOWN to be sent to the server. (did this by adding Ctrl+c signal handling to PlayCommon.cpp for the windows build) I inspected the traffic in WireShark and noted that the client sends a BYE message but the server does not. I also noted that the client did not print out the message that a BYE handler was received. Therefore we can conclude that the server does not send BYE messages on receipt of a TEARDOWN message. By inspecting the code, as I described before, it appears that you intended the server to send a BYE messages on a TEARDOWN. Furthermore, the RTP spec indicates that a participant should send a BYE message when it leaves a session, I believe that this is regardless of the reason for leaving the session. Matt S. Ross Finlayson wrote: >> I'm pretty sure that this would happen with unmodified code, just see >> if your bye handler is called in OpenRTSP after sending or a teardown >> message while the source is still playing. > > It is, *provided that* the original source file does not have a known > duration. When streaming from a live source, or from a file for which > the server does not know the duration, the server sends a RTCP "BYE" > packet when the input source ends. > > However, when streaming from a file for which the server knows the > file's duration (currently, MP3, .MPG, .TS and .AVI files), the server > *does not* send a RTCP "BYE" packet when it reaches the end of the > file. The reason for this is to keep the stream alive, and therefore > allow the client - if it desires - to continue playing the file, by > seeking back within it and replaying from an earlier time. Because - > in this case - the client was told the stream's duration beforehand, > it doesn't need to see a RTCP "BYE" in order to know when it ends. > > Note the code for the function "afterPlayingStreamState()" in > "liveMedia/OnDemandServerMediaSubsession.cpp". > > And yes, I've tested this on the installed code. > > If you suspect a bug in the code, then please test this using the > original, released code; not your modifications to the code. From matt at schuckmannacres.com Thu Mar 12 11:34:06 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Thu, 12 Mar 2009 11:34:06 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: References: <49B594D6.5040706@schuckmannacres.com> <49B6AFB4.204@schuckmannacres.com> <49B6E37E.7080604@schuckmannacres.com> <49B7EE8E.2010805@schuckmannacres.com> Message-ID: <49B9559E.1090005@schuckmannacres.com> You have inadvertently pointed out something we missed and that is how you *intended* the server should force a shutdown of an active session or a live feed. We had been simply destroying the RTSPClientSession which is what happens when a TEARDOWN message has been received. And this is how I discovered the aforementioned problem. By back tracing to see how afterPlayStreamState() could have been called I see that we should have been calling handleClosure() on each of the FramedSources, that would cause the fNoFramesLeft flag in MultiFramedRTPSink to be set to true and cause the subsession to end, as you intended. I will modify our code to do this when the server is forcing shutdown of a session. I still believe that a BYE message should be sent from the server when the client initiates the session ending via a TEARDOWN (or other such) message. Matt S. Ross Finlayson wrote: >> I'm pretty sure that this would happen with unmodified code, just see >> if your bye handler is called in OpenRTSP after sending or a teardown >> message while the source is still playing. > > It is, *provided that* the original source file does not have a known > duration. When streaming from a live source, or from a file for which > the server does not know the duration, the server sends a RTCP "BYE" > packet when the input source ends. > > However, when streaming from a file for which the server knows the > file's duration (currently, MP3, .MPG, .TS and .AVI files), the server > *does not* send a RTCP "BYE" packet when it reaches the end of the > file. The reason for this is to keep the stream alive, and therefore > allow the client - if it desires - to continue playing the file, by > seeking back within it and replaying from an earlier time. Because - > in this case - the client was told the stream's duration beforehand, > it doesn't need to see a RTCP "BYE" in order to know when it ends. > > Note the code for the function "afterPlayingStreamState()" in > "liveMedia/OnDemandServerMediaSubsession.cpp". > > And yes, I've tested this on the installed code. > > If you suspect a bug in the code, then please test this using the > original, released code; not your modifications to the code. From matt at schuckmannacres.com Thu Mar 12 12:02:22 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Thu, 12 Mar 2009 12:02:22 -0700 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory In-Reply-To: References: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> Message-ID: <49B95C3E.4030104@schuckmannacres.com> I'm highly suspicious of running live555 in multiple threads even if you do follow the FAQ. The FAQ basically suggests that you can run 2 independent copies of Live555 in separate threads and those 2 copies can *NOT* interact except via global variables. I don't know for sure but I don't think that is not what Liu is after. Furthermore I'd be very suspicious of even doing what is suggested in in the FAQ. I know of at least one place in the code where things could go very wrong: The RTPInterface.cpp code uses 2 static hash tables when using RTP over TCP. Should you run 2 separate threads of execution of Live555 in the same process both threads would use the same copies of these static hash tables and it would be very easy for them both to try to modify them at the same time or one read while the other is writing and cause those hash tables to be in an inconsistent state and probably cause a crash. I don't know what other little bombs like this are out there waiting to be discovered. Matt S. From matt at schuckmannacres.com Thu Mar 12 14:54:22 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Thu, 12 Mar 2009 14:54:22 -0700 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory In-Reply-To: <49B95C3E.4030104@schuckmannacres.com> References: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> <49B95C3E.4030104@schuckmannacres.com> Message-ID: <49B9848E.6090908@schuckmannacres.com> Matt Schuckmann wrote: > > Furthermore I'd be very suspicious of even doing what is suggested in > in the FAQ. > I know of at least one place in the code where things could go very > wrong: > The RTPInterface.cpp code uses 2 static hash tables when using RTP > over TCP. Should you run 2 separate threads of execution of Live555 in > the same process both threads would use the same copies of these > static hash tables and it would be very easy for them both to try to > modify them at the same time or one read while the other is writing > and cause those hash tables to be in an inconsistent state and > probably cause a crash. > > I don't know what other little bombs like this are out there waiting > to be discovered. Never mind as someone here just pointed out the tables come from the usageenvironment so that should be ok. Don't know, why I didn't see that the first time I went through this file. Sorry for any confusion. Matt S. > > Matt S. > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From patbob at imoveinc.com Thu Mar 12 14:54:44 2009 From: patbob at imoveinc.com (Patrick White) Date: Thu, 12 Mar 2009 14:54:44 -0700 Subject: [Live-devel] Failed authentication notification? In-Reply-To: <49B95C3E.4030104@schuckmannacres.com> References: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> <49B95C3E.4030104@schuckmannacres.com> Message-ID: <200903121454.44730.patbob@imoveinc.com> Is there any consistent way for a client to know that authentication has failed? It needs to know this so it can prompt the user for their login/password. I've been looking at the RTSPClient.cpp code, but I can't see any consistent way that it reports an authentication failure. thanks, patbob From joeflin at 126.com Thu Mar 12 11:02:57 2009 From: joeflin at 126.com (joeflin) Date: Fri, 13 Mar 2009 02:02:57 +0800 (CST) Subject: [Live-devel] H.264 + Transport Stream Message-ID: <30673775.3501236880977788.JavaMail.coremail@bj126app106.126.com> Hi, does the current release of live555 support Transport Stream with H264 encapsulation? Do we need to change live555 to do this, instead of writing codes cleanly on top of live555? Thanks -- Joe -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 12 15:21:20 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Mar 2009 15:21:20 -0700 Subject: [Live-devel] H.264 + Transport Stream In-Reply-To: <30673775.3501236880977788.JavaMail.coremail@bj126app106.126.com> References: <30673775.3501236880977788.JavaMail.coremail@bj126app106.126.com> Message-ID: >Hi, does the current release of live555 support Transport Stream with H264 encapsulation? > Yes. (However, indexing and 'trick play' support on such files are not yet supported.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 12 15:29:33 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Mar 2009 15:29:33 -0700 Subject: [Live-devel] Live555 RTSP Client never sees RTCP BYE message from Live555 Server In-Reply-To: <49B9559E.1090005@schuckmannacres.com> References: <49B594D6.5040706@schuckmannacres.com> <49B6AFB4.204@schuckmannacres.com> <49B6E37E.7080604@schuckmannacres.com> <49B7EE8E.2010805@schuckmannacres.com> <49B9559E.1090005@schuckmannacres.com> Message-ID: >I still believe that a BYE message should be sent from the server >when the client initiates the session ending via a TEARDOWN (or >other such) message. Maybe, but in practice I don't think this matters one iota. If the client sends a "TEARDOWN", it is because it has asked the stream to end. And the subsequent RTSP response gives confirmation that the stream has ended. The client doesn't also need to receive a "BYE" (which, as a datagram, could never be 100% reliable anyway) in order to tell it that the stream has ended. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 12 15:39:03 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Mar 2009 15:39:03 -0700 Subject: [Live-devel] Failed authentication notification? In-Reply-To: <200903121454.44730.patbob@imoveinc.com> References: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> <49B95C3E.4030104@schuckmannacres.com> <200903121454.44730.patbob@imoveinc.com> Message-ID: >Is there any consistent way for a client to know that authentication has >failed? It needs to know this so it can prompt the user for their >login/password. I've been looking at the RTSPClient.cpp code, but I can't >see any consistent way that it reports an authentication failure. This is something that the current code doesn't handle very well, unfortunately. For now, you can do what VLC's LIVE555 interface code (see "modules_demux_live555.cpp" in the VLC code) does: Look at the result of "describeURL()" call. If it's NULL, then get the error message (using "getResultMsg()"), and parse it to get the RTSP response code. Then, if that response code is 401, then prompt/get your username,password, and retry the "DESCRIBE", using either "describeURL()" with a filled-in authenticator, or using "describeWithPassword()". Ugh - this is gross. It's yet another thing I'm going to have to fix when I rewrite the "RTSPClient" class. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From joeflin at 126.com Thu Mar 12 23:39:04 2009 From: joeflin at 126.com (joeflin) Date: Fri, 13 Mar 2009 14:39:04 +0800 (CST) Subject: [Live-devel] H.264 + Transport Stream In-Reply-To: References: <30673775.3501236880977788.JavaMail.coremail@bj126app106.126.com> Message-ID: <15722037.249231236926344407.JavaMail.coremail@bj126app52.126.com> Thanks Ross. That is great. I studied the codes for a little while, but now I am somehow lost. Seems that we start from MPEG2TSFromESSource.cpp, what do we use for the mpegVersion? what else do we need to do? what functions do we need to write specifically for H.264? Thanks! ?2009-03-13?06:21:20?"Ross?Finlayson"????? >>Hi,?does?the?current?release?of?live555?support?Transport?Stream?with?H264?encapsulation?? >> > >Yes.??(However,?indexing?and?'trick?play'?support?on?such?files?are? >not?yet?supported.) >--? > >Ross?Finlayson >Live?Networks,?Inc. >http://www.live555.com/ >_______________________________________________ >live-devel?mailing?list >live-devel at lists.live555.com >http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 13 00:06:23 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 13 Mar 2009 00:06:23 -0700 Subject: [Live-devel] H.264 + Transport Stream In-Reply-To: <15722037.249231236926344407.JavaMail.coremail@bj126app52.126.com> References: <30673775.3501236880977788.JavaMail.coremail@bj126app106.126.com> <15722037.249231236926344407.JavaMail.coremail@bj126app52.126.com> Message-ID: >Seems that we >start from MPEG2TSFromESSource.cpp, what do we use for the >mpegVersion? what else do we need to do? >what functions do we need to write specifically for H.264? Thanks! What specifically are you trying to do? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From joeflin at 126.com Fri Mar 13 00:20:17 2009 From: joeflin at 126.com (joeflin) Date: Fri, 13 Mar 2009 15:20:17 +0800 (CST) Subject: [Live-devel] H.264 + Transport Stream Message-ID: <17830621.278391236928817547.JavaMail.coremail@bj126app52.126.com> I have a H.264 capturing card (PCI) that I want to stream out in TS, so I can playback with VLC on another PC. Thanks ?2009-03-13?15:06:23?"Ross?Finlayson"????? >>Seems?that?we >>start?from?MPEG2TSFromESSource.cpp,?what?do?we?use?for?the? >>mpegVersion??what?else?do?we?need?to?do? >>what?functions?do?we?need?to?write?specifically?for?H.264??Thanks! > >What?specifically?are?you?trying?to?do? >--? > >Ross?Finlayson >Live?Networks,?Inc. >http://www.live555.com/ >_______________________________________________ >live-devel?mailing?list >live-devel at lists.live555.com >http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From renatomauro at libero.it Fri Mar 13 03:23:01 2009 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Fri, 13 Mar 2009 11:23:01 +0100 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory References: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> <49B95C3E.4030104@schuckmannacres.com> <49B9848E.6090908@schuckmannacres.com> Message-ID: <454FFEA33FC84510A85DA0AC923E9494@CSystemDev> A global array (and its index) is used by our_random(). Take care. Renato MAURO ----- Original Message ----- From: "Matt Schuckmann" To: "LIVE555 Streaming Media - development & use" Sent: Thursday, March 12, 2009 10:54 PM Subject: Re: [Live-devel] my performance benchmark of livemedia library, not satisfactory > > Matt Schuckmann wrote: >> >> Furthermore I'd be very suspicious of even doing what is suggested in in >> the FAQ. >> I know of at least one place in the code where things could go very >> wrong: >> The RTPInterface.cpp code uses 2 static hash tables when using RTP over >> TCP. Should you run 2 separate threads of execution of Live555 in the >> same process both threads would use the same copies of these static hash >> tables and it would be very easy for them both to try to modify them at >> the same time or one read while the other is writing and cause those hash >> tables to be in an inconsistent state and probably cause a crash. >> >> I don't know what other little bombs like this are out there waiting to >> be discovered. > > Never mind as someone here just pointed out the tables come from the > usageenvironment so that should be ok. > Don't know, why I didn't see that the first time I went through this file. > > Sorry for any confusion. > > Matt S. > > >> >> Matt S. >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From waqasdaar at gmail.com Fri Mar 13 07:40:49 2009 From: waqasdaar at gmail.com (Waqas Daar) Date: Fri, 13 Mar 2009 15:40:49 +0100 Subject: [Live-devel] Live555 + streaming directory Message-ID: <11f113780903130740s126a8f23i56af5167352cd5dd@mail.gmail.com> Hi, I have downloaded the live555 streaming server and compiled it, it works perfectly. I want to know how can I change its behavior so that it streams media files from specific folder where all media files are. currently It streams files which are in the current directory where live555mediastream file is. Can you please tell me in detail because I am new in this domain. Thanks in Advanced -- With Regards, M. Waqas Ahmad Daar MS Internetworking KTH, Stockholm Sweden. Cell: +46704058221 e-mail: waqasdaar at gmail.com daar at kth.se -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Fri Mar 13 08:59:00 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Fri, 13 Mar 2009 08:59:00 -0700 Subject: [Live-devel] Live555 + streaming directory In-Reply-To: <11f113780903130740s126a8f23i56af5167352cd5dd@mail.gmail.com> Message-ID: Check out the mediaServer folder in the source tree. This implements an on demand server which may be all you are needing. On 3/13/09 7:40 AM, "Waqas Daar" wrote: > Hi,? > > I have downloaded the live555 streaming server and compiled it, it works > perfectly. I want to know how can I change its?behavior?so that it streams > media files from specific folder where all media files are. currently It > streams files which are in the?current?directory where live555mediastream file > is. Can you please tell me in detail?because?I am new in this domain.? > > Thanks in Advanced - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From waqasdaar at gmail.com Fri Mar 13 09:11:44 2009 From: waqasdaar at gmail.com (Waqas Daar) Date: Fri, 13 Mar 2009 17:11:44 +0100 Subject: [Live-devel] Live555 + streaming directory In-Reply-To: References: <11f113780903130740s126a8f23i56af5167352cd5dd@mail.gmail.com> Message-ID: <11f113780903130911l48cbe971k1eb20dcad052a6f5@mail.gmail.com> Yes I have added these lines in "DynamicRTSPServer.cpp" file in mediaServer folder. char DAVSLive555Path[]="/home/waqasdaar/Videos/live555/%s"; char *DAVSStream=NULL; int length=strlen(streamName)+strlen(DAVSLive555Path); DAVSStream=(char*)malloc(sizeof(char)*length); snprintf(DAVSStream,length,DAVSLive555Path,streamName); FILE* fid = fopen(DAVSStream, "rb"); Boolean fileExists = fid != NULL; Its working now. Can you please tell me if I want to test for live streaming how can I test it. Because until now I have test on demand streaming. Thanks in Advanced. 2009/3/13 Rawling, Stuart > Check out the mediaServer folder in the source tree. This implements an > on demand server which may be all you are needing. > > > On 3/13/09 7:40 AM, "Waqas Daar" wrote: > > Hi, > > I have downloaded the live555 streaming server and compiled it, it works > perfectly. I want to know how can I change its behavior so that it streams > media files from specific folder where all media files are. currently It > streams files which are in the current directory where live555mediastream > file is. Can you please tell me in detail because I am new in this domain. > > Thanks in Advanced > > > - > ------------------------------------------------------------------------------ > Confidentiality Notice: The information contained in this transmission is > legally privileged and confidential, intended only for the use of the > individual(s) or entities named above. This email and any files transmitted > with it are the property of Pelco. If the reader of this message is not the > intended recipient, or an employee or agent responsible for delivering this > message to the intended recipient, you are hereby notified that any review, > disclosure, copying, distribution, retention, or any action taken or omitted > to be taken in reliance on it is prohibited and may be unlawful. If you > receive this communication in error, please notify us immediately by > telephone call to +1-559-292-1981 or forward the e-mail to > administrator at pelco.com and then permanently delete the e-mail and destroy > all soft and hard copies of the message and any attachments. Thank you for > your cooperation. > - > ------------------------------------------------------------------------------ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- With Regards, M. Waqas Ahmad Daar MS Internetworking KTH, Stockholm Sweden. Cell: +46704058221 e-mail: waqasdaar at gmail.com daar at kth.se -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 13 09:53:35 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 13 Mar 2009 09:53:35 -0700 Subject: [Live-devel] Live555 + streaming directory In-Reply-To: <11f113780903130911l48cbe971k1eb20dcad052a6f5@mail.gmail.com> References: <11f113780903130740s126a8f23i56af5167352cd5dd@mail.gmail.com> <11f113780903130911l48cbe971k1eb20dcad052a6f5@mail.gmail.com> Message-ID: >Can you please tell me if I want to test for live streaming how can >I test it. Because until now I have test on demand streaming. The "LIVE555 Media Server" product (currently) isn't set up to (easily) support live input sources. However, you can do this with the "testOnDemandRTSPServer" demo application. See the FAQ. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bartha_adam at yahoo.com Fri Mar 13 10:02:19 2009 From: bartha_adam at yahoo.com (Bartha Adam) Date: Fri, 13 Mar 2009 10:02:19 -0700 (PDT) Subject: [Live-devel] Live555 + streaming directory Message-ID: <934957.25435.qm@web110303.mail.gq1.yahoo.com> Try this: http://www.live555.com/liveMedia/faq.html#liveInput-unicast I am also trying to stream some mpeg2 from live source. Did anyone implemented the subclass of OnDemandServerMediaSubsession for mpeg2 live source? Regards, Gr3go --- On Fri, 3/13/09, Waqas Daar wrote: From: Waqas Daar Subject: Re: [Live-devel] Live555 + streaming directory To: "LIVE555 Streaming Media - development & use" Date: Friday, March 13, 2009, 9:11 AM Yes I have added these lines in "DynamicRTSPServer.cpp" file in mediaServer ?folder. ??char ?DAVSLive555Path[]="/home/waqasdaar/Videos/live555/%s";??char *DAVSStream=NULL; ??int length=strlen(streamName)+strlen(DAVSLive555Path);??DAVSStream=(char*)malloc(sizeof(char)*length); ??snprintf(DAVSStream,length,DAVSLive555Path,streamName); ??FILE* fid = fopen(DAVSStream, "rb");??Boolean fileExists = fid != NULL; ?Its working now. Can you please tell me if I want to test for live streaming how can I test it. Because until now I have test on demand ? streaming.? ?Thanks in Advanced.? 2009/3/13 Rawling, Stuart Check out the mediaServer folder in the source tree. ?This implements an on demand server which may be all you are needing. On 3/13/09 7:40 AM, "Waqas Daar" wrote: Hi,? I have downloaded the live555 streaming server and compiled it, it works perfectly. I want to know how can I change its?behavior?so that it streams media files from specific folder where all media files are. currently It streams files which are in the?current?directory where live555mediastream file is. Can you please tell me in detail?because?I am new in this domain.? Thanks in Advanced - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -- With Regards, M. Waqas Ahmad Daar MS Internetworking KTH, Stockholm Sweden. Cell: +46704058221 e-mail: waqasdaar at gmail.com ? ? ? ? ? daar at kth.se -----Inline Attachment Follows----- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Mar 13 22:51:01 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 13 Mar 2009 22:51:01 -0700 Subject: [Live-devel] H.264 + Transport Stream In-Reply-To: <17830621.278391236928817547.JavaMail.coremail@bj126app52.126.com> References: <17830621.278391236928817547.JavaMail.coremail@bj126app52.126.com> Message-ID: >I have a H.264 capturing card (PCI) that I want to stream out in TS, >so I can playback with VLC on another PC. Quite honestly, I'm not sure whether or not our "MPEG2TSFromESSource" class - as it currently existst - will be able to embed H.264 video data in a Transport Stream in a way that a media player will understand. (It was designed assuming that the input data is MPEG-1 or MPEG-2 audio or video data.) I suppose you could try it (with a "mpegVersion" value of 4) to check. (You may also need to update the parent class ("MPEG2TransportStreamMultiplexor") slightly to handle this.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Mar 13 23:15:27 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 13 Mar 2009 23:15:27 -0700 Subject: [Live-devel] my performance benchmark of livemedia library, not satisfactory In-Reply-To: <454FFEA33FC84510A85DA0AC923E9494@CSystemDev> References: <55d43d1d0903111952l36d6b42eubd6ac9dc85f91152@mail.gmail.com> <49B95C3E.4030104@schuckmannacres.com > <49B9848E.6090908@schuckmannacres.com> <454FFEA33FC84510A85DA0AC923E9494@CSystemDev> Message-ID: >A global array (and its index) is used by our_random(). Yes. However, by default (if "USE_OUR_RANDOM" is not defined), our code in "our_random()" is not used. Instead, the standard "random()" and "srandom()" functions from the C runtime are used. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lhhajs at gmail.com Sat Mar 14 09:30:08 2009 From: lhhajs at gmail.com (Liu Hua) Date: Sat, 14 Mar 2009 12:30:08 -0400 Subject: [Live-devel] Real Capture Message-ID: <97ed97ef0903140930o9b39869n92f7dbcfd85e45c8@mail.gmail.com> Hi, I want to use liveMedia to send the real time encoded data over rtsp, which means: capture -> encode ->send. I know that I could create a subclass of DeviceSource to capture data, but where to add the encode code?? Can someone help me? Thanks very much. From vanevery at walking-productions.com Sat Mar 14 12:41:35 2009 From: vanevery at walking-productions.com (Shawn Van Every) Date: Sat, 14 Mar 2009 15:41:35 -0400 Subject: [Live-devel] QuickTime file generation errors Message-ID: Hi Folks, I am attempting to narrow down an issue that I am having. I am using openRTSP to record from Axis P3301 IP cameras (H.264/AAC via RTSP). In general things are working out great. Unfortunately about 1% of the files (5 to 6 minutes long) seem to have issues opening (in QuickTime or any other software (VLC, MPlayer and so on)). Generally the error I get has something to do with a missing or corrupt atom. Here is the command I am using: openRTSP -q -f 30 -w 640 -h 480 -t -Q -b 200000 -y rtsp://10.0.1.1/axis-media/media.amp?videocodec=h264&resolution=640x480&audio=1&duration=0&fps=30&videobitrate=10000&videomaxbitrate=10000&videobitratepriority=framerate&videokeyframeinterval=2&compression=10&color=1&clock=0&date=0&text=0 > recording.mov 2>output.log & echo $! The QoS reports seem fine. Sometimes the packet loss is high but that doesn't seem to directly correspond to corrupt files. (One thing that is strange to me is that that the packet loss is sometimes high even though since I am using the -t for TCP which I would expect to be more reliable although slower than UDP. If I use the -l flag things get really strange and I seem to double or even triple the bitrate and get very strange looking video with flashes and so forth.) I am stopping the recording by issuing a kill -1 PID. I am glad to dig through the source and try to identify and correct the issue, I am just not sure where to begin. Any thoughts would be appreciated. Thanks! -shawn ps. Should I provide a sample file? From finlayson at live555.com Sun Mar 15 14:54:18 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 15 Mar 2009 14:54:18 -0700 Subject: [Live-devel] QuickTime file generation errors In-Reply-To: References: Message-ID: >I am glad to dig through the source and try to identify and correct >the issue, I am just not sure where to begin. "liveMedia/QuickTimeFileSink.cpp" Any bug fixes to this code would certainly be appreciated. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Mar 15 15:05:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 15 Mar 2009 15:05:28 -0700 Subject: [Live-devel] Real Capture In-Reply-To: <97ed97ef0903140930o9b39869n92f7dbcfd85e45c8@mail.gmail.com> References: <97ed97ef0903140930o9b39869n92f7dbcfd85e45c8@mail.gmail.com> Message-ID: >I want to use liveMedia to send the real time encoded data over rtsp, >which means: capture -> encode ->send. > >I know that I could create a subclass of DeviceSource to capture data No, this would not be a *subclass* of "DeviceSource", but a subclass of "FramedSource" that would (perhaps) be modeled on the "DeviceSource.cpp" code. (See ) > but where to add the encode code?? You would write your own subclass of "FramedFilter" that implements the "doGetNextFrame()" virtual function by (i) calling "getNextFrame()" on its upstream source, and (ii) encode the received data before delivering it to the downstream object. Your "ServerMediaSubsession" subclass would then implement the "createNewStreamSource()" virtual function by (i) creating an object of your 'data source' class, (ii) creating an object of your 'encoding filter' class (which takes the 'data source' object as input parameter), and then returns a pointer to your 'encoding filter' object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From skramer at inbeeld.eu Mon Mar 16 01:18:41 2009 From: skramer at inbeeld.eu (Steven Kramer) Date: Mon, 16 Mar 2009 09:18:41 +0100 Subject: [Live-devel] QuickTime file generation errors In-Reply-To: <14CB5C44CF90E54394ABE97A1D61CC8F05C9587CCE@TPEX01.internal.triple-p.nl> References: <14CB5C44CF90E54394ABE97A1D61CC8F05C9587CCE@TPEX01.internal.triple-p.nl> Message-ID: > > I am stopping the recording by issuing a kill -1 PID. > Shouldn't that have been a kill -HUP ? From finlayson at live555.com Mon Mar 16 01:24:09 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 16 Mar 2009 01:24:09 -0700 Subject: [Live-devel] QuickTime file generation errors In-Reply-To: References: <14CB5C44CF90E54394ABE97A1D61CC8F05C9587CCE@TPEX01.internal.triple-p.nl> Message-ID: >>I am stopping the recording by issuing a kill -1 PID. >> > >Shouldn't that have been a kill -HUP ? They mean the same thing. The signal number for SIGHUP is 1. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lizur at wp.pl Tue Mar 17 03:15:46 2009 From: lizur at wp.pl (Piotr Lizurek) Date: Tue, 17 Mar 2009 11:15:46 +0100 Subject: [Live-devel] MPEG4 and UDP Message-ID: <49bf785288ecb8.79936777@wp.pl> Hello! I try to create simple app, which takes video stream from my webcam and send it thru udp to another computer on the Internet. Could You give me any hints on doing it? Even small code snippets would be appreciated. Best Regards, Piotr Lizurek, Poland. ---------------------------------------------------- Tr?jmiejskie Targi Pracy 2009 -Rozprawiamy si? z kryzysem! Przy??cz si? do nas! Zobacz: http://klik.wp.pl/?adr=http%3A%2F%2Fcorto.www.wp.pl%2Fas%2Ftargipracy09.html&sid=664 From finlayson at live555.com Tue Mar 17 08:45:21 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 17 Mar 2009 08:45:21 -0700 Subject: [Live-devel] MPEG4 and UDP In-Reply-To: <49bf785288ecb8.79936777@wp.pl> References: <49bf785288ecb8.79936777@wp.pl> Message-ID: >I try to create simple app, which takes video stream from my webcam and >send it thru udp to another computer on the Internet. Could You give me >any hints on doing it? Even small code snippets would be appreciated. Look at the "testMPEG4VideoStreamer" demo application (in the "testProgs" directory) - and read the FAQ! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From galaxy-huang at 163.com Wed Mar 18 07:16:42 2009 From: galaxy-huang at 163.com (galaxy-huang) Date: Wed, 18 Mar 2009 22:16:42 +0800 (CST) Subject: [Live-devel] Can LiveMedia be used over wireless network? Message-ID: <6454949.1083611237385802898.JavaMail.coremail@bj163app80.163.com> Hello,everyone! I recently do some reserch about streaming over the wireless network, and i wonder if LiveMedia is appropriate over wireless network,if not,can it be adapted easily to meet my need? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Mar 18 07:22:45 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Mar 2009 07:22:45 -0700 Subject: [Live-devel] Can LiveMedia be used over wireless network? In-Reply-To: <6454949.1083611237385802898.JavaMail.coremail@bj163app80.163.com> References: <6454949.1083611237385802898.JavaMail.coremail@bj163app80.163.com> Message-ID: >I recently do some reserch about streaming over the wireless >network, and i wonder if LiveMedia is appropriate over wireless >network Yes, it can be used over any IP network. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gabriele.deluca at hotmail.com Wed Mar 18 07:47:02 2009 From: gabriele.deluca at hotmail.com (Gabriele De Luca) Date: Wed, 18 Mar 2009 15:47:02 +0100 Subject: [Live-devel] MultiFramedRTPSink Message-ID: Hi Ross,I have the RTP packets on my UNIX SOCKET. I want read them and send them to my sinks!So, I have a question:I must work on multiframedrtpsink.cpp or on rtpsink.cpp?I ask you the shortest way to do my purpose: my RTP packets are ready on my UNIX SOCKET, so I don't build header or payload, I must simply send packets as is. Thanks in advance _________________________________________________________________ Non solo posta: scopri i nuovi servizi Windows Live. http://clk.atdmt.com/GBL/go/136430530/direct/01/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From debargha.mukherjee at hp.com Wed Mar 18 14:39:42 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Wed, 18 Mar 2009 21:39:42 +0000 Subject: [Live-devel] Framed Source Issue Message-ID: <73833378E80044458EC175FF8C1E63D56DFACD5940@GVW0433EXB.americas.hpqcorp.net> Hi, I am developing a multi-threaded application using live libraries that receives audio and video from several Axis IP cameras concurrently as RTSP clients, decodes into raw, processes the data, creates a single outbound audio/video stream, re-encodes and streams it out to a remote receiver after setting up a RTSP server. I am using VLC player to play back the stream on the receiving end. I am having an issue on the streaming-out side. The audio and video encoders read raw data from shared buffers using two derived FramedSource classes modeled after DeviceSource.cpp. The deliverFrame() function in these derived classes read raw audio and video from respective shared buffers, encodes them using ffmpeg libraries, fills up the buffers and sets other parameters appropriately, before returning. Occasionally, when a shared buffer is accessed for read, there is'nt enough data avaialable to read, possibly due to jitter in processing time on the write side of the shared buffers. What is the right action in that case? If my buffer reader waited a few milli-secs until there is enough data available to read (by using Events or otherwise), the receiver side VLC player freezes. If I return with fFramesize = 0, the application crashes. The only thing that seems to work is if I re-encoded the previous frame (for video) and encoded all-zero (for audio), and filled up the buffers and other parameters the normal way. Even in this case, the receiving VLC player freezes every few min or so. What am I doing wrong? Thanks, Debargha From finlayson at live555.com Wed Mar 18 14:58:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Mar 2009 14:58:26 -0700 Subject: [Live-devel] Framed Source Issue In-Reply-To: <73833378E80044458EC175FF8C1E63D56DFACD5940@GVW0433EXB.americas.hpqcorp.ne t> References: <73833378E80044458EC175FF8C1E63D56DFACD5940@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: >I am having an issue on the streaming-out side. The audio and video >encoders read raw data from shared buffers using two derived >FramedSource classes modeled after DeviceSource.cpp. The >deliverFrame() function in these derived classes read raw audio and >video from respective shared buffers, encodes them using ffmpeg >libraries, fills up the buffers and sets other parameters >appropriately, before returning. Occasionally, when a shared buffer >is accessed for read, there is'nt enough data avaialable to read, >possibly due to jitter in processing time on the write side of the >shared buffers. What is the right action in that case? If your "doGetNextFrame()" implementation can't deliver data (to its downstream object) immediately, it should instead just return (*without* calling "FramedSource::afterGetting()", because you haven't delivered any data in this case). In this case, the future availability of sufficient data must be handled via an event in the event loop (delayed 'polling' using "TaskScheduler::scheduleDelayedTask()" is one way to do this, if you can't make the arrival of new data an 'event' in any other way). > If my buffer reader waited a few milli-secs until there is enough >data available to read (by using Events or otherwise), the receiver >side VLC player freezes. If I return with fFramesize = 0 No, don't do this. >, the application crashes. The only thing that seems to work is if I >re-encoded the previous frame (for video) and encoded all-zero (for >audio), and filled up the buffers and other parameters the normal >way. Even in this case, the receiving VLC player freezes every few >min or so. This 'freezing' suggests one of two possibilities: 1/ You might not be setting presentation times correctly on the data, before it gets fed to a "MultiFramedRTPSink" (subclass). Your "doGetNextFrame()" implementation should set "fPresentationTime", before calling "FramedSource::afterGetting()". 2/ Your "doGetNextFrame()" implementation might not be setting "fDurationInMicroseconds" correctly, thereby causing the downstream "MultiFramedRTPSink" to delay excessively (after sending a RTP packet) before requesting new data. Because you are streaming from a live source - rather than from a prerecorded file - you might be able to get away with not setting "fDurationInMicroseconds" at all (which will cause it to keep its default value of zero). I hope this helps. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Mar 18 15:12:13 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Mar 2009 15:12:13 -0700 Subject: [Live-devel] MultiFramedRTPSink In-Reply-To: References: Message-ID: >I have the RTP packets on my UNIX SOCKET. I want read them and send >them to my sinks! >So, I have a question: >I must work on multiframedrtpsink.cpp or on rtpsink.cpp? You don't need to 'work on' either of these. To receive RTP packets from your socket, you should create a "Groupsock" object (with the socket's port number as parameter), and then use this to create an appropriate "RTPSource" subclass (depending on the RTP packets' payload format). Then, to output the RTP packets (i.e., without adding new RTP headers), you would use a "BasicUDPSink", *not* a "RTPSink" (subclass). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From charlie at sensoray.com Thu Mar 19 10:05:23 2009 From: charlie at sensoray.com (Charlie X. Liu) Date: Thu, 19 Mar 2009 10:05:23 -0700 Subject: [Live-devel] Can LiveMedia be used over wireless network? In-Reply-To: References: <6454949.1083611237385802898.JavaMail.coremail@bj163app80.163.com> Message-ID: <00d201c9a8b4$e4ebad40$aec307c0$@com> Yes, we started to use the LiveMedia for wireless application since a couple of years ago. It has been working great. Charlie X. Liu Dept. of Engineering Sensoray Company, Inc. 7313 SW Tech Center Dr. Tigard, OR 97223 Phone: (503) 684-8073 Fax: (503) 684-8164 Web: http://www.sensoray.com -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, March 18, 2009 7:23 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Can LiveMedia be used over wireless network? >I recently do some reserch about streaming over the wireless >network, and i wonder if LiveMedia is appropriate over wireless >network Yes, it can be used over any IP network. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From debargha.mukherjee at hp.com Thu Mar 19 11:48:48 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Thu, 19 Mar 2009 18:48:48 +0000 Subject: [Live-devel] Framed Source Issue Message-ID: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> Thanks Ross for the pointers. First, is it possible to explain using TaskScheduler::scheduleDelayedTask(), as you suggested, a little better? Second, I also suspect that the freezing issue has to do with the timestamps and the duration. I am setting the duration in microsecs as 26122 (for 44.1 KHz, MP3 frames of 1152 samples) for audio, and 33333 (30 fps) for video. The presentation time is obtained from gettimeofday(). However, I find that the audio is called much more often than 26122 microsecs. Audio is called only at intervals of a few thousand microsecs (about 10 times more than what it should be). Can you explain what may be going on. I am using MP3 in MPEG1or2AudioRTPSink. Thanks for your help. -------------------------------------------------------------------- >I am having an issue on the streaming-out side. The audio and video >encoders read raw data from shared buffers using two derived >FramedSource classes modeled after DeviceSource.cpp. The >deliverFrame() function in these derived classes read raw audio and >video from respective shared buffers, encodes them using ffmpeg >libraries, fills up the buffers and sets other parameters >appropriately, before returning. Occasionally, when a shared buffer >is accessed for read, there is'nt enough data avaialable to read, >possibly due to jitter in processing time on the write side of the >shared buffers. What is the right action in that case? If your "doGetNextFrame()" implementation can't deliver data (to its downstream object) immediately, it should instead just return (*without* calling "FramedSource::afterGetting()", because you haven't delivered any data in this case). In this case, the future availability of sufficient data must be handled via an event in the event loop (delayed 'polling' using "TaskScheduler::scheduleDelayedTask()" is one way to do this, if you can't make the arrival of new data an 'event' in any other way). > If my buffer reader waited a few milli-secs until there is enough >data available to read (by using Events or otherwise), the receiver >side VLC player freezes. If I return with fFramesize = 0 No, don't do this. >, the application crashes. The only thing that seems to work is if I >re-encoded the previous frame (for video) and encoded all-zero (for >audio), and filled up the buffers and other parameters the normal >way. Even in this case, the receiving VLC player freezes every few >min or so. This 'freezing' suggests one of two possibilities: 1/ You might not be setting presentation times correctly on the data, before it gets fed to a "MultiFramedRTPSink" (subclass). Your "doGetNextFrame()" implementation should set "fPresentationTime", before calling "FramedSource::afterGetting()". 2/ Your "doGetNextFrame()" implementation might not be setting "fDurationInMicroseconds" correctly, thereby causing the downstream "MultiFramedRTPSink" to delay excessively (after sending a RTP packet) before requesting new data. Because you are streaming from a live source - rather than from a prerecorded file - you might be able to get away with not setting "fDurationInMicroseconds" at all (which will cause it to keep its default value of zero). I hope this helps. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gcote at matrox.com Thu Mar 19 12:56:08 2009 From: gcote at matrox.com (=?ISO-8859-1?Q?Georges_C=F4t=E9?=) Date: Thu, 19 Mar 2009 15:56:08 -0400 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> Message-ID: <49C2A358.5050903@matrox.com> Hello, First of all, I'd like to thank everyone involved in the project. This is quite helpful for newbies like myself. I have a server streaming H.264 and AAC. The video resolution is 1080i @ 29.97 fps. The bit rate is around 5 Mbits/sec. I'm currently working on the client side. I'm using the office LAN to test. Both server and client are connected at 1 Gb/s. I also tested using a D-Link 1 Gb/s managed switch. I based my code on the H.264 tutorial. I get corruption once in a while. The H/W encoder is configured to generate one IDR and 14 forward frames, no backward frames (I, P and B in mpeg2 terminology). I'm not sure of the H.264 terminology. What I see is that the reference frames are quite large > 150 KB while the other frames are around 15 KB. Most of the times, the client is called with the right size. Once in a while, I will be missing part of a IDR or even the whole reference frame. If I use Wireshark on the client side, I see that I'm receiving the "missing" packets. I haven't digged in the code to investigate yet! On the server side, when the frame is larger than the destination buffer, I copy as much as I can. The remaining data will be copied when doGetNextFrame is called again. Incomplete parts have the right presentation time but I set the duration to 0. The last part has the same presentation time but I set the duration according to the right frame rate. Any clues you might think of? Where should I start looking? NTSC @ 2 Mb/s is much more stable. Thank you! Georges From finlayson at live555.com Thu Mar 19 16:57:38 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Mar 2009 16:57:38 -0700 Subject: [Live-devel] Framed Source Issue In-Reply-To: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: >First, is it possible to explain using >TaskScheduler::scheduleDelayedTask(), as you suggested, a little >better? In your implementation of "doGetNextFrame()", if no input data is currently available to be delivered to the downstream object, then you could just call "TaskScheduler::scheduleDelayedTask()" with a delay parameter of 10000 (i.e., 10 ms), and passing a pointer to a function (that you would write) that would call "doGetNextFrame()" once again, to retry. Then, after calling "TaskScheduler::scheduleDelayedTask()", just return. For example, you could look at the code on lines 58-74 of "liveMedia/MPEG4VideoFileServerMediaSubsession.cpp", which does something slightly similar. >Second, I also suspect that the freezing issue has to do with the >timestamps and the duration. >I am setting the duration in microsecs as 26122 (for 44.1 KHz, MP3 >frames of 1152 samples) for audio, and 33333 (30 fps) for video. The >presentation time is obtained from gettimeofday(). However, I find >that the audio is called much more often than 26122 microsecs. Audio >is called only at intervals of a few thousand microsecs (about 10 >times more than what it should be). Can you explain what may be >going on. I am using MP3 in MPEG1or2AudioRTPSink. Are you *sure* your "doGetFrame()" implementation is setting the "fDurationInMicroseconds" (and "fFrameSize" and "fPresentationTime") variable, before it calls "FramedSource::afterGetting()"? One way you can check this is by looking at the parameters to the (subsequent) call to "MultiFramedRTPSink:: ::afterGettingFrame()", and checking that they are correct. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From matt at schuckmannacres.com Thu Mar 19 14:21:32 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Thu, 19 Mar 2009 14:21:32 -0700 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <49C2A358.5050903@matrox.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> <49C2A358.5050903@matrox.com> Message-ID: <49C2B75C.3080809@schuckmannacres.com> Chances are your socket receiver buffers in the OS are too small. Try increasing them with calls to setReceiveBufferTo() or increaseReceiveBufferTo(), I think you can find examples of this in the OpenRTSP example and I think there are some references to this in the FAQ. Check out the FAQ because there maybe some registry/config settings (in the case of windows) you need to change to allow bigger buffers. There are companion methods for the send buffers on the server side but I don't think your having a problem with that if WireShark shows all the data is making it to the client. > > On the server side, when the frame is larger than the destination > buffer, I copy as much as I can. The remaining data will be copied > when doGetNextFrame is called again. I'd be interested to know if this works because I got the impression that it doesn't. There is a similar buffer on the client side that is used to pass data to the afterGettingFrame() method of your videoSink, if the data is too big for that buffer then the numTrucatedBytes parameter is set to number of bytes that are lost and as far as I can tell that data is gone. I don't think I've come across a case where this has occurred but in theory it could happen, I'm still not sure how you'd increase this buffer size. Another thing you might try is to have your H.264 encoder slice up the frames into multiple slices, I think this will push your NAL packet sizes down which should reduce the buffer size requirements. I haven't tired this either but I've been meaning to just to see what happens. Matt S. iMove Inc. mschuck at imoveinc.com From finlayson at live555.com Thu Mar 19 18:12:27 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Mar 2009 18:12:27 -0700 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <49C2A358.5050903@matrox.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> <49C2A358.5050903@matrox.com> Message-ID: See -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From galaxy-huang at 163.com Thu Mar 19 21:52:02 2009 From: galaxy-huang at 163.com (galaxy-huang) Date: Fri, 20 Mar 2009 12:52:02 +0800 (CST) Subject: [Live-devel] Does LiveMedia take these conditions into account? Message-ID: <8474359.748921237524722171.JavaMail.coremail@bj163app26.163.com> Does LiveMedia take the conditions as follow into account:jitter caused by the network,skew due to tyhe asynchronous of two clock,and the adaption of the playout point in order to compensate the aforementioned conditions etc. We know, these are all very common in the network,a robust implementation should take them into account.Where do you achieve these in LiveMedia and how? Thanks in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Mar 19 22:21:42 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 19 Mar 2009 22:21:42 -0700 Subject: [Live-devel] H.264 NAL confusion Message-ID: As an H.264 source, I'm using a DirectShow filter implementation of x264 which can be found here: http://blog.monogram.sk/janos/2008/12/29/monogram-x264-encoder-1020/ ...and here are some comments by the author of that x264 filter regarding the output: << This filter delivers NAL units in the form as they are described in ISO/IEC 14496-15 (AVC File Format). That means each NAL unit is superceeded by 4-byte length and then the NAL unit follows (without the Annex B start code). Most filters supporting H.264 do this the same way (Haali splitter, ffdshow, ?) e.g. 00 00 00 37 | 45 ? (another 0?36 bytes of IDR nal unit) If more NAL units are delivered they look like this <4byte SIZE>NAL<4byte SIZE>NAL 2) ?00 00 00 01? is not in there 3) When you skip the first 4 bytes you get the exact start of the NAL unit 4) PPS and SPS are encoded in the mediatype format structure AFTER the VideoInfo or MPEG_VIDEO_INFO structures. They are encoded in this form <2byte size>NAL<2byte size>NAL This might look like this : 00 1B 67 42 C0 33 AB 40 5A 09 37 FE 00 20 00 1E 20 00 00 03 00 20 00 00 06 51 E3 06 54 00 04 68 CE 3C 80 >> What I want to do is stream this video using Live555. I've created a framer that inherits from H264VideoStreamFramer. I see that it does "stream" something out over the network, but I can't get VLC to render it correctly. I see that my NAL units come with a 4-byte header specifying the length of the NAL unit. Each of my NAL units appears to roughly correspond with a single frame of video (i.e. one frame = one NAL unit, so the buffers are sort of big). My question is: do I need to package this according to RFC 3984? Also, do I need to break it up (my NAL units are decidedly larger than 1400 bytes...), or can I count on the H264FUAFragmenter to do that for me? I'm definitely confused on how I need to format the output of my decoder. If it helps, I can definitely post some code snippets. One final (unrelated) question: I'm confused about fPresentationTime. What are the units of this value? Is it an absolute time or a relative time, and do I need to set any "base" times or anything like that? I get timestamps from my encoder, but I'm not sure how they relate to fPresentationTime. Any advice is much appreciated; thank you in advance! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 19 23:05:21 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Mar 2009 23:05:21 -0700 Subject: [Live-devel] Does LiveMedia take these conditions into account? In-Reply-To: <8474359.748921237524722171.JavaMail.coremail@bj163app26.163.com> References: <8474359.748921237524722171.JavaMail.coremail@bj163app26.163.com> Message-ID: >Does LiveMedia take the conditions as follow into account:jitter >caused by the network,skew due to tyhe asynchronous of two clock,and >the adaption of the playout point in order to compensate the >aforementioned conditions etc. > We know, these are all very common in the network,a robust >implementation should take them into account.Where do you achieve >these in LiveMedia and how? Our software, in accordance with the RTP/RTCP standard, delivers - to each client - data frames with an accurate presentation time. These presentation times are times in the *sender's* clock. If the receiver's clock runs at a different rate than the sender's clock (the clock used by the data presentation times), then the receiving application may need to compensate for this. The mechanism that the application uses t odo so is (necessarily) application and codec-specific, and best handled by media decoders (which are not part of our software); therefore it is not something that we do in our libraries. (For example, the receiving application might choose to drop some decoded audio samples, or to duplicate some decoded audio samples, as appropriate.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 19 23:30:46 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Mar 2009 23:30:46 -0700 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: >My question is: do I need to package this according to RFC 3984? No. Our software (in particular, the "H264VideoRTPSink" class) already implements the RTP payload format described in RFC 3984. All you need to do is deliver - one at a time - NAL units to it (*without* a preceding 'length' field). > Also, do I need to break it up (my NAL units are decidedly larger >than 1400 bytes...) No - we do this all for you. >One final (unrelated) question: I'm confused about >fPresentationTime. What are the units of this value? Is it an >absolute time or a relative time, and do I need to set any "base" >times or anything like that? I get timestamps from my encoder, but >I'm not sure how they relate to fPresentationTime. "fPresentationTime" should be aligned with 'wall clock' time - i.e., the time that you get by calling "gettimeofday()". Because your incoming NAL units have their own timestamps, I suggest that you set "fPresentationTime" - for each NAL unit - as follows: struct timeval presentationTimeBase; Boolean isFirstNALUnit = True; if (isFirstNALUnit) { struct timeval timeNow; gettimeofday(&timeNow, NULL); presentationTimeBase = timeNow - theCurrentNALUnitsTimestamp; isFirstNALUnit = False; } fPresentationTime = presentationTimeBase + theCurrentNALUnitsTimestamp; -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jaroslav.hurdes at nitta-systems.cz Fri Mar 20 02:11:12 2009 From: jaroslav.hurdes at nitta-systems.cz (Jaroslav Hurdes) Date: Fri, 20 Mar 2009 10:11:12 +0100 Subject: [Live-devel] Tutorial for use Classes of livemedia library Message-ID: <49C35DB0.2060606@nitta-systems.cz> Hello, Im finding library, which i want use for programming frame grabber from FireWire and IP cameras. Im look on LiveMedia and finding any tutorial, which describe realtions and usage of Classes on LiveMedia library? Where i find this tutorial? Thank you. Jaroslav Hurdes PS: Excuse me for bad english. From kvolaa at seznam.cz Fri Mar 20 02:41:27 2009 From: kvolaa at seznam.cz (=?us-ascii?Q?Karel=20Volejnik?=) Date: Fri, 20 Mar 2009 10:41:27 +0100 (CET) Subject: [Live-devel] Tutorial for use Classes of livemedia library In-Reply-To: <49C35DB0.2060606@nitta-systems.cz> Message-ID: <2830.7321-18925-499761786-1237542087@seznam.cz> Hi, look at test programs: the best tutorial is souce code itself. :-) > ------------ P?vodn? zpr?va ------------ > Od: Jaroslav Hurdes > P?edm?t: [Live-devel] Tutorial for use Classes of livemedia library > Datum: 20.3.2009 10:28:06 > ---------------------------------------- > Hello, Im finding library, which i want use for programming frame > grabber from FireWire and IP cameras. Im look on LiveMedia and finding > any tutorial, which describe realtions and usage of Classes on LiveMedia > library? Where i find this tutorial? > > Thank you. Jaroslav Hurdes > > PS: Excuse me for bad english. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > From finlayson at live555.com Fri Mar 20 02:23:01 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Mar 2009 02:23:01 -0700 Subject: [Live-devel] Tutorial for use Classes of livemedia library In-Reply-To: <49C35DB0.2060606@nitta-systems.cz> References: <49C35DB0.2060606@nitta-systems.cz> Message-ID: >Hello, Im finding library, which i want use for programming frame >grabber from FireWire and IP cameras. Im look on LiveMedia and >finding any tutorial, which describe realtions and usage of Classes >on LiveMedia library? Where i find this tutorial? http://www.live555.com/liveMedia/ http://www.live555.com/liveMedia/faq.html -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gcote at matrox.com Fri Mar 20 06:48:07 2009 From: gcote at matrox.com (=?ISO-8859-1?Q?Georges_C=F4t=E9?=) Date: Fri, 20 Mar 2009 09:48:07 -0400 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <49C2B75C.3080809@schuckmannacres.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> <49C2A358.5050903@matrox.com> <49C2B75C.3080809@schuckmannacres.com> Message-ID: <49C39E97.6020509@matrox.com> An HTML attachment was scrubbed... URL: From bendeguy at gmail.com Fri Mar 20 08:55:40 2009 From: bendeguy at gmail.com (Miklos Szeles) Date: Fri, 20 Mar 2009 16:55:40 +0100 Subject: [Live-devel] Using DigestAuthentication Message-ID: Hi, Can anybody explain me how can I use DigestAuthentication? How can I get the required realm an nonce parameters? Many thanks, Miki From patbob at imoveinc.com Fri Mar 20 08:56:57 2009 From: patbob at imoveinc.com (Patrick White) Date: Fri, 20 Mar 2009 08:56:57 -0700 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <49C39E97.6020509@matrox.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> <49C2B75C.3080809@schuckmannacres.com> <49C39E97.6020509@matrox.com> Message-ID: <200903200856.57661.patbob@imoveinc.com> I just fixed this issue in our code last night -- the I-frame from the H.264 encoder was getting truncated because it was too big to fit into the stock OutPacketBuffer buffer (only ~60000 bytes). The short answer is that the entire outgoing packet must fit into a buffer -- there's no way for the LIVE555 code to output the correct packets if it doesn't. Matt's idea of having the codec produce smaller NALs has a good channe of working.. but I just upped the buffers for now: On the server end, before you instantiate RTSPServer, set the global variable OutPacketBuffer::maxSize to your desired output buffer size. The output buffer(s) will be automatically allocated. On the client side you have to call getNextFrame() with a big enough buffer -- that call happens from code we've written, so I don't know what you might have to do to make it work, or even if you need to do anything. I upped both buffers because I'm in a hurry -- perhaps only the outgoing buffer is sufficient. Long answer: Over in H264FUAFragmenter::doGetNextFrame(), it uses case 2 to send the initial FU-A packet after it gets a new buffer full, then case 3 to send the balance of that buffer via FU-B packets. Near as I can tell, there's no way to jump back into case 3 with another buffer full.. ergo, the entire frame must fit into a single buffer full to be sent out as the proper sequence of FU-A/FU-B packets. At least, that's the state of the code as of a month or two ago when we last updated... Ross, the code's a little convoluted down there, is my interpretation correct?, did I misunderstand the logic flow? or have you made recent changes down there to fix it? Regardless, upping the buffer sizes fixed the issue for us. Hope that helps. patbob On Thu Mar 19 12:56:08 2009, Georges C?t? wrote: >.. >I based my code on the H.264 tutorial. > >I get corruption once in a while. The H/W encoder is configured to >generate one IDR and 14 forward frames, no backward frames (I, P and B >in mpeg2 terminology). I'm not sure of the H.264 terminology. >What I see is that the reference frames are quite large > 150 KB while >the other frames are around 15 KB. > >Most of the times, the client is called with the right size. Once in a >while, I will be missing part of a IDR or even the whole reference >frame. If I use Wireshark on the client side, I see that I'm receiving >the "missing" packets. I haven't digged in the code to investigate yet! > >On the server side, when the frame is larger than the destination >buffer, I copy as much as I can. The remaining data will be copied when >doGetNextFrame is called again. > >Incomplete parts have the right presentation time but I set the duration >to 0. >The last part has the same presentation time but I set the duration >according to the right frame rate. >.. On Friday 20 March 2009 6:48 am, Georges C?t? wrote: > Thank you Matt and Ross. > > My code is already calling increaseReceiveBufferTo(2000000) for the video > and 100000 for the audio. I added a call to increaseSendBufferTo(20000000) > on the server side but it didn't make a difference. > > My preliminary investigation tells me that I'm receiving all the packets > (I added TRACEs in MultiFramedRTPSource::networkReadHandler). I will look > into MultiFramedRTPSource::doGetNextFrame1. > > See below. > > Matt Schuckmann wrote: > Chances are your socket receiver buffers in the OS are too small. > Try increasing them with calls to setReceiveBufferTo() or > increaseReceiveBufferTo(), I think you can find examples of this in the > OpenRTSP example and I think there are some references to this in the FAQ. > Check out the FAQ because there maybe some registry/config settings (in the > case of windows) you need to change to allow bigger buffers. > > There are companion methods for the send buffers on the server side but I > don't think your having a problem with that if WireShark shows all the data > is making it to the client. > > > On the server side, when the frame is larger than the destination buffer, > I copy as much as I can. The remaining data will be copied when > doGetNextFrame is called again. > > I'd be interested to know if this works because I got the impression that > it doesn't. It seems to be working. I modified unsigned > OutPacketBuffer::maxSize = 600000 instead of 60000. But it works with the > smaller value. > > > > There is a similar buffer on the client side that is used to pass data to > the afterGettingFrame() method of your videoSink, if the data is too big > for that buffer then the numTrucatedBytes parameter is set to number of > bytes that are lost and as far as I can tell that data is gone. I don't > think I've come across a case where this has occurred but in theory it > could happen, I'm still not sure how you'd increase this buffer size. > > Another thing you might try is to have your H.264 encoder slice up the > frames into multiple slices, I think this will push your NAL packet sizes > down which should reduce the buffer size requirements. I haven't tired this > either but I've been meaning to just to see what happens. > > I currently can't configure it to encode multiple slices which is a pain > since my S/W decoder is multi-threaded. > > Regards, > > Georges > > > Matt S. > iMove Inc. > mschuck at imoveinc.com > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From debargha.mukherjee at hp.com Fri Mar 20 11:28:34 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Fri, 20 Mar 2009 18:28:34 +0000 Subject: [Live-devel] How to know if server died Message-ID: <73833378E80044458EC175FF8C1E63D56DFACD6563@GVW0433EXB.americas.hpqcorp.net> Hi, What is the best way to check from a RTSP/RTP client using live libraries that the server has dropped the connection, or if one or more of the RTP streams have died? Thanks, Debargha. ****************************************************** Debargha Mukherjee, Ph.D. Senior Research Scientist Hewlett Packard Laboratories Multimedia Communications and Networking Lab 1501 Page Mill Road, MS 1181, Palo Alto, CA 94304 Tel.: 650-236-8058, FAX: 650-852-3791 Email: debargha at hpl.hp.com WWW: http://www.hpl.hp.com/personal/Debargha_Mukherjee ****************************************************** From gcote at matrox.com Fri Mar 20 11:50:42 2009 From: gcote at matrox.com (=?UTF-8?B?R2VvcmdlcyBDw7R0w6k=?=) Date: Fri, 20 Mar 2009 14:50:42 -0400 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <200903200856.57661.patbob@imoveinc.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> <49C2B75C.3080809@schuckmannacres.com> <49C39E97.6020509@matrox.com> <200903200856.57661.patbob@imoveinc.com> Message-ID: <49C3E582.1010800@matrox.com> An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Fri Mar 20 15:37:33 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Fri, 20 Mar 2009 15:37:33 -0700 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <49C3E582.1010800@matrox.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.net> <49C2B75C.3080809@schuckmannacres.com> <49C39E97.6020509@matrox.com> <200903200856.57661.patbob@imoveinc.com> <49C3E582.1010800@matrox.com> Message-ID: <49C41AAD.4020604@schuckmannacres.com> An HTML attachment was scrubbed... URL: From dmcneill at vtilt.com Fri Mar 20 16:06:08 2009 From: dmcneill at vtilt.com (Don McNeill) Date: Fri, 20 Mar 2009 19:06:08 -0400 Subject: [Live-devel] Trap in exit() Message-ID: <71F71449-E8F0-4D52-98F8-943C88260E4E@vtilt.com> Hi, I've been working with a playback session with QuickTime and implementing the calls associated with playback; per recommendation: void PlaybackSubsession::seekStreamSource(FramedSource* inputSource, double seekNPT); void PlaybackSubsession::setSourceStreamScale(FramedSource* inputSource, float scale); void PlaybackSubsession::testScaleFactor(float& scale); float PlaybackSubsession::duration(void); So far, so good -- QuickTime displays the slider and I am able to move it around and see the video changing positions, and the time display change on the left for the QT client. VERY seldom, I get a trap in FramedSource::getNextFrame( ... ) where the flag fIsCurrentlyAwaitingData is true, thus causing the exit() function to be called(!). I went back and double-checked everything and made sure that the PlaybackSource class had coverage in the doGetNextFrame method: if ( isCurrentlyAwaitingData() ) readFrame( this, 0 ); return; Note that this happens quite rarely... There is also coverage in doGetNextFrame() for the EOF condition, where handleClosure(this) is called on end of file or other error. Do you have any recommendations I could try? Regards, Don McNeill From finlayson at live555.com Fri Mar 20 22:36:12 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Mar 2009 22:36:12 -0700 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <49C41AAD.4020604@schuckmannacres.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> <49C2B75C.3080809@schuckmannacres.com> <49C39E97.6020509@matrox.com> <200903200856.57661.patbob@imoveinc.com> <49C3E582.1010800@matrox.com> <49C41AAD.4020604@schuckmannacres.com> Message-ID: >The way I understand it and read the code the >currentNALUnitEndsAccessUnit() method returning true marks the end >of a frame not a a NAL unit. (remember NAL unit does not always >equal frame). Yes, "currentNALUnitEndsAccessUnit()" should return True iff (if and only if) the most recently-delivered NAL unit marks the end of a video frame (i.e., a complete screen). >question; if you have a NAL unit that can't fit in the buffer >provided in doGetNextFrame() can you hold on to the part of the NAL >unit that didn't fit and pass it in the next time doGetNextFrame() >is called? No. Each NAL unit must fit completely in the output buffer (at the server end), and the input buffer (at the receiving end). (Our software takes care of fragmenting a large NAL unit across several *network* (RTP) packets, and reassembling at the receiver end, so you don't need to concern yourself with this. However, each NAL unit must fit completely within an output (and input) buffer. Very large NAL units are not a good idea, because they get sent in several consecutive RTP packets, and if any one of these packets gets lost, then the whole NAL unit will (effectlvely) be lost. Therefore, if possible, you should reduce the size of your NAL units - e.g., using slices. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Mar 20 22:46:51 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Mar 2009 22:46:51 -0700 Subject: [Live-devel] How to know if server died In-Reply-To: <73833378E80044458EC175FF8C1E63D56DFACD6563@GVW0433EXB.americas.hpqcorp.ne t> References: <73833378E80044458EC175FF8C1E63D56DFACD6563@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: >What is the best way to check from a RTSP/RTP client using live >libraries that the server has dropped the connection, or if one or >more of the RTP streams have died? If a server *voluntarily* closes a connection, it will send a RTCP "BYE" packet, which the client can handle (note, for example, the call to "setByeHandler()" in the "openRTSP" code). If, however, a stream dies *involuntarily* (e.g., because the server crashes), then the only way to detect this is either to 1/ Detect the closing of the RTSP TCP connection (see ), and/or 2/ Notice that no more RTP packets have arrived (in the last few seconds. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sat Mar 21 02:45:35 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Mar 2009 02:45:35 -0700 Subject: [Live-devel] Using DigestAuthentication In-Reply-To: References: Message-ID: >Can anybody explain me how can I use DigestAuthentication? How can I >get the required realm an nonce parameters? Are you a RTSP client or a RTSP server? If you're a RTSP client, you don't have to know what type of authentication the server is requesting; just pass the 'username' and 'password' parameters to "RTSPClient::describeWithPassword()", and our library will do the right thing. If you're a RTSP server, then look at the "testOnDemandRTSPServer.cpp" code for an example of how to do this. See the code bracketed with #ifdef ACCESS_CONTROL #endif -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sat Mar 21 02:52:46 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Mar 2009 02:52:46 -0700 Subject: [Live-devel] Trap in exit() In-Reply-To: <71F71449-E8F0-4D52-98F8-943C88260E4E@vtilt.com> References: <71F71449-E8F0-4D52-98F8-943C88260E4E@vtilt.com> Message-ID: >VERY seldom, I get a trap in FramedSource::getNextFrame( ... ) where >the flag fIsCurrentlyAwaitingData is true, thus causing the exit() This "exit()" indicates that there's a serious error - in your code. It means exactly what it says: You are trying to read from the same object more than once simultaneously. The cause of this is usually that your code is calling "FramedSource::afterGetting()" when it shouldn't. Your "doGetNextFrame()" implementation must call "FramedSource::afterGetting()" exactly once (and no more) each time. It must do this only after you have successfully delivered incoming data to the downstream object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Mar 22 15:33:47 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 22 Mar 2009 15:33:47 -0700 Subject: [Live-devel] setupDatagramSocket - SO_REUSEADDR problems In-Reply-To: <49B8FFDC.5030305@intraway.com> References: <49B55A05.8040709@intraway.com> <977F5CDB71D1408B8AF55F8297D86DDA@ricardoc466e9e> <49B7F5CD.8060501@intraway.com> <49B8FFDC.5030305@intraway.com> Message-ID: I have now installed a new version (2009.03.22) of the "LIVE555 Streaming Media" code that - I think - now fixes this problem (in "MediaSession::initiate()"). (Please check this out, and if you still see this problem, let me know.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From msn at conecom.dk Mon Mar 23 06:53:26 2009 From: msn at conecom.dk (Michael Skaastrup Nielsen) Date: Mon, 23 Mar 2009 14:53:26 +0100 Subject: [Live-devel] Compiling on solaris 64 bit Message-ID: <5DEAE73D1DDDB24F976671CEEBA7CDED05CE59@cecdom01.conecom.internal> Hello I am trying to run live555 server on an Solaris 64 bit box. It runs fine as a 32 bit application, but I am unable to read or stream any files over 2 GB. VLC streaming works fine but live555 server is required for this application. Can anyone help me, either by directing me to something pre compiled or instruct me how to compile for 64 bit solaris? TIA Venlig hilsen Michael Skaastrup Web:www.conecom.dk Systemkonsulent Email: msn at conecom.dk Con E Com A/S Tlf: 36 93 25 57 Hjallesegade 45 Mob: +45 20 31 41 90 5260 Odense S Tlf: +45 70 11 20 19 Fax: +45 70 11 20 18 Vi g?r opm?rksom p?, at denne e-mail kan indeholde fortrolig information. Hvis du ved en fejltagelse modtager e-mailen, beder vi dig venligst informere afsender om fejlen ved at bruge svar-funktionen. Samtidig beder vi dig slette e-mailen i dit system uden at videresende eller kopiere den. Selv om e-mailen og ethvert vedh?ftet bilag efter vores overbevisning er fri for virus og andre fejl, som kan p?virke computeren eller it-systemet, hvori den modtages og l?ses, ?bnes den p? modtagerens eget ansvar. Vi p?tager os ikke noget ansvar for tab og skade, som er opst?et i forbindelse med at modtage og bruge e-mailen. _______________ Please note that this message may contain confidential information. If you have received this message by mistake, please inform the sender of the mistake by sending a reply, then delete the message from your system without making, distributing or retaining any copies of it. Although we believe that the message and any attachments are free from viruses and other errors that might affect the computer or IT system where it is received and read, the recipient opens the message at his or her own risk. We assume no responsibility for any loss or damage arising from the receipt or use of this message. From finlayson at live555.com Mon Mar 23 09:39:53 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 23 Mar 2009 09:39:53 -0700 Subject: [Live-devel] Compiling on solaris 64 bit In-Reply-To: <5DEAE73D1DDDB24F976671CEEBA7CDED05CE59@cecdom01.conecom.internal> References: <5DEAE73D1DDDB24F976671CEEBA7CDED05CE59@cecdom01.conecom.internal> Message-ID: >I am trying to run live555 server on an Solaris 64 bit box. > >It runs fine as a 32 bit application, but I am unable to read or >stream any files over 2 GB. VLC streaming works fine but live555 >server is required for this application. > >Can anyone help me, either by directing me to something pre compiled >or instruct me how to compile for 64 bit solaris? The usual method - i.e. genMakefiles solaris is supposed to work for both 32-bit and 64-bit Solaris. But if that's not the case, and there's some change that could be made to the "config.solaris" file in order to fix this, then please let us know. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bitter at vtilt.com Mon Mar 23 18:29:36 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Mon, 23 Mar 2009 21:29:36 -0400 Subject: [Live-devel] Access to RTP Extension Header Message-ID: Is there an easy way to access the RTP header when using live555 as a client? A video source I'm using kindly decided to insert important information into the RTP header using the RTP extension header. Thanks, Brad From finlayson at live555.com Mon Mar 23 21:50:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 23 Mar 2009 21:50:28 -0700 Subject: [Live-devel] Access to RTP Extension Header In-Reply-To: References: Message-ID: >Is there an easy way to access the RTP header when using live555 as a client? Not yet, unfortunately. (The code currently checks for a RTP extension header, but then just ignores (skips over) it it sees it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From msn at conecom.dk Tue Mar 24 00:16:37 2009 From: msn at conecom.dk (Michael Skaastrup Nielsen) Date: Tue, 24 Mar 2009 08:16:37 +0100 Subject: [Live-devel] Compiling on solaris 64 bit Message-ID: <5DEAE73D1DDDB24F976671CEEBA7CDED243A1B@cecdom01.conecom.internal> >I am trying to run live555 server on an Solaris 64 bit box. > >It runs fine as a 32 bit application, but I am unable to read or >stream any files over 2 GB. VLC streaming works fine but live555 >server is required for this application. > >Can anyone help me, either by directing me to something pre compiled >or instruct me how to compile for 64 bit solaris? The usual method - i.e. genMakefiles solaris is supposed to work for both 32-bit and 64-bit Solaris. But if that's not the case, and there's some change that could be made to the "config.solaris" file in order to fix this, then please let us know. -- Hi. Sorry, should have stated that. The above mentioned method works fine and streams fine. But the result is a 32 bit binary. And I am having problems reading files that are over 2 GB so therefore I speculated that this is because of the 32 bitness of the binary. To explain it further. If I rip a whole DVD as MPEG2 TS I can play it locally on Mplayer and VLC (Linux laptop). If I scp the file to the Solaris 2008.11 (AMD 64 bit) server I get a 404 from live555 when connecting through mplayer (which gives the best error output) mplayer rtsp://server/dvd.ts If I rip a portion of the same DVD below 2 GB, the server streams it perfectly. Still as MPEG2 TS. So I think I narrowed it down to the ability to read files over 2 GB. I am not a developer so my experience with compilers is limited to tracing missing libraries and things like that. Thanks for the reply. Hope this gives further info. If I am all wrong please correct me. Kind regards Michael Skaastrup Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From chengjianweiok at gmail.com Tue Mar 24 00:39:59 2009 From: chengjianweiok at gmail.com (Danny Cheng) Date: Tue, 24 Mar 2009 15:39:59 +0800 Subject: [Live-devel] about the openRTSP Message-ID: <8cd741830903240039q412b8139vfd231192870395b@mail.gmail.com> Hello, This is my first letter to you! I have a question about the openRTSP. In the end of the function startPlayingStreams, there is : // Watch for incoming packets (if desired): checkForPacketArrival(NULL); checkInterPacketGaps(NULL); My question is whether the video or audio packets haved been rececived here. If the packets are received here, then what is the difference between the role of the function startPlayingStreams and the function *taskScheduler().doEventLoop()* in reveiving the packets. Thank you! Danny -- -------------- next part -------------- An HTML attachment was scrubbed... URL: From peter.gejgus at gmail.com Tue Mar 24 01:02:38 2009 From: peter.gejgus at gmail.com (Peter Gejgus) Date: Tue, 24 Mar 2009 09:02:38 +0100 Subject: [Live-devel] timeout doesn't work in RTSPClient Message-ID: Hello, Does timeout work in RTSPClient? I slightly modified openRTSP.cpp in order to use timeout value (in seconds). I replaced at line 32: return rtspClient->sendOptionsCmd(url, username, password); with return rtspClient->sendOptionsCmd(url, username, password, NULL, 5); and at line 45: result = rtspClient->describeURL(url); with result = rtspClient->describeURL(url, NULL, False, 5); But after this change functions getOptionsResponse and getSDPDescriptionFromURL returned always NULL. Seems, socket wasn't created. Did I make some mistake or timeout parameter in RTSPClient::openConnectionFromURL doesn't work? Best Regards, Builder -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 24 01:06:25 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Mar 2009 01:06:25 -0700 Subject: [Live-devel] Compiling on solaris 64 bit In-Reply-To: <5DEAE73D1DDDB24F976671CEEBA7CDED243A1B@cecdom01.conecom.internal> References: <5DEAE73D1DDDB24F976671CEEBA7CDED243A1B@cecdom01.conecom.internal> Message-ID: >The usual method - i.e. > genMakefiles solaris >is supposed to work for both 32-bit and 64-bit Solaris. > >But if that's not the case, and there's some change that could be >made to the "config.solaris" file in order to fix this, then please >let us know. >-- > >Hi. > >Sorry, should have stated that. The above mentioned method works fine >and streams fine. > >But the result is a 32 bit binary. > >And I am having problems reading files that are over 2 GB so therefore I >speculated that this is because of the 32 bitness of the binary. OK, so you (or someone else who's knowledgeable about compiling/linking for Solaris) is going to have to figure out what changes should be made to the "config.solaris" file in order to properly compile/link applications for 64-bit Solaris. I'm not a Solaris expert (and don't have a Solaris machine available for testing anyway), so someone else is going to have to figure this out for us. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 24 01:24:32 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Mar 2009 01:24:32 -0700 Subject: [Live-devel] about the openRTSP In-Reply-To: <8cd741830903240039q412b8139vfd231192870395b@mail.gmail.com> References: <8cd741830903240039q412b8139vfd231192870395b@mail.gmail.com> Message-ID: >I have a question about the openRTSP. In the end of the function >startPlayingStreams, there is : > // Watch for incoming packets (if desired): > checkForPacketArrival(NULL); > checkInterPacketGaps(NULL); > > My question is whether the video or audio packets haved been >rececived here. The first time that these functions are called, no RTP (audio or video) packets will have been received yet (because the event loop hasn't yet started; see below). Notice, though, that each of these functions - at the very end of their code - reschedules another call to itself, from the event loop, after a short delay. I.e., each of these functions gets called periodically. > If the packets are received here, then what is >the difference between the role of the function >startPlayingStreams and the function taskScheduler().doEventLoop() >in reveiving the packets. No incoming packets actually get handled until "doEventLoop()" is called (because the arrival of each incoming packet is handled as an 'event' from within the event loop). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From msn at conecom.dk Tue Mar 24 01:45:13 2009 From: msn at conecom.dk (Michael Skaastrup Nielsen) Date: Tue, 24 Mar 2009 09:45:13 +0100 Subject: [Live-devel] Compiling on solaris 64 bit References: <5DEAE73D1DDDB24F976671CEEBA7CDED243A1B@cecdom01.conecom.internal> Message-ID: <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5D@cecdom01.conecom.internal> >The usual method - i.e. > genMakefiles solaris >is supposed to work for both 32-bit and 64-bit Solaris. > >But if that's not the case, and there's some change that could be >made to the "config.solaris" file in order to fix this, then please >let us know. >-- > >Hi. > >Sorry, should have stated that. The above mentioned method works fine >and streams fine. > >But the result is a 32 bit binary. > >And I am having problems reading files that are over 2 GB so therefore I >speculated that this is because of the 32 bitness of the binary. OK, so you (or someone else who's knowledgeable about compiling/linking for Solaris) is going to have to figure out what changes should be made to the "config.solaris" file in order to properly compile/link applications for 64-bit Solaris. I'm not a Solaris expert (and don't have a Solaris machine available for testing anyway), so someone else is going to have to figure this out for us. -- Hi. I will try to dig deeper. Contacting SUN right now. If anything turns up I will inform you via this list. Thank you for taking time.. If anybody else on the list has anything to add, please do not hesitate. Kind regards Michael Skaastrup Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3369 bytes Desc: not available URL: From msn at conecom.dk Tue Mar 24 02:59:31 2009 From: msn at conecom.dk (Michael Skaastrup Nielsen) Date: Tue, 24 Mar 2009 10:59:31 +0100 Subject: [Live-devel] Compiling on solaris 64 bit -maybe fixed References: <5DEAE73D1DDDB24F976671CEEBA7CDED243A1B@cecdom01.conecom.internal> <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5D@cecdom01.conecom.internal> Message-ID: <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5F@cecdom01.conecom.internal> Hi I will just reply to my own mail. Figured it out. I am attaching the config.solaris inline. I've just added -m64 three places to instruct compiler and linker to make 64 bit files. bash-3.00#file software/live/mediaServer/live555MediaServer software/live/mediaServer/live555MediaServer: ELF 64-bit LSB executable AMD64 Version 1, dynamically linked, not stripped, no debugging information available config.solaris COMPILE_OPTS = $(INCLUDES) -m64 -I. -O -DSOLARIS -DSOCKLEN_T=socklen_t C = c C_COMPILER = cc C_FLAGS = $(COMPILE_OPTS) CPP = cpp CPLUSPLUS_COMPILER = c++ CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall OBJ = o LINK = c++ -m64 -o LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) LIBRARY_LINK = ld -o LIBRARY_LINK_OPTS = $(LINK_OPTS) -64 -r -dn LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = -lsocket -lnsl LIBS_FOR_GUI_APPLICATION = $(LIBS_FOR_CONSOLE_APPLICATION) EXE = As i said earlier I am not a developer so this is based on works-for-me info. Kind regards Michael Skaastrup >The usual method - i.e. > genMakefiles solaris >is supposed to work for both 32-bit and 64-bit Solaris. > >But if that's not the case, and there's some change that could be >made to the "config.solaris" file in order to fix this, then please >let us know. >-- > >Hi. > >Sorry, should have stated that. The above mentioned method works fine >and streams fine. > >But the result is a 32 bit binary. > >And I am having problems reading files that are over 2 GB so therefore I >speculated that this is because of the 32 bitness of the binary. OK, so you (or someone else who's knowledgeable about compiling/linking for Solaris) is going to have to figure out what changes should be made to the "config.solaris" file in order to properly compile/link applications for 64-bit Solaris. I'm not a Solaris expert (and don't have a Solaris machine available for testing anyway), so someone else is going to have to figure this out for us. -- Hi. I will try to dig deeper. Contacting SUN right now. If anything turns up I will inform you via this list. Thank you for taking time.. If anybody else on the list has anything to add, please do not hesitate. Kind regards Michael Skaastrup Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 4110 bytes Desc: not available URL: From gcote at matrox.com Tue Mar 24 08:04:39 2009 From: gcote at matrox.com (=?ISO-8859-1?Q?Georges_C=F4t=E9?=) Date: Tue, 24 Mar 2009 11:04:39 -0400 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> <49C2B75C.3080809@schuckmannacres.com> <49C39E97.6020509@matrox.com> <200903200856.57661.patbob@imoveinc.com> <49C3E582.1010800@matrox.com> <49C41AAD.4020604@schuckmannacres.com> Message-ID: <49C8F687.9010100@matrox.com> If my H/W encoder is single-slice, am I doomed? Seems to be working well on a reliable network for 1080i @ 5 Mb/s. Anything higher (10 Mb/s) is not as stable but I haven't had time investigating why. For the reference frames, the codec generates a buffer containing 3 NAL units (SPS, PPS and the reference frame) but since SPS and PPS NAL units are so small, I don't bother sending them by themselves. The other buffers contain 1 NAL unit also. Thank you! Georges Ross Finlayson wrote: >> The way I understand it and read the code the >> currentNALUnitEndsAccessUnit() method returning true marks the end of >> a frame not a a NAL unit. (remember NAL unit does not always equal >> frame). > > Yes, "currentNALUnitEndsAccessUnit()" should return True iff (if and > only if) the most recently-delivered NAL unit marks the end of a video > frame (i.e., a complete screen). > > >> question; if you have a NAL unit that can't fit in the buffer >> provided in doGetNextFrame() can you hold on to the part of the NAL >> unit that didn't fit and pass it in the next time doGetNextFrame() is >> called? > > No. Each NAL unit must fit completely in the output buffer (at the > server end), and the input buffer (at the receiving end). (Our > software takes care of fragmenting a large NAL unit across several > *network* (RTP) packets, and reassembling at the receiver end, so you > don't need to concern yourself with this. However, each NAL unit must > fit completely within an output (and input) buffer. > > Very large NAL units are not a good idea, because they get sent in > several consecutive RTP packets, and if any one of these packets gets > lost, then the whole NAL unit will (effectlvely) be lost. Therefore, > if possible, you should reduce the size of your NAL units - e.g., > using slices. > From bendeguy at gmail.com Tue Mar 24 03:39:08 2009 From: bendeguy at gmail.com (Miklos Szeles) Date: Tue, 24 Mar 2009 11:39:08 +0100 Subject: [Live-devel] Using DigestAuthentication In-Reply-To: References: Message-ID: I'm using RTSP as a client. Unfortunatelly the authentication fails. I found out that live555 implements RFC 2069 specification. This specification was replaced by RFC 2617 and unfortunatelly our camera uses the features of the new standard. Do you plan to implementation the RFC 2617 digest authentication in the near future? Best Regards, Miki On Sat, Mar 21, 2009 at 10:45 AM, Ross Finlayson wrote: >> Can anybody explain me how can I use DigestAuthentication? How can I >> get the required realm an nonce parameters? > > Are you a RTSP client or a RTSP server? ?If you're a RTSP client, you don't > have to know what type of authentication the server is requesting; just pass > the 'username' and 'password' parameters to > "RTSPClient::describeWithPassword()", and our library will do the right > thing. > > If you're a RTSP server, then look at the "testOnDemandRTSPServer.cpp" code > for an example of how to do this. See the code bracketed with > #ifdef ACCESS_CONTROL > #endif > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Mar 24 14:44:59 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Mar 2009 14:44:59 -0700 Subject: [Live-devel] Compiling on solaris 64 bit -maybe fixed In-Reply-To: <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5F@cecdom01.conecom.internal> References: <5DEAE73D1DDDB24F976671CEEBA7CDED243A1B@cecdom01.conecom.internal> <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5D@cecdom01.conecom.internal> <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5F@cecdom01.conecom.internal> Message-ID: >Figured it out. I am attaching the config.solaris inline. I've just >added -m64 three places to instruct compiler and linker to make 64 >bit files. Thanks. One question, though: What happens if someone uses this new "config.solaris" to compile/link an application on (and for) a 32-bit Solaris system. Will the resulting binary still run OK? Or should there be two config files for Solaris: "config.solaris-32bit" and "config.solaris-64bit"? Please let me know, so I can make the appropriate changes for the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 24 15:12:34 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Mar 2009 15:12:34 -0700 Subject: [Live-devel] H.264 streaming -- Not receiving all packets In-Reply-To: <49C8F687.9010100@matrox.com> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> <49C2B75C.3080809@schuckmannacres.com> <49C39E97.6020509@matrox.com> <200903200856.57661.patbob@imoveinc.com> <49C3E582.1010800@matrox.com> <49C41AAD.4020604@schuckmannacres.com> <49C8F687.9010100@matrox.com> Message-ID: >If my H/W encoder is single-slice, am I doomed? No, of course you're not 'doomed', but if your 'reference frames' are very large (and you didn't say how large they typically are), then they are going to be especially vulnerable to packet loss. If you could split them up into 'slice' NAL units in software (I'm not a H.264 expert, so I don't know how easy/hard that would be), and instead deliver - to the rest of the LIVE555 code - one 'slice' NAL unit at a time, then that would be better. > Seems to be working well on a reliable network for 1080i @ 5 Mb/s. >Anything higher (10 Mb/s) is not as stable but I haven't had time >investigating why. It's probably because (assuming your frame rate remains constant) your 'reference frames' are doubling in size when you switch from 5 Mbps to 10 Mbps. Once again, if such a frame takes up a single NAL unit, then the loss of *any* network packet will render it unusable. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bitter at vtilt.com Tue Mar 24 16:52:45 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Tue, 24 Mar 2009 19:52:45 -0400 Subject: [Live-devel] Access to RTP Extension Header In-Reply-To: References: Message-ID: Can you point me to where the header gets skipped (where the RTP header gets parsed)? I'll look and see if there is a clean way to allow access to the data. Any suggestions on the best way to add access? Thanks, Brad On Mar 24, 2009, at 12:50 AM, Ross Finlayson wrote: >> Is there an easy way to access the RTP header when using live555 as >> a client? > > Not yet, unfortunately. (The code currently checks for a RTP > extension header, but then just ignores (skips over) it it sees it. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From SRawling at pelco.com Tue Mar 24 18:12:15 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Tue, 24 Mar 2009 18:12:15 -0700 Subject: [Live-devel] RTCPInstance questions Message-ID: Hi all, I am connecting a client to an RTP server. I instantiate the groupsocks, and the RTPSources and the RTCP instance all fine, and the stream comes in great. However, I have a couple of questions about the RTCPInstance class. 1. Should we be sending RTCP messages when the connection is in a paused state. My initial thoughts were no, but then I thought that without the RTCPInstance running and sending messages, how would a server know the client has not gone away. 2. I have debug turned on and I see RTCP messages being received from myself. Ie: saw incoming RTCP packet (from address 10.221.224.6, port 7999). The client is running on 10.221.224.6, and the server is on a separate box. It looks to me like the client is receiving its own messages, but is not chomping them like I would expect. I think the call to see if the packets was from ourselves (RTCVP.cpp line 336) is failing for some reason. I have no other live555 processes running on the same box, and so would expect the packet to be ignored. 3. Occasionally, I will see the sudden the RTCPInstance schedule function get called in a tight loop. I am unsure what the cause is, I will be investigating further, but was curious to see if anyone has seen this one before. Regards, Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Mar 24 18:24:03 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Mar 2009 18:24:03 -0700 Subject: [Live-devel] RTCPInstance questions In-Reply-To: References: Message-ID: >1. Should we be sending RTCP messages when the connection is in a >paused state. My initial thoughts were no, but then I thought that >without the RTCPInstance running and sending messages, how would a >server know the client has not gone away. Yes, RTCP continues, as usual, even when a RTSP session is 'paused'. > >2. I have debug turned on and I see RTCP messages being received >from myself. Ie: saw incoming RTCP packet (from address >10.221.224.6, port 7999). The client is running on 10.221.224.6, >and the server is on a separate box. It looks to me like the client >is receiving its own messages, but is not chomping them like I would >expect. I think the call to see if the packets was from ourselves >(RTCVP.cpp line 336) is failing for some reason. I have no other >live555 processes running on the same box, and so would expect the >packet to be ignored. I presume this is a multicast session (if it were a unicast session, then the RTCP packets should be sent back directly to the server, and the client would never get a chance to receive them itself). I didn't see a question in the paragraph above - but Remember, You Have Complete Source Code. >3. Occasionally, I will see the sudden the RTCPInstance schedule >function get called in a tight loop. This seems very unlikely, but if you discover a specific problam, please let us know. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 24 18:44:37 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Mar 2009 18:44:37 -0700 Subject: [Live-devel] Using DigestAuthentication In-Reply-To: References: Message-ID: >I'm using RTSP as a client. Unfortunatelly the authentication fails. I >found out that live555 implements RFC 2069 specification. This >specification was replaced by RFC 2617 and unfortunatelly our camera >uses the features of the new standard. Thanks for the information. Could you please send us an example of the diagnostic output printed by the RTSP client (to get this, set "verbosityLevel" to 1 in "RTSPClient::createNew()"), so I can see specifically what your server is sending that's different from what we currently implement? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From SRawling at pelco.com Tue Mar 24 20:16:07 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Tue, 24 Mar 2009 20:16:07 -0700 Subject: [Live-devel] RTCPInstance questions In-Reply-To: Message-ID: >> >2. I have debug turned on and I see RTCP messages being received >> >from myself. >> >I presume this is a multicast session (if it were a unicast session, >> >then the RTCP packets should be sent back directly to the server, and >> >the client would never get a chance to receive them itself). I >> >didn't see a question in the paragraph above - but Remember, You Have >> >Complete Source Code. > > It turns out I was handling the Groupsocks wrong. > I have an RtpReceiver class that takes 3 parameters sourceIP destinationIP and > destinationPort. > In multicast mode, the destinationIP would be the multicast address, in > unicast mode it would be the IP of the interface the application would receive > the stream on. > > I was setting the groupAddr of the rtcp socket to be the destination address > always (which would mean I would send the packets to myself in unicast, and > the multicast group in multicast). Obviously this is wrong, as I should > always be sending directly back to the source. I have made this change and > things are looking good. > > As I was checking into how to setup the groupsocks, I seem to see that in the > MediaSession class the groupAddr passed to the changeDestination call is the > multicast address when streaming multicast, but the source address of the > server when streaming unicast. Is this correct? If so how does one account > for multi-homed systems. I am presuming it will filter on the SSRC of the > incoming packet? > > Thanks for the reply btw, figured it out pretty quick after that. > > Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From live.davinci at gmail.com Wed Mar 25 02:41:06 2009 From: live.davinci at gmail.com (davinci white) Date: Wed, 25 Mar 2009 17:41:06 +0800 Subject: [Live-devel] transmission delay using Live555 Message-ID: <70eda3d30903250241uedd116ctfd33ca449faf33a@mail.gmail.com> Hello All, I have successfully transmit live MPEG4 video using Live555, but have a problem. I connect the IP camera to a monitor, and a dsp evaluation board too. The video in the VLC player is almost delayed 2 second than the video in the monitor. where can i modify the source code to reduce the transmission time? I use example testMPEG4VideoStreamer.cpp, and I modify it to use the class MPEG4VideoStreamDiscreteFramer. Any help would be great. Thank You, komac From bendeguy at gmail.com Wed Mar 25 03:43:53 2009 From: bendeguy at gmail.com (Miklos Szeles) Date: Wed, 25 Mar 2009 11:43:53 +0100 Subject: [Live-devel] Using DigestAuthentication In-Reply-To: References: Message-ID: Hi Ross, Sorry for my mistake, but I found that the camera uses the "qop" option only in http digest authentication. The RTSP authentication failure was caused by using the wrong username:). Thanks for your help. BR, Miki On Wed, Mar 25, 2009 at 2:44 AM, Ross Finlayson wrote: >> I'm using RTSP as a client. Unfortunatelly the authentication fails. I >> found out that live555 implements RFC 2069 specification. This >> specification was replaced by RFC 2617 and unfortunatelly our camera >> uses the features of the new standard. > > Thanks for the information. ?Could you please send us an example of the > diagnostic output printed by the RTSP client (to get this, set > "verbosityLevel" to 1 in "RTSPClient::createNew()"), so I can see > specifically what your server is sending that's different from what we > currently implement? > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From sebastien-devel at celeos.eu Wed Mar 25 05:30:26 2009 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Wed, 25 Mar 2009 13:30:26 +0100 Subject: [Live-devel] transmission delay using Live555 In-Reply-To: <70eda3d30903250241uedd116ctfd33ca449faf33a@mail.gmail.com> References: <70eda3d30903250241uedd116ctfd33ca449faf33a@mail.gmail.com> Message-ID: <1237984226.1109.6.camel@sebastien-office.USERPC> Hi On Wed, 2009-03-25 at 17:41 +0800, davinci white wrote: > The video in the VLC player is almost delayed 2 second than the video > in the monitor. Are you sure it is a transmission delay ? I would say this is a vlc question. You could try --*-caching options like : --rtsp-caching=100 to have 100ms in vlc rtsp module. But it won't decrease the total delay to 100ms because vlc is caching frames at different places. I noticed that there are at least 3 frames cached in vlc, in addition to rtsp-caching value. So the delay will also depend on your frame rate. From finlayson at live555.com Wed Mar 25 05:52:05 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 25 Mar 2009 05:52:05 -0700 Subject: [Live-devel] transmission delay using Live555 In-Reply-To: <70eda3d30903250241uedd116ctfd33ca449faf33a@mail.gmail.com> References: <70eda3d30903250241uedd116ctfd33ca449faf33a@mail.gmail.com> Message-ID: >I have successfully transmit live MPEG4 video using Live555, but have >a problem. >I connect the IP camera to a monitor, and a dsp evaluation board too. > >The video in the VLC player is almost delayed 2 second than the video >in the monitor. > >where can i modify the source code to reduce the transmission time? You can't, because there is no significant delay in the "LIVE555 Streaming Media" code - at either the sending end or the receiving (VLC) end. However, VLC does have a separate jitter buffer that - by default - adds 1.2 seconds (1200 ms) at the receiving end. You can reduce this by changing VLC's Preferences->Interface(all)->Input/Codecs->Demuxers->RTP/RTSP->Caching value (ms) option. Note, though, that if you reduce this value too much, your image quality (in VLC) might suffer. Also, you haven't said how you are transferring video data from your camera into our "MPEG4VideoStreamDiscreteFramer" class. Perhaps there is some delay there - e.g., if you are usng an OS pipe, then you might be able to reduce its buffer size also. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From msn at conecom.dk Thu Mar 26 01:02:10 2009 From: msn at conecom.dk (Michael Skaastrup) Date: Thu, 26 Mar 2009 09:02:10 +0100 Subject: [Live-devel] Compiling on solaris 64 bit -maybe fixed In-Reply-To: References: <5DEAE73D1DDDB24F976671CEEBA7CDED243A1B@cecdom01.conecom.internal> <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5D@cecdom01.conecom.internal> <5DEAE73D1DDDB24F976671CEEBA7CDED05CE5F@cecdom01.conecom.internal> Message-ID: <1238054530.3462.10.camel@msn> tir, 24 03 2009 kl. 14:44 -0700, skrev Ross Finlayson: > >Figured it out. I am attaching the config.solaris inline. I've just > >added -m64 three places to instruct compiler and linker to make 64 > >bit files. > > Thanks. > > One question, though: What happens if someone uses this new > "config.solaris" to compile/link an application on (and for) a 32-bit > Solaris system. Will the resulting binary still run OK? Hello Sorry for the delay, had some work to do ;-) A 64 bit binary cannot run on a 32 bit solaris as far as I know, so better make a config.solaris64 On the project I worked on i made som load tests. This on a ferry which has been converted to a floating hotel with 88 cabins. I tested live555server successfully with 40 simultaneous clients. Showing different MPEG2 ts files ripped from DVD's. The server seems sigle threaded and connections maxes out at about 45 clients connecting to 5 different MPEGs. So I started another instance which listens on port 8554 as the code defaults to. Cool. Another 45 clients I thought. But sadly the server maxed out at about 60 clients. Still 28 from the teoretical max. So the question is. Are there any other tricks I could use? The system is a Quad core AMD 2,5 GHz, 8 GB RAM and 14 SAS disks on ZFS. > > Or should there be two config files for Solaris: > "config.solaris-32bit" and "config.solaris-64bit"? Please let me > know, so I can make the appropriate changes for the next release of > the software. Venlig hilsen Michael Skaastrup Web:www.conecom.dk Systemkonsulent Email: msn at conecom.dk Con E Com A/S Tlf: 36 93 25 57 Hjallesegade 45 Mob: +45 20 31 41 90 5260 Odense S Tlf: +45 70 11 20 19 Fax: +45 70 11 20 18 Vi g?r opm?rksom p?, at denne e-mail kan indeholde fortrolig information. Hvis du ved en fejltagelse modtager e-mailen, beder vi dig venligst informere afsender om fejlen ved at bruge svar-funktionen. Samtidig beder vi dig slette e-mailen i dit system uden at videresende eller kopiere den. Selv om e-mailen og ethvert vedh?ftet bilag efter vores overbevisning er fri for virus og andre fejl, som kan p?virke computeren eller it-systemet, hvori den modtages og l?ses, ?bnes den p? modtagerens eget ansvar. Vi p?tager os ikke noget ansvar for tab og skade, som er opst?et i forbindelse med at modtage og bruge e-mailen. _______________ Please note that this message may contain confidential information. If you have received this message by mistake, please inform the sender of the mistake by sending a reply, then delete the message from your system without making, distributing or retaining any copies of it. Although we believe that the message and any attachments are free from viruses and other errors that might affect the computer or IT system where it is received and read, the recipient opens the message at his or her own risk. We assume no responsibility for any loss or damage arising from the receipt or use of this message. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: CEC-logo.png Type: image/png Size: 3794 bytes Desc: not available URL: From live.davinci at gmail.com Thu Mar 26 00:49:28 2009 From: live.davinci at gmail.com (davinci white) Date: Thu, 26 Mar 2009 15:49:28 +0800 Subject: [Live-devel] transmission delay using Live555 Message-ID: <70eda3d30903260049g623dc60ej742eb29fc11e4f09@mail.gmail.com> Thank you, the problem is solved. Just chang VLC's Preferences->Interface(all)->Input/Codecs->Demuxers->RTP/RTSP->Caching value (ms) I reduce it to 80ms. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gabriele.deluca at hotmail.com Thu Mar 26 06:44:36 2009 From: gabriele.deluca at hotmail.com (Gabriele De Luca) Date: Thu, 26 Mar 2009 14:44:36 +0100 Subject: [Live-devel] Duplicate Packets Message-ID: Hi Ross,how about the library in the presence of duplicate packets in input?How to take into account the duplication of packets in the class QoSMeasurement (negative values)? Thanks in advance _________________________________________________________________ Ci sai fare con l'italiano? Gioca su Typectionary! http://clk.atdmt.com/GBL/go/136430522/direct/01/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 26 07:43:56 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Mar 2009 07:43:56 -0700 Subject: [Live-devel] Duplicate Packets In-Reply-To: References: Message-ID: >how about the library in the presence of duplicate packets in input? Our software auomatically detects, and discards, duplicate RTP packets. >How to take into account the duplication of packets in the class >QoSMeasurement (negative values)? Yes. If a stream contains only duplicate RTP packets, with no packets lost, then this will be reported as negative packet loss. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From blanes at eps.udg.edu Thu Mar 26 09:22:35 2009 From: blanes at eps.udg.edu (Josep Blanes) Date: Thu, 26 Mar 2009 17:22:35 +0100 Subject: [Live-devel] looping stream Message-ID: <49CBABCB.5070800@eps.udg.edu> Dear Sirs, I'm using live555MediaServer to stream an mpeg-ts encapsuled H.264 video to an external set-top-box (from Exterity). All is working fine; the only problem is that I want to loop the video (rewind and replay) once it has finished. Other streaming servers allow this option, but I don't know how to do it with yours. I've seen some software that edits an *.ts file (mpeg2-ts) and generates a loop in the video file, but I'm not really sure whether this would work or not. any help appreciated. thanks a lot. -- Josep Blanes Responsable Seccio Informatica EPS E-mail: josep.blanes at udg.edu Avda. Lluis Santalo, s/n. Escola Politecnica Superior, Edifici P-I Universitat de Girona --------------------------------------- From kidjan at gmail.com Thu Mar 26 10:29:26 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 26 Mar 2009 10:29:26 -0700 Subject: [Live-devel] H.264 NAL confusion Message-ID: > No. Our software (in particular, the "H264VideoRTPSink" class) > already implements the RTP payload format described in RFC 3984. > All you need to do is deliver - one at a time - NAL units to it > (*without* a preceding 'length' field). Ross, Thank you for the clarification. A few follow up questions: 1. I have my stream running, but I can't seem to get VLC to digest it. Should VLC be able to play it? If I use UDP streaming, I can see the incoming packets but it never manages to figure out that the stream is H.264. If I use RTP, it doesn't seem to do much of anything. Any thoughts? 2. Do I need to do anything with PPS and SPS information? My encoder supplies it but I don't currently do anything with it. This could be my problem (?). Any help is greatly appreciated; my goal is to get the output of my application to render correctly in VLC, so any advice would be great. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Mar 26 17:30:29 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Mar 2009 17:30:29 -0700 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: >1. I have my stream running, but I can't seem to get VLC to digest >it. Should VLC be able to play it? Yes, but this is not a VLC mailing list. >2. Do I need to do anything with PPS and SPS information? Yes. You need to Base-64-encode each of these special NAL units, concatenate the resulting strings together with the comma (',') character, and pass the resulting string as the "sprop_parameter_sets_str" parameter to "H264VideoRTPSink::createNew()". > My encoder supplies it but I don't currently do anything with it. >This could be my problem (?). Yes. (I wonder, what were you passing as the "sprop_parameter_sets_str" parameter before?? That parameter is there for a reason. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From debargha.mukherjee at hp.com Thu Mar 26 17:37:03 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Fri, 27 Mar 2009 00:37:03 +0000 Subject: [Live-devel] Framed Source Issue In-Reply-To: References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: <73833378E80044458EC175FF8C1E63D57245CCDADB@GVW0433EXB.americas.hpqcorp.net> HI Ross, Thanks for the pointers, but I am still struggling with the issue of audio being called much more often than it needs to be. Please find below my derived audio encoding class implementation. My constructor takes in a structure with encoding parameters, along with two pointers to external functions that I use to fill the buffers and check if there is enough data to read. I am using MPEG1or2AudioStreamFramer with MP3 encoding (single channel at 16000 KHz). Also it seems to me that the MPEG1or2AudioStreamFramer class does not use the fDurationInMicroseconds parameter at all. Any help will be appreciated. ----------------------------------------------------------------------------- #include "AudioEncSource.hh" void doGetNextFrame_AudioEncSource(void* clientData) { AudioEncSource *a = (AudioEncSource*)clientData; a->doGetNextFrame(); } AudioEncSource* AudioEncSource::createNew(UsageEnvironment& env, AudioEncParameters params, int (*auReadFn)(short *, int), int (*auIsOpenFn)()) { return new AudioEncSource(env, params, auReadFn, auIsOpenFn); } AudioEncSource::AudioEncSource(UsageEnvironment& env, AudioEncParameters params, int (*auReadFn)(short *, int), int (*auIsOpenFn)()) : FramedSource(env), fParams(params), audio_buf(NULL), audioReadData(auReadFn), audioIsOpen(auIsOpenFn) { // Any initialization of the device would be done here //avcodec_init(); // Register all formats and codecs //av_register_all(); c= avcodec_alloc_context(); avcodec_get_context_defaults2(c, CODEC_TYPE_AUDIO); c->codec_id=params.codec_id; /* put sample parameters */ c->bit_rate = params.bitrate; c->channels = fParams.numchannels; c->sample_rate = fParams.sampfreq; /* frames per second */ codec = avcodec_find_encoder(c->codec_id); if(codec==NULL) { printf("Audio encoder could not be found\n"); av_free(c); c=NULL; exit(1); } if(avcodec_open(c, codec)<0) { printf("Audio encoder could not be opened\n"); av_free(c); c=NULL; codec=NULL; exit(1); } if (c) { framesize = c->frame_size * c->channels; audio_buf = (short*)malloc(2 * framesize); outbuf_size = 100000; outbuf = (uint8_t*)malloc(outbuf_size); frameDurationInMicroseconds = (unsigned int)(1000000.0*c->frame_size/c->sample_rate+0.5); } } AudioEncSource::~AudioEncSource() { if (c) { avcodec_close(c); av_free(c); free(audio_buf); free(outbuf); } } void AudioEncSource::doGetNextFrame() { // Arrange here for our "deliverFrame" member function to be called // when the next frame of data becomes available from the device. // This must be done in a non-blocking fashion - i.e., so that we // return immediately from this function even if no data is // currently available. // // If the device can be implemented as a readable socket, then one easy // way to do this is using a call to // envir().taskScheduler().turnOnBackgroundReadHandling( ... ) // (See examples of this call in the "liveMedia" directory.) // If, for some reason, the source device stops being readable // (e.g., it gets closed), then you do the following: if (audioIsOpen) { if (!audioIsOpen() /* the source stops being readable */) { handleClosure(this); return; } } deliverFrame(); } void AudioEncSource::deliverFrame() { // This would be called when new frame data is available from the device. // This function should deliver the next frame of data from the device, // using the following parameters (class members): // 'in' parameters (these should *not* be modified by this function): // fTo: The frame data is copied to this address. // (Note that the variable "fTo" is *not* modified. Instead, // the frame data is copied to the address pointed to by "fTo".) // fMaxSize: This is the maximum number of bytes that can be copied // (If the actual frame is larger than this, then it should // be truncated, and "fNumTruncatedBytes" set accordingly.) // 'out' parameters (these are modified by this function): // fFrameSize: Should be set to the delivered frame size (<= fMaxSize). // fNumTruncatedBytes: Should be set iff the delivered frame would have been // bigger than "fMaxSize", in which case it's set to the number of bytes // that have been omitted. // fPresentationTime: Should be set to the frame's presentation time // (seconds, microseconds). // fDurationInMicroseconds: Should be set to the frame's duration, if known. if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet //Deliver the data here: //fread(audio_buf,1,framesize,fInFid); gettimeofday(&fPresentationTime, NULL); int x = audioReadData(audio_buf, framesize); if (!x) { printf("Audio read zero at time %lu.%lu (%d)\n", fPresentationTime.tv_sec, fPresentationTime.tv_usec, frameDurationInMicroseconds); envir().taskScheduler().scheduleDelayedTask(frameDurationInMicroseconds>>2, (TaskFunc*)doGetNextFrame_AudioEncSource, (void*)this); return; } int out_size = avcodec_encode_audio(c, outbuf, outbuf_size, audio_buf); fFrameSize=(out_size<=fMaxSize?out_size:fMaxSize); memcpy(fTo, outbuf, fFrameSize); fNumTruncatedBytes = out_size-fFrameSize; fDurationInMicroseconds = frameDurationInMicroseconds; //After delivering the data, inform the reader that it is now available: printf("Audio %d in, %d out at time %lu.%lu (%d)\n",x, fFrameSize, fPresentationTime.tv_sec, fPresentationTime.tv_usec, fDurationInMicroseconds); FramedSource::afterGetting(this); } > -----Original Message----- > From: live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson > Sent: Thursday, March 19, 2009 4:58 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > > >First, is it possible to explain using > >TaskScheduler::scheduleDelayedTask(), as you suggested, a little > >better? > > In your implementation of "doGetNextFrame()", if no input data is > currently available to be delivered to the downstream object, then > you could just call "TaskScheduler::scheduleDelayedTask()" with a > delay parameter of 10000 (i.e., 10 ms), and passing a pointer to a > function (that you would write) that would call "doGetNextFrame()" > once again, to retry. Then, after calling > "TaskScheduler::scheduleDelayedTask()", just return. > > For example, you could look at the code on lines 58-74 of > "liveMedia/MPEG4VideoFileServerMediaSubsession.cpp", which does > something slightly similar. > > > >Second, I also suspect that the freezing issue has to do with the > >timestamps and the duration. > >I am setting the duration in microsecs as 26122 (for 44.1 KHz, MP3 > >frames of 1152 samples) for audio, and 33333 (30 fps) for video. The > >presentation time is obtained from gettimeofday(). However, I find > >that the audio is called much more often than 26122 microsecs. Audio > >is called only at intervals of a few thousand microsecs (about 10 > >times more than what it should be). Can you explain what may be > >going on. I am using MP3 in MPEG1or2AudioRTPSink. > > Are you *sure* your "doGetFrame()" implementation is setting the > "fDurationInMicroseconds" (and "fFrameSize" and "fPresentationTime") > variable, before it calls "FramedSource::afterGetting()"? One way > you can check this is by looking at the parameters to the > (subsequent) call to "MultiFramedRTPSink:: ::afterGettingFrame()", > and checking that they are correct. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Thu Mar 26 17:35:33 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Mar 2009 17:35:33 -0700 Subject: [Live-devel] looping stream In-Reply-To: <49CBABCB.5070800@eps.udg.edu> References: <49CBABCB.5070800@eps.udg.edu> Message-ID: >Dear Sirs, > >I'm using live555MediaServer to stream an mpeg-ts encapsuled H.264 >video to an external set-top-box (from Exterity). > >All is working fine; the only problem is that I want to loop the >video (rewind and replay) once it has finished. Other streaming >servers allow this option, but I don't know how to do it with yours. There's no really easy way to do this, unfortunately. Probably the easiest way would be for you to modify the "ByteStreamFileSource" class so that it handles end-of-file transparently by reseeking the open file back to the beginning, and then re-reading. I.e., so that the higher-level software has no knowledge that the file has looped. You may still run into problems with your TS file's PCT timestamps (and thus your RTP timestamps) wrapping around - depending on how robust your client is. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 26 17:59:59 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Mar 2009 17:59:59 -0700 Subject: [Live-devel] Framed Source Issue In-Reply-To: <73833378E80044458EC175FF8C1E63D57245CCDADB@GVW0433EXB.americas.hpqcorp.ne t> References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> <73833378E80044458EC175FF8C1E63D57245CCDADB@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: >Also it seems to me that the MPEG1or2AudioStreamFramer class does >not use the fDurationInMicroseconds parameter at all. Yes it does, it *sets* this parameter (see line 144). Once again, look at the parameters that are passed to each call to "void MultiFramedRTPSink::afterGettingFrame()". (These parameters include the frame size and duration; the 'duration' is what our code uses to decide when to ask for the next frame of data.) This should help tell you what's going wrong. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From onramp123 at yahoo.com Thu Mar 26 21:19:51 2009 From: onramp123 at yahoo.com (Steve DeLaney) Date: Thu, 26 Mar 2009 20:19:51 -0800 Subject: [Live-devel] Long buffering latency with testMPEG4VideoStreamer on quicktime player In-Reply-To: Message-ID: <03f101c9ae93$47084680$6501a8c0@sdelaney2> Hi there, I ran some initial tests with testMPEG4VideoStreamer on FC10, using Quicktime player 7.6 on WinXP. This is going OK so far but I ran into a couple of initial problems to sort out. First, after starting RTSP/SDP PLAY, quicktime player buffers for a LONG time before playing. It looks like it actually waits until the clip file cycles to the start before it actually kicks in. for example, if you have a 30 second clip, the average wait would be something like 15 seconds before the clip plays in quicktime. it seems the longer the clip, the longer the average wait. This happens using any test clip file "para.m4e", "petrov.m4e" or "wwe.m4e" This is obviously not the intended user experience. I set quicktime player to "quick start" but it did not help. Any ideas where to start isolating this? A secondary and not so urgent issue; I'm testing on a multihomed server and attempting to restrict multicast output to a specific subnet. So far this has been unsuccessful. I ran across some related topics in the mailing list archives - so I'll just keep searching. this is somewhat lower priority at the moment. Overall thank you for a nicely organized and complete media streaming server! /steverino2 From finlayson at live555.com Thu Mar 26 21:32:09 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Mar 2009 21:32:09 -0700 Subject: [Live-devel] Long buffering latency with testMPEG4VideoStreamer on quicktime player In-Reply-To: <03f101c9ae93$47084680$6501a8c0@sdelaney2> References: <03f101c9ae93$47084680$6501a8c0@sdelaney2> Message-ID: >Hi there, I ran some initial tests with testMPEG4VideoStreamer on FC10, >using Quicktime player 7.6 on WinXP. > >This is going OK so far but I ran into a couple of initial problems to sort >out. > >First, after starting RTSP/SDP PLAY, quicktime player buffers for a LONG >time >before playing. It looks like it actually waits until the clip file >cycles to the start before it actually kicks in. for example, if you have a >30 >second clip, the average wait would be something like 15 seconds before >the clip plays in quicktime. it seems the longer the clip, the longer the >average wait. > >This happens using any test clip file "para.m4e", "petrov.m4e" or "wwe.m4e" Sorry, but I couldn't reproduce this problem with QuickTime Player (version 7.6 (472)) on Mac OS, nor with VLC. Perhaps it's a problem specific to Windows?? I suggest that you try using VLC instead of QuickTime Player, to see if that works better for you. >A secondary and not so urgent issue; I'm testing on a multihomed server and >attempting to restrict multicast output to a specific subnet. That's something that you have to do in your OS (usually by changing routing tables). It has nothing to do with our software (which, of course, runs above the OS). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From iskaz at intracomdefense.com Fri Mar 27 05:59:30 2009 From: iskaz at intracomdefense.com (=?windows-1252?Q?=3F=3F=3F=3F=3F=3F=3F=3F_=3F=3F=3F=3F=3F=3F=3F?=) Date: Fri, 27 Mar 2009 14:59:30 +0200 Subject: [Live-devel] How to add remove a ServerMediaSubssesion on the fly? Message-ID: <49CCCDB2.8090606@intracomdefense.com> Hi Ross, I have written an mpeg4 streamer based on live555 using (My)DeviceSource class, (My)VideoFileServerMediaSubsession class, and the MPEG4VideoStreamDiscreteFramer (based on your testOnDemandRTSPServer example). Everything works fine, but I would like to ask you the following: While an rtsp client is connected to the server I would like to remove the mediaSession and set up another one. How to do that? I have a function which add's an sms: fnct_add(){ ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(XvidVideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); } and one function which removes the sms: fnct_remove(){ sms->close(*env,streamName); sms->~ServerMediaSession(); } I run the env->taskScheduler().doEventLoop(&watch_event_loop); in its on process (new thread) and i use the watch_event_loop + a dummyTask(Null) to control the event loop. In order to remove an sms I do: 1) set watch_event_loop =1 ion order to stop the endless loop 2) remove the sms ( sms->close(*env,streamName); and sms->~ServerMediaSession(); 3)stop the tread which runs the env->taskScheduler().doEventLoop(&watch_event_loop); in order to add a new sms I do: 1) add a new sms with ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(XvidVideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); 2) set watch_event_loop=0 3) create a new thread which runs the env->taskScheduler().doEventLoop(&watch_event_loop); Generally this process seems to work, but while the client (mplayer) is connected and the server is streaming, if we remove the sms the client stops playing the video stream but it does not close the connection with the server and therefore live555 does not call the deconstructor of (My)DeviceSource class (which it does, if I stop the client first). if I call again the function which add a new sms, then and only then live555 call's the deconstructor of (My)DeviceSource class first (which stops my frame grabber card) and then initialize a new instance of (My)DeviceSource (which starts my frame grabber card) when a client atemps to connect to the server again. Why after the execution of sms->~ServerMediaSession(); does not call the deconstructor of (My)DeviceSource class? Why the connection between client and server does not close, when I remove the sms? Which is the right way to add /remove an sms during a connection between server and client? From iskaz at intracomdefense.com Fri Mar 27 06:21:59 2009 From: iskaz at intracomdefense.com (=?UTF-8?B?zqPOus6xzrbOr866zrfPgiDOmc+JzqzOvc69zrfPgg==?=) Date: Fri, 27 Mar 2009 15:21:59 +0200 Subject: [Live-devel] Sorry about that, but I forgot my name... (How to add remove a ServerMediaSubssesion on the fly?) Message-ID: <49CCD2F7.2010509@intracomdefense.com> Hi Ross, I have written an mpeg4 streamer based on live555 using (My)DeviceSource class, (My)VideoFileServerMediaSubsession class, and the MPEG4VideoStreamDiscreteFramer (based on your testOnDemandRTSPServer example). Everything works fine, but I would like to ask you the following: While an rtsp client is connected to the server I would like to remove the mediaSession and set up another one. How to do that? I have a function which add's an sms: fnct_add(){ ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(XvidVideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); } and one function which removes the sms: fnct_remove(){ sms->close(*env,streamName); sms->~ServerMediaSession(); } I run the env->taskScheduler().doEventLoop(&watch_event_loop); in its on process (new thread) and i use the watch_event_loop + a dummyTask(Null) to control the event loop. In order to remove an sms I do: 1) set watch_event_loop =1 ion order to stop the endless loop 2) remove the sms ( sms->close(*env,streamName); and sms->~ServerMediaSession(); 3)stop the tread which runs the env->taskScheduler().doEventLoop(&watch_event_loop); in order to add a new sms I do: 1) add a new sms with ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(XvidVideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); 2) set watch_event_loop=0 3) create a new thread which runs the env->taskScheduler().doEventLoop(&watch_event_loop); Generally this process seems to work, but while the client (mplayer) is connected and the server is streaming, if we remove the sms the client stops playing the video stream but it does not close the connection with the server and therefore live555 does not call the deconstructor of (My)DeviceSource class (which it does, if I stop the client first). if I call again the function which add a new sms, then and only then live555 call's the deconstructor of (My)DeviceSource class first (which stops my frame grabber card) and then initialize a new instance of (My)DeviceSource (which starts my frame grabber card) when a client atemps to connect to the server again. Why after the execution of sms->~ServerMediaSession(); does not call the deconstructor of (My)DeviceSource class? Why the connection between client and server does not close, when I remove the sms? Which is the right way to add /remove an sms during a connection between server and client? Thanks in advance, Ioannis Skazikis Intracom Defense Electronics. From atulrtripathi at gmail.com Fri Mar 27 05:33:59 2009 From: atulrtripathi at gmail.com (atul tripathi) Date: Fri, 27 Mar 2009 18:03:59 +0530 Subject: [Live-devel] Streaming live AMR Message-ID: <879b22720903270533v1bed18e2we07fa958ac2ec8e@mail.gmail.com> Hi, I am new to live-world. I need to stream live amr data using live-555. For this i have written a buffer(subclass of FramedSource). I am converting the live raw audio to amr(sample: 8000Hz, bit-rate: 12200) using ffmpeg and writing to the buffer. I have modified *AMRAudioFileServerMediaSubsession *to take my buffer as source instead of AMRAudioFileSource. I also modified *AMRAudioRTPSink *to take my buffer as source instead of AMRAudioSource. But when i start steaming, then i am getting some musical sound or silence. But when i am writing the same data(encoded by ffmpeg) to a .amr file then it is fine. I am reading the frame in the same way as *AMRAudioFileSource* do in doGetNextFrame(). Do we need some different read logic for live amr data? Thanks in advance, Atul -------------- next part -------------- An HTML attachment was scrubbed... URL: From debargha.mukherjee at hp.com Fri Mar 27 09:28:10 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Fri, 27 Mar 2009 16:28:10 +0000 Subject: [Live-devel] Framed Source Issue In-Reply-To: References: <73833378E80044458EC175FF8C1E63D56DFACD5F10@GVW0433EXB.americas.hpqcorp.ne t> <73833378E80044458EC175FF8C1E63D57245CCDADB@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: <73833378E80044458EC175FF8C1E63D57245CCDD98@GVW0433EXB.americas.hpqcorp.net> Hi Ross, I am still somewhat confused. The parameter fDurationInMicrosecondss is being set correctly by me in the deliverFrame() function of my AudioEncSource class before the call to FramedSource::afterGetting(this). Could you point me to where in your code it is actually used to decide when to make the next call? In line 144 of MPEG1or2AudioStreamFramer::continueReadProcessing(), as you mentioned, it is *set* anyways by your code. So however I set it in my class does not seem to matter at all. Thanks, Debargha. > -----Original Message----- > From: live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson > Sent: Thursday, March 26, 2009 6:00 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > > >Also it seems to me that the MPEG1or2AudioStreamFramer class does > >not use the fDurationInMicroseconds parameter at all. > > Yes it does, it *sets* this parameter (see line 144). > > Once again, look at the parameters that are passed to each call to > "void MultiFramedRTPSink::afterGettingFrame()". (These parameters > include the frame size and duration; the 'duration' is what our code > uses to decide when to ask for the next frame of data.) This should > help tell you what's going wrong. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson+mobile at live555.com Fri Mar 27 14:24:38 2009 From: finlayson+mobile at live555.com (Ross Finlayson) Date: Fri, 27 Mar 2009 14:24:38 -0700 Subject: [Live-devel] How to add remove a ServerMediaSubssesion on the fly? In-Reply-To: <49CCCDB2.8090606@intracomdefense.com> References: <49CCCDB2.8090606@intracomdefense.com> Message-ID: Ross. ---------- Sent from my jesusPhone > While an rtsp client is connected to the server I would like to > remove the mediaSession and set up another one. How to do that? Just call "RTSPServer::removeServerMediaSession()". You can do this even if clients are already streaming from it. (Any ongoing streams will run until completion, but no new clients will be able to use it.) From finlayson+mobile at live555.com Fri Mar 27 14:42:22 2009 From: finlayson+mobile at live555.com (Ross Finlayson) Date: Fri, 27 Mar 2009 14:42:22 -0700 Subject: [Live-devel] Framed Source Issue Message-ID: <9EC7F671-9F1D-4AD6-A1E2-EABDCB3CFAD8@live555.com> On Mar 27, 2009, at 9:28 AM, "Mukherjee, Debargha" wrote: > Hi Ross, > > I am still somewhat confused. > The parameter fDurationInMicrosecondss is being set correctly by me > in the deliverFrame() function of my AudioEncSource class before the > call to FramedSource::afterGetting(this). Could you point me to > where in your code it is actually used to decide when to make the > next call? > > In line 144 of MPEG1or2AudioStreamFramer::continueReadProcessing(), > as you mentioned, it is *set* anyways by your code. So however I set > it in my class does not seem to matter at all. >> >> >> >> If your audio source object delivers discrete audio frames (one at a time), then you should *not* feed it into a "MPEG1or2AudioStreamFramer". (That class is for parsing MPEG audio from an unstructured byte stream.). Instead, feed your input source directly to your "MPEG1or2AudioRTPSink". From debargha.mukherjee at hp.com Fri Mar 27 15:22:11 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Fri, 27 Mar 2009 22:22:11 +0000 Subject: [Live-devel] Framed Source Issue In-Reply-To: <9EC7F671-9F1D-4AD6-A1E2-EABDCB3CFAD8@live555.com> References: <9EC7F671-9F1D-4AD6-A1E2-EABDCB3CFAD8@live555.com> Message-ID: <73833378E80044458EC175FF8C1E63D57245CCE11C@GVW0433EXB.americas.hpqcorp.net> Thanks. How about the MPEG4 video? I am currently encoding video frames into MPEG4 and then using the MPEG4VideoStreamFramer class before feeding into MPEG4ESVideoRTPSink. Is that correct? > -----Original Message----- > From: live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson > Sent: Friday, March 27, 2009 2:42 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > > On Mar 27, 2009, at 9:28 AM, "Mukherjee, Debargha" > > wrote: > > > Hi Ross, > > > > I am still somewhat confused. > > The parameter fDurationInMicrosecondss is being set > correctly by me > > in the deliverFrame() function of my AudioEncSource class > before the > > call to FramedSource::afterGetting(this). Could you point me to > > where in your code it is actually used to decide when to make the > > next call? > > > > In line 144 of > MPEG1or2AudioStreamFramer::continueReadProcessing(), > > as you mentioned, it is *set* anyways by your code. So > however I set > > it in my class does not seem to matter at all. > >> > >> > >> > >> > > If your audio source object delivers discrete audio frames (one at a > time), then you should *not* feed it into a > "MPEG1or2AudioStreamFramer". (That class is for parsing MPEG audio > from an unstructured byte stream.). Instead, feed your input source > directly to your "MPEG1or2AudioRTPSink". > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson+mobile at live555.com Fri Mar 27 15:40:56 2009 From: finlayson+mobile at live555.com (Ross Finlayson) Date: Fri, 27 Mar 2009 15:40:56 -0700 Subject: [Live-devel] Framed Source Issue In-Reply-To: <73833378E80044458EC175FF8C1E63D57245CCE11C@GVW0433EXB.americas.hpqcorp.net> References: <9EC7F671-9F1D-4AD6-A1E2-EABDCB3CFAD8@live555.com> <73833378E80044458EC175FF8C1E63D57245CCE11C@GVW0433EXB.americas.hpqcorp.net> Message-ID: <40CE587C-0014-4776-82D8-D9BCAFB6ACBC@live555.com> On Mar 27, 2009, at 3:22 PM, "Mukherjee, Debargha" wrote: > Thanks. How about the MPEG4 video? I am currently encoding video > frames into MPEG4 and then using the MPEG4VideoStreamFramer class > before feeding into MPEG4ESVideoRTPSink. Is that correct? >> >>> >>> >>>> >>>> >>>> >>>> >> >> No. Because your input source delivers discrete frames, you should feed it into "MPEG4VideoStreamDiscreteFramer" instead. From finlayson at live555.com Sat Mar 28 13:05:42 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 28 Mar 2009 13:05:42 -0700 Subject: [Live-devel] Streaming live AMR In-Reply-To: <879b22720903270533v1bed18e2we07fa958ac2ec8e@mail.gmail.com> References: <879b22720903270533v1bed18e2we07fa958ac2ec8e@mail.gmail.com> Message-ID: >I need to stream live amr data using live-555. > >For this i have written a buffer(subclass of FramedSource). This should be a subclass of "AMRAudioSource", not just "FramedSource". >I am converting the live raw audio to amr(sample: 8000Hz, bit-rate: >12200) using ffmpeg and writing to the buffer. Also, your class (a subclass of "AMRAudioSource") should properly set the fields Boolean fIsWideband; unsigned fNumChannels; u_int8_t fLastFrameHeader; which you inherit from "AMRAudioSource". >I have modified AMRAudioFileServerMediaSubsession to take my buffer >as source instead of AMRAudioFileSource. No, you shouldn't modify the existing code; it works just fine for its purpose. Instead, you should write a *new* class - a subclass of "ServerMediaSubsession" - that does what you want. (It will likely copy much of the code from the existing "AMRAudioFileServerMediaSubsession" class, which you should not change. >I also modified AMRAudioRTPSink to take my buffer as source instead >of AMRAudioSource. No, you definitely should not modify "AMRAudioRTPSink"; there is nothing wrong with it. Just use this class as is. (Because your new input class is a subclass of "AMRAudioSource", the existing code will continue to work. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From joeflin at 126.com Sun Mar 29 00:38:04 2009 From: joeflin at 126.com (joeflin) Date: Sun, 29 Mar 2009 15:38:04 +0800 (CST) Subject: [Live-devel] openRTSP + H264 Message-ID: <32061428.121831238312284909.JavaMail.coremail@bj126app27.126.com> hi, openRTSP saved the stream to a file, I added the SPS and PPS according to the SDP info. When I did a "file" command on the saved file, I got " video-H264-2: JVT NAL sequence, H.264 video, baseline @ L 31", but mplayer and vlc can not play this file. vlc complains about "nothing to play", and mplayer says "libavformat file format detected.", where it should say "H264-ES file format detected.". Any hints? Joe BTW, the stream itself can be played by both mplayer and vlc. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Mar 29 02:05:08 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 29 Mar 2009 02:05:08 -0700 Subject: [Live-devel] openRTSP + H264 In-Reply-To: <32061428.121831238312284909.JavaMail.coremail@bj126app27.126.com> References: <32061428.121831238312284909.JavaMail.coremail@bj126app27.126.com> Message-ID: >openRTSP saved the stream to a file, I added the SPS and PPS >according to the SDP info. When I did a "file" command on the saved >file, I got " > >video-H264-2: JVT NAL sequence, H.264 video, baseline @ L 31", but >mplayer and vlc can not play this file. vlc complains about "nothing >to play", and mplayer says "libavformat file format detected.", >where it should say "H264-ES file format detected.". Any hints? Yes - contact the authors of those two applications, asking them why they do not (or cannot) play this type of file. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From patbob at imoveinc.com Mon Mar 30 09:03:33 2009 From: patbob at imoveinc.com (Patrick White) Date: Mon, 30 Mar 2009 09:03:33 -0700 Subject: [Live-devel] Long buffering latency with testMPEG4VideoStreamer on quicktime player In-Reply-To: References: <03f101c9ae93$47084680$6501a8c0@sdelaney2> Message-ID: <200903300903.33732.patbob@imoveinc.com> > >A secondary and not so urgent issue; I'm testing on a multihomed server > > and attempting to restrict multicast output to a specific subnet. > > That's something that you have to do in your OS (usually by changing > routing tables). It has nothing to do with our software (which, of > course, runs above the OS). If I understand the code right, the global variable SendingInterfaceAddr limits which physical interface the multicast goes out on.. its not subnet control, but could that help in this instance? From debargha.mukherjee at hp.com Mon Mar 30 10:19:36 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Mon, 30 Mar 2009 17:19:36 +0000 Subject: [Live-devel] Framed Source Issue In-Reply-To: <40CE587C-0014-4776-82D8-D9BCAFB6ACBC@live555.com> References: <9EC7F671-9F1D-4AD6-A1E2-EABDCB3CFAD8@live555.com> <73833378E80044458EC175FF8C1E63D57245CCE11C@GVW0433EXB.americas.hpqcorp.net> <40CE587C-0014-4776-82D8-D9BCAFB6ACBC@live555.com> Message-ID: <73833378E80044458EC175FF8C1E63D572464B823D@GVW0433EXB.americas.hpqcorp.net> Thanks Ross - By making the changes in audio and video sink connections as you suggested, the intermittent crashing/freezing problem seems to have disappered. However, audio still gets called much more often than it needs to as per the fDurationInMicroseconds prameter. Two more questions: 1. If I encoded discrete video frames using H.264 (say using ffmpeg), should I directly feed to H264VideoRTPSink, or should I use H264VideoStreamFramer in between? 2. If I used a "proprietary" discrete video frame encoder and decoder, what is the best way to use the live libraries to stream and receive. I am planning to use a derived FileSink class at the client end to receive and decode the elementary stream, but I am not sure of the server side. Best Regards, Debargha. > -----Original Message----- > From: live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson > Sent: Friday, March 27, 2009 3:41 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > > On Mar 27, 2009, at 3:22 PM, "Mukherjee, Debargha" > > wrote: > > > Thanks. How about the MPEG4 video? I am currently encoding video > > frames into MPEG4 and then using the MPEG4VideoStreamFramer class > > before feeding into MPEG4ESVideoRTPSink. Is that correct? > >> > >>> > >>> > >>>> > >>>> > >>>> > >>>> > >> > >> > > No. Because your input source delivers discrete frames, you should > feed it into "MPEG4VideoStreamDiscreteFramer" instead. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From bitter at vtilt.com Mon Mar 30 10:40:35 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Mon, 30 Mar 2009 13:40:35 -0400 Subject: [Live-devel] Access to RTP Extension Header In-Reply-To: References: Message-ID: Ross, I modifies MultiFramedRTPSource to add the ability to set and call a callback function to gain access to the RTP extended header. This works for my test scenario. If there is no RTP extended header (probably 90% of the time this is the case) then the added code has no effect on the the system. I have attached a patch/diff of the changes I made. I hope you'll add them into the official tree.... Thanks for your help, Brad -------------- next part -------------- A non-text attachment was scrubbed... Name: live.2009.02.23.patch Type: application/octet-stream Size: 2390 bytes Desc: not available URL: -------------- next part -------------- On Mar 24, 2009, at 12:50 AM, Ross Finlayson wrote: >> Is there an easy way to access the RTP header when using live555 as >> a client? > > Not yet, unfortunately. (The code currently checks for a RTP > extension header, but then just ignores (skips over) it it sees it. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From kidjan at gmail.com Mon Mar 30 14:48:14 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Mon, 30 Mar 2009 14:48:14 -0700 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: > Yes. (I wonder, what were you passing as the "sprop_parameter_sets_str" parameter before?? > That parameter is there for a reason. Before, just "h264" which is what the sample code I started with used (I was hoping I could just start with an existing sample and move forward, but apparently it's more complicated than that). I don't doubt it's there for a reason; I didn't even realize it was there, or the associated profile level id. I'm confused about what information I need to supply to the framer to get all this to work correctly. So far it seems I need to: 1. Supply NAL units 2. Implement currentNALUnitEndsAccessUnit() (for me, this always returns true since each NAL unit I have seems to correspond with a whole encoded frame--does that sound right to you?) 3. supply SPS/PPS information which is communicated in sprop_parameter_sets_str (just for clarification: this looks like "(SPS as base64),(PPS as base64)", right?) 4. supply profile_level_id (any hints about this one? I've reviewed rfc 3984, and i'm still a little fuzzy on what this value should look like) 5. Anything else? I really appreciate your help on this; I have no experience with H.264 or streaming per RFC 3984, so your advice has been a big help. I know for a fact my H.264 video stream is valid (I can decode/render it) and Live555 is sending traffic over the wire (I see it coming in VLC, in task manager, etc.). My goal is to get to the point where VLC can render it, and I think I'm pretty close. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yorksun at freescale.com Mon Mar 30 14:58:48 2009 From: yorksun at freescale.com (York Sun) Date: Mon, 30 Mar 2009 16:58:48 -0500 Subject: [Live-devel] stream m4e Message-ID: <1238450328.5328.178.camel@oslab-l1> I am new to this list. Tried to search the archive but didn't find a good way. In short, I have difficulty to stream m4e by testMPEG4VideoStreamer. The m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured from my Axis camera, 640x480. I can see the picture with wrong color. vlc and mplayer display different wrong color. I don't know which part is wrong. More info, if adding -i or -4 to openRTSP, the avi or mp4 file can be correctly played by vlc and mplayer. I don't know how to verify the m4e file. I can stream the sample para.m4e file. Only the very beginning shows wrong color. Any help is appreciated. York From bitter at vtilt.com Mon Mar 30 17:51:12 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Mon, 30 Mar 2009 20:51:12 -0400 Subject: [Live-devel] stream m4e In-Reply-To: <1238450328.5328.178.camel@oslab-l1> References: <1238450328.5328.178.camel@oslab-l1> Message-ID: You can use ffmpeg to wrap the file into a mp4 container. I think the command is just the following: ./ffmpeg -i video-MP4V-ES-1 out.mp4 You should then be able to open the file with VLC. Also, ffmpeg will usually tell you if there are problems with the bitstream. I have also just renamed the file with a .m4v extension and VLC can play it directly without any container. I have used openRTSP in the past to capture a stream from an Axis camera and it worked fine. You might also want to turn the debug log level up in vlc and see if it reports any errors. - Brad On Mar 30, 2009, at 5:58 PM, York Sun wrote: > I am new to this list. Tried to search the archive but didn't find a > good way. > > In short, I have difficulty to stream m4e by testMPEG4VideoStreamer. > The > m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured > from my > Axis camera, 640x480. I can see the picture with wrong color. vlc and > mplayer display different wrong color. I don't know which part is > wrong. > > More info, if adding -i or -4 to openRTSP, the avi or mp4 file can be > correctly played by vlc and mplayer. I don't know how to verify the > m4e > file. I can stream the sample para.m4e file. Only the very beginning > shows wrong color. > > Any help is appreciated. > > York > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From yorksun at freescale.com Mon Mar 30 19:34:06 2009 From: yorksun at freescale.com (sun york-R58495) Date: Mon, 30 Mar 2009 20:34:06 -0600 Subject: [Live-devel] stream m4e References: <1238450328.5328.178.camel@oslab-l1> Message-ID: Thanks for the hint. vlc can play the file with .m4v extension. That at least proves the openRTSP works OK. I still don't know why testMPEG4VideoStreamer sends the vague picture with wrong color. Is it resolution issue? The sample para.m4e file streams OK. It has low resolution, though. York -----Original Message----- From: live-devel-bounces at ns.live555.com on behalf of Brad Bitterman Sent: Mon 3/30/2009 19:51 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] stream m4e You can use ffmpeg to wrap the file into a mp4 container. I think the command is just the following: ./ffmpeg -i video-MP4V-ES-1 out.mp4 You should then be able to open the file with VLC. Also, ffmpeg will usually tell you if there are problems with the bitstream. I have also just renamed the file with a .m4v extension and VLC can play it directly without any container. I have used openRTSP in the past to capture a stream from an Axis camera and it worked fine. You might also want to turn the debug log level up in vlc and see if it reports any errors. - Brad On Mar 30, 2009, at 5:58 PM, York Sun wrote: > I am new to this list. Tried to search the archive but didn't find a > good way. > > In short, I have difficulty to stream m4e by testMPEG4VideoStreamer. > The > m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured > from my > Axis camera, 640x480. I can see the picture with wrong color. vlc and > mplayer display different wrong color. I don't know which part is > wrong. > > More info, if adding -i or -4 to openRTSP, the avi or mp4 file can be > correctly played by vlc and mplayer. I don't know how to verify the > m4e > file. I can stream the sample para.m4e file. Only the very beginning > shows wrong color. > > Any help is appreciated. > > York > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Mar 30 20:51:52 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Mar 2009 20:51:52 -0700 Subject: [Live-devel] Framed Source Issue In-Reply-To: <73833378E80044458EC175FF8C1E63D572464B823D@GVW0433EXB.americas.hpqcorp.ne t> References: <9EC7F671-9F1D-4AD6-A1E2-EABDCB3CFAD8@live555.com> <73833378E80044458EC175FF8C1E63D57245CCE11C@GVW0433EXB.americas.hpqcorp.ne t> <40CE587C-0014-4776-82D8-D9BCAFB6ACBC@live555.com> <73833378E80044458EC175FF8C1E63D572464B823D@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: >However, audio still gets called much more often than it needs to as >per the fDurationInMicroseconds prameter. Remember, You Have Complete Source Code. I have told you *repeatedly* what you need to look at to figure out why your problem is happening. Why do you keep ignoring my advice?? >Two more questions: > >1. If I encoded discrete video frames using H.264 (say using >ffmpeg), should I directly feed to H264VideoRTPSink, or should I use >H264VideoStreamFramer in between? No, you must use (your own subclass of ) "H264VideoStreamFramer" inbetween your H.264 video source object and your "H264VideoRTPSink". > >2. If I used a "proprietary" discrete video frame encoder and >decoder, what is the best way to use the live libraries to stream >and receive. I am planning to use a derived FileSink class at the >client end to receive and decode the elementary stream, but I am not >sure of the server side. If by "proprietary" you mean a proprietary codec (rather than just a proprietary implementation of a public codec), then the answer is that you can't - unless a RTP payload format has been defined for this codec. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Mar 30 21:23:09 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Mar 2009 21:23:09 -0700 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: >I'm confused about what information I need to supply to the framer >to get all this to work correctly. So far it seems I need to: > >1. Supply NAL units > >2. Implement currentNALUnitEndsAccessUnit() (for me, this always >returns true since each NAL unit I have seems to correspond with a >whole encoded frame--does that sound right to you?) Yes. > >3. supply SPS/PPS information which is communicated in >sprop_parameter_sets_str (just for clarification: this looks like >"(SPS as base64),(PPS as base64)", right?) Yes. > >4. supply profile_level_id (any hints about this one? I've reviewed >rfc 3984, and i'm still a little fuzzy on what this value should >look like) I suggest consulting the documentation for your H.264 encoder. > >5. Anything else? Remember to set "fPresentationTime" and "fDurationInMicroseconds". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Mar 30 21:58:44 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Mar 2009 21:58:44 -0700 Subject: [Live-devel] Long buffering latency with testMPEG4VideoStreamer on quicktime player In-Reply-To: <200903300903.33732.patbob@imoveinc.com> References: <03f101c9ae93$47084680$6501a8c0@sdelaney2> <200903300903.33732.patbob@imoveinc.com> Message-ID: > > >A secondary and not so urgent issue; I'm testing on a multihomed server >> > and attempting to restrict multicast output to a specific subnet. >> >> That's something that you have to do in your OS (usually by changing >> routing tables). It has nothing to do with our software (which, of >> course, runs above the OS). > >If I understand the code right, the global variable SendingInterfaceAddr >limits which physical interface the multicast goes out on.. its not subnet >control, but could that help in this instance? Perhaps, but for that to work the OS probably would still need to route multicast on that interface. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From RGlobisch at csir.co.za Tue Mar 31 02:56:08 2009 From: RGlobisch at csir.co.za (Ralf Globisch) Date: Tue, 31 Mar 2009 11:56:08 +0200 Subject: [Live-devel] End RTSP client session cleanly References: <49D0DCDA0200004D0002EB14@pta-emo.csir.co.za> <49D0E2AF0200004D0002EB27@pta-emo.csir.co.za> <49D204D90200004D0002EBDE@pta-emo.csir.co.za> Message-ID: <49D204D3.5DA9.004D.0@csir.co.za> Hi Ross, We are using the liveMedia RTSP server to deliver media using a source based on the DeviceSource class. What is the correct way to initiate the shutdown of a specific RTSP client session from the server side? This is to cater for the scenario in which an administrator wants to disconnect a specific client (e.g. based on IP). I have located the device source and tried simply calling Medium::close(pDeviceSource); This does cause end the delivery of media to the connected client, but causes an access violation later on once the RtspClientSession times out (as the source pointer is no longer valid). Is there a "correct" way to end an RTSP client session from the server side? Thanks, Kind regards, Ralf -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. MailScanner thanks Transtec Computers for their support. From finlayson at live555.com Tue Mar 31 06:05:43 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Mar 2009 06:05:43 -0700 Subject: [Live-devel] Searching this mailing list's archives, FYI In-Reply-To: <1238450328.5328.178.camel@oslab-l1> References: <1238450328.5328.178.camel@oslab-l1> Message-ID: >Tried to search the archive but didn't find a good way. FYI, you can search this mailing list's archives using Google, by adding site:lists.live555.com to your search query. Other search engines probably also let you do something similar. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 31 06:16:03 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Mar 2009 06:16:03 -0700 Subject: [Live-devel] End RTSP client session cleanly In-Reply-To: <49D204D3.5DA9.004D.0@csir.co.za> References: <49D0DCDA0200004D0002EB14@pta-emo.csir.co.za> <49D0E2AF0200004D0002EB27@pta-emo.csir.co.za> <49D204D90200004D0002EBDE@pta-emo.csir.co.za> <49D204D3.5DA9.004D.0@csir.co.za> Message-ID: >What is the correct way to initiate the shutdown of a specific >RTSP client session from the server side? Right now there isn't a clean way to do this, unfortunately. If you know the "RTSPServer::RTSPClientSession" object, then you could try deleting it; that will likely work. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yorksun at freescale.com Tue Mar 31 07:16:59 2009 From: yorksun at freescale.com (York Sun) Date: Tue, 31 Mar 2009 09:16:59 -0500 Subject: [Live-devel] Searching this mailing list's archives, FYI In-Reply-To: References: <1238450328.5328.178.camel@oslab-l1> Message-ID: <1238509019.3109.0.camel@oslab-l1> On Tue, 2009-03-31 at 06:05 -0700, Ross Finlayson wrote: > >Tried to search the archive but didn't find a good way. > > FYI, you can search this mailing list's archives using Google, by adding > site:lists.live555.com > to your search query. > > Other search engines probably also let you do something similar. Great tips. It is much easier to search now. York From patbob at imoveinc.com Tue Mar 31 08:36:45 2009 From: patbob at imoveinc.com (Patrick White) Date: Tue, 31 Mar 2009 08:36:45 -0700 Subject: [Live-devel] End RTSP client session cleanly In-Reply-To: References: <49D0DCDA0200004D0002EB14@pta-emo.csir.co.za> <49D204D3.5DA9.004D.0@csir.co.za> Message-ID: <200903310836.45050.patbob@imoveinc.com> On Tuesday 31 March 2009 6:16 am, Ross Finlayson wrote: > >What is the correct way to initiate the shutdown of a specific > >RTSP client session from the server side? > > Right now there isn't a clean way to do this, unfortunately. If you > know the "RTSPServer::RTSPClientSession" object, then you could try > deleting it; that will likely work. That does work -- it's what we do and how inactive sessions are terminated by the server. From my examining of the code, it looks like it cleans everything up properly too :). However, the client gets no notification that it happened -- the RTSP stream is just rudely closed and RTP data stops arriving. This gets back to the RTCP BYE message not gettng sent issue Matt was talking about a few weeks ago. FYI, you'll have to invent a mechanism to be able to get a pointer to the running RTCPClientSession instance so you can call its destructor.. and of course you can only do it safely via a scheduled task. From bitter at vtilt.com Tue Mar 31 12:49:52 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Tue, 31 Mar 2009 15:49:52 -0400 Subject: [Live-devel] stream m4e In-Reply-To: References: <1238450328.5328.178.camel@oslab-l1> Message-ID: Not sure what would cause a color issue like what you're describing.... Live555 doesn't modify the frame data in any way so chroma info should not be modified. - Brad On Mar 30, 2009, at 10:34 PM, sun york-R58495 wrote: > Thanks for the hint. vlc can play the file with .m4v extension. That > at least proves the openRTSP works OK. I still don't know why > testMPEG4VideoStreamer sends the vague picture with wrong color. Is > it resolution issue? The sample para.m4e file streams OK. It has low > resolution, though. > > York > > > > -----Original Message----- > From: live-devel-bounces at ns.live555.com on behalf of Brad Bitterman > Sent: Mon 3/30/2009 19:51 > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] stream m4e > > You can use ffmpeg to wrap the file into a mp4 container. I think the > command is just the following: > > ./ffmpeg -i video-MP4V-ES-1 out.mp4 > > You should then be able to open the file with VLC. Also, ffmpeg will > usually tell you if there are problems with the bitstream. I have also > just renamed the file with a .m4v extension and VLC can play it > directly without any container. > > I have used openRTSP in the past to capture a stream from an Axis > camera and it worked fine. You might also want to turn the debug log > level up in vlc and see if it reports any errors. > > - Brad > > On Mar 30, 2009, at 5:58 PM, York Sun wrote: > > > I am new to this list. Tried to search the archive but didn't find a > > good way. > > > > In short, I have difficulty to stream m4e by testMPEG4VideoStreamer. > > The > > m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured > > from my > > Axis camera, 640x480. I can see the picture with wrong color. vlc > and > > mplayer display different wrong color. I don't know which part is > > wrong. > > > > More info, if adding -i or -4 to openRTSP, the avi or mp4 file can > be > > correctly played by vlc and mplayer. I don't know how to verify the > > m4e > > file. I can stream the sample para.m4e file. Only the very beginning > > shows wrong color. > > > > Any help is appreciated. > > > > York > > > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From yorksun at freescale.com Tue Mar 31 13:24:54 2009 From: yorksun at freescale.com (York Sun) Date: Tue, 31 Mar 2009 15:24:54 -0500 Subject: [Live-devel] stream m4e In-Reply-To: References: <1238450328.5328.178.camel@oslab-l1> Message-ID: <1238531094.23779.8.camel@oslab-l1> Brad, I should be clear about the color. It is not random color. It is all green, with a little bit shape here and there under vlc. mplayer shows most black. I think it is something related to multicast. Just found an old post in archive http://lists.live555.com/pipermail/live-devel/2008-August/009323.html. Ross seems to believe the missing key frame causes the green. I tried live555MediaServer and it streams OK. Here is what I am trying to do. I want to capture rtsp steams from multiple IP cameras (MPEG4 or H.264), save the stream to the hard drive, and also re-stream the video to unicast and/or multicast addresses. I am hoping I can use the openRTSP and testMPEG4VideoStreamer as template and put them together. Now it seems to be a little bit difficult for multicasting. Please advise if it is even possible, or someone already did it. Thanks, York On Tue, 2009-03-31 at 15:49 -0400, Brad Bitterman wrote: > Not sure what would cause a color issue like what you're > describing.... Live555 doesn't modify the frame data in any way so > chroma info should not be modified. > > > - Brad > > On Mar 30, 2009, at 10:34 PM, sun york-R58495 wrote: > > > Thanks for the hint. vlc can play the file with .m4v extension. That > > at least proves the openRTSP works OK. I still don't know why > > testMPEG4VideoStreamer sends the vague picture with wrong color. Is > > it resolution issue? The sample para.m4e file streams OK. It has low > > resolution, though. > > > > York > > > > > > > > -----Original Message----- > > From: live-devel-bounces at ns.live555.com on behalf of Brad Bitterman > > Sent: Mon 3/30/2009 19:51 > > To: LIVE555 Streaming Media - development & use > > Subject: Re: [Live-devel] stream m4e > > > > You can use ffmpeg to wrap the file into a mp4 container. I think > > the > > command is just the following: > > > > ./ffmpeg -i video-MP4V-ES-1 out.mp4 > > > > You should then be able to open the file with VLC. Also, ffmpeg > > will > > usually tell you if there are problems with the bitstream. I have > > also > > just renamed the file with a .m4v extension and VLC can play it > > directly without any container. > > > > I have used openRTSP in the past to capture a stream from an Axis > > camera and it worked fine. You might also want to turn the debug > > log > > level up in vlc and see if it reports any errors. > > > > - Brad > > > > On Mar 30, 2009, at 5:58 PM, York Sun wrote: > > > > > I am new to this list. Tried to search the archive but didn't find > > a > > > good way. > > > > > > In short, I have difficulty to stream m4e by > > testMPEG4VideoStreamer. > > > The > > > m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured > > > from my > > > Axis camera, 640x480. I can see the picture with wrong color. vlc > > and > > > mplayer display different wrong color. I don't know which part is > > > wrong. > > > > > > More info, if adding -i or -4 to openRTSP, the avi or mp4 file can > > be > > > correctly played by vlc and mplayer. I don't know how to verify > > the > > > m4e > > > file. I can stream the sample para.m4e file. Only the very > > beginning > > > shows wrong color. > > > > > > Any help is appreciated. > > > > > > York > > > > > > _______________________________________________ > > > live-devel mailing list > > > live-devel at lists.live555.com > > > http://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > > > > > > > > > > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From bitter at vtilt.com Tue Mar 31 13:55:17 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Tue, 31 Mar 2009 16:55:17 -0400 Subject: [Live-devel] stream m4e In-Reply-To: <1238531094.23779.8.camel@oslab-l1> References: <1238450328.5328.178.camel@oslab-l1> <1238531094.23779.8.camel@oslab-l1> Message-ID: <64DCF11D-D0F4-4078-B4F6-F7876FED062A@vtilt.com> Oh! I think I know exactly what your seeing. I have seen this before many times with VLC. What happens for me is that the time between i- frames is very long. When I connect with VLC i only get p-frames for a while. The green is because the initial decoded picture buffer in VLC is filled with all zeros which in YUV land is green. The little shapes you see are the motion vectors that are being decoded against the green buffer. How long is the clip you captured? What is the key frame interval set to on the Axis camera? - Brad On Mar 31, 2009, at 4:24 PM, York Sun wrote: > Brad, > > I should be clear about the color. It is not random color. It is all > green, with a little bit shape here and there under vlc. mplayer shows > most black. > > I think it is something related to multicast. Just found an old post > in > archive > http://lists.live555.com/pipermail/live-devel/2008-August/009323.html. > Ross seems to believe the missing key frame causes the green. > > I tried live555MediaServer and it streams OK. > > Here is what I am trying to do. I want to capture rtsp steams from > multiple IP cameras (MPEG4 or H.264), save the stream to the hard > drive, > and also re-stream the video to unicast and/or multicast addresses. > I am > hoping I can use the openRTSP and testMPEG4VideoStreamer as template > and > put them together. Now it seems to be a little bit difficult for > multicasting. Please advise if it is even possible, or someone already > did it. > > Thanks, > > York > > > On Tue, 2009-03-31 at 15:49 -0400, Brad Bitterman wrote: >> Not sure what would cause a color issue like what you're >> describing.... Live555 doesn't modify the frame data in any way so >> chroma info should not be modified. >> >> >> - Brad >> >> On Mar 30, 2009, at 10:34 PM, sun york-R58495 wrote: >> >>> Thanks for the hint. vlc can play the file with .m4v extension. That >>> at least proves the openRTSP works OK. I still don't know why >>> testMPEG4VideoStreamer sends the vague picture with wrong color. Is >>> it resolution issue? The sample para.m4e file streams OK. It has low >>> resolution, though. >>> >>> York >>> >>> >>> >>> -----Original Message----- >>> From: live-devel-bounces at ns.live555.com on behalf of Brad Bitterman >>> Sent: Mon 3/30/2009 19:51 >>> To: LIVE555 Streaming Media - development & use >>> Subject: Re: [Live-devel] stream m4e >>> >>> You can use ffmpeg to wrap the file into a mp4 container. I think >>> the >>> command is just the following: >>> >>> ./ffmpeg -i video-MP4V-ES-1 out.mp4 >>> >>> You should then be able to open the file with VLC. Also, ffmpeg >>> will >>> usually tell you if there are problems with the bitstream. I have >>> also >>> just renamed the file with a .m4v extension and VLC can play it >>> directly without any container. >>> >>> I have used openRTSP in the past to capture a stream from an Axis >>> camera and it worked fine. You might also want to turn the debug >>> log >>> level up in vlc and see if it reports any errors. >>> >>> - Brad >>> >>> On Mar 30, 2009, at 5:58 PM, York Sun wrote: >>> >>>> I am new to this list. Tried to search the archive but didn't find >>> a >>>> good way. >>>> >>>> In short, I have difficulty to stream m4e by >>> testMPEG4VideoStreamer. >>>> The >>>> m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured >>>> from my >>>> Axis camera, 640x480. I can see the picture with wrong color. vlc >>> and >>>> mplayer display different wrong color. I don't know which part is >>>> wrong. >>>> >>>> More info, if adding -i or -4 to openRTSP, the avi or mp4 file can >>> be >>>> correctly played by vlc and mplayer. I don't know how to verify >>> the >>>> m4e >>>> file. I can stream the sample para.m4e file. Only the very >>> beginning >>>> shows wrong color. >>>> >>>> Any help is appreciated. >>>> >>>> York >>>> >>>> _______________________________________________ >>>> live-devel mailing list >>>> live-devel at lists.live555.com >>>> http://lists.live555.com/mailman/listinfo/live-devel >>> >>> _______________________________________________ >>> live-devel mailing list >>> live-devel at lists.live555.com >>> http://lists.live555.com/mailman/listinfo/live-devel >>> >>> >>> >>> >>> >>> _______________________________________________ >>> live-devel mailing list >>> live-devel at lists.live555.com >>> http://lists.live555.com/mailman/listinfo/live-devel >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From yorksun at freescale.com Tue Mar 31 14:12:07 2009 From: yorksun at freescale.com (York Sun) Date: Tue, 31 Mar 2009 16:12:07 -0500 Subject: [Live-devel] stream m4e In-Reply-To: <64DCF11D-D0F4-4078-B4F6-F7876FED062A@vtilt.com> References: <1238450328.5328.178.camel@oslab-l1> <1238531094.23779.8.camel@oslab-l1> <64DCF11D-D0F4-4078-B4F6-F7876FED062A@vtilt.com> Message-ID: <1238533927.23779.13.camel@oslab-l1> Brad, My camera is AXIS M1011 Network Camera. I have no idea about the key frame on camera. My recording is 100 seconds lone. However, with testMPEG4VideoStreamer and vlc as a client, I can only see the green picture for about 5 seconds, then the picture freezes and vlc message says ffmpeg decoder error: more than 5 seconds of late video -> dropping frame (computer too slow ?) On the other hand, vlc can play the recorded file if rename it to .m4v. York On Tue, 2009-03-31 at 16:55 -0400, Brad Bitterman wrote: > Oh! I think I know exactly what your seeing. I have seen this before > many times with VLC. What happens for me is that the time between i- > frames is very long. When I connect with VLC i only get p-frames for a > while. The green is because the initial decoded picture buffer in VLC > is filled with all zeros which in YUV land is green. The little shapes > you see are the motion vectors that are being decoded against the > green buffer. > > How long is the clip you captured? What is the key frame interval set > to on the Axis camera? > > - Brad > > On Mar 31, 2009, at 4:24 PM, York Sun wrote: > > > Brad, > > > > I should be clear about the color. It is not random color. It is all > > green, with a little bit shape here and there under vlc. mplayer shows > > most black. > > > > I think it is something related to multicast. Just found an old post > > in > > archive > > http://lists.live555.com/pipermail/live-devel/2008-August/009323.html. > > Ross seems to believe the missing key frame causes the green. > > > > I tried live555MediaServer and it streams OK. > > > > Here is what I am trying to do. I want to capture rtsp steams from > > multiple IP cameras (MPEG4 or H.264), save the stream to the hard > > drive, > > and also re-stream the video to unicast and/or multicast addresses. > > I am > > hoping I can use the openRTSP and testMPEG4VideoStreamer as template > > and > > put them together. Now it seems to be a little bit difficult for > > multicasting. Please advise if it is even possible, or someone already > > did it. > > > > Thanks, > > > > York > > > > > > On Tue, 2009-03-31 at 15:49 -0400, Brad Bitterman wrote: > >> Not sure what would cause a color issue like what you're > >> describing.... Live555 doesn't modify the frame data in any way so > >> chroma info should not be modified. > >> > >> > >> - Brad > >> > >> On Mar 30, 2009, at 10:34 PM, sun york-R58495 wrote: > >> > >>> Thanks for the hint. vlc can play the file with .m4v extension. That > >>> at least proves the openRTSP works OK. I still don't know why > >>> testMPEG4VideoStreamer sends the vague picture with wrong color. Is > >>> it resolution issue? The sample para.m4e file streams OK. It has low > >>> resolution, though. > >>> > >>> York > >>> > >>> > >>> > >>> -----Original Message----- > >>> From: live-devel-bounces at ns.live555.com on behalf of Brad Bitterman > >>> Sent: Mon 3/30/2009 19:51 > >>> To: LIVE555 Streaming Media - development & use > >>> Subject: Re: [Live-devel] stream m4e > >>> > >>> You can use ffmpeg to wrap the file into a mp4 container. I think > >>> the > >>> command is just the following: > >>> > >>> ./ffmpeg -i video-MP4V-ES-1 out.mp4 > >>> > >>> You should then be able to open the file with VLC. Also, ffmpeg > >>> will > >>> usually tell you if there are problems with the bitstream. I have > >>> also > >>> just renamed the file with a .m4v extension and VLC can play it > >>> directly without any container. > >>> > >>> I have used openRTSP in the past to capture a stream from an Axis > >>> camera and it worked fine. You might also want to turn the debug > >>> log > >>> level up in vlc and see if it reports any errors. > >>> > >>> - Brad > >>> > >>> On Mar 30, 2009, at 5:58 PM, York Sun wrote: > >>> > >>>> I am new to this list. Tried to search the archive but didn't find > >>> a > >>>> good way. > >>>> > >>>> In short, I have difficulty to stream m4e by > >>> testMPEG4VideoStreamer. > >>>> The > >>>> m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured > >>>> from my > >>>> Axis camera, 640x480. I can see the picture with wrong color. vlc > >>> and > >>>> mplayer display different wrong color. I don't know which part is > >>>> wrong. > >>>> > >>>> More info, if adding -i or -4 to openRTSP, the avi or mp4 file can > >>> be > >>>> correctly played by vlc and mplayer. I don't know how to verify > >>> the > >>>> m4e > >>>> file. I can stream the sample para.m4e file. Only the very > >>> beginning > >>>> shows wrong color. > >>>> > >>>> Any help is appreciated. > >>>> > >>>> York > >>>> > >>>> _______________________________________________ > >>>> live-devel mailing list > >>>> live-devel at lists.live555.com > >>>> http://lists.live555.com/mailman/listinfo/live-devel > >>> > >>> _______________________________________________ > >>> live-devel mailing list > >>> live-devel at lists.live555.com > >>> http://lists.live555.com/mailman/listinfo/live-devel > >>> > >>> > >>> > >>> > >>> > >>> _______________________________________________ > >>> live-devel mailing list > >>> live-devel at lists.live555.com > >>> http://lists.live555.com/mailman/listinfo/live-devel > >> > >> > >> _______________________________________________ > >> live-devel mailing list > >> live-devel at lists.live555.com > >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From patbob at imoveinc.com Tue Mar 31 13:50:06 2009 From: patbob at imoveinc.com (Patrick White) Date: Tue, 31 Mar 2009 13:50:06 -0700 Subject: [Live-devel] stream m4e In-Reply-To: <1238531094.23779.8.camel@oslab-l1> References: <1238450328.5328.178.camel@oslab-l1> <1238531094.23779.8.camel@oslab-l1> Message-ID: <200903311350.07007.patbob@imoveinc.com> It's been a while since I looked into these sorts of issues, but green is significant on YUV-encoded imagery, such as Jpegs, Mpeg4 and (I believe) H.264. It often means the deocding failed and the max amount of blue & red are being subtracted from the resulting image, making it green (i.e. sick) :) It's possible the key frame was missing or had a decode error and therefore green, and you're seeing deltas applied to that green key frame, hence the little bits of "shape". Maybe some initialization data didn't make it? Hope that's a useful little tidbit. patbob On Tuesday 31 March 2009 1:24 pm, York Sun wrote: > Brad, > > I should be clear about the color. It is not random color. It is all > green, with a little bit shape here and there under vlc. mplayer shows > most black. > > I think it is something related to multicast. Just found an old post in > archive > http://lists.live555.com/pipermail/live-devel/2008-August/009323.html. > Ross seems to believe the missing key frame causes the green. > > I tried live555MediaServer and it streams OK. > > Here is what I am trying to do. I want to capture rtsp steams from > multiple IP cameras (MPEG4 or H.264), save the stream to the hard drive, > and also re-stream the video to unicast and/or multicast addresses. I am > hoping I can use the openRTSP and testMPEG4VideoStreamer as template and > put them together. Now it seems to be a little bit difficult for > multicasting. Please advise if it is even possible, or someone already > did it. > > Thanks, > > York > > On Tue, 2009-03-31 at 15:49 -0400, Brad Bitterman wrote: > > Not sure what would cause a color issue like what you're > > describing.... Live555 doesn't modify the frame data in any way so > > chroma info should not be modified. > > > > > > - Brad > > > > On Mar 30, 2009, at 10:34 PM, sun york-R58495 wrote: > > > Thanks for the hint. vlc can play the file with .m4v extension. That > > > at least proves the openRTSP works OK. I still don't know why > > > testMPEG4VideoStreamer sends the vague picture with wrong color. Is > > > it resolution issue? The sample para.m4e file streams OK. It has low > > > resolution, though. > > > > > > York > > > > > > > > > > > > -----Original Message----- > > > From: live-devel-bounces at ns.live555.com on behalf of Brad Bitterman > > > Sent: Mon 3/30/2009 19:51 > > > To: LIVE555 Streaming Media - development & use > > > Subject: Re: [Live-devel] stream m4e > > > > > > You can use ffmpeg to wrap the file into a mp4 container. I think > > > the > > > command is just the following: > > > > > > ./ffmpeg -i video-MP4V-ES-1 out.mp4 > > > > > > You should then be able to open the file with VLC. Also, ffmpeg > > > will > > > usually tell you if there are problems with the bitstream. I have > > > also > > > just renamed the file with a .m4v extension and VLC can play it > > > directly without any container. > > > > > > I have used openRTSP in the past to capture a stream from an Axis > > > camera and it worked fine. You might also want to turn the debug > > > log > > > level up in vlc and see if it reports any errors. > > > > > > - Brad > > > > > > On Mar 30, 2009, at 5:58 PM, York Sun wrote: > > > > I am new to this list. Tried to search the archive but didn't find > > > > > > a > > > > > > > good way. > > > > > > > > In short, I have difficulty to stream m4e by > > > > > > testMPEG4VideoStreamer. > > > > > > > The > > > > m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured > > > > from my > > > > Axis camera, 640x480. I can see the picture with wrong color. vlc > > > > > > and > > > > > > > mplayer display different wrong color. I don't know which part is > > > > wrong. > > > > > > > > More info, if adding -i or -4 to openRTSP, the avi or mp4 file can > > > > > > be > > > > > > > correctly played by vlc and mplayer. I don't know how to verify > > > > > > the > > > > > > > m4e > > > > file. I can stream the sample para.m4e file. Only the very > > > > > > beginning > > > > > > > shows wrong color. > > > > > > > > Any help is appreciated. > > > > > > > > York > > > > > > > > _______________________________________________ > > > > live-devel mailing list > > > > live-devel at lists.live555.com > > > > http://lists.live555.com/mailman/listinfo/live-devel > > > > > > _______________________________________________ > > > live-devel mailing list > > > live-devel at lists.live555.com > > > http://lists.live555.com/mailman/listinfo/live-devel > > > > > > > > > > > > > > > > > > _______________________________________________ > > > live-devel mailing list > > > live-devel at lists.live555.com > > > http://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Mar 31 14:25:29 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Mar 2009 14:25:29 -0700 Subject: [Live-devel] End RTSP client session cleanly In-Reply-To: <200903310836.45050.patbob@imoveinc.com> References: <49D0DCDA0200004D0002EB14@pta-emo.csir.co.za> <49D204D3.5DA9.004D.0@csir.co.za> <200903310836.45050.patbob@imoveinc.com> Message-ID: > > >What is the correct way to initiate the shutdown of a specific >> >RTSP client session from the server side? >> >> Right now there isn't a clean way to do this, unfortunately. If you >> know the "RTSPServer::RTSPClientSession" object, then you could try >> deleting it; that will likely work. > >That does work -- it's what we do and how inactive sessions are terminated by >the server. From my examining of the code, it looks like it cleans >everything up properly too :). However, the client gets no notification that >it happened -- the RTSP stream is just rudely closed and RTP data stops >arriving. This gets back to the RTCP BYE message not gettng sent issue Matt >was talking about a few weeks ago. Yes. I'll need to fix this... >FYI, you'll have to invent a mechanism to be able to get a pointer to the >running RTCPClientSession instance so you can call its destructor.. The trouble is, more than one client (and thus, more than one "RTSPClientSession") can be using the same "ServerMediaSession" simultaneously. So if you're starting just from the "ServerMediaSession" (e.g., you want to both remove it from the RTSP server, *and* shut down all streams that use it), then you need to do this for all clients (but that's probably what you want anyway). Again, this is a mechanism that I'll need to provide. Stay tuned... > and of >course you can only do it safely via a scheduled task. That's not a problem; in LIVE555 *everything* is a scheduled task. From spider.karma+live555.com at gmail.com Tue Mar 31 13:10:51 2009 From: spider.karma+live555.com at gmail.com (Gordon Smith) Date: Tue, 31 Mar 2009 14:10:51 -0600 Subject: [Live-devel] Read from device file problem Message-ID: <2df568dc0903311310h1aa1c3bcnba70f1a03d0417a9@mail.gmail.com> Hello - After modification for input of filename, testMPEG2TransportStreamer can read from pipe, but not read directly from device file. The fread() in ByteStreamFileSource returns EIO and reads zero bytes for direct read. Device file /dev/video2 is saa7134-empress device on Linux debian 2.6.26-1-686, live-2009-03-22, latest v4l-dvb modules. Using cat redirected to file and live555MediaServer works perfectly for at least 12 minutes (the longest test so far). ? $ cat /dev/video2 > test-video2.ts ? $ ./live555MediaServer ? VLC: rtsp://debian:8554/test-video2.ts Reading /dev/video2 directly results in EIO and zero bytes read. ? $ sudo ./testMPEG2TransportStreamer /dev/video2 ? Play this stream using the URL "rtsp://192.168.168.24/testStream" ? Beginning streaming... ? Beginning to read from file... ? ByteStreamFileSource::doReadFromFile: fread returned 0 ? ByteStreamFileSource::doReadFromFile: ferror = 5 (Input/output error) FYI, using cat piped to testMPEG2TransportStreamer starts well, but begins to continuously lose data after about 2 to 4 minutes. Using setvbuf to increase buffer to 32k or 64k did not help. ? $ cat /dev/video2 | sudo ./testMPEG2TransportStreamer stdin ? VLC: rtsp://debian/testStream Any thoughts as to why using fread on device file could fail? A v4l2 example capture program uses read() on a device file successfully. Would writing a replacement for ByteStreamFileSource that mimics v4l2 code be useful? Thank you, Gordon From RGlobisch at csir.co.za Tue Mar 31 21:47:07 2009 From: RGlobisch at csir.co.za (Ralf Globisch) Date: Wed, 01 Apr 2009 06:47:07 +0200 Subject: [Live-devel] End RTSP client session cleanly In-Reply-To: References: Message-ID: <49D30DE6.5DA9.004D.0@csir.co.za> Ross and Patrick, thanks for the info, that's what I needed to know. ;) -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. MailScanner thanks Transtec Computers for their support.