From mailme_vinaytyagi at yahoo.com Tue May 1 03:58:28 2012 From: mailme_vinaytyagi at yahoo.com (Vinay Tyagi) Date: Tue, 1 May 2012 03:58:28 -0700 (PDT) Subject: [Live-devel] SDP file access In-Reply-To: <1334122388.94963.YahooMailNeo@web113403.mail.gq1.yahoo.com> References: <1333970463.50831.YahooMailNeo@web113408.mail.gq1.yahoo.com> <1334122388.94963.YahooMailNeo@web113403.mail.gq1.yahoo.com> Message-ID: <1335869908.84179.YahooMailNeo@web113401.mail.gq1.yahoo.com> Hi, ? Please guide me how to call .sdp sample files in a test program given in testprogs folder. ? Regards, Vinay -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 1 14:23:13 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 May 2012 14:23:13 -0700 Subject: [Live-devel] SDP file access In-Reply-To: <1335869908.84179.YahooMailNeo@web113401.mail.gq1.yahoo.com> References: <1333970463.50831.YahooMailNeo@web113408.mail.gq1.yahoo.com> <1334122388.94963.YahooMailNeo@web113403.mail.gq1.yahoo.com> <1335869908.84179.YahooMailNeo@web113401.mail.gq1.yahoo.com> Message-ID: > Please guide me how to call .sdp sample files in a test program given in testprogs folder. Each ".sdp" file contains a SDP descriptions for the multicast streams that's sent by the corresponding "test*Streamer" demo application. For example, the file "testMPEG2Transport.sdp" contains a SDP description for the multicast stream that's sent by the "testMPEG2TransportStreamer" demo application. You don't have to use these ".sdp" files. You could instead just run the corresponding "test*Receiver" application, or enable the "test*Streamer" application's built-in RTSP server, and then use a RTSP client to receive the stream. But the ".sdp" files give you another possible way to receive these streams. If you wish, you can use a media player (such as VLC) to open a ".sdp" file, and receive the multicast stream that way. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From csavtche at gmail.com Tue May 1 17:05:25 2012 From: csavtche at gmail.com (Constantin Savtchenko) Date: Tue, 1 May 2012 20:05:25 -0400 Subject: [Live-devel] x264 Video Stream Stops In-Reply-To: References: Message-ID: Hello all, After even further examination of the code, I set Groupsock::DebugLevel = 3. Following is the output once the data stream seems to abruptly stop. Each line appears at around 0.5-1 seconds after each other, and continues endlessly: ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 80 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): wrote 52 bytes, ttl 255 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 80 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 80 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): wrote 28 bytes, ttl 255 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 60 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): wrote 28 bytes, ttl 255 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 60 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): wrote 28 bytes, ttl 255 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 60 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): wrote 28 bytes, ttl 255 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 60 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): wrote 28 bytes, ttl 255 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): read 60 bytes from 192.168.40.131 ??:??:?? Groupsock(376: 192.168.40.131, 49219, 255): wrote 28 bytes, ttl 255 Thanks, Constantin On Tue, May 1, 2012 at 6:56 PM, Constantin Savtchenko wrote: > Hello all, > > I will be using Live555 to read in an RTSP xh264 Video stream. I > started with the testRTSPClient. It runs well for 1-10 seconds (variably), > then stops. The program is still running, but it's not getting any new > frames. > > Examination of the incoming packets shows the following pattern --- > packet 1 (2 bytes), packet 2 (~700 bytes), packet 3 (~700 bytes), repeat > packet 1 (2 bytes), packet 4 (~700 bytes), packet 5(~700 bytes). These 2 > recurring bytes have values of 9 and 48. I am extremely unfamiliar with > h264 but from examining the code, I am guessing they are some form of NAL > types? Start of frame or something? > > Anyways, when testRTSPClient stops getting data, it is always right > before the 2 byte packet is expected. Is it possible something is > occurring in doNextFrame1() and thus the afterGettingFrame callback is > never being fired? Any advice would be appreciated. Thank you. > > Constantin S > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 1 18:04:07 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 May 2012 18:04:07 -0700 Subject: [Live-devel] x264 Video Stream Stops In-Reply-To: References: Message-ID: <2C2B824B-E85E-478E-88D8-4E6E0C0A2E3A@live555.com> > I will be using Live555 to read in an RTSP xh264 Video stream. What is "xh264"? Or did you just mean to say "h264"? > I started with the testRTSPClient. It runs well for 1-10 seconds (variably), then stops. The program is still running, but it's not getting any new frames. Where is this H.264 RTSP stream coming from? I.e., what hardware/software is generating it? Is the stream accessible via a public "rtsp://" URL? If so, let us know, and we'll take a look at it, to try to figure out what's wrong with it. (But if the stream is not publically accessible, then it's hard to say what might be wrong...) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Tue May 1 19:57:07 2012 From: warren at etr-usa.com (Warren Young) Date: Tue, 01 May 2012 20:57:07 -0600 Subject: [Live-devel] RTP/raw UDP stream joining after stream starts In-Reply-To: <851195B3-339D-42B8-8E3D-336482A67375@live555.com> References: <4F9EE557.60209@etr-usa.com> <851195B3-339D-42B8-8E3D-336482A67375@live555.com> Message-ID: <4FA0A283.9020601@etr-usa.com> On 4/30/2012 4:41 PM, Ross Finlayson wrote: > I suspect that the video contains a MPEG "Video Sequence Header" only at > the beginning Good call! I wrote an MPEG start code scanner and it indeed found only one 0x000001B3 in the file. I have the scanner munching on our DV file library now. :) From csavtche at gmail.com Tue May 1 19:28:15 2012 From: csavtche at gmail.com (Constantin Savtchenko) Date: Tue, 1 May 2012 22:28:15 -0400 Subject: [Live-devel] x264 Video Stream Stops In-Reply-To: <2C2B824B-E85E-478E-88D8-4E6E0C0A2E3A@live555.com> References: <2C2B824B-E85E-478E-88D8-4E6E0C0A2E3A@live555.com> Message-ID: Hey Ross, Thank you for the quick response. Yes I meant h264, but I think I put it together with the x264 encoder in my mind while I wrote the email... It has been a long day. The stream is coming from a gstreamer server, written by a colleague. As of right now it is not publicly accessible. I was hoping to get some advice or suggestions, as opposed to the answer on a silver platter. Some thoughts and questions I had: 1) Does my second email have any significance. What is Live555 writing to the socket now and then in the form of 60 bytes. Note that this is after the RTP connection has been negotiated through RTSP. Is this a KeepAlive type mechanism? After examining the code, and doing some light reading, it seems a KeepAlive might be needed as the communication between the server and my testRTSPclient is over UDP. This brings me to my second thought. 2) Do I need a KeepAlive mechanism? Do I implement it with RTSP client's SendGetParameter call with a NULL body? Or does Live555 do this for us. (I dug through the source code but I couldn't find where that 60 bytes is sent, could you actually direct me to the class?) 3) If you were to speculate, and assume that the server is working correctly encoding h264 packets to RTP, what are the consistent 60 bytes reads? (As a sidenote, I am unsure the server is correct, I will sit down tomorrow with my colleague to review.) I hope this wasn't too much at once. I appreciate all the work you've done with the library. Regards, Constantin On Tue, May 1, 2012 at 9:04 PM, Ross Finlayson wrote: > I will be using Live555 to read in an RTSP xh264 Video stream. >> > > What is "xh264"? Or did you just mean to say "h264"? > > > I started with the testRTSPClient. It runs well for 1-10 seconds >> (variably), then stops. The program is still running, but it's not getting >> any new frames. >> > > Where is this H.264 RTSP stream coming from? I.e., what hardware/software > is generating it? > > Is the stream accessible via a public "rtsp://" URL? If so, let us know, > and we'll take a look at it, to try to figure out what's wrong with it. > (But if the stream is not publically accessible, then it's hard to say > what might be wrong...) > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 1 23:33:00 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 May 2012 23:33:00 -0700 Subject: [Live-devel] x264 Video Stream Stops In-Reply-To: References: <2C2B824B-E85E-478E-88D8-4E6E0C0A2E3A@live555.com> Message-ID: <0296C7FE-6C14-4CFB-856E-A8DFD7A2B37A@live555.com> > 1) Does my second email have any significance. What is Live555 writing to the socket now and then in the form of 60 bytes. RTCP "Reception Report" ("RR") packets. But anyway, because your colleague is not using our software to develop your server, I can't help you with it. I suggest that you instead use our "testOnDemandRTSPServer" (or "live555MediaServer") - reading from a H.264 file - as an example of a server that works properly. Rest assured, however, that our "testRTSPClient" application is working properly. > 2) Do I need a KeepAlive mechanism? Do I implement it with RTSP client's SendGetParameter call with a NULL body? Or does Live555 do this for us. Yes, "testRTSPClient" does this automatically (by sending RTCP "RR" packets). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From csavtche at gmail.com Wed May 2 13:49:12 2012 From: csavtche at gmail.com (Constantin Savtchenko) Date: Wed, 2 May 2012 16:49:12 -0400 Subject: [Live-devel] Returning Subsession/Session in Response Handlers Message-ID: Hey All, I had a design question. It seems that RequestRecords are contained and held until the Response comes in, at which point the "foundRequest" is used to call the handler. How hard would it be to pass the fSubsession or fSession that was passed into the request during the fHandler call? I am interested in the design consideration here. Thank you. Constantin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 2 18:03:23 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 May 2012 18:03:23 -0700 Subject: [Live-devel] Returning Subsession/Session in Response Handlers In-Reply-To: References: Message-ID: > I had a design question. It seems that RequestRecords are contained and held until the Response comes in, at which point the "foundRequest" is used to call the handler. How hard would it be to pass the fSubsession or fSession that was passed into the request during the fHandler call? I am interested in the design consideration here. Thank you. As a general design principle, functions should have unnecessary (or redundant) parameters. Because each "RTSPClient" is for a single stream only, and because each RTSP command will be processed and replied to in order, a "RTSPClient" will always be able to figure out which "ServerMediaSession" (or "ServerMediaSubsession") it needs to use when processing the response handler function. Therefore passing them as parameters to the response handler would be unnecessary. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Thu May 3 00:51:36 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Thu, 3 May 2012 13:21:36 +0530 Subject: [Live-devel] please tell Message-ID: Hi Everyone i made a live media server client through sliverlight (TCP socket and network stream),i am able to read the first chunk(response of GET request) which contain info like:- HTTP/1.1 200 OK\r\nDate: Thu, May 03 2012 07:47:59 GMT\r\nServer: LIVE555 Streaming Media v2011.11.29\r\nLast-Modified: Mon, Mar 26 2012 07:22:03 GMT\r\nContent-Length: 6073\r\nContent-Type: application/vnd.apple.mpegurl\r\n\r\n" After this i am able to read 6073 bytes,then i receive nothing,i am streaming a file of 5MB through sever. Please help to resolve this issue... -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Thu May 3 02:33:57 2012 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Thu, 03 May 2012 10:33:57 +0100 Subject: [Live-devel] StreamReplicator bug deactivating a replica? In-Reply-To: References: <20120430173439.M10739@livingdata.pt> Message-ID: <4FA25105.4080107@livingdata.pt> Hello Ross, sorry for the late reply. I'm afraid replacing the 2 "for" loops with the code you provided generates a segmentation fault in our app. Of course, it may be something in our application so I'll run more tests today and will provide you with more information as soon as I can. Thank you for your attention. Bruno Abreu On 05/01/2012 02:32 AM, Ross Finlayson wrote: > Thanks for the report. Yes, there is a bug in that code, although it > wasn't quite what you thought. Try replacing those two "for" loops with > the following: > > for (StreamReplica* r1 = fReplicasAwaitingCurrentFrame; r1 != NULL;) { > if (r1 == replicaBeingDeactivated) { > if (r1 == fReplicasAwaitingCurrentFrame) fReplicasAwaitingCurrentFrame = > r1->fNext; > r1 = r1->fNext; > replicaBeingDeactivated->fNext = NULL; > break; > } else { > r1 = r1->fNext; > } > } > for (StreamReplica* r2 = fReplicasAwaitingNextFrame; r2 != NULL;) { > if (r2 == replicaBeingDeactivated) { > if (r2 == fReplicasAwaitingNextFrame) fReplicasAwaitingNextFrame = > r2->fNext; > r2 = r2->fNext; > replicaBeingDeactivated->fNext = NULL; > break; > } else { > r2 = r2->fNext; > } > } > } > > If you find any problem with this, then please let us know ASAP. > Otherwise I'll release a new version of the code with this change. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > !DSPAM:500,4f9f3e21140493728916153! > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > !DSPAM:500,4f9f3e21140493728916153! -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. Rua Lu?s de Cam?es, N? 133, 1? B Phone: +351 213622163 1300-357 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From michel.promonet at thalesgroup.com Thu May 3 09:16:11 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Thu, 3 May 2012 18:16:11 +0200 Subject: [Live-devel] OutPacketBuffer::maxsize and multithreading Message-ID: <13013_1336061818_4FA2AF7A_13013_8894_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBD52B16@THSONEA01CMS01P.one.grp> Hi Ross, It seems that we meet, times to times, a problem related to the static variable OutPacketBuffer::maxsize between multiple live555 environment running in different threads. Basically it seems that creating a RTCPClient change the static OutPacketBuffer::maxsize, but if a RTSPSink is created the static is not set to the value we expect. Do you think this scenario could occurs ? Thanks in advance for your support. Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Thu May 3 09:18:50 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Thu, 3 May 2012 18:18:50 +0200 Subject: [Live-devel] OutPacketBuffer::maxsize and multithreading Message-ID: <6756_1336061978_4FA2B01A_6756_16537_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBD52B27@THSONEA01CMS01P.one.grp> Hi Ross, It seems that we meet, times to times, a problem related to the static variable OutPacketBuffer::maxsize between multiple live555 environment running in different threads. Basically it seems that creating a RTCPClient change the static OutPacketBuffer::maxsize, but if a RTSPSink is created the static is not set to the value we expect. Do you think this scenario could occurs ? Thanks in advance for your support. Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 3 22:11:49 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 May 2012 22:11:49 -0700 Subject: [Live-devel] OutPacketBuffer::maxsize and multithreading In-Reply-To: <6756_1336061978_4FA2B01A_6756_16537_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBD52B27@THSONEA01CMS01P.one.grp> References: <6756_1336061978_4FA2B01A_6756_16537_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBD52B27@THSONEA01CMS01P.one.grp> Message-ID: (First, please do not post the same question to the list multiple times.) Yes, this is an issue, and is a good illustration why it's ill-advised to structure a LIVE555-based application using multiple threads. However, in some future release, I'll make the "OutPacketBuffer::maxSize" variable a per-UsageEnvironment value instead of a global variable; this will overcome this problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mwright3 at slb.com Fri May 4 07:37:03 2012 From: mwright3 at slb.com (Martin Wright) Date: Fri, 4 May 2012 14:37:03 +0000 Subject: [Live-devel] raw PCM from TCP socket to RTP In-Reply-To: <96345891-60FC-4035-B5C9-F4DC4F8AC5AE@live555.com> References: <96345891-60FC-4035-B5C9-F4DC4F8AC5AE@live555.com> Message-ID: Hi Ross I know you won't be able to debug my code but can you say if I'm going in the right direction or if there is anything obviously missing from this description? I have written PCMSource.cpp, based on DeviceSource.cpp, in which createNew() creates a socket and connects to the remote endpoint, and doGetNextFrame() does the recv() and then calls deliverFrame() to do the memmove(fTo, receivedData). PCMStreamServerMediaSubsession.cpp is based on WAVAudioFileServerMediaSubsession.cpp but the createNewStreamSource() can hardcode the bits/sample, sampling frequency and number of channels information that would normally come from the .wav file header. These new files compile on CentOS 5. Adding an extra section to testOnDemandRTSPServer.cpp to create a stream name allows me to enter the url and reproduce the speech from the original mp3 podcast I'm using to test this with VLC from a Win 7 workstation. Nice; the raw samples are being converted to a recognisable RTP stream. Unfortunately, I am only getting the first second of a minute's worth of audio data. openRTSP -Q rtsp://134.32.45.50:8554/pcmAudioTest shows that a variable number of packets are being lost in a sequence of tests. I haven't done anything with the TaskScheduler code. I have changed MediaSink.cpp to set OutPacketBuffer::maxSize appropriately for the amount of data I am sending. Thanks Martin From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 25 April 2012 15:45 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] DeviceSource.cpp socket implementation I'm wondering if anybody has a simple socket reading version of the testOnDemandRTSPServer program. What you really mean to ask for is "a subclass of the 'OnDemandServerMediaSubsession' class that reads from a socket". There is such a class that you can use as a model: "MPEG2TransportUDPServerMediaSubsession". Note also how this is used in "testOnDemandRTSPServer". If, however, you are asking about reading from a *stream* (i.e., TCP) socket, rather than from a datagram socket, then you should be able to convert the socket into a "FILE*" (assuming that you're not using Windoze), and instead use "WAVAudioFileServerMediaSubsession" as a model for your new "OnDemandServerMediaSubsession" subclass. (Either way, you will need to write your own "OnDemandServerMediaSubsession" subclass.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 4 13:18:54 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 May 2012 13:18:54 -0700 Subject: [Live-devel] raw PCM from TCP socket to RTP In-Reply-To: References: <96345891-60FC-4035-B5C9-F4DC4F8AC5AE@live555.com> Message-ID: Martin, First, because you're delivering PCM audio, where the frames are very small, you shouldn't need to modify "OutPacketBuffer::maxSize". (That variable needs to be increased only when you are transmitting media with exceptionally large frames (larger than the default value of 60000 bytes). Because that's not the case for you, I don't recommend changing this value - at least not until you fix your other problems.) > I have written PCMSource.cpp, based on DeviceSource.cpp, in which createNew() creates a socket and connects to the remote endpoint, and doGetNextFrame() does the recv() and then calls deliverFrame() to do the memmove(fTo, receivedData). > > PCMStreamServerMediaSubsession.cpp is based on WAVAudioFileServerMediaSubsession.cpp but the createNewStreamSource() can hardcode the bits/sample, sampling frequency and number of channels information that would normally come from the .wav file header. > > These new files compile on CentOS 5. > > Adding an extra section to testOnDemandRTSPServer.cpp to create a stream name allows me to enter the url and reproduce the speech from the original mp3 podcast I?m using to test this with VLC from a Win 7 workstation. Nice; the raw samples are being converted to a recognisable RTP stream. Unfortunately, I am only getting the first second of a minute?s worth of audio data. I suggest that you begin by ensuring that your "PCMSource" class is delivering 'correct' data to its downstream object. To do this, I suggest that - rather than starting with a RTSP server (which is a complex application) - you begin by writing an application that feeds a "PCMSource" into a "FileSink". I.e., write a simple application that - creates a "FileSink" - creates a "PCMSource" - calls "FileSink" -> startPlaying("PCMSource", ...) - calls "doEventLoop();" to enter the LIVE555 event loop. Once you've done this, you should be able to look at the output file to figure out whether or not your "PCMSource" is delivering correct data. (Perhaps also add a WAV header to the beginning of the file, and try playing it.) If your "PCMSource" seems to be delivering correct data, then you should next make sure that it's the right size. Because your "PCMSource" will be delivering into a "RTPSink" (subclass) - i.e., packing the data into outgoing RTP packets - you should make sure that "fFrameSize" is large enough for each delivery. Don't just deliver one audio sample at a time. Instead, deliver as much data as you can, up to the limit of "fMaxSize". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From shiyong.zhang.cn at gmail.com Fri May 4 08:58:03 2012 From: shiyong.zhang.cn at gmail.com (=?GB2312?B?1cXKwNPC?=) Date: Fri, 4 May 2012 23:58:03 +0800 Subject: [Live-devel] A bug when interpreting RTP extend packet. Message-ID: Hi Here is maybe a bug when interpret RTP extend packet. Version live.2012.05.03 File: MultiFrameRTPSource.cpp line 224: void MultiFrameRTPSource::networkReadHandler1() { line 225: ...... ...... line 268: unsigned remExtSize = 4*(extHdr&0xFFFF); Pls pay attension to *line 268.* Here you calculated RTP packet extension size with extHdr length directly, but it should be network bytes order, need to be transfered to host byte order firstly. E.g: unsigned remExtSize = 4 * *ntohs*(extHdr & 0xFFFF); How do you think ? Br Shiyong Zhang -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 4 14:16:32 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 May 2012 14:16:32 -0700 Subject: [Live-devel] A bug when interpreting RTP extend packet. In-Reply-To: References: Message-ID: > line 268: unsigned remExtSize = 4*(extHdr&0xFFFF); > Pls pay attension to line 268. Here you calculated RTP packet extension size with extHdr length directly, > but it should be network bytes order, need to be transfered to host byte order firstly. > > E.g: unsigned remExtSize = 4 * ntohs(extHdr & 0xFFFF); > > How do you think ? No, because we already converted the data to host byte order when we assigned the "extHdr" field, in the previous line (line 267): unsigned extHdr = ntohl(*(u_int32_t*)(bPacket->data())); ADVANCE(4); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ricardo at kafemeeting.com Mon May 7 07:08:49 2012 From: ricardo at kafemeeting.com (Ricardo Acosta) Date: Mon, 7 May 2012 16:08:49 +0200 Subject: [Live-devel] increaseReceiveBufferTo or setReceiveBuffersize double buffer Size in BasicUDPSource class Message-ID: Hi Ross We are calling the class BasicUDPSource in our code, when we use increaseReceiveBufferTo or setReceiveBufferTo we got double the size that we put. Just to be sure we printed out before and after doing the increase/set. std::cerr << "Buffer size before " << getReceiveBufferSize(*env,udpGroupSock->socketNum()) << std::endl; setReceiveBufferTo(*env,udpGroupSock->socketNum(),500*1024); std::cerr << "Buff' size after " << getReceiveBufferSize(*env,udpGroupSock->socketNum()) << std::endl; We got the following informations Buffer size before 126976 (In fact this amount is more than the standard size of 50 x 1024 declared in the BasicUDPSource class) Buffer size after 1024000 which is the double of 500K We are doing something wrong ? Since we are having some artefacts in our video, we dont know if the buffer is cutting up some packets. Thank you Ricardo -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Mon May 7 05:02:03 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Mon, 7 May 2012 17:32:03 +0530 Subject: [Live-devel] http problem Message-ID: i got some raw h264 files from the link http://www.live555.com/liveMedia/public/264/ and converted it into .ts and .tsx and tryed to stream it using live media server using RTSP tunneling over HTTP,my vlc player doesn't play the files when i stream it using server.My vlc player is able to play the .ts files if i play them them individually .i.e.not streaming through live media server. What is the problem.??? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 7 16:25:01 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 May 2012 16:25:01 -0700 Subject: [Live-devel] http problem In-Reply-To: References: Message-ID: > i got some raw h264 files from the link http://www.live555.com/liveMedia/public/264/ and converted it into .ts and .tsx and tryed to stream it using live media server using RTSP tunneling over HTTP,my vlc player doesn't play the files when i stream it using server.My vlc player is able to play the .ts files if i play them them individually .i.e.not streaming through live media server. > What is the problem.??? At least some versions of VLC have a problem receiving RTSP/RTP streams tunneled over HTTP. (Unfortunately, whenever I mention this to the VLC folks, they're not able to reproduce the problem themselves, so the problem doesn't get fixed :-( However, the server still supports this OK; you can verify this by running the "openRTSP" client with the "-T " option. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 7 16:55:38 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 May 2012 16:55:38 -0700 Subject: [Live-devel] increaseReceiveBufferTo or setReceiveBuffersize double buffer Size in BasicUDPSource class In-Reply-To: References: Message-ID: <67E1DDAE-D0C1-4327-A7E8-3BE4890AD42A@live555.com> > We are doing something wrong ? I don't think so. "getReceiveBufferSize()" does, indeed, return the actual OS internal buffer size (obtained by calling "getsockopt(..., SOL_SOCKET, ...)".) > Since we are having some artefacts in our video, we dont know if the buffer is cutting up some packets. Sometimes network packet loss really happens. UDP is not TCP. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Tue May 8 01:19:11 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Tue, 8 May 2012 01:19:11 -0700 Subject: [Live-devel] http problem1 Message-ID: Hey Ross Can i know which version of vlc player all the h264 files that are converted into ts through the live media h264 to transport stream program??? Is there any other player that support the above issue? Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From ricardo at kafemeeting.com Tue May 8 06:02:12 2012 From: ricardo at kafemeeting.com (Ricardo Acosta) Date: Tue, 8 May 2012 15:02:12 +0200 Subject: [Live-devel] increaseReceiveBufferTo or setReceiveBuffersize double buffer Size in BasicUDPSource class In-Reply-To: <67E1DDAE-D0C1-4327-A7E8-3BE4890AD42A@live555.com> References: <67E1DDAE-D0C1-4327-A7E8-3BE4890AD42A@live555.com> Message-ID: > > I don't think so. "getReceiveBufferSize()" does, indeed, return the > actual OS internal buffer size (obtained by calling "getsockopt(..., > SOL_SOCKET, ...)".) > > OK, But If buffer's size is 126976 (standard size before any increaseReceiveBuffer or setReceiveBuffer) and we made setReceiveBufferTo(*env,udpGroupSock->socketNum(),500*1024); Final buffer's size should be 500*1024 = 512.000 not 1.024.000 as we had when we ask "getReceiveBufferSize(*env,udpGroupSock->socketNum()) " Thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Wed May 9 01:48:07 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Wed, 9 May 2012 10:48:07 +0200 Subject: [Live-devel] OutPacketBuffer::maxsize and multithreading In-Reply-To: References: <6756_1336061978_4FA2B01A_6756_16537_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBD52B27@THSONEA01CMS01P.one.grp> Message-ID: <2453_1336553346_4FAA2F82_2453_4765_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBE30520@THSONEA01CMS01P.one.grp> Hi Ross, Sorry for double post, this is due to mishunderstandin between live-devel at lists.live555.com and live-devel at ns.live555.com. It seems both point to same mailing lists. We will wait for per-UsageEnviroment implementation of OutPacketBuffer::maxSize. Thanks, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : vendredi 4 mai 2012 07:12 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] OutPacketBuffer::maxsize and multithreading (First, please do not post the same question to the list multiple times.) Yes, this is an issue, and is a good illustration why it's ill-advised to structure a LIVE555-based application using multiple threads. However, in some future release, I'll make the "OutPacketBuffer::maxSize" variable a per-UsageEnvironment value instead of a global variable; this will overcome this problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Wed May 9 01:38:38 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Wed, 9 May 2012 14:08:38 +0530 Subject: [Live-devel] testh264streamer error Message-ID: Hey Ross just tell me one thing i am streaming a file through "testh264VideoStreamer.cpp",when i receive open the network stream through vlc player or mplayer i am getting the following responses:- mplayer response:- Playing rtsp://192.168.0.1/testStream. Connecting to server 192.168.0.1[192.168.0.1]: 554... librtsp: server responds: 'RTSP/1.0 400 Bad Request' STREAM_LIVE555, URL: rtsp://192.168.0.1/testStream Stream not seekable! file format detected. ID_VIDEO_ID=0 Initiated "video/MP2T" RTP subsession on port 18888 No Transport Stream sync byte in data.Stream url is not set! Invalid seek to negative position fffffffffffffff8! Invalid seek to negative position ffffffffffffffff! Stream url is not set! Vlc player response:- main debug: adding item `rtsp://192.168.0.1/testStream' ( rtsp:// 192.168.0.1/testStream ) qt4 debug: Adding a new MRL to recent ones: rtsp://192.168.0.1/testStream main debug: rebuilding array of current - root Playlist main debug: rebuild done - 2 items, index 0 main debug: processing request item: rtsp://192.168.0.1/testStream, node: null, skip: 0 main debug: no fetch required for (null) (art currently (null)) main debug: resyncing on rtsp://192.168.0.1/testStream main debug: rtsp://192.168.0.1/testStream is at 1 main debug: starting playback of the new playlist item main debug: creating new input thread main debug: Creating an input for 'rtsp://192.168.0.1/testStream' main debug: using timeshift granularity of 50 MiB, in path 'C:\DOCUME~1\IISU05\LOCALS~1\Temp' main debug: `rtsp://192.168.0.1/testStream' gives access `rtsp' demux `' path `192.168.0.1/testStream' main debug: creating demux: access='rtsp' demux='' location=' 192.168.0.1/testStream' file='\\192.168.0.1\testStream' main debug: looking for access_demux module: 1 candidate live555 debug: version 2011.12.23 qt4 debug: IM: Setting an input live555 debug: RTP subsession 'video/MP2T' live555 error: SETUP of'video/MP2T' failed 461 Unsupported Transport live555 debug: setup start: 0.000000 stop:0.000000 live555 error: Nothing to play for rtsp://192.168.0.1/testStream main debug: no access_demux module matching "rtsp" could be loaded main debug: TIMER module_need() : 471.120 ms - Total 471.120 ms / 1 intvls (Avg 471.120 ms) main debug: creating access 'rtsp' location='192.168.0.1/testStream', path='\\192.168.0.1\testStream' main debug: looking for access module: 1 candidate main debug: net: connecting to 192.168.0.1 port 554 main debug: connection succeeded (socket = 1656) access_realrtsp debug: rtsp connected access_realrtsp warning: only real/helix rtsp servers supported for now main debug: no access module matching "rtsp" could be loaded main debug: TIMER module_need() : 1001.017 ms - Total 1001.017 ms / 1 intvls (Avg 1001.017 ms) main error: open of `rtsp://192.168.0.1/testStream' failed main debug: finished input main debug: dead input main debug: changing item without a request (current 1/2) main debug: nothing to play qt4 debug: IM: Deleting the input main debug: TIMER input launching for 'rtsp://192.168.0.1/testStream' : 1484.100 ms - Total 1484.100 ms / 1 intvls (Avg 1484.100 ms) *what might be happening and what should i do to resolve.* *Actually i am on a one to one network(i have 2 machines in one to one network one one machine i am streaming and on other i am receiving).* -------------- next part -------------- An HTML attachment was scrubbed... URL: From sudan at koperasw.com Wed May 9 07:13:02 2012 From: sudan at koperasw.com (Sudan Landge - Kopera) Date: Wed, 09 May 2012 19:43:02 +0530 Subject: [Live-devel] [testH264VideoStreamer] removing schedulerTickTask and incomingReportHandler Message-ID: <4FAA7B6E.9080105@koperasw.com> Hi Ross, I am taking reference of the sample app testH264VideoStreamer. I found that there are three processes are scheduled for this app which are: -> schedulerTickTask -> RTCPInstance -> MultiFramedRTPSink My requirement is only of getting h264 data and streaming it, so I did a little experiment: I commented the call to RTCPInstance::incomingConnectionHandler1, I commented scheduling of schedulerTickTask and RTCPInstance::onExpire After doing this I compiled the code and tested it using vlc on the client side. It worked as before. So now my question is that: if these tasks are unnecessary, can I remove them? if not (i.e. if these tasks are required) then, could you please brief me on the impact, by doing the above changes? Thanks and regards, Sudan From finlayson at live555.com Wed May 9 13:44:35 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 May 2012 13:44:35 -0700 Subject: [Live-devel] testh264streamer error In-Reply-To: References: Message-ID: > live555 error: SETUP of'video/MP2T' failed 461 Unsupported Transport That's your problem. The server is returning this error because your client asked to receive a multicast stream via RTP-over-TCP. You can't do this for multicast streams. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 9 13:51:24 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 May 2012 13:51:24 -0700 Subject: [Live-devel] [testH264VideoStreamer] removing schedulerTickTask and incomingReportHandler In-Reply-To: <4FAA7B6E.9080105@koperasw.com> References: <4FAA7B6E.9080105@koperasw.com> Message-ID: <7DF1B6E3-20A7-4D95-9EDD-DC95140B5C65@live555.com> > My requirement is only of getting h264 data and streaming it, so I did a little experiment: > I commented the call to RTCPInstance::incomingConnectionHandler1, > I commented scheduling of schedulerTickTask and RTCPInstance::onExpire Our code does not contain any function named "incomingConnectionHandler1", or "schedulerTickTask". You appear to be using an old version of the code - which we do not support. Please upgrade to the latest version of the code, and ask your question again. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Wed May 9 15:59:09 2012 From: warren at etr-usa.com (Warren Young) Date: Wed, 09 May 2012 16:59:09 -0600 Subject: [Live-devel] http problem1 In-Reply-To: References: Message-ID: <4FAAF6BD.2060800@etr-usa.com> On 5/8/2012 2:19 AM, i m what i m ~~~~ wrote: > > Can i know which version of vlc player all the h264 files that are > converted into ts through the live media h264 to transport stream program??? Please rephrase this in English. (I know you think you wrote English, but those of us who speak it fluently disagree.) And, I think you should probably do so on the VLC list, because I don't see anything in what you asked that's Live555 specific. From sudan at koperasw.com Wed May 9 23:25:43 2012 From: sudan at koperasw.com (Sudan Landge - Kopera) Date: Thu, 10 May 2012 11:55:43 +0530 Subject: [Live-devel] [testH264VideoStreamer] removing schedulerTickTask and incomingReportHandler Message-ID: <4FAB5F67.7050008@koperasw.com> Hello Ross, Thanks for your reply. I am sorry for the typo; In the mail below, I meant incomingConnectionHandlerRTSP1 and not incomingConnectionHandler1 which is defined in liveMedia/RTSPServer.cpp (liveMedia/include/RTSPServer.hh) and schedulerTickTask is defined in BasicUsageEnvironment/BasicTaskScheduler.cpp I had downloaded the code from http://www.live555.com/liveMedia/public/ and the code has LIVEMEDIA_LIBRARY_VERSION_STRING "2012.04.18". If I am using a wrong version of the code could you please point me to the latest version? Thanks and regards, Sudan Subject: Re: [Live-devel] [testH264VideoStreamer] removing schedulerTickTask and incomingReportHandler Date: Wed, 9 May 2012 13:51:24 -0700 From: Ross Finlayson Reply-To: LIVE555 Streaming Media - development & use To: LIVE555 Streaming Media - development & use > My requirement is only of getting h264 data and streaming it, so I did > a little experiment: > I commented the call to RTCPInstance::incomingConnectionHandler1, > I commented scheduling of schedulerTickTask and RTCPInstance::onExpire Our code does not contain any function named "incomingConnectionHandler1", or "schedulerTickTask". You appear to be using an old version of the code - which we do not support. Please upgrade to the latest version of the code, and ask your question again. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 9 23:49:09 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 May 2012 23:49:09 -0700 Subject: [Live-devel] [testH264VideoStreamer] removing schedulerTickTask and incomingReportHandler In-Reply-To: <4FAB5F67.7050008@koperasw.com> References: <4FAB5F67.7050008@koperasw.com> Message-ID: > I am sorry for the typo; In the mail below, I meant incomingConnectionHandlerRTSP1 and not incomingConnectionHandler1 which is defined in liveMedia/RTSPServer.cpp (liveMedia/include/RTSPServer.hh) OK. > schedulerTickTask is defined in BasicUsageEnvironment/BasicTaskScheduler.cpp Yes, my mistake. > I had downloaded the code from http://www.live555.com/liveMedia/public/ and the code has LIVEMEDIA_LIBRARY_VERSION_STRING "2012.04.18". > If I am using a wrong version of the code could you please point me to the latest version? The latest version of the code is "2012.05.03" - but that's irrelevant to your particular question. Anyway - OF COURSE you shouldn't disable "incomingConnectionHandlerRTSP1()" or "schedulerTickTask()". Why would you even think about doing this?? Those functions weren't just put there on a whim. Once again, a reminder that if you modify the supplied source code, you are much less likely to get any support on this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sidprice at softtools.com Thu May 10 09:55:31 2012 From: sidprice at softtools.com (Sid Price) Date: Thu, 10 May 2012 10:55:31 -0600 Subject: [Live-devel] Archive access Message-ID: <015c01cd2ecd$b7533770$25f9a650$@softtools.com> Hello, I am just starting to evaluate live555 and would like to know if there is a way to access the list archives so I can check if my questions have already been answered before I ask here. Thanks in advance, Sid. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bstump at codemass.com Thu May 10 12:05:11 2012 From: bstump at codemass.com (Barry Stump) Date: Thu, 10 May 2012 12:05:11 -0700 Subject: [Live-devel] Archive access In-Reply-To: <015c01cd2ecd$b7533770$25f9a650$@softtools.com> References: <015c01cd2ecd$b7533770$25f9a650$@softtools.com> Message-ID: > > I am just starting to evaluate live555 and would like to know if there is > a way to access the list archives so I can check if my questions have > already been answered before I ask here. > The archives are available here: http://lists.live555.com/pipermail/live-devel/ The easiest way to search them is to use "site:lists.live555.com" along with your search term in Google. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 10 12:12:39 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 May 2012 12:12:39 -0700 Subject: [Live-devel] Archive access In-Reply-To: <015c01cd2ecd$b7533770$25f9a650$@softtools.com> References: <015c01cd2ecd$b7533770$25f9a650$@softtools.com> Message-ID: <4EFD43C5-7AED-4304-864C-ED522C439864@live555.com> > I am just starting to evaluate live555 and would like to know if there is a way to access the list archives so I can check if my questions have already been answered before I ask here. Yes, the archives are available at http://lists.live555.com/pipermail/live-devel/ (You can also search these archives using Google, by adding "site:lists.live555.com" to your search query.) Also, of course, don't forget to review the FAQ. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From enadler at WatchGuardVideo.com Thu May 10 12:22:00 2012 From: enadler at WatchGuardVideo.com (Eric Nadler) Date: Thu, 10 May 2012 14:22:00 -0500 Subject: [Live-devel] Archive access In-Reply-To: <015c01cd2ecd$b7533770$25f9a650$@softtools.com> Message-ID: You can use Google search with ?site:lists.live555.com?. -- Sincerely, Eric Nadler Senior Software Engineer WatchGuard Video (469) 867-3735 Mobile (972) 423-9778 Fax ENadler at WatchGuardVideo.com http://www.WatchGuardVideo.com Visit our New Blog! 415 Century Parkway, Allen, TX 75013 On 5/10/12 11:55 AM, "Sid Price" wrote: > Hello, > > I am just starting to evaluate live555 and would like to know if there is a > way to access the list archives so I can check if my questions have already > been answered before I ask here. > > Thanks in advance, > Sid. > > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Thu May 10 19:55:00 2012 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Fri, 11 May 2012 03:55:00 +0100 Subject: [Live-devel] StreamReplicator bug deactivating a replica? In-Reply-To: References: <20120430173439.M10739@livingdata.pt> Message-ID: <4FAC7F84.9090109@livingdata.pt> On 05/01/2012 02:32 AM, Ross Finlayson wrote: > Try replacing those two "for" loops with the following: > > for (StreamReplica* r1 = fReplicasAwaitingCurrentFrame; r1 != NULL;) { > if (r1 == replicaBeingDeactivated) { > if (r1 == fReplicasAwaitingCurrentFrame) fReplicasAwaitingCurrentFrame = > r1->fNext; > r1 = r1->fNext; > replicaBeingDeactivated->fNext = NULL; > break; > } else { > r1 = r1->fNext; > } > } > for (StreamReplica* r2 = fReplicasAwaitingNextFrame; r2 != NULL;) { > if (r2 == replicaBeingDeactivated) { > if (r2 == fReplicasAwaitingNextFrame) fReplicasAwaitingNextFrame = > r2->fNext; > r2 = r2->fNext; > replicaBeingDeactivated->fNext = NULL; > break; > } else { > r2 = r2->fNext; > } > } > > If you find any problem with this, then please let us know ASAP. > Otherwise I'll release a new version of the code with this change. Hello Ross. After some days of testing I'm afraid I'm now pretty sure this solution (released on 2012.05.03) is still buggy. Sorry I didn't report it earlier, but I also didn't see it sooner: can you see how, if the replica being removed is not the first on the list, the one preceding it will be left pointing at the removed one? It will! The r1 and r2 pointers don't update the lists. That is done only if the replica being removed is the first element (r1 == fReplicasAwaitingCurrentFrame) or (r2 == fReplicasAwaitingNextFrame). Using this solution we either get a segmentation fault or the following message on standard error: "StreamReplicator::deliverReceivedFrame() Internal Error 2(2,2)!" We've been successfully using the solution I proposed on my first message on this subject and which, I believe, is the most elegant (using a pointer to pointer to iterate over the lists), but the following has also worked flawlessly, although it is longer: if (fReplicasAwaitingCurrentFrame != NULL) { if (replicaBeingDeactivated == fReplicasAwaitingCurrentFrame) { fReplicasAwaitingCurrentFrame = replicaBeingDeactivated->fNext; replicaBeingDeactivated->fNext = NULL; } else { for (StreamReplica* r1 = fReplicasAwaitingCurrentFrame; r1->fNext != NULL; r1 = r1->fNext) { if (r1->fNext == replicaBeingDeactivated) { r1->fNext = replicaBeingDeactivated->fNext; replicaBeingDeactivated->fNext = NULL; break; } } } } if (fReplicasAwaitingNextFrame != NULL) { if (replicaBeingDeactivated == fReplicasAwaitingNextFrame) { fReplicasAwaitingNextFrame = replicaBeingDeactivated->fNext; replicaBeingDeactivated->fNext = NULL; } else { for (StreamReplica* r2 = fReplicasAwaitingNextFrame; r2->fNext != NULL; r2 = r2->fNext) { if (r2->fNext == replicaBeingDeactivated) { r2->fNext = replicaBeingDeactivated->fNext; replicaBeingDeactivated->fNext = NULL; break; } } } } Once again, sorry I didn't see this in time for the current release. Hope the next one will fix this issue. Thank you, Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From finlayson at live555.com Thu May 10 20:26:45 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 May 2012 20:26:45 -0700 Subject: [Live-devel] StreamReplicator bug deactivating a replica? In-Reply-To: <4FAC7F84.9090109@livingdata.pt> References: <20120430173439.M10739@livingdata.pt> <4FAC7F84.9090109@livingdata.pt> Message-ID: <9B632EB7-B857-478A-83DA-414F362F338E@live555.com> > After some days of testing I'm afraid I'm now pretty sure this solution (released on 2012.05.03) is still buggy. Yes, you're right. (Stupidity on my part...) I've now released a new version of the code that includes your proposed solution. Thanks again for your help. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sudan at koperasw.com Fri May 11 00:38:11 2012 From: sudan at koperasw.com (Sudan Landge - Kopera) Date: Fri, 11 May 2012 13:08:11 +0530 Subject: [Live-devel] [live_devel] input from buffer and not file in testH264ViedoStreamer Message-ID: <4FACC1E3.6000607@koperasw.com> Hello Ross, The current "testH264VidesoStreamer" takes input from a file. I want to feed the streamer from a buffer source and not from a file; I read the FAQs which had a similar query of taking H264 data from encoder and for this the DeviceSource.cpp file has to be modified (approach of flushing and taking data from stdin is not possible in my case). Before starting to work on this, I wanted to know if there was any reference code that I could use (i.e. example of implementation of DeviceSource.cpp). If there is a ready reference then it would be easy to start my work. Also could you please let me if my approach (of starting from DeviceSource.cpp) is correct or if there is a better way of doing it. My requirement is reading data from a buffer and not a file. The read pointer of buffer will be given to the streamer each time the buffer has enough data. Thanks and regards, Sudan From Marlon at scansoft.co.za Fri May 11 06:56:45 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Fri, 11 May 2012 15:56:45 +0200 Subject: [Live-devel] AudioInputDevice and different audio host APIs Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8D8D@SSTSVR1.sst.local> Hi Ross, I am using the AudioInputDevice class to stream data from a microphone. I am using the getPortNames function to return the names of the microphones connected to the Windows machine so that the user can select the desired microphone. I have however noticed that the getPortNames function only returns the devices that belong to the system default audio host API, which in my case is MME. My questions are Can the getPortNames function return all the audio input irrespective of which audio host API they belong to? and if so, Can the AudioInputDevice ::createNew function make use of an input that is not part of the system default audio host API? Thank you for all your assistance. ___________________________________ Marlon Reid Web: www.scansoft.co.za Tel: +27 21 913 8664 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 32497 bytes Desc: att83e78.png URL: From nikolai.vorontsov at quadrox.be Fri May 11 07:42:03 2012 From: nikolai.vorontsov at quadrox.be (Nikolai Vorontsov) Date: Fri, 11 May 2012 16:42:03 +0200 Subject: [Live-devel] [live_devel] input from buffer and not file intestH264ViedoStreamer In-Reply-To: <4FACC1E3.6000607@koperasw.com> References: <4FACC1E3.6000607@koperasw.com> Message-ID: Hello Sudan, Please find attached sample of buffered stream. You need to make your own ServerMediaSubsession like: FramedSource* CQxServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) { //TBD estBitrate = 500; // kbps, estimate // Create the video source: if ((m_pSource = CQxByteStreamSource::createNew(envir(), m_Token)) == NULL) return NULL; // Create a framer for the H264 Stream: return H264VideoStreamFramer::createNew(envir(), m_pSource); } and add code to get stream to the CQxByteStreamSource::doGetNextFrame() It works for me. Nikolai. -------------- next part -------------- A non-text attachment was scrubbed... Name: QxByteStreamSource.h Type: application/octet-stream Size: 675 bytes Desc: QxByteStreamSource.h URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: QxByteStreamSource.cpp Type: application/octet-stream Size: 2192 bytes Desc: QxByteStreamSource.cpp URL: From gordu at dvr2010.com Fri May 11 12:27:39 2012 From: gordu at dvr2010.com (Gord Umphrey) Date: Fri, 11 May 2012 15:27:39 -0400 Subject: [Live-devel] How to receive h264 frames in testRTSPClient In-Reply-To: References: Message-ID: <43DF616349A345B0A6D2889A81196508@SmokeyPC> Hi; Correct me if I am wrong, but currently testRTSPClient receives RTP packets. Is there a way to receive the individual h264 frames, Ie. parse the incoming packets and provide each individual h264 frame (each i frame and p frames). Thanks, Gord. From bstump at codemass.com Fri May 11 14:27:57 2012 From: bstump at codemass.com (Barry Stump) Date: Fri, 11 May 2012 14:27:57 -0700 Subject: [Live-devel] How to receive h264 frames in testRTSPClient In-Reply-To: <43DF616349A345B0A6D2889A81196508@SmokeyPC> References: <43DF616349A345B0A6D2889A81196508@SmokeyPC> Message-ID: > > Correct me if I am wrong, but currently testRTSPClient receives RTP > packets. Is there a way to receive the individual h264 frames, Ie. parse > the incoming packets and provide each individual h264 frame (each i frame > and p frames). > > In your MediaSink derived class (called "DummySink" in the testRTSPClient code), the afterGettingFrame() method gets called for each H.264 NAL unit (frame). The Live555 library takes care of removing the RTP headers and reassembling fragmented frames for you. If you need to know the type of frame, look at the NAL unit type, using something like this: int nal_type = fReceiveBuffer[0] &0x1F; envir() << "afterGettingFrame() received NAL unit type " << nal_type << "\n"; -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 11 17:20:46 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 May 2012 17:20:46 -0700 Subject: [Live-devel] AudioInputDevice and different audio host APIs In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8D8D@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8D8D@SSTSVR1.sst.local> Message-ID: <59627DF4-8248-4E3B-B89E-6AFBB30AAE10@live555.com> > I am using the AudioInputDevice class to stream data from a microphone. I am using the getPortNames function to return the names of the microphones connected to the Windows machine so that the user can select the desired microphone. I have however noticed that the getPortNames function only returns the devices that belong to the system default audio host API, which in my case is MME. > > My questions are > > Can the getPortNames function return all the audio input irrespective of which audio host API they belong to? > > and if so, > > Can the AudioInputDevice ::createNew function make use of an input that is not part of the system default audio host API? Remember, You Have Complete Source Code. "AudioInputDevice" is an abstract base class, implemented by a subclass. You are (I presume) using the "WindowsAudioInputDevice" subclass, whose source code is in the "WindowsAudioInputDevice" directory (and has two possible implementations - one that uses Windows' built-in software 'mixer', and one that doesn't). You can review that code to figure out how it works. If you are not happy with either of the current implementations, then you can - if you wish - provide your own. This would be another subclass of "AudioInputDevice" (ideally with a different name than "WindowsAudioInputDevice", to avoid confusion). You would reimplement the "AudioInputDevice::createNew()" function to create an instance of your new subclass, rather than "WindowsAudioInputDevice". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Fri May 11 08:29:13 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Fri, 11 May 2012 20:59:13 +0530 Subject: [Live-devel] is there any provision Message-ID: Hello Ross In your testh264videostreamer application is there any provision that we can give the streaming URL .i.e."rtsp://URL/testStream" ourself by creating a dialog box and that ask for the URL and stream at that URL. Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 11 19:47:25 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 May 2012 19:47:25 -0700 Subject: [Live-devel] is there any provision In-Reply-To: References: Message-ID: <18F82672-9682-4854-B9EF-5ECA0084EBE4@live555.com> > In your testh264videostreamer application is there any provision that we can give the streaming URL .i.e."rtsp://URL/testStream" ourself by creating a dialog box and that ask for the URL and stream at that URL. The "streamName" parameter in "ServerMediaSession::createNew()" specifies the stream name that gets put in the "rtsp://" URL. You can set that to whatever you want. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon May 14 02:32:56 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 14 May 2012 11:32:56 +0200 Subject: [Live-devel] MPEG4GenericRTPSink.cpp:44:65: error: 'tolower' was not declared in this scope In-Reply-To: References: Message-ID: <31953_1336988047_4FB0D18F_31953_2102_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF41AF9@THSONEA01CMS01P.one.grp> I don't know if you find a fix for this ? I faced same problem, and it seems adding #include in MPEG4GenericRTPSink.cpp allow to build live555 again. Regads, Michel. [@@THALES GROUP RESTRICTED@@] -----Message d'origine----- De?: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Josh Envoy??: jeudi 26 avril 2012 04:01 ??: live-devel at ns.live555.com Objet?: [Live-devel] MPEG4GenericRTPSink.cpp:44:65: error: 'tolower' was not declared in this scope g++ -c -Iinclude -I../UsageEnvironment/include -I../groupsock/include -I. -O -DSOCKLEN_T=int -DLOCAL E_NOT_USED -DRTSPCLIENT_SYNCHRONOUS_INTERFACE=1 -D__MINGW32__ -Wall -Wno-deprecated MPEG4GenericRTPS ink.cpp In file included from MPEG4GenericRTPSink.cpp:22:0: include/Locale.hh:52:44: warning: 'typedef' was ignored in this declaration [enabled by default] MPEG4GenericRTPSink.cpp: In constructor 'MPEG4GenericRTPSink::MPEG4GenericRTPSink(UsageEnvironment&, Groupsock*, u_int8_t, u_int32_t, const char*, const char*, const char*, unsigned int)': MPEG4GenericRTPSink.cpp:44:65: error: 'tolower' was not declared in this scope make[1]: *** [MPEG4GenericRTPSink.o] Error 1 make[1]: Leaving directory `/home/Joshua/mplayer/live/liveMedia' make: *** [all] Error 2 I'm getting this error attempting to compile under MinGW/GCC 4.6.2 (from mingw.org). I'm compiling it for MPlayer. _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Mon May 14 02:53:56 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 May 2012 02:53:56 -0700 Subject: [Live-devel] MPEG4GenericRTPSink.cpp:44:65: error: 'tolower' was not declared in this scope In-Reply-To: <31953_1336988047_4FB0D18F_31953_2102_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF41AF9@THSONEA01CMS01P.one.grp> References: <31953_1336988047_4FB0D18F_31953_2102_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF41AF9@THSONEA01CMS01P.one.grp> Message-ID: <70113381-9864-49F0-BA65-8E1DC4C6551C@live555.com> > I faced same problem, and it seems adding #include in MPEG4GenericRTPSink.cpp allow to build live555 again. OK, I'll add this to the next release of the software. Thanks for the note. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Mon May 14 05:34:53 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Mon, 14 May 2012 18:04:53 +0530 Subject: [Live-devel] can the same example be implemented for testh264videostreamer.cpp Message-ID: Hello Ross In your TestOndemandRTSPserver.cpp u have given an example " // A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source:" Can WE do the same example for testh264videostreamer application if yes then please tell how??' Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon May 14 10:15:16 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 14 May 2012 19:15:16 +0200 Subject: [Live-devel] RTSPserver rangeStart > duration with duration=0 Message-ID: <13429_1337015786_4FB13DEA_13429_12703_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF93553@THSONEA01CMS01P.one.grp> Hi Ross, Could you help us to understand the behaviour of RTSP server play when no duration is specified. In RTSPServer.cpp : // Make sure that "rangeStart" and "rangeEnd" (from the client's "Range:" header) have sane values // before we send back our own "Range:" header in our response: if (rangeStart < 0.0) rangeStart = 0.0; else if (rangeStart > duration) rangeStart = duration; if (rangeEnd < 0.0) rangeEnd = 0.0; else if (rangeEnd > duration) rangeEnd = duration; This makes that when client send RTSP play with "Range: npt=25-", the 25 is ignored. Is it possible to change this line with something like " else if (rangeStart > duration && duration!=0) rangeStart = duration;" Thanks for your support. Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 14 14:16:40 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 May 2012 14:16:40 -0700 Subject: [Live-devel] RTSPserver rangeStart > duration with duration=0 In-Reply-To: <13429_1337015786_4FB13DEA_13429_12703_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF93553@THSONEA01CMS01P.one.grp> References: <13429_1337015786_4FB13DEA_13429_12703_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF93553@THSONEA01CMS01P.one.grp> Message-ID: <5D3678FE-7EEC-4DC6-9726-01E8A7D4B7A0@live555.com> > Could you help us to understand the behaviour of RTSP server play when no duration is specified. Quite simply: If a stream has no duration, then seeking is not supported - at all. (Streams without a duration are usually either 'live' streams, or else are from a file for which (because of the media type) we could not determine its duration. In either case, 'seeking' wouldn't really make sense.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 14 17:19:23 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 May 2012 17:19:23 -0700 Subject: [Live-devel] can the same example be implemented for testh264videostreamer.cpp In-Reply-To: References: Message-ID: <05820A12-7B2A-4210-BFEF-9B664748EA75@live555.com> > In your TestOndemandRTSPserver.cpp u have given an example " // A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source:" > > Can WE do the same example for testh264videostreamer application if yes then please tell how??' Yes, you can do this, provided that the H.264 input stream uses RTP/UDP (i.e., not raw-UDP). You would change the way that the input source is created (in the function "play()"). Instead of creating a "ByteStreamFileSource" and feeding it into a "H264VideoStreamFramer", you would create a "H264VideoRTPSource", and feed it into a "H264VideoStreamDiscreteFramer". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sudan at koperasw.com Mon May 14 23:05:40 2012 From: sudan at koperasw.com (Sudan Landge - Kopera) Date: Tue, 15 May 2012 11:35:40 +0530 Subject: [Live-devel] [live_devel] H264FUAFragmenter buffer Message-ID: <4FB1F234.8090303@koperasw.com> Hello Ross, We have done study to see how testH264VideoStreamer works. After step debugging through the code we found that three buffer were created in classes - H264VideoRTPSink - H264FUAFragmenter and - StreamParser Just wanted to know the significance of H264FUAFragmenter buffer. From code it looks like there is data redundancy while copying data in H264FUAFragmenter buffer, as the same work could have been done without it; This is my understanding from a high level view of the code. I might be missing a few points in this as well; could you please correct me if i am wrong. Awaiting for your reply. Thanks Sudan From finlayson at live555.com Tue May 15 00:21:44 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 May 2012 00:21:44 -0700 Subject: [Live-devel] [live_devel] H264FUAFragmenter buffer In-Reply-To: <4FB1F234.8090303@koperasw.com> References: <4FB1F234.8090303@koperasw.com> Message-ID: <27D8E779-F486-4E5A-8EE6-0E2302E29CA1@live555.com> The "H264FUAFragmenter" class was introduced - as an intermediary between "H264VideoRTPSink" and its input source - to handle the (common) case of input H.264 NAL units that are too large to fit into an outgoing RTP packet. In this case, the data gets fragmented over multiple RTP packets. Normally, this can be handled by the "MultiFramedRTPSink" subclass (e.g., "H264VideoRTPSink") only, without requiring an extra intermediate object. However, for some reason that I can't quite remember, the RTP payload format for H.264 video made it necessary to use an intermediate object to handle the fragmentation. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sudan at koperasw.com Tue May 15 01:56:02 2012 From: sudan at koperasw.com (Sudan Landge - Kopera) Date: Tue, 15 May 2012 14:26:02 +0530 Subject: [Live-devel] [live_devel] H264FUAFragmenter buffer Message-ID: <4FB21A22.7080608@koperasw.com> Hello Ross, Thanks for the previous reply. Could you please let us know if, there is there an alternate solution available to stream H.264 video without using the FUAFragmenter's intermediate buffer? If there is no solution available then, what can we do to achieve it (like writing a subclass or so)? Thanks and regards, Sudan Subject: Re: [Live-devel] [live_devel] H264FUAFragmenter buffer Date: Tue, 15 May 2012 00:21:44 -0700 From: Ross Finlayson Reply-To: LIVE555 Streaming Media - development & use To: LIVE555 Streaming Media - development & use The "H264FUAFragmenter" class was introduced - as an intermediary between "H264VideoRTPSink" and its input source - to handle the (common) case of input H.264 NAL units that are too large to fit into an outgoing RTP packet. In this case, the data gets fragmented over multiple RTP packets. Normally, this can be handled by the "MultiFramedRTPSink" subclass (e.g., "H264VideoRTPSink") only, without requiring an extra intermediate object. However, for some reason that I can't quite remember, the RTP payload format for H.264 video made it necessary to use an intermediate object to handle the fragmentation. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mwright3 at slb.com Tue May 15 02:00:25 2012 From: mwright3 at slb.com (Martin Wright) Date: Tue, 15 May 2012 09:00:25 +0000 Subject: [Live-devel] raw PCM from TCP socket to RTP In-Reply-To: References: <96345891-60FC-4035-B5C9-F4DC4F8AC5AE@live555.com> Message-ID: Hi Ross Thanks for your past help with this. I now have a trivial but awkward C++ compilation error: PCMSource.cpp: In destructor ?virtual PCMSource::~PCMSource()?: PCMSource.cpp:192: error: invalid conversion from ?int? to ?Medium*? PCMSource.cpp:192: error: initializing argument 1 of ?static void Medium::close(Medium*)? I want to close the socket but have a name conflict with Medium::close(). The old library functions are not in name spaces and I can't figure out how, or find the technique on the web, to use the standard file descriptor/socket close() function rather than the Media::close() method. Is there a simple solution? Martin From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 04 May 2012 21:19 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] raw PCM from TCP socket to RTP Martin, First, because you're delivering PCM audio, where the frames are very small, you shouldn't need to modify "OutPacketBuffer::maxSize". (That variable needs to be increased only when you are transmitting media with exceptionally large frames (larger than the default value of 60000 bytes). Because that's not the case for you, I don't recommend changing this value - at least not until you fix your other problems.) I have written PCMSource.cpp, based on DeviceSource.cpp, in which createNew() creates a socket and connects to the remote endpoint, and doGetNextFrame() does the recv() and then calls deliverFrame() to do the memmove(fTo, receivedData). PCMStreamServerMediaSubsession.cpp is based on WAVAudioFileServerMediaSubsession.cpp but the createNewStreamSource() can hardcode the bits/sample, sampling frequency and number of channels information that would normally come from the .wav file header. These new files compile on CentOS 5. Adding an extra section to testOnDemandRTSPServer.cpp to create a stream name allows me to enter the url and reproduce the speech from the original mp3 podcast I'm using to test this with VLC from a Win 7 workstation. Nice; the raw samples are being converted to a recognisable RTP stream. Unfortunately, I am only getting the first second of a minute's worth of audio data. I suggest that you begin by ensuring that your "PCMSource" class is delivering 'correct' data to its downstream object. To do this, I suggest that - rather than starting with a RTSP server (which is a complex application) - you begin by writing an application that feeds a "PCMSource" into a "FileSink". I.e., write a simple application that - creates a "FileSink" - creates a "PCMSource" - calls "FileSink" -> startPlaying("PCMSource", ...) - calls "doEventLoop();" to enter the LIVE555 event loop. Once you've done this, you should be able to look at the output file to figure out whether or not your "PCMSource" is delivering correct data. (Perhaps also add a WAV header to the beginning of the file, and try playing it.) If your "PCMSource" seems to be delivering correct data, then you should next make sure that it's the right size. Because your "PCMSource" will be delivering into a "RTPSink" (subclass) - i.e., packing the data into outgoing RTP packets - you should make sure that "fFrameSize" is large enough for each delivery. Don't just deliver one audio sample at a time. Instead, deliver as much data as you can, up to the limit of "fMaxSize". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 15 02:09:16 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 May 2012 02:09:16 -0700 Subject: [Live-devel] [live_devel] H264FUAFragmenter buffer In-Reply-To: <4FB21A22.7080608@koperasw.com> References: <4FB21A22.7080608@koperasw.com> Message-ID: > Could you please let us know if, there is there an alternate solution available to stream H.264 video without using the FUAFragmenter's intermediate buffer? No, not unless all of the input NAL units are small enough to fit (with RTP packetization) into a single outgoing UDP packet. Again, I can't recall all of the details/decisions that were behind this design, but I'm generally not in the habit of writing unnecessary code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 15 02:24:08 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 May 2012 02:24:08 -0700 Subject: [Live-devel] raw PCM from TCP socket to RTP In-Reply-To: References: <96345891-60FC-4035-B5C9-F4DC4F8AC5AE@live555.com> Message-ID: <1C3E7876-5B1C-4475-8435-7538FCF2DC7E@live555.com> > I want to close the socket but have a name conflict with Medium::close(). The old library functions are not in name spaces and I can?t figure out how, or find the technique on the web, to use the standard file descriptor/socket close() function rather than the Media::close() method. Is there a simple solution? You could try calling ::close(socket); If that doesn't work, then just do something like static int closeSocket(int d) { return close(d); } YourMediumSubclass::memberFunction() { closeSocket(socket); } Or, if you have a FID* for the socket, call fclose(fid); instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Tue May 15 06:37:04 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 15 May 2012 15:37:04 +0200 Subject: [Live-devel] RTSPserver rangeStart > duration with duration=0 In-Reply-To: <5D3678FE-7EEC-4DC6-9726-01E8A7D4B7A0@live555.com> References: <13429_1337015786_4FB13DEA_13429_12703_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF93553@THSONEA01CMS01P.one.grp> <5D3678FE-7EEC-4DC6-9726-01E8A7D4B7A0@live555.com> Message-ID: <10350_1337089097_4FB25C49_10350_17903_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBFD3191@THSONEA01CMS01P.one.grp> Hi Ross, The problem we have is not exactly this, if the answer to the DESCRIBE send to the player the duration, when the player send a PLAY with a range "25-" (ommiting duration not setting it to 0) the range is cleared by the RTSPServer implementation. But perhaps it's better to answer to the PLAY with a correct range instead of reseting it using the input range ? Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : lundi 14 mai 2012 23:17 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] RTSPserver rangeStart > duration with duration=0 Could you help us to understand the behaviour of RTSP server play when no duration is specified. Quite simply: If a stream has no duration, then seeking is not supported - at all. (Streams without a duration are usually either 'live' streams, or else are from a file for which (because of the media type) we could not determine its duration. In either case, 'seeking' wouldn't really make sense.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashfaque at iwavesystems.com Tue May 15 06:59:38 2012 From: ashfaque at iwavesystems.com (Ashfaque) Date: Tue, 15 May 2012 19:29:38 +0530 Subject: [Live-devel] Failed to join group: Invalid IP Address Message-ID: <3804BF7B625B4C37962BF9101F5666CE@iwns10> Hi Ross, We are using the Live555 on embedded system with the device acting as an Soft Access Point (Server). No external LAN need to be connected to it. Following is the print on ifconfig on the device: uap0 Link encap:Ethernet HWaddr 00:22:58:77:80:62 inet addr:192.168.25.2 Bcast:192.168.25.255 Mask:255.255.255.0 UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:49982 errors:0 dropped:0 overruns:0 frame:0 TX packets:49641 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:3191690 (3.0 MiB) TX bytes:2680334 (2.5 MiB) The client is an iPad application with the Wifi connected to above device access point. We need to stream a video captured from the above device to iPad. The Test reference used was testH264VideoStreamer. The streaming happens successfully when the device is connected to the LAN but when it is configured as an Access point and similar static IP address is assigned to the iPad the connection is not being established. It shows following error messages: 00:07:36 Groupsock(8: 232.202.29.84, 18888, 255): failed to join group: setsockopt(IP_ADD_MEMBERSHIP) error: No such device Unable to determine our source address: This computer has an invalid IP address: 0.0.0.0 00:07:36 Groupsock(9: 232.202.29.84, 18889, 255): failed to join group: setsockopt(IP_ADD_MEMBERSHIP) error: No such device Unable to determine our source address: This computer has an invalid IP address: 0.0.0.0 Beginning streaming... RTPStreamer::play()---- Play this stream using the URL "rtsp://0.0.0.0:8554/testStream" As IP address is getting printed as 0, feels like groupSock interface is not able to get the device IP Address. What is the necessary change or the configuration needed so that Access point IP Address is used for streaming. The basic socket function which is failing is setsockopt. Please provide necessary input to get the streaming with a access point device. Thanks & Regards, Ashfaque -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 15 08:35:05 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 May 2012 08:35:05 -0700 Subject: [Live-devel] RTSPserver rangeStart > duration with duration=0 In-Reply-To: <10350_1337089097_4FB25C49_10350_17903_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBFD3191@THSONEA01CMS01P.one.grp> References: <13429_1337015786_4FB13DEA_13429_12703_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF93553@THSONEA01CMS01P.one.grp> <5D3678FE-7EEC-4DC6-9726-01E8A7D4B7A0@live555.com> <10350_1337089097_4FB25C49_10350_17903_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBFD3191@THSONEA01CMS01P.one.grp> Message-ID: <5C40F6D1-6392-4D3D-84DB-BA2DB32510B0@live555.com> > The problem we have is not exactly this, if the answer to the DESCRIBE send to the player the duration, when the player send a PLAY with a range "25-" (ommiting duration not setting it to 0) the range is cleared by the RTSPServer implementation. > > But perhaps it's better to answer to the PLAY with a correct range instead of reseting it using the input range ? But the server *is* answering the "PLAY" request with a correct range. It's not doing any seeking at all (because the stream has no duration, seeking to 25 seconds makes no sense); therefore it's response says what it actually does. A response "range of "0-" correct here; a response range of "25-" certainly would not be. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 15 08:40:54 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 May 2012 08:40:54 -0700 Subject: [Live-devel] Failed to join group: Invalid IP Address In-Reply-To: <3804BF7B625B4C37962BF9101F5666CE@iwns10> References: <3804BF7B625B4C37962BF9101F5666CE@iwns10> Message-ID: <009AB714-FB5E-41AC-BFC7-E04BB64103A8@live555.com> > It shows following error messages: > > 00:07:36 Groupsock(8: 232.202.29.84, 18888, 255): failed to join group: setsockopt(IP_ADD_MEMBERSHIP) error: No such device It sounds like you don't have any multicast route set up on this device. (The code uses multicast, initially, to figure out the system's own IP address.) Make sure that you have a route for 224.0.0.0/4 set up on your network interface. Then the code should work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From renatomauro at libero.it Tue May 15 07:38:57 2012 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Tue, 15 May 2012 16:38:57 +0200 Subject: [Live-devel] RTP over TCP - What happens if a client closes the RTP socket gracelessly? Message-ID: Hello Ross. If an RTSP client (not developed by me and sadly not based on Live555) asks for a video, streaming with RTP over TCP, and after some minutes closes the RTP socket gracelessly (bug or black-out), doesn't (or can't) send the RTSP Teardown command, then the Live555 RTSP Server (OnDemandServerMediaSubSession): 1) waits only for the usual liveness timeout due to the lack of arrival of the RTCP RR command and, in case, terminates the matching RTPClientSession; 2) manages the usual liveness timeout; but it terminates the matching RTPClientSession also if the select() function, before the usual liveness timeout has occured, understands that the other end-point is dead (since the TCP level acknowledge is not received even after some retransmissions). Thank you very much, Renato MAURO -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 15 09:06:31 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 May 2012 09:06:31 -0700 Subject: [Live-devel] RTP over TCP - What happens if a client closes the RTP socket gracelessly? In-Reply-To: References: Message-ID: > If an RTSP client (not developed by me and sadly not based on Live555) asks for a video, streaming with RTP over TCP, and after some minutes closes the RTP socket gracelessly (bug or black-out), doesn't (or can't) send the RTSP Teardown command, then the Live555 RTSP Server (OnDemandServerMediaSubSession): > > 1) waits only for the usual liveness timeout due to the lack of arrival of the RTCP RR command and, in case, terminates the matching RTPClientSession; > > 2) manages the usual liveness timeout; but it terminates the matching RTPClientSession also if the select() function, before the usual liveness timeout has occured, understands that the other end-point is dead (since the TCP level acknowledge is not received even after some retransmissions). Just 1. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rcagley at toyon.com Tue May 15 11:01:21 2012 From: rcagley at toyon.com (Richard Cagley) Date: Tue, 15 May 2012 11:01:21 -0700 Subject: [Live-devel] HTTP Live Streaming from a camera advice Message-ID: I'm interested in streaming from a camera using HLS. I've looked at the live555MediaServer application and the liveMedia/DeviceSource.cpp code. I can't seem to modify the live555MediaServer app in order to not make it use a .tsx file, which sorta makes me think the FAQ link http://www.live555.com/liveMedia/faq.html#liveInput is the way to go. As I'm completely new to live555 I was hoping someone could say HLS from a camera is possible and I'm heading in the right direction. Sorry if this is in the mailing lists somewhere. I looked on the key words I thought would reference this and couldn't find what I was looking for. Thanks, -rich From finlayson at live555.com Tue May 15 13:14:02 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 May 2012 13:14:02 -0700 Subject: [Live-devel] HTTP Live Streaming from a camera advice In-Reply-To: References: Message-ID: > I'm interested in streaming from a camera using HLS. I've looked at the > live555MediaServer application and the liveMedia/DeviceSource.cpp code. > I can't seem to modify the live555MediaServer app in order to not make > it use a .tsx file, which sorta makes me think the FAQ link > http://www.live555.com/liveMedia/faq.html#liveInput is the way to go. > As I'm completely new to live555 I was hoping someone could say HLS from > a camera is possible No, unfortunately it's not currently possible to use our server code to support HLS from a live source. The reason for this is that our implementation implements HLS by seeking within the input source (as specified by the segment denoted in the 'm3u' playlist), and live input sources are not seekable. Note line 124 of "liveMedia/RTSPServerSupportingHTTPStreaming.cpp". This is why we currently support HLS only on streams that come from seekable files. (And Transport Stream files, to be seekable, have to have a corresponding ".tsx" file.) To implement HLS on a live source, one would have to develop an appropriate "ServerMediaSubsession" subclass that sits in front of the live source, caching recent data, and presenting this cached data - to the server - as a seekable object. (You'd also have to make changes to the playlist.) This is probably doable, but would be a lot of work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Wed May 16 00:36:38 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Wed, 16 May 2012 13:06:38 +0530 Subject: [Live-devel] Streamings through http Message-ID: Hi Ross I made an an application i which i am streaming the .ts file over http and an RTSP client ,Now i have an two queries:- 1>Can We increase the rate of sending frames through our server??? 2>when i stream the file,when the end of file comes the packets received my client are very slow,in the start and end of file it works great... Can u help me on above two problems.. Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Wed May 16 01:37:49 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Wed, 16 May 2012 10:37:49 +0200 Subject: [Live-devel] RTSPserver rangeStart > duration with duration=0 In-Reply-To: <5C40F6D1-6392-4D3D-84DB-BA2DB32510B0@live555.com> References: <13429_1337015786_4FB13DEA_13429_12703_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF93553@THSONEA01CMS01P.one.grp> <5D3678FE-7EEC-4DC6-9726-01E8A7D4B7A0@live555.com> <10350_1337089097_4FB25C49_10350_17903_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBFD3191@THSONEA01CMS01P.one.grp> <5C40F6D1-6392-4D3D-84DB-BA2DB32510B0@live555.com> Message-ID: <5596_1337157543_4FB367A7_5596_1141_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC00C53C@THSONEA01CMS01P.one.grp> Hi Ross, To precise our situation, we are registering through RTSPClient and we stream through RTSPServer. Then the end of the available stream is moving (beginning also). So we try to find a solution to seek in the stream, that is compatible with RTSP RFC. One of possibility could be using clock in the Range ? or npt with "xxx-now" ? What's your feeling ? Thanks and Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : mardi 15 mai 2012 17:35 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] RTSPserver rangeStart > duration with duration=0 The problem we have is not exactly this, if the answer to the DESCRIBE send to the player the duration, when the player send a PLAY with a range "25-" (ommiting duration not setting it to 0) the range is cleared by the RTSPServer implementation. But perhaps it's better to answer to the PLAY with a correct range instead of reseting it using the input range ? But the server *is* answering the "PLAY" request with a correct range. It's not doing any seeking at all (because the stream has no duration, seeking to 25 seconds makes no sense); therefore it's response says what it actually does. A response "range of "0-" correct here; a response range of "25-" certainly would not be. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 16 02:23:24 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 May 2012 02:23:24 -0700 Subject: [Live-devel] RTSPserver rangeStart > duration with duration=0 In-Reply-To: <5596_1337157543_4FB367A7_5596_1141_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC00C53C@THSONEA01CMS01P.one.grp> References: <13429_1337015786_4FB13DEA_13429_12703_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBF93553@THSONEA01CMS01P.one.grp> <5D3678FE-7EEC-4DC6-9726-01E8A7D4B7A0@live555.com> <10350_1337089097_4FB25C49_10350_17903_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBFD3191@THSONEA01CMS01P.one.grp> <5C40F6D1-6392-4D3D-84DB-BA2DB32510B0@live555.com> <5596_1337157543_4FB367A7_5596_1141_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC00C53C@THSONEA01CMS01P.one.grp> Message-ID: > To precise our situation, we are registering through RTSPClient and we stream through RTSPServer. > Then the end of the available stream is moving (beginning also). Sorry, but this makes no sense to me. I just don't understand what it is that you're trying to seek into, and how. But in any case, the code - in lots of places - has the assumption that "ability to be seeked into" == "has a non-zero duration" so I'm not planning to make any changes to the code. Sorry. However, if you want to implement seeking into your data (and thus have the server return this seek time in its "Range:" response), then I suggest that you reimplement the virtual function virtual float duration() const; in your "ServerMediaSubsession" subclass, so that it returns some value >0. Then you can also reimplement the virtual function virtual void seekStreamSource( ... ) to do whatever you want, and the server should behave how you expect it to. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sudan at koperasw.com Wed May 16 07:06:42 2012 From: sudan at koperasw.com (Sudan Landge - Kopera) Date: Wed, 16 May 2012 19:36:42 +0530 Subject: [Live-devel] simple NAL unit data streamer Message-ID: <4FB3B472.4060006@koperasw.com> Hello Ross, I have a streamer source that continuously gives me NAL unit data. I want to stream this data using live555 streamer. I went through the testH264VideoStreamer application and found out that, it has the following components: ByteStreamFileSource -> StreamPaser -> H264FUAFragmenter -> RTPSink Now the output of "H264FUAFragmenter" code gives you = > H264 data + headers + identifiers (if required i.e. large NAL unit data are present); this output (NAL unit data) is fed to the RTPSink which through RTPInterface sends them over UDP. We already have a source which gives us this data(NAL unit data) so, our requirement is, a streamer which will get this NAL unit data and stream it. In short you can say that we need a streamer which has a NAL unit data source instead of byteStreamFileSource (or something similar to it). Could you please point me to an application that simply takes NAL unit data and streams it ? if there is no such application available then, could you please provide me with guidelines to implement one (maybe writing a subclass or something)? Please correct me wherever I am wrong. Thanks and regards, Sudan From finlayson at live555.com Wed May 16 07:19:47 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 May 2012 07:19:47 -0700 Subject: [Live-devel] simple NAL unit data streamer In-Reply-To: <4FB3B472.4060006@koperasw.com> References: <4FB3B472.4060006@koperasw.com> Message-ID: <566DAC68-A11D-4AB6-AC18-912A203116CD@live555.com> > We already have a source which gives us this data(NAL unit data) so, our requirement is, a streamer which will get this NAL unit data and stream it. If your data source object delivers *discrete* NAL units - i.e., one at a time - instead of in an unstructured byte stream, then you must feed this source into a "H264VideoStreamDIscreteFramer", not a "H264VideoStreamFramer". Of course, in this case, your input data source object will *not* be a "ByteStreamFileSource". Instead, it will be a custom subclass of "FramedSource" (perhaps similar to the "DeviceSource" code) that you have written yourself. Note also that the H.264 NAL units that you deliver to the "H264VideoStreamDIscreteFramer" MUST NOT begin with a MPEG 'start code' (i.e., 0x00000001). To transmit this data via RTP, you use a "H264VideoRTPSink" (unmodified), as usual. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Wed May 16 04:46:34 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Wed, 16 May 2012 17:16:34 +0530 Subject: [Live-devel] issue with testOnDemandRTSPServer Message-ID: Hello Ross I am using your testOnDemandRTSPServer.cpp in which i am using the following part:- // A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source: { char const* streamName = "mpeg2TransportStreamFromUDPSourceTest"; char const* inputAddressStr = "239.255.42.42"; // This causes the server to take its input from the stream sent by the "testMPEG2TransportStreamer" demo application. // (Note: If the input UDP source is unicast rather than multicast, then change this to NULL.) portNumBits const inputPortNum = 1234; // This causes the server to take its input from the stream sent by the "testMPEG2TransportStreamer" demo application. Boolean const inputStreamIsRawUDP = False; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(MPEG2TransportUDPServerMediaSubsession ::createNew(*env, inputAddressStr, inputPortNum, inputStreamIsRawUDP)); rtspServer->addServerMediaSession(sms); char* url = rtspServer->rtspURL(sms); *env << "\n\"" << streamName << "\" stream, from a UDP Transport Stream input source \n\t("; if (inputAddressStr != NULL) { *env << "IP multicast address " << inputAddressStr << ","; } else { *env << "unicast;"; } *env << " port " << inputPortNum << ")\n"; *env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; } for this application i first run the "*testMPEG2TransportStreamer.exe*" ant then run this *testOnDemandRTSPServer.cpp*,it works well for rtsp streaming then i added the *RTSPServerSupportingHTTPStreaming.cpp* and *ByteStream.cpp * then i again ran the *testMPEG2TransportStreamer.exe* and then the * testOnDemandRTSPServer.exe* including the* tunneling file *now when i try to receive the data through* http://URL/mpeg2TransportStreamFromUDPSourceTest,* *my vlc player says:- VLC is unable to open the MRL.* *Do i have to attach any other file to the project???Am i doing something wrong??* Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed May 16 06:01:46 2012 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 16 May 2012 13:01:46 +0000 Subject: [Live-devel] HTTP Live Streaming from a camera advice In-Reply-To: References: Message-ID: <615FD77639372542BF647F5EBAA2DBC2251E9C78@IL-BOL-EXCH01.smartwire.com> I subclassed the MediaSInk to receive from MPEG2traonsportStreamFramer I gorilla subclassed MPEG2TransportStreamMultiplexor calling it MPEG2TransportStreamMultiplexor4ios and MPEG2TransportStreamFromESSource calling it MPEG2TransportStreamFromESSource4iOS The trick I used was to set up a standard filter chain and then modify the classes to insert the necessary PES headers at "apple" defined intervals instead of timed. The media sink then just segments and serves them out. They live in memory only and never hit the disk. Hitme at http://206.205.9.143:9000/application/845.stream/playlist.m3u8 with safari on mac or ipad/iphone This is a wireless security camera in my front window pushed(camera initiates conversation thru firewall) to a desktop PC at work and restreamed HLS. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, May 15, 2012 3:14 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] HTTP Live Streaming from a camera advice I'm interested in streaming from a camera using HLS. I've looked at the live555MediaServer application and the liveMedia/DeviceSource.cpp code. I can't seem to modify the live555MediaServer app in order to not make it use a .tsx file, which sorta makes me think the FAQ link http://www.live555.com/liveMedia/faq.html#liveInput is the way to go. As I'm completely new to live555 I was hoping someone could say HLS from a camera is possible No, unfortunately it's not currently possible to use our server code to support HLS from a live source. The reason for this is that our implementation implements HLS by seeking within the input source (as specified by the segment denoted in the 'm3u' playlist), and live input sources are not seekable. Note line 124 of "liveMedia/RTSPServerSupportingHTTPStreaming.cpp". This is why we currently support HLS only on streams that come from seekable files. (And Transport Stream files, to be seekable, have to have a corresponding ".tsx" file.) To implement HLS on a live source, one would have to develop an appropriate "ServerMediaSubsession" subclass that sits in front of the live source, caching recent data, and presenting this cached data - to the server - as a seekable object. (You'd also have to make changes to the playlist.) This is probably doable, but would be a lot of work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Wed May 16 11:04:10 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Wed, 16 May 2012 11:04:10 -0700 Subject: [Live-devel] Does Test On Demand server support rtsp over http tunneling?? Message-ID: Hello Ross Does Test On Demand server support rtsp over http tunneling in case of A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source: ?? If yes,which files do we have to add to the solution??? Thanks tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 16 12:15:20 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 May 2012 12:15:20 -0700 Subject: [Live-devel] Does Test On Demand server support rtsp over http tunneling?? In-Reply-To: References: Message-ID: <6680D32E-A5C0-4ED6-A94C-809CC04F6524@live555.com> Please stop posting the same message - or close to the same message - to the mailing list multiple times! "testOnDemandRTSPServer" (and the "live555MediaServer") supports RTSP-over-HTTP tunneling (which is really RTSP/RTP/RTCP-over-HTTP tunneling), for *all* streams that it serves. The HTTP port number is displayed at the bottom, after "testOnDemandRTSPServer" starts up. You can verify that RTSP-over-HTTP tunneling works, by running the "openRTSP" client, with the "-T " option. HOWEVER, RTSP-over-HTTP tunneling currently does not work correctly for the VLC media player. We can't do anything about this, because VLC is not our application. Please DO NOT send ANY MORE emails about this, otherwise you will be banned from posting to this mailing list. (I will not tolerate 'denial of service' attacks from people who use @gmail.com email addresses. This software is not intended for you.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Thu May 17 04:02:42 2012 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Thu, 17 May 2012 12:02:42 +0100 Subject: [Live-devel] Two possible problems in RTSPServer class In-Reply-To: <64C2C7BE-B757-4B9D-B8D6-3C04798BB289@live555.com> References: <64C2C7BE-B757-4B9D-B8D6-3C04798BB289@live555.com> Message-ID: <4FB4DAD2.5020900@livingdata.pt> On 04/22/2012 06:46 AM, Ross Finlayson wrote: >> 2. Also I have seen possible ServerMediaSession object relations bug >> in RTSPServer::addServerMediaSession/RTSPServer::removeServerMediaSession >> methods. >> Method addServerMediaSession adds a ServerMediaSession object to >> hashmap fServerMediaSessions and call removeServerMediaSession on >> ServerMediaSession object previously set in the hashmap (if both >> objects have same stream name). Method removeServerMediaSession >> removes from the hashmap new ServerMediaSession object just added by >> addServerMediaSession. > > Yes, this is a bug. It will get fixed in a future release. I'm sorry, but I fail to see how this was a bug. Seems to me that RTSPServer::addServerMediaSession was removing an existing ServerMediaSession with the same name as the one being created, if any. Which sounded about right. The current fix, however, as introduced a nasty bug to our application. We are running a streaming server in which we use a DynamicRTSPServer, similar to the one of the LIVE555 Media Server application, on which it is based. With the current solution, the DynamicRTSPServer::lookupServerMediaSession method calls RTSPServer::addServerMediaSession (as before) which calls RTSPServer::removeServerMediaSession(char const* streamName) which, in turn, calls DynamicRTSPServer::lookupServerMediaSession again. This happens recursively until our application crashes. I tested the live555MediaServer app to see if it was a problem with our implementation but it exhibits a similar behavior, although, in this case, recursion only happens about 1000 times and then stops. I don't know why recursion stops in one case and not on the other, but I don't think it was supposed to happen at all, anyway. I added a log message to RTSPServer::addServerMediaSession and LIVE555 Media Server's DynamicRTSPServer::lookupServerMediaSession and this was the output: LIVE555 Media Server version 0.74 (LIVE555 Streaming Media library version 2012.05.11). ... accept()ed connection from 192.168.1.10 DynamicRTSPServer::lookupServerMediaSession: picolo.264 RTSPServer::addServerMediaSession: picolo.264 DynamicRTSPServer::lookupServerMediaSession: picolo.264 RTSPServer::addServerMediaSession: picolo.264 DynamicRTSPServer::lookupServerMediaSession: picolo.264 RTSPServer::addServerMediaSession: picolo.264 ... last 2 messages repeated about 1000 times So, I have a question: is this a bug or isn't DynamicRTSPServer::lookupServerMediaSession supposed to call RTSPServer::addServerMediaSession? If it isn't a bug we'll have to fix our app, and I guess the same will have to be done to LIVE555 Media Server, although it still works. Otherwise I think the current solution still needs some improvement. Thank you, Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From ashfaque at iwavesystems.com Thu May 17 04:35:10 2012 From: ashfaque at iwavesystems.com (Ashfaque) Date: Thu, 17 May 2012 17:05:10 +0530 Subject: [Live-devel] Failed to join group: Invalid IP Address In-Reply-To: <009AB714-FB5E-41AC-BFC7-E04BB64103A8@live555.com> References: <3804BF7B625B4C37962BF9101F5666CE@iwns10> <009AB714-FB5E-41AC-BFC7-E04BB64103A8@live555.com> Message-ID: Thanks a lot Ross. I added route and it worked fine.. Now able to stream between platform to iPad. But there is another issue we are facing after streaming is started, Wifi connection is getting disconnected after streaming for some 5 minutes duration (ping response fails). We are doubting the application as the devices are able to ping for a long duration also(when streaming application is not started). Console print (Not sure whether it will help). Can you please provide some inputs on this. /* Debug prints on connection lost on console */ ------------[ cut here ]------------ WARNING: at net/sched/sch_generic.c:258 dev_watchdog+0x17c/0x284() NETDEV WATCHDOG: uap0 (wlan_sdio): transmit queue 0 timed out Modules linked in: em28xx saa7115 videobuf_vmalloc videobuf_core tveeprom sd8xxx mlan(P) [<8002f4c0>] (unwind_backtrace+0x0/0xf0) from [<8004f9b0>] (warn_slowpath_common+0x4c/0x64) [<8004f9b0>] (warn_slowpath_common+0x4c/0x64) from [<8004fa48>] (warn_slowpath_fmt+0x2c/0x3c) [<8004fa48>] (warn_slowpath_fmt+0x2c/0x3c) from [<8033bc54>] (dev_watchdog+0x17c/0x284) [<8033bc54>] (dev_watchdog+0x17c/0x284) from [<8005a7a0>] (run_timer_softirq+0x16c/0x23c) [<8005a7a0>] (run_timer_softirq+0x16c/0x23c) from [<80054e58>] (__do_softirq+0x70/0xf8) [<80054e58>] (__do_softirq+0x70/0xf8) from [<80054f24>] (irq_exit+0x44/0xa8) [<80054f24>] (irq_exit+0x44/0xa8) from [<8002a070>] (asm_do_IRQ+0x70/0x8c) [<8002a070>] (asm_do_IRQ+0x70/0x8c) from [<8002aa8c>] (__irq_svc+0x4c/0xcc) Exception stack(0x807d7f78 to 0x807d7fc0) 7f60: 807e4cb4 c3c5dc13 7f80: 00000001 00000000 807d6000 807daa18 80821f84 807daa10 700224b8 412fc085 7fa0: 0000001f 00000000 00000003 807d7fc0 8002ba84 8002ba88 60000013 ffffffff [<8002aa8c>] (__irq_svc+0x4c/0xcc) from [<8002ba88>] (default_idle+0x24/0x28) [<8002ba88>] (default_idle+0x24/0x28) from [<8002bf6c>] (cpu_idle+0x48/0xa0) [<8002bf6c>] (cpu_idle+0x48/0xa0) from [<80008938>] (start_kernel+0x234/0x284) [<80008938>] (start_kernel+0x234/0x284) from [<70008034>] (0x70008034) ---[ end trace 3ea9d70442c4aedb ]--- 4294961629 : Tx timeout, bss_index=1 Regards, Ashfaque From: Ross Finlayson Sent: Tuesday, May 15, 2012 9:10 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Failed to join group: Invalid IP Address It shows following error messages: 00:07:36 Groupsock(8: 232.202.29.84, 18888, 255): failed to join group: setsockopt(IP_ADD_MEMBERSHIP) error: No such device It sounds like you don't have any multicast route set up on this device. (The code uses multicast, initially, to figure out the system's own IP address.) Make sure that you have a route for 224.0.0.0/4 set up on your network interface. Then the code should work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------------------------------------------------------------------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 17 07:13:08 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 May 2012 07:13:08 -0700 Subject: [Live-devel] Two possible problems in RTSPServer class In-Reply-To: <4FB4DAD2.5020900@livingdata.pt> References: <64C2C7BE-B757-4B9D-B8D6-3C04798BB289@live555.com> <4FB4DAD2.5020900@livingdata.pt> Message-ID: <7867716E-71CF-4809-A0B5-FE9F107F837E@live555.com> >>> 2. Also I have seen possible ServerMediaSession object relations bug >>> in RTSPServer::addServerMediaSession/RTSPServer::removeServerMediaSession >>> methods. >>> Method addServerMediaSession adds a ServerMediaSession object to >>> hashmap fServerMediaSessions and call removeServerMediaSession on >>> ServerMediaSession object previously set in the hashmap (if both >>> objects have same stream name). Method removeServerMediaSession >>> removes from the hashmap new ServerMediaSession object just added by >>> addServerMediaSession. >> >> Yes, this is a bug. It will get fixed in a future release. > > I'm sorry, but I fail to see how this was a bug. Seems to me that RTSPServer::addServerMediaSession was removing an existing ServerMediaSession with the same name as the one being created, if any. No, the old code was doing (when implementing "RTSPServer::addServerMediaSession()"): ServerMediaSession* existingSession = (ServerMediaSession*)(fServerMediaSessions->Add(sessionName, (void*)serverMediaSession)); removeServerMediaSession(existingSession); // if any which was a bug, because - if "existingSession" was not NULL, then the call to removeServerMediaSession(existingSession); // if any happened to remove - from the hash table - the *new* "serverMediaSession" that had just been added - because of the way that RTSPServer::removeServerMediaSession(ServerMediaSession* existingSession) is implemented by calling Remove(existingSession->streamName()); on the internal hash table. The new code does: removeServerMediaSession(sessionName); // in case an existing "ServerMediaSession" with this name already exists fServerMediaSessions->Add(sessionName, (void*)serverMediaSession); which is correct, except for the fact that void RTSPServer::removeServerMediaSession(char const* sessionName) is implemented by calling removeServerMediaSession(lookupServerMediaSession(sessionName)); i.e., by calling the "lookupServerMediaSession()" *virtual function*. That's a bug. The code should just be doing a hash table lookup here; not calling the "lookupServerMediaSession()" virtual function. I've now installed a new version (2012.05.17) of the code that fixes this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 17 07:14:55 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 May 2012 07:14:55 -0700 Subject: [Live-devel] Failed to join group: Invalid IP Address In-Reply-To: References: <3804BF7B625B4C37962BF9101F5666CE@iwns10> <009AB714-FB5E-41AC-BFC7-E04BB64103A8@live555.com> Message-ID: > But there is another issue we are facing after streaming is started, Wifi connection is getting disconnected after streaming for some 5 minutes duration (ping response fails). > We are doubting the application as the devices are able to ping for a long duration also(when streaming application is not started). > > Console print (Not sure whether it will help). > > Can you please provide some inputs on this. > > /* Debug prints on connection lost on console */ > ------------[ cut here ]------------ > WARNING: at net/sched/sch_generic.c:258 dev_watchdog+0x17c/0x284() > NETDEV WATCHDOG: uap0 (wlan_sdio): transmit queue 0 timed out This is an operating system (network packet driver) issue, that has nothing to do with our code. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Thu May 17 09:45:18 2012 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Thu, 17 May 2012 17:45:18 +0100 Subject: [Live-devel] Two possible problems in RTSPServer class In-Reply-To: <7867716E-71CF-4809-A0B5-FE9F107F837E@live555.com> References: <64C2C7BE-B757-4B9D-B8D6-3C04798BB289@live555.com> <4FB4DAD2.5020900@livingdata.pt> <7867716E-71CF-4809-A0B5-FE9F107F837E@live555.com> Message-ID: <20120517163103.M61003@livingdata.pt> On Thu, 17 May 2012 07:13:08 -0700, Ross Finlayson wrote > > I'm sorry, but I fail to see how this was a bug. Seems to me that RTSPServer::addServerMediaSession was removing an existing ServerMediaSession with the same name as the one being created, if any. > > No, the old code was doing (when implementing "RTSPServer::addServerMediaSession()"): > > ServerMediaSession* existingSession > = (ServerMediaSession*)(fServerMediaSessions->Add(sessionName, (void*)serverMediaSession)); > removeServerMediaSession(existingSession); // if any > > which was a bug, because - if "existingSession" was not NULL, then the call to > > removeServerMediaSession(existingSession); // if any > > happened to remove - from the hash table - the *new* "serverMediaSession" that had just been added - because of the way that > > RTSPServer::removeServerMediaSession(ServerMediaSession* existingSession) > > is implemented by calling > > Remove(existingSession->streamName()); > > on the internal hash table. OK, I get it now. > The new code does: > > removeServerMediaSession(sessionName); // in case an existing "ServerMediaSession" with this name already exists > fServerMediaSessions->Add(sessionName, (void*)serverMediaSession); > > which is correct, except for the fact that > > void RTSPServer::removeServerMediaSession(char const* sessionName) > > is implemented by calling > > removeServerMediaSession(lookupServerMediaSession(sessionName)); > > i.e., by calling the "lookupServerMediaSession()" *virtual function*. That's a bug. The code should just be doing a hash table lookup here; not calling the "lookupServerMediaSession()" virtual function. > > I've now installed a new version (2012.05.17) of the code that fixes this. And is working perfectly. Thank you so much Ross. Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From csavtche at gmail.com Thu May 17 12:08:07 2012 From: csavtche at gmail.com (Constantin Savtchenko) Date: Thu, 17 May 2012 15:08:07 -0400 Subject: [Live-devel] Compiling LiveMedia For Debug In Windows Message-ID: Hello all, I was wondering if anyone had recompiled the Live555 libraries with debug symbols? Do I simply set the the _ITERATOR_DEBUG_LEVEL in the *.mak files? Is it using some default level from when it includes the ntwin32.mak, or will I be okay with just setting my own? Thank you. Constantin S -------------- next part -------------- An HTML attachment was scrubbed... URL: From christopher.compagnon at gmail.com Thu May 17 22:08:51 2012 From: christopher.compagnon at gmail.com (Christopher COMPAGNON) Date: Fri, 18 May 2012 07:08:51 +0200 Subject: [Live-devel] liveMedia/FreeBSD : compilation fails on ar command In-Reply-To: <4FB5D902.5040501@gmail.com> References: <4FB5D902.5040501@gmail.com> Message-ID: <4FB5D963.1050805@gmail.com> > Hello ! > > I ran genMakefiles and then I tried to compile liveMedia on FreeBSD (8 > & 9 : fresh installed systems). But I have an issue : > > /ar crlibliveMedia.a Media.o MediaSource.o FramedSource.o > FramedFileSource.o FramedFilter.o ByteStreamFileSource.o > ByteStreamMultiFileSource.o ByteStreamMemoryBufferSource.o > BasicUDPSource.o DeviceSource.o AudioInputDevice.o > WAVAudioFileSource.o MPEG1or2Demux.o MPEG1or2DemuxedElementaryStream.o > MPEGVideoStreamFramer.o MPEG1or2VideoStreamFramer.o > MPEG1or2VideoStreamDiscreteFramer.o MPEG4VideoStreamFramer.o > MPEG4VideoStreamDiscreteFramer.o H264VideoStreamFramer.o > H264VideoStreamDiscreteFramer.o MPEGVideoStreamParser.o > MPEG1or2AudioStreamFramer.o MPEG1or2AudioRTPSource.o > MPEG4LATMAudioRTPSource.o MPEG4ESVideoRTPSource.o > MPEG4GenericRTPSource.o MP3FileSource.o MP3Transcoder.o MP3ADU.o > MP3ADUdescriptor.o MP3ADUinterleaving.o MP3ADUTranscoder.o > MP3StreamState.o MP3Internals.o MP3InternalsHuffman.o > MP3InternalsHuffmanTable.o MP3ADURTPSource.o MPEG1or2VideoRTPSource.o > MPEG2TransportStreamMultiplexor.o MPEG2TransportStreamFromPESSource.o > MPEG2TransportStreamFromESSource.o MPEG2TransportStreamFramer.o > ADTSAudioFileSource.o H263plusVideoRTPSource.o > H263plusVideoStreamFramer.o H263plusVideoStreamParser.o > AC3AudioStreamFramer.o AC3AudioRTPSource.o DVVideoStreamFramer.o > DVVideoRTPSource.o JPEGVideoSource.o AMRAudioSource.o > AMRAudioFileSource.o InputFile.o StreamReplicator.o MediaSink.o > FileSink.o BasicUDPSink.o AMRAudioFileSink.o H264VideoFileSink.o > MPEG1or2AudioRTPSink.o MP3ADURTPSink.o MPEG1or2VideoRTPSink.o > MPEG4LATMAudioRTPSink.o MPEG4GenericRTPSink.o MPEG4ESVideoRTPSink.o > H263plusVideoRTPSink.o H264VideoRTPSink.o DVVideoRTPSink.o > AC3AudioRTPSink.o VorbisAudioRTPSink.o VP8VideoRTPSink.o > GSMAudioRTPSink.o JPEGVideoRTPSink.o SimpleRTPSink.o AMRAudioRTPSink.o > T140TextRTPSink.o TCPStreamSink.o OutputFile.o uLawAudioFilter.o > RTPSource.o MultiFramedRTPSource.o SimpleRTPSource.o > H261VideoRTPSource.o H264VideoRTPSource.o QCELPAudioRTPSource.o > AMRAudioRTPSource.o JPEGVideoRTPSource.o VorbisAudioRTPSource.o > VP8VideoRTPSource.o RTPSink.o MultiFramedRTPSink.o AudioRTPSink.o > VideoRTPSink.o TextRTPSink.o RTPInterface.o RTCP.o rtcp_from_spec.o > RTSPServer.o RTSPClient.o RTSPCommon.o > RTSPServerSupportingHTTPStreaming.o SIPClient.o MediaSession.o > ServerMediaSession.o PassiveServerMediaSubsession.o > OnDemandServerMediaSubsession.o FileServerMediaSubsession.o > MPEG4VideoFileServerMediaSubsession.o > H264VideoFileServerMediaSubsession.o > H263plusVideoFileServerMediaSubsession.o > WAVAudioFileServerMediaSubsession.o > AMRAudioFileServerMediaSubsession.o > MP3AudioFileServerMediaSubsession.o > MPEG1or2VideoFileServerMediaSubsession.o MPEG1or2FileServerDemux.o > MPEG1or2DemuxedServerMediaSubsession.o > MPEG2TransportFileServerMediaSubsession.o > ADTSAudioFileServerMediaSubsession.o > DVVideoFileServerMediaSubsession.o AC3AudioFileServerMediaSubsession.o > MPEG2TransportUDPServerMediaSubsession.o ProxyServerMediaSession.o > QuickTimeFileSink.o QuickTimeGenericRTPSource.o AVIFileSink.o > MPEG2IndexFromTransportStream.o MPEG2TransportStreamIndexFile.o > MPEG2TransportStreamTrickModeFilter.o MatroskaFile.o > MatroskaFileParser.o EBMLNumber.o MatroskaDemuxedTrack.o > MatroskaFileServerDemux.o H264VideoMatroskaFileServerMediaSubsession.o > VP8VideoMatroskaFileServerMediaSubsession.o > AACAudioMatroskaFileServerMediaSubsession.o > AC3AudioMatroskaFileServerMediaSubsession.o > MP3AudioMatroskaFileServerMediaSubsession.o > VorbisAudioMatroskaFileServerMediaSubsession.o > T140TextMatroskaFileServerMediaSubsession.o DarwinInjector.o > BitVector.o StreamParser.o DigestAuthentication.o our_md5.o > our_md5hl.o Base64.o Locale.o > ar: invalid option -- e > usage: ar -d [-Tjsvz] archive file ... > ar -m [-Tjsvz] archive file ... > ar -m [-Tabijsvz] position archive file ... > ar -p [-Tv] archive [file ...] > ar -q [-TcDjsvz] archive file ... > ar -r [-TcDjsuvz] archive file ... > ar -r [-TabcDijsuvz] position archive file ... > ar -s [-jz] archive > ar -t [-Tv] archive [file ...] > ar -x [-CTouv] archive [file ...] > ar -V > *** Error code 64 > > Stop in /usr/local/live/liveMedia./ > > > I think that the issue comes from the absence of a space between "cr" > and /"libliveMedia.a"/ > For solving the issue I modified the Makefile files. > > The Makefile files contain the following command : > $(LIBRARY_LINK)$@ $(LIBRARY_LINK_OPTS) > It might be $(LIBRARY_LINK) $@ $(LIBRARY_LINK_OPTS) > > Now the compilation works without issue, but do not complile. I have > no progs and no" make clean"... > I see only .cpp files, hh files, etc. but no prog files. > > / > /Do you know what I have to change for compiling with success ? > > Thanks and regards. > C. COMPAGNON > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 18 00:37:22 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 May 2012 00:37:22 -0700 Subject: [Live-devel] liveMedia/FreeBSD : compilation fails on ar command In-Reply-To: <4FB5D963.1050805@gmail.com> References: <4FB5D902.5040501@gmail.com> <4FB5D963.1050805@gmail.com> Message-ID: <20F8EC52-CCAC-471E-A9B8-9FFC3C9D1690@live555.com> >> I ran genMakefiles and then I tried to compile liveMedia on FreeBSD (8 & 9 : fresh installed systems). But I have an issue : >> >> ar crlibliveMedia.a The problem is the missing space after "ar cr". If that's then, then "ar" should work. But I don't understand why/how your Makefile could be missing that space, if you ran genMakefiles freebsd to generate the Makefiles... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Fri May 18 00:49:04 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Fri, 18 May 2012 13:19:04 +0530 Subject: [Live-devel] Can test on demand server be implemented to handle multiple clients Message-ID: Hello Ross Remarkable work "The Live Media". Ross i want to ask u something is this possible or not:- In your test on demand server implementation there is an option in which server listens to the incoming mpeg2 stream(client), Can this server be implemented to handle multiple clients(as we have to hard code the ip for each client how can we make this server to handle multiple clients). Please guide me on this. Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From christopher.compagnon at gmail.com Fri May 18 01:12:01 2012 From: christopher.compagnon at gmail.com (Christopher COMPAGNON) Date: Fri, 18 May 2012 10:12:01 +0200 Subject: [Live-devel] liveMedia/FreeBSD : compilation fails on ar command In-Reply-To: References: <4FB5D902.5040501@gmail.com> <4FB5D963.1050805@gmail.com> <20F8EC52-CCAC-471E-A9B8-9FFC3C9D1690@live555.com> Message-ID: Note : I solved the ar issue the following solution : In Makefile.tail, I added a space in the line $(LIBRARY_LINK)$@ $(LIBRARY_LINK_OPTS) \ between $(LIBRARY_LINK) and $@... Best regards. C.COMPAGNON 2012/5/18 Christopher COMPAGNON > Hello ! > > Like you, I do not understand why the space is missing (2-3 years ago, I > compiled this application with success). > I ran genMakefiles as explain. > I tried compile as usual as explain.--> error > > Then, I added the space manually in each Makefile. > > I have just seen a new version of source. > I will try to compile with this new version... > > Thanks and regards. > C.COMPAGNON > > 2012/5/18 Ross Finlayson > >> I ran genMakefiles and then I tried to compile liveMedia on FreeBSD (8 & >> 9 : fresh installed systems). But I have an issue : >> >> *ar crlibliveMedia.a* >> >> >> The problem is the missing space after "ar cr". If that's then, then >> "ar" should work. >> >> But I don't understand why/how your Makefile could be missing that space, >> if you ran >> genMakefiles freebsd >> to generate the Makefiles... >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 18 02:13:59 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 May 2012 02:13:59 -0700 Subject: [Live-devel] Can test on demand server be implemented to handle multiple clients In-Reply-To: References: Message-ID: <9F44C456-5FE2-43FD-88F7-D7A0844067B3@live555.com> > In your test on demand server implementation there is an option in which server listens to the incoming mpeg2 stream(client), > Can this server be implemented to handle multiple clients Yes, all of our RTSP servers handle multiple concurrent clients automatically. You don't need to do anything extra to get this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 18 02:26:41 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 May 2012 02:26:41 -0700 Subject: [Live-devel] liveMedia/FreeBSD : compilation fails on ar command In-Reply-To: References: <4FB5D902.5040501@gmail.com> <4FB5D963.1050805@gmail.com> <20F8EC52-CCAC-471E-A9B8-9FFC3C9D1690@live555.com> Message-ID: > I solved the ar issue the following solution : > In Makefile.tail, I added a space in the line > $(LIBRARY_LINK)$@ $(LIBRARY_LINK_OPTS) \ > between $(LIBRARY_LINK) and $@... But you shouldn't have to do this, because the space is already present - at the end of the "LIBRARY_LINK" line in the "config.freebsd" file. Noone else has had problems building the code for Freebsd after just doing "genMakefiles freebsd". I don't know why it's not working for you... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Fri May 18 02:53:26 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Fri, 18 May 2012 15:23:26 +0530 Subject: [Live-devel] Can test on demand server be implemented to handle multiple clients In-Reply-To: <9F44C456-5FE2-43FD-88F7-D7A0844067B3@live555.com> References: <9F44C456-5FE2-43FD-88F7-D7A0844067B3@live555.com> Message-ID: Hey Ross In this case clients are not the vlc player,clients are the testMPEG2TransportStreamer, Sorry if i was not able to explain question earlier,My ques is i want to run 2 or 3 testMPEG2TransportStreamer application and can these multiple applications be handled by test on demand server??? On Fri, May 18, 2012 at 2:43 PM, Ross Finlayson wrote: > In your test on demand server implementation there is an option in which > server listens to the incoming mpeg2 stream(client), > Can this server be implemented to handle multiple clients > > > Yes, all of our RTSP servers handle multiple concurrent clients > automatically. You don't need to do anything extra to get this. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 18 05:59:04 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 May 2012 05:59:04 -0700 Subject: [Live-devel] Can test on demand server be implemented to handle multiple clients In-Reply-To: References: <9F44C456-5FE2-43FD-88F7-D7A0844067B3@live555.com> Message-ID: <647D4B10-4260-47A4-B03F-C0516A413AC4@live555.com> > In this case clients are not the vlc player,clients are the testMPEG2TransportStreamer, > Sorry if i was not able to explain question earlier,My ques is i want to run 2 or 3 testMPEG2TransportStreamer application and can these multiple applications be handled by test on demand server??? Yes, you can do this, by duplicating the code block // A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source: { ... } for each input stream. In each case, you would use a different "streamName", "inputAddressStr" (which must be for a multicast address), and "inputPortNum". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From trn200190 at gmail.com Fri May 18 08:28:34 2012 From: trn200190 at gmail.com (i m what i m ~~~~) Date: Fri, 18 May 2012 20:58:34 +0530 Subject: [Live-devel] Can test on demand server be implemented to handle multiple clients In-Reply-To: <647D4B10-4260-47A4-B03F-C0516A413AC4@live555.com> References: <9F44C456-5FE2-43FD-88F7-D7A0844067B3@live555.com> <647D4B10-4260-47A4-B03F-C0516A413AC4@live555.com> Message-ID: Hey Ross Thats obvious sir if we duplicate the code block and write source of different test mpeg2 streamer. Can the test on demand server file be act as following:- the server should always be in listening mode and every time when i run the test mpeg2 streamer the server creates a thread(thread containing the code of A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source:) and thread binds to a particular a particular test mpeg 2 streamer...?? On Fri, May 18, 2012 at 6:29 PM, Ross Finlayson wrote: > In this case clients are not the vlc player,clients are the > testMPEG2TransportStreamer, > Sorry if i was not able to explain question earlier,My ques is i want to > run 2 or 3 testMPEG2TransportStreamer application and can these multiple > applications be handled by test on demand server??? > > > Yes, you can do this, by duplicating the code block > // A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) > source: > { > ... > } > for each input stream. In each case, you would use a different > "streamName", "inputAddressStr" (which must be for a multicast address), > and "inputPortNum". > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From csavtche at gmail.com Fri May 18 10:57:14 2012 From: csavtche at gmail.com (Constantin Savtchenko) Date: Fri, 18 May 2012 13:57:14 -0400 Subject: [Live-devel] MediaSession Destructor Is Protected Message-ID: Hello All, I am trying to manage my resources correctly, and I've discovered you cannot delete a MediaSession... What is the proper way to free a MediaSession's resources. An scenario would be when you wish to reinitialize a MediaSession with a new DESCRIBE command. Thank you. Constantin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 18 11:16:30 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 May 2012 11:16:30 -0700 Subject: [Live-devel] Can test on demand server be implemented to handle multiple clients In-Reply-To: References: <9F44C456-5FE2-43FD-88F7-D7A0844067B3@live555.com> <647D4B10-4260-47A4-B03F-C0516A413AC4@live555.com> Message-ID: <0D616ADD-0D76-4462-A1EC-607BDED2CA88@live555.com> > Thats obvious sir if we duplicate the code block and write source of different test mpeg2 streamer. > Can the test on demand server file be act as following:- > the server should always be in listening mode and > every time when i run the test mpeg2 streamer the server creates a thread(thread containing the code of A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source:) and thread binds to a particular a particular test mpeg 2 streamer...?? What is your problem?? You've been told to read the FAQ, so you (should) know that LIVE555-based applications use an event loop, not multiple threads, for concurrency. If you write your server the way that I told you - by duplicating the code block for reading from a particular multicast address+port - then the server will be able to handle multiple incoming streams concurrently. You don't need threads to do this. This is the 'last straw'. You've been posting a ridiculous number of messages to the mailing list, despite the fact (as you show by your email address) that you're just a casual hobbyist. No more. You have now been removed from the mailing list, and banned from resubscribing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 18 11:23:42 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 May 2012 11:23:42 -0700 Subject: [Live-devel] MediaSession Destructor Is Protected In-Reply-To: References: Message-ID: > I am trying to manage my resources correctly, and I've discovered you cannot delete a MediaSession... What is the proper way to free a MediaSession's resources. "MediaSession" - like all subclasses of "Medium" - objects cannot be deleted directly. Instead, you reclaim such an object by calling Medium:close(subsession); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From shiyong.zhang.cn at gmail.com Sat May 19 00:10:04 2012 From: shiyong.zhang.cn at gmail.com (=?GB2312?B?1cXKwNPC?=) Date: Sat, 19 May 2012 15:10:04 +0800 Subject: [Live-devel] Problem in function RTSPClient::handleResponseBytes and RTSPClient::incomingDataHandler1() Message-ID: Hello In RTSPClient line 1312 void RTSPClient::incomingDataHandler1() { line 1313 struct sockaddr_in dummy; // 'from' address - not used line 1314 line 1315 int bytesRead = readSocket(envir(), fInputSocketNum, (unsigned char*)&fResponseBuffer[fResponseBytesAlreadySeen], fResponseBufferBytesLeft, dummy); line 1316 handleResponseBytes(bytesRead); line 1317 } line 1338 void RTSPClient::handleResponseBytes(int newBytesRead) { line 1339 do { line 1340 if (newBytesRead > 0 && (unsigned)newBytesRead < fResponseBufferBytesLeft) break; // data was read OK; process it below line 1341 line 1342 if (newBytesRead >= (int)fResponseBufferBytesLeft) { line 1343 // We filled up our response buffer. Treat this as an error (for the first response handler): line 1344 envir().setResultMsg("RTSP response was truncated. Increase \"RTSPClient::responseBufferSize\""); line 1345 } ... line 1370 fResponseBuffer[fResponseBytesAlreadySeen] = '\0'; In line 1340, You expect newBytesRead < fResponseBufferBytesLeft, but it's possible newBytesRead == fResponseBufferBytesLeft. Though you expectd to increase responseBufferSize to contain network date, but it cann't avoid it.In some scenario it's still happen, especially in RTP over RTSP. Would you mind to change function RTSPClient::incomingDataHandler1, line 1315 as the following: int bytesRead = readSocket(envir(), fInputSocketNum, (unsigned char*)&fResponseBuffer[fResponseBytesAlreadySeen], fResponseBufferBytesLeft- 1, dummy); At this time, codes between line 1339-1366 maybe useless. I don't think it can be avoid this problem at its root by change the value of TSPClient::responseBufferSize. How do you think ? Br Shiyong Zhang -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 19 00:45:26 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 May 2012 00:45:26 -0700 Subject: [Live-devel] Problem in function RTSPClient::handleResponseBytes and RTSPClient::incomingDataHandler1() In-Reply-To: References: Message-ID: No, I believe that the current code is correct. We want the response buffer to be large enough to hold all of the response data; if the buffer is not large enough, then that's an error. That's why we have the code: if (newBytesRead >= (int)fResponseBufferBytesLeft) { // We filled up our response buffer. Treat this as an error (for the first response handler): envir().setResultMsg("RTSP response was truncated. Increase \"RTSPClient::responseBufferSize\""); } However, the default response buffer size of 20000 bytes should, in practice, be large enough for any conceivable RTSP response, so I don't understand why you would be filling up the buffer. (Note that the data that we're receiving here should only be RTSP responses, even for RTP-over-RTSP streams.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ranshalit at gmail.com Sun May 20 06:16:45 2012 From: ranshalit at gmail.com (Ran Shalit) Date: Sun, 20 May 2012 16:16:45 +0300 Subject: [Live-devel] codecs supported Message-ID: Hello, I?was searching for a list of supported codecs in the web page. It is said "The libraries can also be used to stream, receive, and process MPEG, H.264, H.263+, DV or JPEG video, and several audio codecs." Is MPEG-2 (ISO 13818-2) also supported ? Thanks, Ran From finlayson at live555.com Sun May 20 09:20:59 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 May 2012 09:20:59 -0700 Subject: [Live-devel] codecs supported In-Reply-To: References: Message-ID: <81740DFA-53DC-48D8-9EB0-BFE8C1F6297E@live555.com> > I was searching for a list of supported codecs in the web page. > It is said "The libraries can also be used to stream, receive, and > process MPEG, H.264, H.263+, DV or JPEG video, and several audio > codecs." > Is MPEG-2 (ISO 13818-2) also supported ? Yes. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ranshalit at gmail.com Sun May 20 10:14:21 2012 From: ranshalit at gmail.com (Ran Shalit) Date: Sun, 20 May 2012 19:14:21 +0200 Subject: [Live-devel] codecs supported In-Reply-To: <81740DFA-53DC-48D8-9EB0-BFE8C1F6297E@live555.com> References: <81740DFA-53DC-48D8-9EB0-BFE8C1F6297E@live555.com> Message-ID: Thanks Ross, Now, I understand that there is no documentary about the live555 streamer (from the faq page), But Is there somewhere a complete list of all supported codecs (MPEG-2 was not part of the list of codecs from below). Best Regards, Ran On Sun, May 20, 2012 at 6:20 PM, Ross Finlayson wrote: > I?was searching for a list of supported codecs in the web page. > It is said "The libraries can also be used to stream, receive, and > process MPEG, H.264, H.263+, DV or JPEG video, and several audio > codecs." > Is MPEG-2 (ISO 13818-2) also supported ? > > > Yes. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Sun May 20 10:52:51 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 May 2012 10:52:51 -0700 Subject: [Live-devel] codecs supported In-Reply-To: References: <81740DFA-53DC-48D8-9EB0-BFE8C1F6297E@live555.com> Message-ID: <6C5D0FCB-E7B4-4A68-9D52-40F9C116187E@live555.com> > Now, I understand that there is no documentary about the live555 > streamer (from the faq page), But Is there somewhere a complete list > of all supported codecs To see the 'complete list' of supported codecs, list the files in the "liveMedia" directory. Files named "*RTPSource" implement the reception of RTP packets that use the named codec. Files named "*RTPSink" implement the transmission of RTP packets that use the named codec. (For most codecs, we implement both reception and transmission over RTP.) > (MPEG-2 was not part of the list of codecs from below). Oh for heavens sake, we said "MPEG", which meant "MPEG-1", "MPEG-2", and "MPEG-4". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashfaque at iwavesystems.com Mon May 21 01:19:27 2012 From: ashfaque at iwavesystems.com (Ashfaque) Date: Mon, 21 May 2012 13:49:27 +0530 Subject: [Live-devel] Streaming over Wifi with no receiver Message-ID: Hi Ross, I am facing a strange issue when streaming the frames over Wifi interface, when there is no receiver to receive the frames. I am guessing like frames are being queued up in the network buffer and causing the wifi module to collapse. As indicated by below prints: NETDEV WATCHDOG: uap0 (wlan_sdio): transmit queue 0 timed out Please provide answers to the following questions: 1. When streaming is started without receiver being connected whether frames will get queued up in the lower network layers of Live555? If yes, how to avoid that? If sending rate is more than receiving rate, can this also cause the same problem? 2. Can we enable streaming only if receiver is available? Will this solve the problem? How to do this? 3. If server disconnected, how receiver will come to know the status? Any event or mechanism available? Thanks & Regards, Ashfaque -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 21 02:13:13 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 May 2012 12:13:13 +0300 Subject: [Live-devel] Streaming over Wifi with no receiver In-Reply-To: References: Message-ID: Network transmission (over WiFi, or any other network interface) is handled by the operating system and its device drivers, and has nothing to do with our software, which operates at a higher level. However, I suspect that your problems are being caused by you attempting to transmit packets too quickly (causing congestion/buffering problems in your OS's device driver). You haven't said what your LIVE555-based application does, but if you are feeding something into a "RTPSink" subclass, you must make sure that the "fDurationInMicroseconds" member variable is set before you complete delivery of each frame (unless you are feeding the "RTPSink" subclass object from a live source). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashfaque at iwavesystems.com Mon May 21 02:29:49 2012 From: ashfaque at iwavesystems.com (Ashfaque) Date: Mon, 21 May 2012 14:59:49 +0530 Subject: [Live-devel] Streaming over Wifi with no receiver In-Reply-To: References: Message-ID: Thanks for the reply Ross. Our application reads frames from a camera at the rate of 33ms (30fps) which are encoded by a Hardware Codec and then fed to the H264Source::deliverFrame method. But when receiver is not connected ping is gives below response after some duration with streaming: ping ?s ?IP? says No buffer space is available. I am not setting any value to fDurationInMicroseconds as frames are being read from a live camera. Without running the application ping between the systems over Wifi has been verified for a very long duration. The receiver application is again a live555 based iOS application. From: Ross Finlayson Sent: Monday, May 21, 2012 2:43 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Streaming over Wifi with no receiver Network transmission (over WiFi, or any other network interface) is handled by the operating system and its device drivers, and has nothing to do with our software, which operates at a higher level. However, I suspect that your problems are being caused by you attempting to transmit packets too quickly (causing congestion/buffering problems in your OS's device driver). You haven't said what your LIVE555-based application does, but if you are feeding something into a "RTPSink" subclass, you must make sure that the "fDurationInMicroseconds" member variable is set before you complete delivery of each frame (unless you are feeding the "RTPSink" subclass object from a live source). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------------------------------------------------------------------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon May 21 08:40:08 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 21 May 2012 17:40:08 +0200 Subject: [Live-devel] How to specify Range in RTSPClient that are not using npt Message-ID: <12147_1337614896_4FBA6230_12147_1419_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4BDC@THSONEA01CMS01P.one.grp> Hi Ross, We try to find a way to send a range in absolute, it seems that a way could be to specify in the RTSP PLAY command : - Range: clock=19961108T142300Z-19961108T143520Z Reading the clock of RTSPClient it seems that no overide is possible. In RTSPClient.cpp this seems done by "static char* createRangeString(double start, double end)" and then we cannot modify its behaviour. Is it possible to define this method in the RTSPClient class as virtual in order to extend it ? Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon May 21 10:00:31 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 21 May 2012 19:00:31 +0200 Subject: [Live-devel] OutPacketBuffer::maxsize and multithreading In-Reply-To: References: <6756_1336061978_4FA2B01A_6756_16537_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBD52B27@THSONEA01CMS01P.one.grp> Message-ID: <8570_1337619716_4FBA7504_8570_6228_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4D56@THSONEA01CMS01P.one.grp> We have make a try using thread local storage that is quite simple. Adding "__thread" before OutPacketBuffer::maxSize declaration and definition give a different variable for each threads. Do you think this fix is acceptable ? Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : vendredi 4 mai 2012 07:12 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] OutPacketBuffer::maxsize and multithreading (First, please do not post the same question to the list multiple times.) Yes, this is an issue, and is a good illustration why it's ill-advised to structure a LIVE555-based application using multiple threads. However, in some future release, I'll make the "OutPacketBuffer::maxSize" variable a per-UsageEnvironment value instead of a global variable; this will overcome this problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 22 01:28:22 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 May 2012 11:28:22 +0300 Subject: [Live-devel] Streaming over Wifi with no receiver In-Reply-To: References: Message-ID: <744FFD7D-D3D6-41C3-81AE-DE517AC4F4F9@live555.com> It looks to me like your output data rate exceeds the capacity of your network (in this case, a WiFi network). This is especially likely if you are streaming via multicast, because the bandwidth of multicast over WiFi tends to be significantly lower than unicast. If you are streaming via multicast, then you might want to try streaming via unicast instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chrish at cantab.net Mon May 21 05:37:19 2012 From: chrish at cantab.net (Chris Harding) Date: Mon, 21 May 2012 14:37:19 +0200 Subject: [Live-devel] Problem converting H264 to TS References: <59C1609E-364D-4776-B6D4-1764578F86C0@cantab.net> Message-ID: <9B2E1F11-D3A1-4F8B-8D56-23C553A8FF4F@cantab.net> Hello all, Tried to convert an h264 stream (available here: http://zvps.co/in.264) to a TS using the test program provided with live555 (testH264VideoToTransportStream). Everything seems OK, but when I open the .ts in VLC I get quite severe artefacts. If I box up the stream into an MP4 file then I can play with Quicktime, which gives me very noticeable but slightly less severe artefacts. When creating the MP4, MP4Box gives me the warnings (which may or may not be significant): [MPEG-2 TS] PID 224 - same PTS 4707874055 for two consecutive PES packets [MPEG-2 TS] PID 224 - same DTS 4707874055 for two consecutive PES packets I should also add that the source .264 stream has been extracted from a .MOV file (recorded on iPhone), again using MP4Box. Any suggestions? Help would be greatly appreciated, thanks. From krishnaks at iwavesystems.com Mon May 21 19:31:52 2012 From: krishnaks at iwavesystems.com (Krishna) Date: Mon, 21 May 2012 19:31:52 -0700 Subject: [Live-devel] Streaming over Wifi with no receiver References: Message-ID: <003c01cd37c3$0c697280$2a02a8c0@iwdtp219> Hi Ross, We are using Live555 library for H264 video streaming over Wi-Fi. But If transmitter is disconnected or turned off, How Receiver will come to know? Is there any events /signals to get that information. We used TestRTSPClient file a a reference on receiver side. Regards, KP -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashfaque at iwavesystems.com Tue May 22 01:58:14 2012 From: ashfaque at iwavesystems.com (Ashfaque) Date: Tue, 22 May 2012 14:28:14 +0530 Subject: [Live-devel] Streaming over Wifi with no receiver In-Reply-To: <744FFD7D-D3D6-41C3-81AE-DE517AC4F4F9@live555.com> References: <744FFD7D-D3D6-41C3-81AE-DE517AC4F4F9@live555.com> Message-ID: Hi Ross, Now we are able to stream over Wifi with receiver application running on iOS device. But we are getting a very large propagation delay ranging from 200ms initially when application to more than 1.5 sec after some 5-10 minutes duration. We are using a camera which produces 30fps. We had calculated the time needed for capture and H264 encode the frame which is always lesser than the frame rate (33ms for 30fps) Trying to find where the delay is, Server side or the receiver side. Does live555 internal layer waits until the frames arrive in time order (As fPresentationTime is used for each frame), or can we drop frames if they are not in order? Can this help us to overcome the issue of propagation delay? Please suggest any other areas which can cause the propagation delay? Thanks in advance. Regards, Ashfaque From: Ross Finlayson Sent: Tuesday, May 22, 2012 1:58 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Streaming over Wifi with no receiver It looks to me like your output data rate exceeds the capacity of your network (in this case, a WiFi network). This is especially likely if you are streaming via multicast, because the bandwidth of multicast over WiFi tends to be significantly lower than unicast. If you are streaming via multicast, then you might want to try streaming via unicast instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------------------------------------------------------------------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From renatomauro at libero.it Mon May 21 11:20:30 2012 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Mon, 21 May 2012 20:20:30 +0200 Subject: [Live-devel] Is it possible to override the RTP port chosen by the server? Message-ID: <42B632EF4EFA4C5E9FB204B3F9DD7287@CSystemDev> Hello Ross. Some Samsung cameras (SNB-5000, SND-5080, SNO-5080) choose the RTP destination port; kindly, the Live555 RTSPClient respect this request. Instead Samsung, unkindly, doens't let us disable this feature or set the desired port up via the camera's configuration web page; so sometimes two or more cameras (and all after power on) choose the same port: so some streams starve and other ones receive frames coming from two or more cameras. Could you please tell me if and where subclassing in order to make Live555 refuse the port chosen by the camera and answer with a port chosen by the OS (hoping the camera will accept the change; if not, sadly I'll have to contact Samsung or change supplier)? Here linked you find two files reporting the RTSP messages. Thank you very much, Renato MAURO -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: RTPPortByOS.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: RTPPortByServer.txt URL: From qrundong at gmail.com Tue May 22 02:31:28 2012 From: qrundong at gmail.com (rundong qiu) Date: Tue, 22 May 2012 17:31:28 +0800 Subject: [Live-devel] Fail in receiving data from a IPcamera via multicast Message-ID: Good Afternoon! I am trying to get a H.264 stream using the "openRTSP" command-line client. The stream is come from a IP camera (that is the DM368IPNC, or, the TMDXIPCAM8127J3) via multicast, and my OS is Windows7. As a result, the RTSP protocol exchange works OK, but the client cannot receive data from the specified port(6032) of my PC, so the resulting data file(s) are empty. With the help of the "Wireshark", I am sure that the stream from a camera does arrive the specified port. And if I asked for the stream using the "VLC player", it can successfully receive the stream and play it. So, could you tell me what is wrong, or give me any advise? Thank you very much! -------------- next part -------------- An HTML attachment was scrubbed... URL: From daniel.martinez at sacnet.es Tue May 22 03:01:46 2012 From: daniel.martinez at sacnet.es (Daniel Martinez Ramos) Date: Tue, 22 May 2012 12:01:46 +0200 Subject: [Live-devel] Streaming over Wifi with no receiver In-Reply-To: References: <744FFD7D-D3D6-41C3-81AE-DE517AC4F4F9@live555.com> Message-ID: <4FBB640A.5000208@sacnet.es> Hi everybody I've found a project quite similar to yours encoding h264 and streaming via wifi, it also cover how to adjust bitrate depending on the bandwidth. link here: http://es.scribd.com/Akavashi/d/71789823-spyPanda-Thesis-Report Regards. Daniel M. El 22/05/2012 10:58, Ashfaque escribi?: > Hi Ross, > Now we are able to stream over Wifi with receiver application running > on iOS device. > But we are getting a very large propagation delay ranging from 200ms > initially when application to more than 1.5 sec after some 5-10 > minutes duration. We are using a camera which produces 30fps. > We had calculated the time needed for capture and H264 encode the > frame which is always lesser than the frame rate (33ms for 30fps) > Trying to find where the delay is, Server side or the receiver side. > Does live555 internal layer waits until the frames arrive in time > order (As fPresentationTime is used for each frame), or can we drop > frames if they are not in order? Can this help us to overcome the > issue of propagation delay? > Please suggest any other areas which can cause the propagation delay? > Thanks in advance. > Regards, > Ashfaque > *From:* Ross Finlayson > *Sent:* Tuesday, May 22, 2012 1:58 PM > *To:* LIVE555 Streaming Media - development & use > > *Subject:* Re: [Live-devel] Streaming over Wifi with no receiver > It looks to me like your output data rate exceeds the capacity of your > network (in this case, a WiFi network). This is especially likely if > you are streaming via multicast, because the bandwidth of multicast > over WiFi tends to be significantly lower than unicast. If you are > streaming via multicast, then you might want to try streaming via > unicast instead. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > ------------------------------------------------------------------------ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From tbatra18 at gmail.com Tue May 22 07:55:23 2012 From: tbatra18 at gmail.com (Tarun Batra) Date: Tue, 22 May 2012 20:25:23 +0530 Subject: [Live-devel] might be an specification mismatch in testH264VideoToTransportStream Message-ID: Hello Sir I am new to live media,i used live media server to stream over http using a transport stream file made from testH264VideoToTransportStream.exe, on my client side when i used to receive the file i was not able to receive it properly(i mean i doesn't play properly) but if i used the same file made from ts muxer software available from internet and receive the file using my client it plays properly,i made my receiver program i am removing ts header and tcp header.i have made my receiver program acc to following specification "Packets are 188 bytes in length,[5] but the communication medium may add some error correction bytes to the packet. ISDB-T and DVB-T/C/S uses 204 bytes and ATSC 8-VSB, 208 bytes as the size of emission packets (transport stream packet + FEC data). ATSC transmission adds 20 bytes of Reed-Solomon forward error correction to create a packet that is 208 bytes long.[8]" The following specification is available on the wikipedia" http://en.wikipedia.org/wiki/MPEG_transport_stream" Can u tell me what might be a problem or what should i do to receive the transport stream file made from testH264VideoToTransportStream.exe. Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Tue May 22 08:59:41 2012 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Tue, 22 May 2012 08:59:41 -0700 Subject: [Live-devel] Fail in receiving data from a IPcamera via multicast In-Reply-To: References: Message-ID: <002301cd3833$e6ec3620$b4c4a260$@com> Hi, If you are on a computer with multiple network cards, you probably just need to specify the interface to use to receive multicast traffic. Since you are using OpenRTSP you can specify the interface with the "-I" option: [-I ] Chris Richardson WTI From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of rundong qiu Sent: Tuesday, May 22, 2012 2:31 AM To: live-devel at ns.live555.com Subject: [Live-devel] Fail in receiving data from a IPcamera via multicast Good Afternoon! I am trying to get a H.264 stream using the "openRTSP" command-line client. The stream is come from a IP camera (that is the DM368IPNC, or, the TMDXIPCAM8127J3) via multicast, and my OS is Windows7. As a result, the RTSP protocol exchange works OK, but the client cannot receive data from the specified port(6032) of my PC, so the resulting data file(s) are empty. With the help of the "Wireshark", I am sure that the stream from a camera does arrive the specified port. And if I asked for the stream using the "VLC player", it can successfully receive the stream and play it. So, could you tell me what is wrong, or give me any advise? Thank you very much! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 23 03:35:25 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 May 2012 13:35:25 +0300 Subject: [Live-devel] How to specify Range in RTSPClient that are not using npt In-Reply-To: <12147_1337614896_4FBA6230_12147_1419_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4BDC@THSONEA01CMS01P.one.grp> References: <12147_1337614896_4FBA6230_12147_1419_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4BDC@THSONEA01CMS01P.one.grp> Message-ID: <1B61CE9B-0457-449B-914B-EEA0066B0D2A@live555.com> > We try to find a way to send a range in absolute, it seems that a way could be to specify in the RTSP PLAY command : > - Range: clock=19961108T142300Z-19961108T143520Z Why? Once again, I don't see the point. With a "Range:" header like this (in fact, any "Range:" header other than "0-" or "now-"), you're telling the server that you want to seek. But if the server's not going to be seeking within the stream, then there's no point in sending a header like this. (And in fact, our server code doesn't even handle this kind of "Range:" header (and there are no plans to extend it to do so, because noone ever uses it.) I don't understand why you keep "tilting at this windmill". > Is it possible to define this method in the RTSPClient class as virtual in order to extend it ? I have no plans to do this. Again, I just don't see the point... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Wed May 23 03:45:09 2012 From: warren at etr-usa.com (Warren Young) Date: Wed, 23 May 2012 04:45:09 -0600 Subject: [Live-devel] Streaming over Wifi with no receiver In-Reply-To: References: <744FFD7D-D3D6-41C3-81AE-DE517AC4F4F9@live555.com> Message-ID: <4FBCBFB5.6070601@etr-usa.com> > On 5/22/2012 2:28 AM, Ross Finlayson wrote: > It looks to me like your output data rate exceeds the capacity of > your network (in this case, a WiFi network). [snip] On 5/22/2012 2:58 AM, Ashfaque wrote: > ...iOS device. If this Wifi network involves an Apple Airport, you probably haven't increased the default multicast rate limit in Airport Utility from the default 2 Mbit/s. From finlayson at live555.com Wed May 23 03:53:56 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 May 2012 13:53:56 +0300 Subject: [Live-devel] OutPacketBuffer::maxsize and multithreading In-Reply-To: <8570_1337619716_4FBA7504_8570_6228_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4D56@THSONEA01CMS01P.one.grp> References: <6756_1336061978_4FA2B01A_6756_16537_1_1BE8971B6CFF3A4F97AF4011882AA2550155FBD52B27@THSONEA01CMS01P.one.grp> <8570_1337619716_4FBA7504_8570_6228_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4D56@THSONEA01CMS01P.one.grp> Message-ID: <54DE9DD5-A784-4B0D-BE62-BAD32365D1A4@live555.com> On May 21, 2012, at 8:00 PM, PROMONET Michel wrote: > We have make a try using thread local storage that is quite simple. > Adding "__thread" before OutPacketBuffer::maxSize declaration and definition give a different variable for each threads. > > Do you think this fix is acceptable ? No, because likely not every person's C++ compiler supports "__thread". As I noted before, the solution (to be included in some future release) will make the "OutPacketBuffer::maxSize" variable a per-UsageEnvironment value. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sidprice at softtools.com Wed May 23 11:19:48 2012 From: sidprice at softtools.com (Sid Price) Date: Wed, 23 May 2012 12:19:48 -0600 Subject: [Live-devel] Building Media Server for WINdows CE 6 Message-ID: <020b01cd3910$a4c64690$ee52d3b0$@softtools.com> Hello, I am building the Media Server for Windows CE 6 (ARM) and I am getting the following linker errors when I build the application: 1>------ Build started: Project: Live555_CE, Configuration: Debug VistaMax-9G20 (ARMV4I) ------ 1>Linking... 1>corelibc.lib(mainwcrt.obj) : error LNK2019: unresolved external symbol wmain referenced in function "void __cdecl mainCRTStartupHelper(struct HINSTANCE__ *,unsigned short const *)" (?mainCRTStartupHelper@@YAXPAUHINSTANCE__@@PBG at Z) 1>livemedia.lib(RTSPServerSupportingHTTPStreaming.obj) : error LNK2019: unresolved external symbol stat referenced in function "char const * __cdecl lastModifiedHeader(char const *)" (?lastModifiedHeader@@YAPBDPBD at Z) 1>livemedia.lib(ByteStreamFileSource.obj) : error LNK2019: unresolved external symbol "int __cdecl _read(int,void *,unsigned int)" (?_read@@YAHHPAXI at Z) referenced in function "protected: void __cdecl ByteStreamFileSource::doReadFromFile(void)" (?doReadFromFile at ByteStreamFileSource@@IAAXXZ) 1>livemedia.lib(WAVAudioFileSource.obj) : error LNK2001: unresolved external symbol "int __cdecl _read(int,void *,unsigned int)" (?_read@@YAHHPAXI at Z) 1>livemedia.lib(MP3InternalsHuffman.obj) : error LNK2019: unresolved external symbol "void __cdecl abort(void)" (?abort@@YAXXZ) referenced in function "void __cdecl initialize_huffman(void)" (?initialize_huffman@@YAXXZ) 1>usageenvironment.lib(UsageEnvironment.obj) : error LNK2001: unresolved external symbol "void __cdecl abort(void)" (?abort@@YAXXZ) 1>VistaMax-9G20 (ARMV4I)\Debug/Live555_CE.exe : fatal error LNK1120: 4 unresolved externals 1>Build log was saved at "file://c:\Data_Root\Projects\Harris\Live555 Media Server\Live555_CE\Live555_CE\VistaMax-9G20 (ARMV4I)\Debug\BuildLog.htm" 1>Live555_CE - 7 error(s), 0 warning(s) ========== Build: 0 succeeded, 1 failed, 4 up-to-date, 0 skipped ========== It appears that maybe a library reference is missing or something like that. Any help would be much appreciated. The CE device is a headless device and we have a command shell available, not sure if that is an issue here? Thanks, Sid. -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 4082 bytes Desc: not available URL: From itf-freak at gmx.de Wed May 23 09:03:21 2012 From: itf-freak at gmx.de (=?ISO-8859-15?Q?Christian_Br=FCmmer?=) Date: Wed, 23 May 2012 18:03:21 +0200 Subject: [Live-devel] OnDemandLiveStream-Server for h264 media Message-ID: <4FBD0A49.5070706@gmx.de> Hi, i wrote my own Subclass of FramedSource and OnDemandServerMediaSubsession to stream a h264 video encoded by libav(ffmpeg). I used these classes the same way testOnDemandRTSPServer.cpp does (as you can see in my main). When i try to connect via vlc to the rtsp server my framed source gets created and after that destroyed directly (deliverFrame0() and doGetNextFrame() are not being called). I dont know what im doing wrong so here is my code: imLiveStreamSource.cpp // derivec from framedsource ################################################### #include "imLiveStreamSource.h" #include // for gettimeofday() EventTriggerId imLiveStreamSource::eventTriggerId = 0; unsigned imLiveStreamSource::mReferenceCount = 0; imLiveStreamSource* imLiveStreamSource::createNew(UsageEnvironment& env, imLiveStreamParameters params) { return new imLiveStreamSource(env, params); } imLiveStreamSource::imLiveStreamSource(UsageEnvironment& env, imLiveStreamParameters param) : FramedSource(env), mReady(true), mParameters(param), mEncodedVideoFrame(NULL), mEncodedVideoFrameSize(0), // mIOService(new boost::asio::io_service()), // mWork(new boost::asio::io_service::work(*mIOService)), // mTimer(*mIOService), mEncodingEnabled(true), mNextEncodedVideoFrameWanted(false) { if(mReferenceCount == 0) { av_register_all(); mOutputFormat = av_guess_format(NULL, "test.h264", NULL); if(!mOutputFormat) { std::cout << "Cannot guess output format! Using mpeg!" << std::endl; mOutputFormat = av_guess_format("mpeg", NULL, NULL); } if(!mOutputFormat) { std::cout << "Could not find suitable output format." << std::endl; mReady = false; } mContext = avformat_alloc_context(); if(!mContext) { std::cout << "Cannot allocate avformat memory." << std::endl; mReady = false; } mContext->oformat = mOutputFormat; mVideoStream = NULL; mOutputFormat->audio_codec = CODEC_ID_NONE; mVideoStream = addVideoStream(mContext, mOutputFormat->video_codec); if(mVideoStream) openVideo(mContext, mVideoStream); for (int x = 0; x < NUMBER_OF_THREADS; x++) { //mWorkerThreads.create_thread(boost::bind(&imLiveStreamSource::workerThread, this)); } //mTimer.expires_from_now(boost::posix_time::seconds((int)(1/mParameters.mFrameRate))); //mTimer.async_wait(boost::bind(&imLiveStreamSource::encodingThread, this, _1)); } ++mReferenceCount; // TODO: local init stuff if(eventTriggerId == 0) { eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0); } } imLiveStreamSource::~imLiveStreamSource() { // Any instance-specific 'destruction' (i.e., resetting) of the device would be done here: //%%% TO BE WRITTEN %%% --mReferenceCount; if(mReferenceCount == 0) { //! Free video encoding stuff if(mVideoStream) closeVideo(mContext, mVideoStream); for(int i = 0; i < mContext->nb_streams; i++) { av_freep(&mContext->streams[i]->codec); av_freep(&mContext->streams[i]); } av_free(mContext); //! Video streaming stuff envir().taskScheduler().deleteEventTrigger(eventTriggerId); eventTriggerId = 0; } } void imLiveStreamSource::doGetNextFrame() { // This function is called (by our 'downstream' object) when it asks for new data. // Note: If, for some reason, the source device stops being readable (e.g., it gets closed), then you do the following: if(!mReady) { handleClosure(this); return; } // If a new frame of data is immediately available to be delivered, then do this now: if (mNextEncodedVideoFrame) { write_video_frame(mContext, mVideoStream); deliverFrame(); } else mNextEncodedVideoFrameWanted = true; // No new data is immediately available to be delivered. We don't do anything more here. // Instead, our event trigger must be called (e.g., from a separate thread) when new data becomes available. } void imLiveStreamSource::deliverFrame0(void* clientData) { ((imLiveStreamSource*)clientData)->deliverFrame(); } void imLiveStreamSource::deliverFrame() { if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet u_int8_t* newFrameDataStart = mEncodedVideoFrame; unsigned int newFrameSize = (int)(mEncodedVideoFrameSize); // Deliver the data here: if (newFrameSize > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = newFrameSize - fMaxSize; } else { fFrameSize = newFrameSize; } gettimeofday(&fPresentationTime, NULL); // If you have a more accurate time - e.g., from an encoder - then use that instead. // If the device is *not* a 'live source' (e.g., it comes instead from a file or buffer), then set "fDurationInMicroseconds" here. memmove(fTo, newFrameDataStart, fFrameSize); mNextEncodedVideoFrame = false; // After delivering the data, inform the reader that it is now available: FramedSource::afterGetting(this); } ###################################################### imLiveStreamMediaSubsession.cpp //derived from OnDemandServerMediaSubsession ###################################################### imLiveStreamMediaSubsession::imLiveStreamMediaSubsession(UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource) : OnDemandServerMediaSubsession(env, reuseFirstSource) { } imLiveStreamMediaSubsession::~imLiveStreamMediaSubsession() { } imLiveStreamMediaSubsession* imLiveStreamMediaSubsession::createNew(UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource) { return new imLiveStreamMediaSubsession(env, fileName, reuseFirstSource); } FramedSource* imLiveStreamMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) { estBitrate = 400; // kbps, estimate ?? imLiveStreamParameters param; param.mBitRate = 400000; param.mCodec = "x264"; param.mFrameRate = 24; param.mHeight = 480; param.mWidth = 800; // Create a framer for the Video Elementary Stream: return imLiveStreamSource::createNew(envir(), param); } RTPSink* imLiveStreamMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) { return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic); } ###################################################### main.cpp ###################################################### int main(int argc, char** argv) { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); UserAuthenticationDatabase* authDB = NULL; // Create the RTSP server: RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554, authDB); if (rtspServer == NULL) { *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n"; exit(1); } char const* descriptionString = "Session streamed by \"INGAme\""; // A H.264 video elementary stream: { char const* streamName = "h264ESVideoTest"; char const* inputFileName = "test.264"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(imLiveStreamMediaSubsession::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } // Also, attempt to create a HTTP server for RTSP-over-HTTP tunneling. // Try first with the default HTTP port (80), and then with the alternative HTTP // port numbers (8000 and 8080). if (rtspServer->setUpTunnelingOverHTTP(80) || rtspServer->setUpTunnelingOverHTTP(8000) || rtspServer->setUpTunnelingOverHTTP(8080)) { *env << "\n(We use port " << rtspServer->httpServerPortNum() << " for optional RTSP-over-HTTP tunneling.)\n"; } else { *env << "\n(RTSP-over-HTTP tunneling is not available.)\n"; } env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName) { char* url = rtspServer->rtspURL(sms); UsageEnvironment& env = rtspServer->envir(); env << "\n\"" << streamName << "\" stream, from the file \"" << inputFileName << "\"\n"; env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; } ######################################### Thank you for your time reading this! Christian -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 23 21:17:25 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 May 2012 07:17:25 +0300 Subject: [Live-devel] Is it possible to override the RTP port chosen by the server? In-Reply-To: <42B632EF4EFA4C5E9FB204B3F9DD7287@CSystemDev> References: <42B632EF4EFA4C5E9FB204B3F9DD7287@CSystemDev> Message-ID: > Some Samsung cameras (SNB-5000, SND-5080, SNO-5080) choose the RTP destination port; kindly, the Live555 RTSPClient respect this request. > Instead Samsung, unkindly, doens't let us disable this feature or set the desired port up via the camera's configuration web page; so sometimes two or more cameras (and all after power on) choose the same port: so some streams starve and other ones receive frames coming from two or more cameras. > > Could you please tell me if and where subclassing in order to make Live555 refuse the port chosen by the camera and answer with a port chosen by the OS (hoping the camera will accept the change That's the big question - i.e., if, for these cameras, the client were to choose its ports itself, would the server (the camera) accept this? Therefore, I suggest that you first run the following test: - First, upgrade to the latest version of the "LIVE555 Streaming Media" code. (The version (2005.10.05) that you are using now is *extremely* out of date!) - Change line 596 of "liveMedia/MediaSession.cpp" from if (fClientPortNum != 0) { to if (fClientPortNum != 0 && IsMulticastAddress(tempAddr.s_addr)) { This will force your client code to choose its own port numbers, even if the server specified a port number in the SDP description. The problem here is that your camera is specifying - in the SDP description - a port number for unicast streaming. This is unusual. It is more common for the server to specify its port number later, when it handles the RTSP "SETUP" command. (Specifying a port number in the SDP description is usually done only for multicast streams (because, for multicast streams, the port number will be the same for both clients and the server.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sudan at koperasw.com Wed May 23 23:05:21 2012 From: sudan at koperasw.com (Sudan Landge - Kopera) Date: Thu, 24 May 2012 11:35:21 +0530 Subject: [Live-devel] streaming each file from start Message-ID: <4FBDCFA1.1040106@koperasw.com> Hi Ross, We noticed that whenever a new client is connected to the live555 streamer, it doesn't start streaming the file from beginning. Could you please help us configure live555 streamer so that, whenever a new client is connected, it should stream the file (which is being requested) from the start. Please let me know if, I am missing out on something or if there is already an example application that, does this. Thanks and regards, Sudan From nikolai.vorontsov at quadrox.be Thu May 24 07:33:24 2012 From: nikolai.vorontsov at quadrox.be (Nikolai Vorontsov) Date: Thu, 24 May 2012 16:33:24 +0200 Subject: [Live-devel] Proper bool type for MSVC Message-ID: Hello Ross, I'm using your library with MS Visual Studio 2008 and I've found the following issue: The Boolean type in your library is defined as unsigned char. However, VS2008 fully supports proper bool/true/false type. Could you please consider the following changes in code (library from 2012-05-17) 1. Enable bool support for MSVC: in the UsageEnvironment\include\Boolean.hh file replace: #ifdef __BORLANDC__ #define Boolean bool by #if defined(__BORLANDC__) || (defined(_MSC_VER) && _MSC_VER >= 1400) // MSVC++ 8.0, Visual Studio 2005 and higher #define Boolean bool That enables bool for Boolean in the library. After that I've found some places where the Boolean variables are improperly used. Here is the list of places and proposed fixes: File MPEG4LATMAudioRTPSource.cpp in line 175 replace audioMuxVersion = 0; allStreamsSameTimeFraming = 1; by audioMuxVersion = false; allStreamsSameTimeFraming = true; in line 187-190 change audioMuxVersion = (nextByte&0x80)>>7; if (audioMuxVersion != 0) break; allStreamsSameTimeFraming = (nextByte&0x40)>>6; by audioMuxVersion = (nextByte&0x80) != 0; if (audioMuxVersion) break; allStreamsSameTimeFraming = (nextByte&0x40)>>6 != 0; File MP3InternalsHuffman.cpp in line 552 replace scaleFactorsLength = getScaleFactorsLength(gr, isMPEG2); by scaleFactorsLength = getScaleFactorsLength(gr, isMPEG2 != 0); Actually it's more accurate to change the isMPEG2 type to Boolean, but I'm note sure what else this change might need. File MP3Internals.cpp in line 176 replace hasCRC = ((hdr>>16)&0x1)^0x1; by hasCRC = ((hdr>>16)&0x1) == 0; I hope compiler could optimize it to (hdr & 0x10000) == 0 in line 227 replace framesize /= samplingFreq< From renatomauro at libero.it Thu May 24 09:51:49 2012 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Thu, 24 May 2012 18:51:49 +0200 Subject: [Live-devel] Is it possible to override the RTP port chosen bythe server? References: <42B632EF4EFA4C5E9FB204B3F9DD7287@CSystemDev> Message-ID: <1B0388F7B4AC40649B139939D37A4E46@CSystemDev> Hello Ross. Thank you very much for your suggestions. Now it works. Actually I wrote if (fClientPortNum != 0 && (IsMulticastAddress(tempAddr.s_addr) || isSSM() ) { >>hoping the camera will accept the change >That's the big question - i.e., if, for these cameras, the client were to choose its ports itself, would the server (the camera) accept this? Now only (sorry), reading the RFC2326 C.1.2, I learned that "If the session is unicast, the port number serves as a recommendation from the server to the client; the client still has to include it in its SETUP request and may ignore this recommendation.". So, I suppose, any server must accept the change and, kindly, Samsung cameras do. Thank you again, Renato MAURO -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 24 11:06:52 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 May 2012 21:06:52 +0300 Subject: [Live-devel] might be an specification mismatch in testH264VideoToTransportStream In-Reply-To: References: Message-ID: <76B8C10B-0BB7-4BD2-96CD-CE4707100361@live555.com> No, because when Transport Stream data is stored in a file, or transmitted in RTP packets - which is what our software deals with - then the 'transport packet size' is always 188 bytes. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 24 11:07:16 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 May 2012 21:07:16 +0300 Subject: [Live-devel] Is it possible to override the RTP port chosen bythe server? In-Reply-To: <1B0388F7B4AC40649B139939D37A4E46@CSystemDev> References: <42B632EF4EFA4C5E9FB204B3F9DD7287@CSystemDev> <1B0388F7B4AC40649B139939D37A4E46@CSystemDev> Message-ID: <4A78AE62-75BA-4C91-9A78-F8A1082EE937@live555.com> > Thank you very much for your suggestions. Now it works. OK, that's good to hear. > > Actually I wrote > if > (fClientPortNum != > 0 && (IsMulticastAddress(tempAddr.s_addr) || isSSM() ) { The "|| isSSM()" turns out to be unnecessary, BTW, because the "IsMulticastAddress()" test checks for multicast addresses in the SSM range also. I'll include this change in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark at engine12.com Thu May 24 12:59:41 2012 From: mark at engine12.com (mark at engine12.com) Date: Thu, 24 May 2012 12:59:41 -0700 Subject: [Live-devel] Speex decoding Message-ID: <8467d3de6cd4e2bba82f3acfe262c7b0@engine12.com> Hello, I'm trying to receive a speex stream and play it back in real-time. I can receive the rtsp stream with openRTSP and play it back with speexdec or mplayer. But when I just use mplayer to receive and play the stream I have no luck :( I dug into mplayer and it looks like speex is not a recognized mime type (see rtpCodecInitialize_audio). I added the audio/SPEEX mime type, but this leads to more problems within mplayer. I then tried to pipe the stdout of openRTSP to mplayer (i.e. openRTSP -a rtsp://192.168.1.165:7070/stream | mplayer - -nocache ) This worked, but there was a twenty second plus latency. The same 20 sec. lag is present when decoding with speexdec --> openRTSP rtsp://192.168.1.165:7070/mjpg_streamer | speexdec - I looked at the openRTSP code, but didn't see anything that would be causing the problem, I must have overlooked something. Please assist with solving the playback of a speex rtp stream in real-time. Thanks, Mark From finlayson at live555.com Thu May 24 16:46:52 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 May 2012 02:46:52 +0300 Subject: [Live-devel] Speex decoding In-Reply-To: <8467d3de6cd4e2bba82f3acfe262c7b0@engine12.com> References: <8467d3de6cd4e2bba82f3acfe262c7b0@engine12.com> Message-ID: Mark, It's important to understand that our software does not do any audio/video decoding (or encoding). Therefore, if a 3rd-party media player application - like "MPlayer" - has problems decoding a RTSP/RTP media stream, but you know that you can receive this stream OK (by testing it using "testRTSPClient" or "openRTSP"), then that's not something that we can help you with; you'll need to contact the developers of that media player (in this case "MPlayer") for advice. However, I suggest that you also try another 3rd-party media player application: VLC, . It is often more reliable than "MPlayer". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sidprice at softtools.com Thu May 24 17:08:29 2012 From: sidprice at softtools.com (Sid Price) Date: Thu, 24 May 2012 18:08:29 -0600 Subject: [Live-devel] Windows CE 6 Message-ID: <038901cd3a0a$84f25390$8ed6fab0$@softtools.com> I have built the Media Server for Windows CE 6 R3 and I am now trying to test it out. One issue is that CE does not have a concept of "current working directory" so I was wondering if anyone else has addressed this issue already? I certainly don't see any changes in the code base to use some variable that holds the location of the media files. Thank you, Sid. -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 2738 bytes Desc: not available URL: From finlayson at live555.com Thu May 24 19:03:56 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 May 2012 05:03:56 +0300 Subject: [Live-devel] Windows CE 6 In-Reply-To: <038901cd3a0a$84f25390$8ed6fab0$@softtools.com> References: <038901cd3a0a$84f25390$8ed6fab0$@softtools.com> Message-ID: > I have built the Media Server for Windows CE 6 R3 and I am now trying to > test it out. One issue is that CE does not have a concept of "current > working directory" so I was wondering if anyone else has addressed this > issue already? Well, as long as the code can look up file names specified by relative path names (i.e., a file name without an initial "/" or "\" character), then it will still work OK. Whichever directory gets searched when the code looks up such a file name will effectively be the "current working directory". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sidprice at softtools.com Thu May 24 19:22:55 2012 From: sidprice at softtools.com (Sid Price) Date: Thu, 24 May 2012 20:22:55 -0600 Subject: [Live-devel] Windows CE 6 In-Reply-To: References: <038901cd3a0a$84f25390$8ed6fab0$@softtools.com> Message-ID: <03ac01cd3a1d$4ceb3d00$e6c1b700$@softtools.com> Ross, I appreciate your comments and already had that knowledge. The problem is that the location of the default directory is apparently not under my control, therefore one cannot "point" the server to known location to serve files from. Since it appears that others have run the server with CE I had hoped that this issue had already been resolved. In truth it is not difficult, I was just trying to work with the "off-the-shelf" package rather than having to update it. Many thanks for the obviously huge effort that you have put into this library! Sid. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, May 24, 2012 8:04 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Windows CE 6 I have built the Media Server for Windows CE 6 R3 and I am now trying to test it out. One issue is that CE does not have a concept of "current working directory" so I was wondering if anyone else has addressed this issue already? Well, as long as the code can look up file names specified by relative path names (i.e., a file name without an initial "/" or "\" character), then it will still work OK. Whichever directory gets searched when the code looks up such a file name will effectively be the "current working directory". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 24 21:04:21 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 May 2012 07:04:21 +0300 Subject: [Live-devel] Windows CE 6 In-Reply-To: <03ac01cd3a1d$4ceb3d00$e6c1b700$@softtools.com> References: <038901cd3a0a$84f25390$8ed6fab0$@softtools.com> <03ac01cd3a1d$4ceb3d00$e6c1b700$@softtools.com> Message-ID: > I appreciate your comments and already had that knowledge. The problem is that the location of the default directory is apparently not under my control, therefore one cannot ?point? the server to known location to serve files from. OK, so maybe that means that you won't be able to run the general "LIVE555 Media Server" application on WinCE. But if your intention is instead to just serve a single file, or a 'hard-wired' set of files, then you can run a variant of the "testOnDemandRTSPServer" demo application instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sidprice at softtools.com Thu May 24 21:20:54 2012 From: sidprice at softtools.com (Sid Price) Date: Thu, 24 May 2012 22:20:54 -0600 Subject: [Live-devel] Windows CE 6 In-Reply-To: References: <038901cd3a0a$84f25390$8ed6fab0$@softtools.com> <03ac01cd3a1d$4ceb3d00$e6c1b700$@softtools.com> Message-ID: <03bd01cd3a2d$c7f194d0$57d4be70$@softtools.com> Thanks Ross, I will take a look at that demo application, Sid. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, May 24, 2012 10:04 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Windows CE 6 I appreciate your comments and already had that knowledge. The problem is that the location of the default directory is apparently not under my control, therefore one cannot "point" the server to known location to serve files from. OK, so maybe that means that you won't be able to run the general "LIVE555 Media Server" application on WinCE. But if your intention is instead to just serve a single file, or a 'hard-wired' set of files, then you can run a variant of the "testOnDemandRTSPServer" demo application instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 4082 bytes Desc: not available URL: From nikolai.vorontsov at quadrox.be Fri May 25 04:29:14 2012 From: nikolai.vorontsov at quadrox.be (Nikolai Vorontsov) Date: Fri, 25 May 2012 13:29:14 +0200 Subject: [Live-devel] 4 bytes loss during transmission Message-ID: Hello Ross, could you suggest anything in the following problem: I'm using slightly modified live555MediaServer to deliver h264 stream to the client. It looks like 4 bytes constantly missed and I-frames is mostly not delivered: On the server side (frame size, fMaxSize, not-fit bytes) Delivered 7226 (150000) bytes. Left 0 bytes Delivered 7244 (142774) bytes. Left 0 bytes Delivered 7001 (135530) bytes. Left 0 bytes Delivered 7079 (128529) bytes. Left 0 bytes Delivered 7033 (121450) bytes. Left 0 bytes Delivered 7097 (114417) bytes. Left 0 bytes Delivered 61350 (107320) bytes. Left 0 byte << I-frame Delivered 9784 (45970) bytes. Left 0 bytes Delivered 8937 (36186) bytes. Left 0 bytes Delivered 8258 (27249) bytes. Left 0 bytes Delivered 7798 (18991) bytes. Left 0 bytes Delivered 7664 (11193) bytes. Left 0 bytes Delivered 3529 (3529) bytes. Left 3819 bytes On the client (testRTSPClient with increased DUMMY_SINK_RECEIVE_BUFFER_SIZE and OutPacketBuffer::maxSize to 1MB): Received 7222 bytes. Presentation time: 1337944307.315058! Received 7240 bytes. Presentation time: 1337944307.355058! Received 6997 bytes. Presentation time: 1337944307.395058! Received 7075 bytes. Presentation time: 1337944307.435058! Received 7029 bytes. Presentation time: 1337944307.475058! Received 7093 bytes. Presentation time: 1337944307.515058! Received 20 bytes. Presentation time: 1337944307.555058! << Oops, where is my key frame? Received 4 bytes. Presentation time: 1337944307.555058! Received 9780 bytes. Presentation time: 1337944307.595058! Received 8933 bytes. Presentation time: 1337944307.635058! Received 8254 bytes. Presentation time: 1337944307.675058! Received 7794 bytes. Presentation time: 1337944307.715058! Received 7660 bytes. Presentation time: 1337944307.755058! Received 7344 bytes. Presentation time: 1337944307.795058! As I see, the received size is always less by 4 bytes than the sent one and the I-frame cames as 20 and 4 bytes... Is there any obvious thing that I missed? Like packet fragmentation or whatever? My void CQxByteStreamSource::doGetNextFrame() do the following (I already posted this code here couple of days ago): if (m_nFilled > 0) // Have some (m_nFilled is the amount of data in the "buffer" we need to transmit to the client) { size_t nToWrite = m_nFilled; // How much do we need to write? if (nToWrite > fMaxSize) // Not too much nToWrite = fMaxSize; memcpy(fTo, m_pBuf, nToWrite); // Data fFrameSize = nToWrite; // Update properties fNumTruncatedBytes = 0; if (m_nFilled > nToWrite) // Move buffer content if necessary memmove(m_pBuf, m_pBuf + nToWrite, m_nFilled - nToWrite); m_nFilled -= nToWrite; // Decrease amount of rest bytes DiagLog((dbg::LOG_TRACE, 20, __FUNCTION__" Delivered %4d (%4d) bytes. Left %4d bytes", nToWrite, fMaxSize, m_nFilled)); // Go go go! nextTask() = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this); return true; } else return false; Thanks in advance, Nikolai _________________________________________________________ Nikolai Vorontsov Quadrox nv Duigemhofstraat 101 3020 HERENT Belgium Tel: +32 16582585 Fax: +32 16582586 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolai.vorontsov at quadrox.be Fri May 25 08:59:44 2012 From: nikolai.vorontsov at quadrox.be (Nikolai Vorontsov) Date: Fri, 25 May 2012 17:59:44 +0200 Subject: [Live-devel] Media subsession lifetime Message-ID: Hello Ross, could you please shed a little light on the following item: I'm using testOnDemandRTSPServer-based application to stream h264 stream. The strange thing that I don't wee when the subsession is destroyed when the client is disconnected. What I see: xRTSPServer::createNew() xRTSPServer::lookupServerMediaSession() looks up and creates SMS xRTSPServer::createNewSMS() so, here it creates SMS, reuseFirstSource = false xMediaSubsession::xMediaSubsession() then adds a mediasubsession, fine xMediaSubsession::createNewStreamSource() it creates source xStreamSource::~xStreamSource() and kills it immediately xRTSPServer::lookupServerMediaSession() again lookup (successfully this time, no new SHS creation) xMediaSubsession::createNewStreamSource() it creates source again -- it starts to deliver frames here -- when the client gracefully shuts down I see: xStreamSource::~xStreamSource() so, the source stream is closed and destroyed. But even after 10 minutes I don't see any subsession close. Do I need to close it manually (how? who and how should call ::close(), do I need to remove it from SMS)? Or do I need to keep the subsession? I\ve noticed that if I connect a client again with the same request it tries to create a new source (for the same subsession I guess). What's the relation between MediaSession, subsession(s) and streams? Am I right that (assuming unicast): MediaSession - single entity for particular let say "camera". Subsession - one (or two if audio enabled) entities for this "camera". Stream - practical implementation of the client connection. ? Thanks in advance, Nikolai. _________________________________________________________ Nikolai Vorontsov Quadrox nv Duigemhofstraat 101 3020 HERENT Belgium Tel: +32 16582585 Fax: +32 16582586 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sambhav at saranyu.in Fri May 25 09:12:53 2012 From: sambhav at saranyu.in (Kumar Sambhav) Date: Fri, 25 May 2012 21:42:53 +0530 Subject: [Live-devel] RTSP Client TCP mode Message-ID: <99C24018-122A-466E-9DC6-1BD6E02528F6@saranyu.in> Hi , I have a live555 RTSP server serving a 3mbps mpeg2ts file. The server is hosted with public internet connection. When openRTSP is used in RTP/UDP mode its able to receive the file properly. However when RTP/TCP mode, openRTSP prints "Missing sync byte!" and stops getting data. On the server the Receiver Report shows lot of packet loss. For low bitrate content TCP mode works fine. For TCP mode is there additional settings required, like buffersize to support high bitrate content etc ? Regards, Sambhav From finlayson at live555.com Fri May 25 11:21:24 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 May 2012 21:21:24 +0300 Subject: [Live-devel] streaming each file from start In-Reply-To: <4FBDCFA1.1040106@koperasw.com> References: <4FBDCFA1.1040106@koperasw.com> Message-ID: <32C31467-862E-4AFF-BEC6-DD2AB6880D19@live555.com> > We noticed that whenever a new client is connected to the live555 streamer, it doesn't start streaming the file from beginning. > Could you please help us configure live555 streamer so that, whenever a new client is connected, it should stream the file (which is being requested) from the start. This depends on the value of the "reuseFirstSource" parameter in the "OnDemandServerMediaSubsession" constructor. If you have written your own subclass of "OnDemandServerMediaSubsession", then you should make sure that its constructor - when it calls the (parent) "OnDemandServerMediaSubsession" constructor, sets the "reuseFirstSource" parameter appropriately. Or, if you are using one of the many "OnDemandServerMediaSubsession" subclasses that we have written for you - each of which also has a "reuseFirstSource" parameter - then you should make sure that the subclass's constructor sets the "reuseFirstSource" parameter appropriately. If "reuseFirstSource" is set to "False", then if multiple clients request the same file, then each client will receive the stream starting from the beginning of the file. (This is the behavior that is programmed for the "LIVE555 Media Server" and the "testOnDemandRTSPServer" demo application). If "reuseFirstSource" is set to "True", then if multiple clients request the same file, then the file is read only once; the first client will receive the stream starting from the beginning of the file, but the second (and any subsequent) clients will receive the stream starting at a later point. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhavesh.dhameliya at matrixcomsec.com Fri May 25 02:25:53 2012 From: bhavesh.dhameliya at matrixcomsec.com (Bhavesh Dhameliya) Date: Fri, 25 May 2012 09:25:53 +0000 (UTC) Subject: [Live-devel] RTCP at wrong end in http tunneling Message-ID: Dear sir, I have tested one of my camera with live 555 for RTSP over http. But i face some problem on RTCP side. The "RR" packet was sent over GET side not on the POST side. Also i have varified this on wireshardk. Because of that problem server stops sending data after session time out. So, can you please provide a solution for that. From sidprice at softtools.com Fri May 25 14:21:17 2012 From: sidprice at softtools.com (Sid Price) Date: Fri, 25 May 2012 15:21:17 -0600 Subject: [Live-devel] Windows CE 6 In-Reply-To: References: <038901cd3a0a$84f25390$8ed6fab0$@softtools.com> <03ac01cd3a1d$4ceb3d00$e6c1b700$@softtools.com> Message-ID: <046901cd3abc$53d6c860$fb845920$@softtools.com> Ross, Just a short note to say that the testOnDemandRTSPServer application is working on my platform. There seem to be some buffering issues because I get some "drop-outs" in the playback, however the MP3 I am playing is located in some slow flash memory. My plan now is to mount an SD card and see if that is better. Many thanks again for the pointers, Sid. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, May 24, 2012 10:04 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Windows CE 6 I appreciate your comments and already had that knowledge. The problem is that the location of the default directory is apparently not under my control, therefore one cannot "point" the server to known location to serve files from. OK, so maybe that means that you won't be able to run the general "LIVE555 Media Server" application on WinCE. But if your intention is instead to just serve a single file, or a 'hard-wired' set of files, then you can run a variant of the "testOnDemandRTSPServer" demo application instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 25 23:34:15 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 May 2012 09:34:15 +0300 Subject: [Live-devel] OnDemandLiveStream-Server for h264 media In-Reply-To: <4FBD0A49.5070706@gmx.de> References: <4FBD0A49.5070706@gmx.de> Message-ID: At least part of the problem here is that you are trying to feed a "imLiveStreamSource" (your new data source class) directly into a "H264VideoRTPSink". You can't do this. "H264VideoRTPSink" objects must be fed from a "H264VideoStreamFramer" (or a "H264VideoStreamDiscreteFramer"). Because your data source object is (I presume) delivering H.264 NAL units, your "createNewStreamSource()" function must return a "H264VideoStreamDiscreteFramer". So instead of doing return imLiveStreamSource::createNew(envir(), param); you should do FramedSource* h264NALSource = imLiveStreamSource::createNew(envir(), param); return H264VideoStreamDiscreteFramer::createNew(envir(), h264NALSource); Note also that your "imLiveStreamSource" object must deliver NAL units, one at a time, *without* any preceding MPEG start code - i.e., *without* any preceding 0x00 0x00 0x00 0x01 bytes. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From qrundong at gmail.com Thu May 24 02:31:31 2012 From: qrundong at gmail.com (rundong qiu) Date: Thu, 24 May 2012 17:31:31 +0800 Subject: [Live-devel] Fail in receiving data from a IPcamera via Message-ID: Good afternoon, As you suggested , I specified the parameter with "-I 192.168.1.2" (IP of my PC). But it does not work. Is there anything wrong? I am new to live555, so maybe I have ignored something or make something basic wrong. Thanks very much. Ryan. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 26 00:40:18 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 May 2012 10:40:18 +0300 Subject: [Live-devel] Fail in receiving data from a IPcamera via In-Reply-To: References: Message-ID: <14F7BFB3-E452-4FAB-862A-4CDEA71EE19B@live555.com> If you haven't already done so, try adding the "-t" option to "openRTSP" (to tell it to request RTP-over-TCP streaming). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 26 02:42:32 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 May 2012 12:42:32 +0300 Subject: [Live-devel] Proper bool type for MSVC In-Reply-To: References: Message-ID: <3AC7D098-DE26-453F-9AE3-F11E340B5898@live555.com> > The Boolean type in your library is defined as unsigned char. However, VS2008 fully supports proper bool/true/false type. Could you please consider the following changes in code (library from 2012-05-17) > > 1. Enable bool support for MSVC: in the UsageEnvironment\include\Boolean.hh file replace: > > #ifdef __BORLANDC__ > #define Boolean bool > > by > > #if defined(__BORLANDC__) || (defined(_MSC_VER) && _MSC_VER >= 1400) // MSVC++ 8.0, Visual Studio 2005 and higher > #define Boolean bool > > That enables bool for Boolean in the library. OK, I'll make this change in the next release of the software. > After that I've found some places where the Boolean variables are improperly used. Here is the list of places and proposed fixes: > > File MPEG4LATMAudioRTPSource.cpp > > in line 175 replace > > audioMuxVersion = 0; > allStreamsSameTimeFraming = 1; > > by > > audioMuxVersion = false; > allStreamsSameTimeFraming = true; OK (except that it'll be "False" and "True" rather than "false" and "true"). > in line 187-190 change > > audioMuxVersion = (nextByte&0x80)>>7; > if (audioMuxVersion != 0) break; > allStreamsSameTimeFraming = (nextByte&0x40)>>6; > > by > > audioMuxVersion = (nextByte&0x80) != 0; > if (audioMuxVersion) break; > allStreamsSameTimeFraming = (nextByte&0x40)>>6 != 0; OK. > File MP3InternalsHuffman.cpp > > in line 552 replace > > scaleFactorsLength = getScaleFactorsLength(gr, isMPEG2); > > by > > scaleFactorsLength = getScaleFactorsLength(gr, isMPEG2 != 0); > > Actually it's more accurate to change the isMPEG2 type to Boolean Yes, that's the right thing to do. > File MP3Internals.cpp > > in line 176 replace > > hasCRC = ((hdr>>16)&0x1)^0x1; > > by > > hasCRC = ((hdr>>16)&0x1) == 0; > > I hope compiler could optimize it to (hdr & 0x10000) == 0 OK, I'll just change it to that; thanks. > in line 227 replace > > framesize /= samplingFreq< > by > > framesize /= samplingFreq<< (isMPEG2 ? 1 : 0); OK. > File MediaSession.cpp > > in lines 1114-116 and 1121-1123 replace > > fReadSource = > AMRAudioRTPSource::createNew(env(), fRTPSocket, fRTPSource, > fRTPPayloadFormat, 0 /*isWideband*/, > fNumChannels, fOctetalign, fInterleaving, > fRobustsorting, fCRC); > // Note that fReadSource will differ from fRTPSource in this case > } else if (strcmp(fCodecName, "AMR-WB") == 0) { // AMR audio (wideband) > fReadSource = > AMRAudioRTPSource::createNew(env(), fRTPSocket, fRTPSource, > fRTPPayloadFormat, 1 /*isWideband*/, > fNumChannels, fOctetalign, fInterleaving, > fRobustsorting, fCRC); > > by > > fReadSource = > AMRAudioRTPSource::createNew(env(), fRTPSocket, fRTPSource, > fRTPPayloadFormat, false /*isWideband*/, > fNumChannels, fOctetalign != 0, fInterleaving, > fRobustsorting != 0, fCRC != 0); > // Note that fReadSource will differ from fRTPSource in this case > } else if (strcmp(fCodecName, "AMR-WB") == 0) { // AMR audio (wideband) > fReadSource = > AMRAudioRTPSource::createNew(env(), fRTPSocket, fRTPSource, > fRTPPayloadFormat, true /*isWideband*/, > fNumChannels, fOctetalign != 0, fInterleaving, > fRobustsorting != 0, fCRC != 0); OK (except again that it'll be "False" and "True" rather than "false" and "true"). > File H264VideoStreamFramer.cpp > > here is 4 places where the BitVector::get1bit() result is assigned to the Boolean variable. The easiest solution is to assign the get1bit() != 0 to Boolean, but more nicer is to make inline get1BitAsBoolean() member and use it. Agreed; I'll do this. Thanks again for the suggestions. I'll make these changes in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 26 02:48:37 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 May 2012 12:48:37 +0300 Subject: [Live-devel] 4 bytes loss during transmission In-Reply-To: References: Message-ID: > I'm using slightly modified live555MediaServer to deliver h264 stream to the client. It looks like 4 bytes constantly missed and I-frames is mostly not delivered: If your input source is delivering discrete NAL units (i.e., one at a time), then note that you must feed them into a "H264VideoStreamDiscreteFramer", and must deliver them *without* any preceding MPEG start code - i.e., *without* any preceding 0x00 0x00 0x00 0x01 bytes. (If your input NAL units contain these bytes at the start, then be sure to set 'fFrameSize" to take account of the fact that you're not delivering those 4 bytes). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 26 02:55:43 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 May 2012 12:55:43 +0300 Subject: [Live-devel] RTSP Client TCP mode In-Reply-To: <99C24018-122A-466E-9DC6-1BD6E02528F6@saranyu.in> References: <99C24018-122A-466E-9DC6-1BD6E02528F6@saranyu.in> Message-ID: > I have a live555 RTSP server serving a 3mbps mpeg2ts file. The server is hosted with public internet connection. > > When openRTSP is used in RTP/UDP mode its able to receive the file properly. Yes, but if your network's capacity is less than 3 Mbps, then you will lose data, but the losses will occur on Transport Stream 'sync byte' boundaries. > However when RTP/TCP mode, openRTSP prints "Missing sync byte!" and stops getting data. > On the server the Receiver Report shows lot of packet loss. > > For low bitrate content TCP mode works fine. Your stream is exceeding the bandwidth of the network. Therefore, you are going to lose data. End of story. Note that - in this case - RTP-over-TCP streaming will not save you. If you try to stream over a too-low-bandwidth network using TCP, you'll still get data loss, but it'll occur at the sender (because of its OS buffer overflowing), rather than at the receiver. This is the important distinction between live media streaming, and on-demand streaming (e.g., using the World-Wide Web). If you use our software, it's important to understand this distinction. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 26 03:15:01 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 May 2012 13:15:01 +0300 Subject: [Live-devel] Media subsession lifetime In-Reply-To: References: Message-ID: <4336E1D1-8A89-41FA-879B-08A541C82C34@live555.com> > could you please shed a little light on the following item: > > I'm using testOnDemandRTSPServer-based application to stream h264 stream. The strange thing that I don't wee when the subsession is destroyed when the client is disconnected. A "ServerMediaSubsession" (note, not a "MediaSubsession"; that's something that's used only by clients) object represents a media 'track' (i.e., an audio, video, text substream within a possibly compound stream), *regardless* of which clients, if any, are currently receiving it. Therefore, it does not get deleted when a client disconnects. Instead, it stays in existence, in case other, new clients also want to receive it later. Note that "ServerMediaSubsession" objects are contained within "ServerMediaSession" objects. A "ServerMediaSession" object represents a complete media stream (consisting of one or more "ServerMediaSubsession"s (i.e., 'tracks')). Each "ServerMediaSubsession" object is automatically reclaimed when its parent "ServerMediaSession" object is reclaimed. "ServerMediaSession" (and thus also "ServerMediaSubsession") objects are automatically reclaimed when the "RTSPServer" object is closed. However, you can also, if you wish, remove a "ServerMediaSession" object (and thus also its "ServerMediaSubsession"s) from the server yourself, using the "RTSPServer::removeServerMediaSession()" function. This is rarely needed, but you would do this, for example, if you decide that you don't want any more clients to be able access this particular stream (but want the server to stay running, to continue to serve other streams). To summarize: You usually don't need to worry about reclaiming "ServerMediaSession" or "ServerMediaSubsession" objects. They will get reclaimed automatically after you call "Medium::close()" on the "RTSPServer" object. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From itf-freak at gmx.de Sat May 26 06:12:19 2012 From: itf-freak at gmx.de (=?ISO-8859-1?Q?Christian_Br=FCmmer?=) Date: Sat, 26 May 2012 15:12:19 +0200 Subject: [Live-devel] OnDemandLiveStream-Server for h264 media In-Reply-To: References: <4FBD0A49.5070706@gmx.de> Message-ID: <4FC0D6B3.9060306@gmx.de> Am 26.05.2012 08:34, schrieb Ross Finlayson: > At least part of the problem here is that you are trying to feed a > "imLiveStreamSource" (your new data source class) directly into a > "H264VideoRTPSink". You can't do this. "H264VideoRTPSink" objects > must be fed from a "H264VideoStreamFramer" (or a > "H264VideoStreamDiscreteFramer"). > > Because your data source object is (I presume) delivering H.264 NAL > units, your "createNewStreamSource()" function must return > a "H264VideoStreamDiscreteFramer". So instead of doing > return imLiveStreamSource::createNew(envir(), param); > you should do > FramedSource* h264NALSource = imLiveStreamSource::createNew(envir(), > param); > return H264VideoStreamDiscreteFramer::createNew(envir(), h264NALSource); > > Note also that your "imLiveStreamSource" object must deliver NAL > units, one at a time, *without* any preceding MPEG start code - i.e., > *without* any preceding 0x00 0x00 0x00 0x01 bytes. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel Its working! My fautl.. windows firewall cancled network connection so that was the problem. VLC plays the video but the video Stucks from time to time (looks like it misses some frames). i encode the frame in the same thread when doGetNextFrame is called - is it recommended encoding the video in another thread? I noticed that the imLiveStreamSource constructor is called two times, and after the first time the destructor is called instantly. dont think that is a right behavior!?! im happy anyway for my first results :)! thanks so much! =] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 26 07:13:05 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 May 2012 17:13:05 +0300 Subject: [Live-devel] OnDemandLiveStream-Server for h264 media In-Reply-To: <4FC0D6B3.9060306@gmx.de> References: <4FBD0A49.5070706@gmx.de> <4FC0D6B3.9060306@gmx.de> Message-ID: <5BA8AC30-E035-48C2-88B2-26A06441DF9D@live555.com> > is it recommended encoding the video in another thread? It's probably a good idea, especially if you have a multi-core CPU. Of course, this separate, encoding thread shouldn't call any LIVE555 code, except for "triggerEvent()". > I noticed that the imLiveStreamSource constructor is called two times, and after the first time the destructor is called instantly. dont think that is a right behavior!?! Actually it is the correct behavior for media - like H.264 and MPEG-4 video - that require that the server parse the data for 'configuration' information before it can be streamed. The first construction (and later destruction) is used, by the server, to read just enough input data to get this configuration information. So in this respect, at least, your server is working correctly. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon May 28 02:05:19 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 28 May 2012 11:05:19 +0200 Subject: [Live-devel] How to specify Range in RTSPClient that are not using npt In-Reply-To: <1B61CE9B-0457-449B-914B-EEA0066B0D2A@live555.com> References: <12147_1337614896_4FBA6230_12147_1419_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4BDC@THSONEA01CMS01P.one.grp> <1B61CE9B-0457-449B-914B-EEA0066B0D2A@live555.com> Message-ID: <17816_1338195958_4FC33FF6_17816_12541_1_1BE8971B6CFF3A4F97AF4011882AA2550155FD28317D@THSONEA01CMS01P.one.grp> Hi Ross, Perhaps Don Quixote wrote RFC 2326 chapter 12.19... More seriously we need to seek in absolute mode because we would like to query video according to the time we recorded it (and not a relative time relative to begin of RTSP session). I try to find ways to reduce live555 patching (I would like to use live555 wihtou any patch). But without entry point in RTSPClient, I guess we have no others way that patching it ? Thanks & Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : mercredi 23 mai 2012 12:35 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] How to specify Range in RTSPClient that are not using npt We try to find a way to send a range in absolute, it seems that a way could be to specify in the RTSP PLAY command : - Range: clock=19961108T142300Z-19961108T143520Z Why? Once again, I don't see the point. With a "Range:" header like this (in fact, any "Range:" header other than "0-" or "now-"), you're telling the server that you want to seek. But if the server's not going to be seeking within the stream, then there's no point in sending a header like this. (And in fact, our server code doesn't even handle this kind of "Range:" header (and there are no plans to extend it to do so, because noone ever uses it.) I don't understand why you keep "tilting at this windmill". Is it possible to define this method in the RTSPClient class as virtual in order to extend it ? I have no plans to do this. Again, I just don't see the point... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nikolai.vorontsov at quadrox.be Wed May 30 02:36:27 2012 From: nikolai.vorontsov at quadrox.be (Nikolai Vorontsov) Date: Wed, 30 May 2012 11:36:27 +0200 Subject: [Live-devel] RTSP Client TCP mode In-Reply-To: References: <99C24018-122A-466E-9DC6-1BD6E02528F6@saranyu.in> Message-ID: Hello Ross, does it mean that we don't have any means to handle tool-low-bandwidth network case? I can imagine for instance for the live view the following scenario - we monitor the output queue length and when it's overflown or close to the top - start to skip all incoming frames till the next key frame (or till the queue is lowering)? That makes video not ideal, but at least customer see something (with gaps). For the playback usually it's possible to ask the source part to hold on frame delivery and I guess, again I can monitor output queue length and tame my backend server stream. Does it make sense in live555? Nikolai ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, May 26, 2012 11:56 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTSP Client TCP mode I have a live555 RTSP server serving a 3mbps mpeg2ts file. The server is hosted with public internet connection. When openRTSP is used in RTP/UDP mode its able to receive the file properly. Yes, but if your network's capacity is less than 3 Mbps, then you will lose data, but the losses will occur on Transport Stream 'sync byte' boundaries. However when RTP/TCP mode, openRTSP prints "Missing sync byte!" and stops getting data. On the server the Receiver Report shows lot of packet loss. For low bitrate content TCP mode works fine. Your stream is exceeding the bandwidth of the network. Therefore, you are going to lose data. End of story. Note that - in this case - RTP-over-TCP streaming will not save you. If you try to stream over a too-low-bandwidth network using TCP, you'll still get data loss, but it'll occur at the sender (because of its OS buffer overflowing), rather than at the receiver. This is the important distinction between live media streaming, and on-demand streaming (e.g., using the World-Wide Web). If you use our software, it's important to understand this distinction. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 30 03:38:39 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 May 2012 13:38:39 +0300 Subject: [Live-devel] RTCP at wrong end in http tunneling In-Reply-To: References: Message-ID: > I have tested one of my camera with live 555 for RTSP over http. > But i face some problem on RTCP side. > The "RR" packet was sent over GET side not on the POST side. > Also i have varified this on wireshardk. > Because of that problem server stops sending data after session time out. You have run into an ambiguous part of the RTSP-over-HTTP 'specification': - it doesn't mention RTCP at all, let alone how RTCP "RR" and "SR" packets are carried. It seems clear that RTCP "SR" packets from the server should be sent - along with the stream's RTP packets - on the "GET" channel. But it's not clear what should be done with RTCP "RR" packets from the client. On the one hand, the presence of separate "GET" and "POST" channels - and the suggestion (though not the explicit statement) from this diagram (HTTP GET) |----<<<< data <<<<< ---- client -----| |---- server |-- >>>> data >>>>-------| (HTTP POST) that the "GET" channel is used only for data flowing from the server to the client, and the "POST" channel is used only for data flowing from the client to the server - suggests that RTCP "RR" packets from the client are sent over the "POST" channel. On the other hand, the specification suggests (but again, does not explicitly state) that the "POST" channel is intended only for Base64-encoded RTSP requests from the client, and it does explicitly say that "The client may at its option close the POST connection at any given time". From this, when I implemented RTSP-over-HTTP tunneling in the "LIVE555 Streaming Media" code, I inferred that RTCP "RR" packets should be sent over the "GET" channel (making it a two-way channel). I seem to recall (though I'm not 100% sure) that this also made the code compatible with Apple's QuickTime Streaming Server. > So, can you please provide a solution for that. If you can't modify your camera to listen for RTCP "RR" packets on the "GET" channel, then one thing you could do is update your client to periodically (i.e., using "TaskScheduler::scheduleDelayedTask()") send RTSP "GET_PARAMETER" commands. The camera might end up using this to detect the client's 'liveness'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From av at bsbc.nb.ca Wed May 30 06:22:05 2012 From: av at bsbc.nb.ca (Anthony Brown) Date: Wed, 30 May 2012 10:22:05 -0300 Subject: [Live-devel] MPEG2TransportStreamIndexer and pipes Message-ID: <4FC61EFD.7060107@bsbc.nb.ca> Is it possible to use MPEG2TransportStreamIndexer either reading from a pipe (named or unnamed) or from stdin? A brief look at the source suggests that a named pipe would work, unless seeking is used somewhere. If it would work from a named pipe then rewriting main() to use stdin would allow it to be used on the end of a pipe or redirection.... yes? My ultimate goal here is to record a transport stream from firewire and write it to disk while simultaneously generating an index file to allow me to stream the file with trick play capability while it is still being recorded. -- Anthony Brown Audiovisual coordinator Brunswick Street Baptist Church Telephone: (506)-458-8348 (leave message) Email: av at bsbc.nb.ca -------------- next part -------------- A non-text attachment was scrubbed... Name: av.vcf Type: text/x-vcard Size: 163 bytes Desc: not available URL: From Marlon at scansoft.co.za Wed May 30 07:19:34 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Wed, 30 May 2012 16:19:34 +0200 Subject: [Live-devel] Stopping the server doesn't actually stop Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8E13@SSTSVR1.sst.local> Hi, I have an application that streams audio from a microphone to other machines on the network via RTSP. Users control the stream with a "Start" and "Stop" button. When making the first "call" the quality is perfect (calls start by clicking "start" and end by clicking "stop"). The second call has some slight noise in it, and the third call is completely rubbish. I have discovered that only the first stop actually stops the server from streaming. The second call's stop, even though it calls the same function, does not stop the server from streaming. I know this because my program continuously hits a breakpoint in the AfterGettingFrame function of my FramedFilter, even though the server is supposed to be stopped. My stop function looks like this //------------------------------------------------- map::iterator itr2; itr2 = m_odsList.find(streamID); if (itr2 == m_odsList.end()) return false; m_odsList.erase(itr2); map::iterator itr; itr = m_smsList.find(streamID); if (itr == m_smsList.end()) return false; m_rtspServer->removeServerMediaSession((*itr).second); m_smsList.erase(itr); //-------------------------------------------------- The first thing it does is look up the OnDemandSubsession for this stream by my streamID, then I erase it from the list. After that I lookup the ServerMediaSession based on the streamID, remove it from the RTSP server and erase it from the list. This procedure works well for the first call's stop, but it seems to fail for the second call's stop ( As I mentioned before, I know this because my program still enters AfterGettingFrame). Could the problem be that I am not deleting the OnDemandSubsession and ServerMediaSession? I tried that but then Live555 crashes in RTSPServer::RTSPCLientSession::reclaimStreamStates(), so I assumed that those are deleted in Live555 and should not be deleted in my code. Once again I need some advice. Why does the server not actually stop on the second call's stop? Thank you for all you assistance. Regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Wed May 30 11:34:51 2012 From: warren at etr-usa.com (Warren Young) Date: Wed, 30 May 2012 12:34:51 -0600 Subject: [Live-devel] MPEG2TransportStreamIndexer and pipes In-Reply-To: <4FC61EFD.7060107@bsbc.nb.ca> References: <4FC61EFD.7060107@bsbc.nb.ca> Message-ID: <4FC6684B.9030806@etr-usa.com> On 5/30/2012 7:22 AM, Anthony Brown wrote: > rewriting main() to use stdin > would allow it to be used on the end of a pipe or redirection.... yes? This much looks easy to do. The indexer gets its input data from a ByteStreamFileSource, and one of its virtual ctors (createNew()) takes a FILE* instead of the file name used by the stock version. Just pass "stdin" here. (http://goo.gl/ymW1e) > My ultimate goal here is to record a transport stream from firewire and > write it to disk while simultaneously generating an index file to allow > me to stream the file with trick play capability while it is still being > recorded. This is the hard part. I assume you're using the live555MediaServer RTSP server here. It has a couple of weaknesses that fight what you want to do. First, it doesn't keep reloading an index file as it changes on disk. It just loads it up once when it creates the stream. Second and worse, live555MediaServer keeps a cache of stream objects indexed by URL, so as long as the URL doesn't change, it will reuse existing stream objects. This means that if the .tsx file doesn't exist when the stream starts the first time, each time you request that URL and thereby cause the server to re-use its cached copy instead of rebuilding it, you don't get trick play support. Maybe there are ways to kick stale streams out of the cache, but our workaround so far has been to restart the RTSP server after generating .tsx files for .ts files in use. To get live555MediaServer to do what you want, I'd suggest modifying the code to keep the .tsx file's modification time at load time, then before each attempt to use info parsed from the .tsx file, check the mtime again and reload the data if it's changed. I hope you provide it as a patch instead of forking your copy, if only because it would be a widely useful improvement. Take a look at this before starting: http://www.suacommunity.com/dictionary/stat-entry.php Notice that there is no _fstat(). From finlayson at live555.com Wed May 30 07:02:33 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 May 2012 17:02:33 +0300 Subject: [Live-devel] How to specify Range in RTSPClient that are not using npt In-Reply-To: <17816_1338195958_4FC33FF6_17816_12541_1_1BE8971B6CFF3A4F97AF4011882AA2550155FD28317D@THSONEA01CMS01P.one.grp> References: <12147_1337614896_4FBA6230_12147_1419_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4BDC@THSONEA01CMS01P.one.grp> <1B61CE9B-0457-449B-914B-EEA0066B0D2A@live555.com> <17816_1338195958_4FC33FF6_17816_12541_1_1BE8971B6CFF3A4F97AF4011882AA2550155FD28317D@THSONEA01CMS01P.one.grp> Message-ID: <57ECA55A-C081-4017-968E-EA49167C8AC8@live555.com> > Perhaps Don Quixote wrote RFC 2326 chapter 12.19? I think you mean chapter 12.29. (But anyway...) > More seriously we need to seek in absolute mode because we would like to query video according to the time we recorded it (and not a relative time relative to begin of RTSP session). This is the first time I've ever heard of someone doing this (or even wanting to do this). Because it's allowed for (but optional) in the RTSP specification, it would be nice if we (at the client and server end) supported this. But because it's so rarely used, it's not going to be a high-priority feature, unfortunately. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 30 12:02:16 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 May 2012 22:02:16 +0300 Subject: [Live-devel] RTSP Client TCP mode In-Reply-To: References: <99C24018-122A-466E-9DC6-1BD6E02528F6@saranyu.in> Message-ID: I'm not sure I totally understand your question, but I'm pretty sure the answer is "no". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From christopher.compagnon at gmail.com Sun May 27 05:55:54 2012 From: christopher.compagnon at gmail.com (Christopher COMPAGNON) Date: Sun, 27 May 2012 14:55:54 +0200 Subject: [Live-devel] liveMedia/FreeBSD : compilation fails on ar command In-Reply-To: References: <4FB5D902.5040501@gmail.com> <4FB5D963.1050805@gmail.com> <20F8EC52-CCAC-471E-A9B8-9FFC3C9D1690@live555.com> Message-ID: <4FC2245A.4060802@gmail.com> Hello ! I've just tried to complie the latest (2012-05-17) live555 streaming server on FreeBSD 7.0 (32bits) (this compilation fails also on Fr'eeBSD 9.0 64bits fresh installed). It failed with the same "ar" issue. I tried with old version I still have on my servers : compilation of the version of 2009-08-28 --> KO compilation of the version of 2009-04-20 --> Success Something changed between 2009-04-20 and 2009-08-28 for compilation's makefiles on FreeBSD... Best regards. C.COMPAGNON Le 18/05/2012 11:26, Ross Finlayson a ?crit : >> I solved the ar issue the following solution : >> In Makefile.tail, I added a space in the line >> $(LIBRARY_LINK)$@ $(LIBRARY_LINK_OPTS) \ >> between $(LIBRARY_LINK) and $@... > > But you shouldn't have to do this, because the space is already > present - at the end of the "LIBRARY_LINK" line in the > "config.freebsd" file. > > Noone else has had problems building the code for Freebsd after just > doing "genMakefiles freebsd". I don't know why it's not working for > you... > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From ketangholap1990 at gmail.com Mon May 28 07:09:53 2012 From: ketangholap1990 at gmail.com (Ketan Gholap) Date: Mon, 28 May 2012 19:39:53 +0530 Subject: [Live-devel] Byte rate of live media server Message-ID: HELLO SIR I am trying to study live media and as a beginner i was wondering how to find out the bit/byte rate of streaming of live media server,that i can print on the console??? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From itf-freak at gmx.de Mon May 28 07:41:33 2012 From: itf-freak at gmx.de (=?ISO-8859-15?Q?Christian_Br=FCmmer?=) Date: Mon, 28 May 2012 16:41:33 +0200 Subject: [Live-devel] Android device don't accept my own OnDemandLiveStream Message-ID: <4FC38E9D.2080707@gmx.de> Hi, i am coding my own OnDemandLiveStream-Server (as discussed here: http://lists.live555.com/pipermail/live-devel/2012-May/015178.html) using h264 encoded videoframes. I can play the videostream using VLC but my Samsung Galaxy SII running ICS (Android 4.03) can't open the stream. If i use the "mediaServer" and a *.264 videofile my smartphone is able to play the video. Using my own server FramedSource is being created and destroyed again (as i know thats a right behavior - for configuration transfer) but other functions like doGetNextFrame() never been called and the smartphone canceled the connection right after the configuration transmission (as you can see in the log file). Since my encoding functions never been called it must be a bad rtsp-configuration. But i dont know what happens behind OnDemandServerMediaSubsession (my media subsession) and H264VideoFileServerMediaSubsession (mediaServer) eg. what are the differences. You can see my whole code here http://lists.live555.com/pipermail/live-devel/2012-May/015178.html. The lines /05-28 16:00:34.540: I/ASessionDescription(24294): a=fmtp:96 packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5JUDAS0IAAAAwBAAAAM0eMGVA==,aO48gA== 05-28 16:00:34.540: I/ASessionDescription(24294): a=control:track1/ existing only in the mediaServer-Log - this may be the problem but i dont what it means! Best regards, Christian Android-Log for my own server: #### 05-28 15:59:09.710: I/ARTSPConnection(24294): status: RTSP/1.0 200 OK 05-28 15:59:09.710: W/MyHandler(24294): OPTIONS completed with result 0 (Success) 05-28 15:59:09.715: I/ARTSPConnection(24294): status: RTSP/1.0 200 OK 05-28 15:59:09.715: I/MyHandler(24294): DESCRIBE completed with result 0 (Success) 05-28 15:59:09.715: I/ASessionDescription(24294): v=0 05-28 15:59:09.715: I/ASessionDescription(24294): o=- 1338213491406907 1 IN IP4 192.168.0.198 05-28 15:59:09.715: I/ASessionDescription(24294): s=Session streamed by "INGAme" 05-28 15:59:09.715: I/ASessionDescription(24294): i=h264.3gp 05-28 15:59:09.715: I/ASessionDescription(24294): t=0 0 05-28 15:59:09.715: I/ASessionDescription(24294): a=tool:LIVE555 Streaming Media v2012.04.21 05-28 15:59:09.715: I/ASessionDescription(24294): a=type:broadcast 05-28 15:59:09.715: I/ASessionDescription(24294): a=control:* 05-28 15:59:09.715: I/ASessionDescription(24294): a=range:npt=0- 05-28 15:59:09.715: I/ASessionDescription(24294): a=x-qt-text-nam:Session streamed by "INGAme" 05-28 15:59:09.715: I/ASessionDescription(24294): a=x-qt-text-inf:h264.3gp 05-28 15:59:09.715: I/ASessionDescription(24294): m=video 0 RTP/AVP 96 05-28 15:59:09.715: I/ASessionDescription(24294): c=IN IP4 0.0.0.0 05-28 15:59:09.715: I/ASessionDescription(24294): b=AS:480 05-28 15:59:09.715: I/ASessionDescription(24294): a=rtpmap:96 H264/90000 05-28 15:59:09.715: I/ASessionDescription(24294): a=control:track1 05-28 15:59:09.715: W/MyHandler(24294): mBaseURL is change to rtsp://192.168.0.198/h264.3gp/ from 'content-base' 05-28 15:59:09.715: W/MyHandler(24294): Property [net.connectivity.qosbw] NOT Found, bwQoS=2147483647 05-28 15:59:09.715: W/APacketSource(24294): Format:video 0 RTP/AVP 96 / MIME-Type:H264/90000 *05-28 15:59:09.715: W/MyHandler(24294): Unsupported format. Ignoring track #1.* #### Android-Log for mediaServer (working): #### 05-28 16:00:34.535: I/ARTSPConnection(24294): status: RTSP/1.0 200 OK 05-28 16:00:34.540: I/MyHandler(24294): DESCRIBE completed with result 0 (Success) 05-28 16:00:34.540: I/ASessionDescription(24294): v=0 05-28 16:00:34.540: I/ASessionDescription(24294): o=- 1338213636753913 1 IN IP4 192.168.0.198 05-28 16:00:34.540: I/ASessionDescription(24294): s=H.264 Video, streamed by the LIVE555 Media Server 05-28 16:00:34.540: I/ASessionDescription(24294): i=working.264 05-28 16:00:34.540: I/ASessionDescription(24294): t=0 0 05-28 16:00:34.540: I/ASessionDescription(24294): a=tool:LIVE555 Streaming Media v2011.11.20 05-28 16:00:34.540: I/ASessionDescription(24294): a=type:broadcast 05-28 16:00:34.540: I/ASessionDescription(24294): a=control:* 05-28 16:00:34.540: I/ASessionDescription(24294): a=range:npt=0- 05-28 16:00:34.540: I/ASessionDescription(24294): a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server 05-28 16:00:34.540: I/ASessionDescription(24294): a=x-qt-text-inf:working.264 05-28 16:00:34.540: I/ASessionDescription(24294): m=video 0 RTP/AVP 96 05-28 16:00:34.540: I/ASessionDescription(24294): c=IN IP4 0.0.0.0 05-28 16:00:34.540: I/ASessionDescription(24294): b=AS:500 05-28 16:00:34.540: I/ASessionDescription(24294): a=rtpmap:96 H264/90000 05-28 16:00:34.540: I/ASessionDescription(24294): a=fmtp:96 packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5JUDAS0IAAAAwBAAAAM0eMGVA==,aO48gA== 05-28 16:00:34.540: I/ASessionDescription(24294): a=control:track1 05-28 16:00:34.540: W/MyHandler(24294): mBaseURL is change to rtsp://192.168.0.198/working.264/ from 'content-base' 05-28 16:00:34.540: W/MyHandler(24294): Property [net.connectivity.qosbw] NOT Found, bwQoS=2147483647 05-28 16:00:34.540: W/APacketSource(24294): Format:video 0 RTP/AVP 96 / MIME-Type:H264/90000 05-28 16:00:34.540: I/APacketSource(24294): dimensions 384x288 *05-28 16:00:34.540: I/ARTPConnection(24294): Start:16202* #### -------------- next part -------------- An HTML attachment was scrubbed... URL: From jillruth234 at gmail.com Wed May 30 01:59:30 2012 From: jillruth234 at gmail.com (jill ruth) Date: Wed, 30 May 2012 14:29:30 +0530 Subject: [Live-devel] Reducing load of wis-streamer Message-ID: Hi, I have a network camera with a modified version of wis-streamer. I wanted to know if any optimizations could be done to reduce the load that wis streamer puts on the camera's processor. Regards, Jill -------------- next part -------------- An HTML attachment was scrubbed... URL: From skygml at gmail.com Wed May 30 04:16:46 2012 From: skygml at gmail.com (Vitaliy Kozlov) Date: Wed, 30 May 2012 15:16:46 +0400 Subject: [Live-devel] Problem with a=x-dimensions LIVE555 Proxy Server Message-ID: Hello all. After the RTSP connection IP-based cameras to send a reply with the line "a=x-dimensions:640,480", but when you connect to LIVE555 Proxy Server, this line is missing. For some decoders need to know the width and height, and this is a problem. Best Regards, Vitaliy, Sarapul Systems, Ltd. From tbatra18 at gmail.com Tue May 29 04:34:14 2012 From: tbatra18 at gmail.com (Tarun Batra) Date: Tue, 29 May 2012 17:04:14 +0530 Subject: [Live-devel] Is there any way in testMPEG2TransportStreamer.cpp to know the frame rate of the file it is streaming Message-ID: Hello sir Is there any way in testMPEG2TransportStreamer.cpp to know the frame rate of the file it is streaming???? If so please tell me.... Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From maratshch at gmail.com Tue May 29 08:59:15 2012 From: maratshch at gmail.com (Marat Shchuchinsky) Date: Tue, 29 May 2012 18:59:15 +0300 Subject: [Live-devel] best way on server side for detect new client connection Message-ID: Hi, I work with live555 and stream H264 coded video via RTP. I need force IDR frame on encoder when each new client connected to my RTSP server. Please let me know how I can to do it ? what is best way (on server side) in live555 to detect client connection / disconnection to my RTSP server? My server may to work in 3 modes: UDP unicast, ASM multicast and SSM multicast. Thank alot. -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix at embedded-sol.com Thu May 31 03:24:12 2012 From: felix at embedded-sol.com (Felix Radensky) Date: Thu, 31 May 2012 13:24:12 +0300 Subject: [Live-devel] Integrating live555 RTSP client Message-ID: <4FC746CC.7050408@embedded-sol.com> Hi, I have to integrate live555 RTSP client into an existing application that decodes H.264 video and sends it to display. The application is multithreaded, with separate threads for decoding and display tasks. I was thinking of using testRTSPClient code as a base, putting it into separate thread, and notifying decoding thread when next frame is ready. My idea was to use condition variable embedded into DummySink object and waiting on condvar in another thread. Does that sound reasonable ? Thanks a lot. Felix. From TWiser at logostech.net Thu May 31 07:21:06 2012 From: TWiser at logostech.net (Wiser, Tyson) Date: Thu, 31 May 2012 07:21:06 -0700 Subject: [Live-devel] How to specify Range in RTSPClient that are not using npt In-Reply-To: <57ECA55A-C081-4017-968E-EA49167C8AC8@live555.com> References: <12147_1337614896_4FBA6230_12147_1419_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4BDC@THSONEA01CMS01P.one.grp> <1B61CE9B-0457-449B-914B-EEA0066B0D2A@live555.com> <17816_1338195958_4FC33FF6_17816_12541_1_1BE8971B6CFF3A4F97AF4011882AA2550155FD28317D@THSONEA01CMS01P.one.grp> <57ECA55A-C081-4017-968E-EA49167C8AC8@live555.com> Message-ID: <8CD7A9204779214D9FDC255DE48B9521C6BDF0D8@EXPMBX105-1.exch.logostech.net> More seriously we need to seek in absolute mode because we would like to query video according to the time we recorded it (and not a relative time relative to begin of RTSP session). This is the first time I've ever heard of someone doing this (or even wanting to do this). Because it's allowed for (but optional) in the RTSP specification, it would be nice if we (at the client and server end) supported this. But because it's so rarely used, it's not going to be a high-priority feature, unfortunately. We also had the same requirement of being able to request video based on absolute record time, not relative time. When we saw that Live555 didn't support this we ended up doing some subclassing so that we could override the handling of SET_PARAMETER messages. We then had the client pass the desired play time as a parameter to the server before sending the PLAY command and had our server check that parameter when it started playing. It ended up working well for us and also made it so that we could change the play time simply by calling SET_PARAMETER again without issuing PLAY, PAUSE, or STOP commands. The downside is that now only our client can control the play time since no other RTSP client would know about our specific parameter. In theory any client would be able to play our live stream (using the "now-" form of the ntp PLAY command) but it wouldn't be able to get anything other than live. -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Thu May 31 10:00:27 2012 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Thu, 31 May 2012 10:00:27 -0700 Subject: [Live-devel] How to specify Range in RTSPClient that are not using npt In-Reply-To: <8CD7A9204779214D9FDC255DE48B9521C6BDF0D8@EXPMBX105-1.exch.logostech.net> References: <12147_1337614896_4FBA6230_12147_1419_1_1BE8971B6CFF3A4F97AF4011882AA2550155FC0C4BDC@THSONEA01CMS01P.one.grp> <1B61CE9B-0457-449B-914B-EEA0066B0D2A@live555.com> <17816_1338195958_4FC33FF6_17816_12541_1_1BE8971B6CFF3A4F97AF4011882AA2550155FD28317D@THSONEA01CMS01P.one.grp> <57ECA55A-C081-4017-968E-EA49167C8AC8@live555.com> <8CD7A9204779214D9FDC255DE48B9521C6BDF0D8@EXPMBX105-1.exch.logostech.net> Message-ID: <4FC7A3AB.7040908@schuckmannacres.com> I can actually see using this feature as well. Matt S. On Thursday, May 31, 2012 7:21:06 AM, Wiser, Tyson wrote: > More seriously we need to seek in absolute mode because we would like > to query video according to the time we recorded it (and not a > relative time relative to begin of RTSP session). > > This is the first time I've ever heard of someone doing this (or even > wanting to do this). Because it's allowed for (but optional) in the > RTSP specification, it would be nice if we (at the client and server > end) supported this. But because it's so rarely used, it's not going > to be a high-priority feature, unfortunately. > > We also had the same requirement of being able to request video based > on absolute record time, not relative time. When we saw that Live555 > didn?t support this we ended up doing some subclassing so that we > could override the handling of SET_PARAMETER messages. We then had the > client pass the desired play time as a parameter to the server before > sending the PLAY command and had our server check that parameter when > it started playing. It ended up working well for us and also made it > so that we could change the play time simply by calling SET_PARAMETER > again without issuing PLAY, PAUSE, or STOP commands. The downside is > that now only our client can control the play time since no other RTSP > client would know about our specific parameter. In theory any client > would be able to play our live stream (using the ?now-? form of the > ntp PLAY command) but it wouldn?t be able to get anything other than live. > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From tayeb at tmxvoip.com Thu May 31 12:07:26 2012 From: tayeb at tmxvoip.com (Meftah Tayeb) Date: Thu, 31 May 2012 22:07:26 +0300 Subject: [Live-devel] MPEG2 to H.264 Message-ID: hello guys i have several MPEG2TS streams through multicast (IPTV) i want to convert them to Unicast (HTTP or RTSP) but transcode them to H.264/AAC any good apropriate tool to use? i have bean fighting VLC all this weeks but no luck for me at all any help is welcome thank you, Meftah Tayeb IT Consulting http://www.tmvoip.com/ phone: +21321656139 Mobile: +213660347746 __________ Information from ESET NOD32 Antivirus, version of virus signature database 6830 (20120126) __________ The message was checked by ESET NOD32 Antivirus. http://www.eset.com From dekarl at spaetfruehstuecken.org Thu May 31 13:51:18 2012 From: dekarl at spaetfruehstuecken.org (Karl Dietz) Date: Thu, 31 May 2012 22:51:18 +0200 Subject: [Live-devel] RTSP Client TCP mode In-Reply-To: References: <99C24018-122A-466E-9DC6-1BD6E02528F6@saranyu.in> Message-ID: <4FC7D9C6.2040502@spaetfruehstuecken.org> On 30.05.2012 11:36, Nikolai Vorontsov wrote: > does it mean that we don't have any means to handle tool-low-bandwidth > network case? I can imagine for instance for the live view the following > scenario - we monitor the output queue length and when it's overflown or > close to the top - start to skip all incoming frames till the next key > frame (or till the queue is lowering)? That makes video not ideal, but > at least customer see something (with gaps). Sounds like you want something like RTP over SCTP with proper splitting of the data into more (intra frames and audio) or less (inter frames) important bits. I'm not sure if there is any production deployment of that already. Or you could use http-live-streaming with multiple bandwidths and let the client switch between them. > For the playback usually it's possible to ask the source part to hold on > frame delivery and I guess, again I can monitor output queue length and > tame my backend server stream. > Does it make sense in live555? Regards, Karl From warren at etr-usa.com Thu May 31 15:56:18 2012 From: warren at etr-usa.com (Warren Young) Date: Thu, 31 May 2012 16:56:18 -0600 Subject: [Live-devel] liveMedia/FreeBSD : compilation fails on ar command In-Reply-To: References: <4FB5D902.5040501@gmail.com> <4FB5D963.1050805@gmail.com> <20F8EC52-CCAC-471E-A9B8-9FFC3C9D1690@live555.com> Message-ID: <4FC7F712.3040202@etr-usa.com> On 5/18/2012 3:26 AM, Ross Finlayson wrote: >> I solved the ar issue the following solution : >> In Makefile.tail, I added a space in the line >> $(LIBRARY_LINK)$@ $(LIBRARY_LINK_OPTS) \ >> between $(LIBRARY_LINK) and $@... > > But you shouldn't have to do this, because the space is already present > - at the end of the "LIBRARY_LINK" line in the "config.freebsd" file. This is a GNU vs BSD make issue. 'make' is BSD make by default on FreeBSD, and it doesn't preserve trailing whitespace when assigning to a variable. GNU make does. Attached is a simple testcase showing this. You may have 'make' aliased to 'gmake' on FreeBSD, which would explain why you haven't seen this problem. POSIX doesn't say what make must do with this (http://goo.gl/Xi4b8). SUSv4 appears to have copied that text exactly. (http://goo.gl/GQ5EZ) GNU make's manual does document the behavior, saying "...trailing space characters are not stripped from variable values..." in section 6.2. But, BSD make's manpage neither agrees nor disagrees with this; it's left unspecified. The potto book (http://goo.gl/lk1is) documents the GNU make behavior on p.45, but as of the 3rd edition the book dropped coverage of non-GNU makes. I may still have a 2nd edition copy at home, but I'll probably forget to check. :) I think you should accept the patch, Ross. The current way appears to rely on behavior specific to GNU make. -------------- next part -------------- TEST = seven test: echo "$(TEST)" > test wc -c test From warren at etr-usa.com Thu May 31 16:08:05 2012 From: warren at etr-usa.com (Warren Young) Date: Thu, 31 May 2012 17:08:05 -0600 Subject: [Live-devel] Byte rate of live media server In-Reply-To: References: Message-ID: <4FC7F9D5.6050903@etr-usa.com> On 5/28/2012 8:09 AM, Ketan Gholap wrote: > > i was wondering how to > find out the bit/byte rate of streaming of live media server,that i can > print on the console??? If you're streaming using an RTPSink object, call its octetCount() method at two known times. Then: double mbits_sent = (o2 - o1) / 1024.0 / 1024.0 / (s2 - s1); where o2 and o1 are the octetCount() values at time_t s2 and s1. Ross, could you please lift this method up a level, to MediaSink? I wanted it when playing around with raw UDP streaming a few weeks ago. It would also avoid the need to dynamic_cast<> the sink object in certain circumstances, such as when accessing MediaSubsession::sink. From warren at etr-usa.com Thu May 31 16:25:19 2012 From: warren at etr-usa.com (Warren Young) Date: Thu, 31 May 2012 17:25:19 -0600 Subject: [Live-devel] Is there any way in testMPEG2TransportStreamer.cpp to know the frame rate of the file it is streaming In-Reply-To: References: Message-ID: <4FC7FDDF.6090106@etr-usa.com> On 5/29/2012 5:34 AM, Tarun Batra wrote: > > Is there any way in testMPEG2TransportStreamer.cpp to know the frame > rate of the file it is streaming???? Try subclassing MPEG2TransportStreamFramer, and keeping track when its getNextFrame() method is called. Between that and gettimeofday(), you should be able to calculate this. From warren at etr-usa.com Thu May 31 16:26:48 2012 From: warren at etr-usa.com (Warren Young) Date: Thu, 31 May 2012 17:26:48 -0600 Subject: [Live-devel] MPEG2 to H.264 In-Reply-To: References: Message-ID: <4FC7FE38.3060407@etr-usa.com> On 5/31/2012 1:07 PM, Meftah Tayeb wrote: > any good apropriate tool to use? Perhaps ffmpeg. But that wouldn't be on topic here, would it? From sidprice at softtools.com Thu May 31 17:21:27 2012 From: sidprice at softtools.com (Sid Price) Date: Thu, 31 May 2012 18:21:27 -0600 Subject: [Live-devel] Streaming a WAV file Message-ID: <020a01cd3f8c$7d762b00$78628100$@softtools.com> Ross, I am running the OnDemandRTSPServer test application with WAV files and it seems that the WAV file processing expects the first SubChunk of the file to always be the "fmt " SubChunk. This is not the case for the files I am using, they have other SubChunks, e.g. "LIST". What is the best way to get this into the library? Should I implement and submit my changes to you? Sid. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 31 06:18:22 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 May 2012 16:18:22 +0300 Subject: [Live-devel] Stopping the server doesn't actually stop In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8E13@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8E13@SSTSVR1.sst.local> Message-ID: <51B4B101-848A-4BCB-9428-DABA7FC31BE7@live555.com> > m_rtspServer->removeServerMediaSession((*itr).second); Calling "RTSPServer::removeServerMediaSession()" merely removes the "ServerMediaSession" (and thus also its "ServerMediaSubsession"s) from the RTSP server, so that clients can no longer start any new streams using its name. It does not stop any streaming to any client(s) that might currently be underway. If you want to stop - from the server - the actual streaming from the server to a client, then you'll need to delete the corresponding "RTSPServer::RTSPClientSession" object. (If you want to stop the streaming from the client end, of course, you simply send a RTSP "TEARDOWN" command, which causes the same thing to happen at the server end - i.e., it causes the "RTSPServer::RTSPClientSession" object to be deleted.) Actually, if you want clients to be able to restart the stream afterwards, then you probably shouldn't be calling "RTSPServer::removeServerMediaSession()" at all. There's no real point in calling "removeServerMediaSession()", and then "addServerMediaSession()" afterwards to reinstate it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ketangholap1990 at gmail.com Thu May 31 07:59:36 2012 From: ketangholap1990 at gmail.com (Ketan Gholap) Date: Thu, 31 May 2012 20:29:36 +0530 Subject: [Live-devel] testMpeg2videostreamer Message-ID: Hello sir I am using your testmpeg2videostreamer application to stream my ts file to a particular ip and using your test on demand rtsp server to catch the streaming,is there any way in testmpeg2videostreamer to stream the ts file at faster rate.? I am using a one to one connection in my machines running testmpeg2videostreamer and test on demand rtsp server. Thanks Ketan -------------- next part -------------- An HTML attachment was scrubbed... URL: From itf-freak at gmx.de Thu May 31 08:27:08 2012 From: itf-freak at gmx.de (=?ISO-8859-1?Q?Christian_Br=FCmmer?=) Date: Thu, 31 May 2012 17:27:08 +0200 Subject: [Live-devel] Android device don't accept my own OnDemandLiveStream In-Reply-To: <4FC38E9D.2080707@gmx.de> References: <4FC38E9D.2080707@gmx.de> Message-ID: <4FC78DCC.6000507@gmx.de> Am 28.05.2012 16:41, schrieb Christian Br?mmer: > Hi, > > i am coding my own OnDemandLiveStream-Server (as discussed here: > http://lists.live555.com/pipermail/live-devel/2012-May/015178.html) > using h264 encoded videoframes. I can play the videostream using VLC > but my Samsung Galaxy SII running ICS (Android 4.03) can't open the > stream. > If i use the "mediaServer" and a *.264 videofile my smartphone is > able to play the video. > Using my own server FramedSource is being created and destroyed again > (as i know thats a right behavior - for configuration transfer) but > other functions like doGetNextFrame() never been called and the > smartphone canceled the connection right after the configuration > transmission (as you can see in the log file). Since my encoding > functions never been called it must be a bad rtsp-configuration. But i > dont know what happens behind OnDemandServerMediaSubsession (my media > subsession) and H264VideoFileServerMediaSubsession (mediaServer) eg. > what are the differences. > > You can see my whole code here > http://lists.live555.com/pipermail/live-devel/2012-May/015178.html. > The lines > /05-28 16:00:34.540: I/ASessionDescription(24294): a=fmtp:96 > packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5JUDAS0IAAAAwBAAAAM0eMGVA==,aO48gA== > 05-28 16:00:34.540: I/ASessionDescription(24294): a=control:track1/ > existing only in the mediaServer-Log - this may be the problem but i > dont what it means! > > Best regards, > Christian > > Android-Log for my own server: > #### > 05-28 15:59:09.710: I/ARTSPConnection(24294): status: RTSP/1.0 200 OK > 05-28 15:59:09.710: W/MyHandler(24294): OPTIONS completed with result > 0 (Success) > 05-28 15:59:09.715: I/ARTSPConnection(24294): status: RTSP/1.0 200 OK > 05-28 15:59:09.715: I/MyHandler(24294): DESCRIBE completed with result > 0 (Success) > 05-28 15:59:09.715: I/ASessionDescription(24294): v=0 > 05-28 15:59:09.715: I/ASessionDescription(24294): o=- 1338213491406907 > 1 IN IP4 192.168.0.198 > 05-28 15:59:09.715: I/ASessionDescription(24294): s=Session streamed > by "INGAme" > 05-28 15:59:09.715: I/ASessionDescription(24294): i=h264.3gp > 05-28 15:59:09.715: I/ASessionDescription(24294): t=0 0 > 05-28 15:59:09.715: I/ASessionDescription(24294): a=tool:LIVE555 > Streaming Media v2012.04.21 > 05-28 15:59:09.715: I/ASessionDescription(24294): a=type:broadcast > 05-28 15:59:09.715: I/ASessionDescription(24294): a=control:* > 05-28 15:59:09.715: I/ASessionDescription(24294): a=range:npt=0- > 05-28 15:59:09.715: I/ASessionDescription(24294): > a=x-qt-text-nam:Session streamed by "INGAme" > 05-28 15:59:09.715: I/ASessionDescription(24294): a=x-qt-text-inf:h264.3gp > 05-28 15:59:09.715: I/ASessionDescription(24294): m=video 0 RTP/AVP 96 > 05-28 15:59:09.715: I/ASessionDescription(24294): c=IN IP4 0.0.0.0 > 05-28 15:59:09.715: I/ASessionDescription(24294): b=AS:480 > 05-28 15:59:09.715: I/ASessionDescription(24294): a=rtpmap:96 H264/90000 > 05-28 15:59:09.715: I/ASessionDescription(24294): a=control:track1 > 05-28 15:59:09.715: W/MyHandler(24294): mBaseURL is change to > rtsp://192.168.0.198/h264.3gp/ from 'content-base' > 05-28 15:59:09.715: W/MyHandler(24294): Property > [net.connectivity.qosbw] NOT Found, bwQoS=2147483647 > 05-28 15:59:09.715: W/APacketSource(24294): Format:video 0 RTP/AVP 96 > / MIME-Type:H264/90000 > *05-28 15:59:09.715: W/MyHandler(24294): Unsupported format. Ignoring > track #1.* > #### > > Android-Log for mediaServer (working): > #### > 05-28 16:00:34.535: I/ARTSPConnection(24294): status: RTSP/1.0 200 OK > 05-28 16:00:34.540: I/MyHandler(24294): DESCRIBE completed with result > 0 (Success) > 05-28 16:00:34.540: I/ASessionDescription(24294): v=0 > 05-28 16:00:34.540: I/ASessionDescription(24294): o=- 1338213636753913 > 1 IN IP4 192.168.0.198 > 05-28 16:00:34.540: I/ASessionDescription(24294): s=H.264 Video, > streamed by the LIVE555 Media Server > 05-28 16:00:34.540: I/ASessionDescription(24294): i=working.264 > 05-28 16:00:34.540: I/ASessionDescription(24294): t=0 0 > 05-28 16:00:34.540: I/ASessionDescription(24294): a=tool:LIVE555 > Streaming Media v2011.11.20 > 05-28 16:00:34.540: I/ASessionDescription(24294): a=type:broadcast > 05-28 16:00:34.540: I/ASessionDescription(24294): a=control:* > 05-28 16:00:34.540: I/ASessionDescription(24294): a=range:npt=0- > 05-28 16:00:34.540: I/ASessionDescription(24294): > a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server > 05-28 16:00:34.540: I/ASessionDescription(24294): > a=x-qt-text-inf:working.264 > 05-28 16:00:34.540: I/ASessionDescription(24294): m=video 0 RTP/AVP 96 > 05-28 16:00:34.540: I/ASessionDescription(24294): c=IN IP4 0.0.0.0 > 05-28 16:00:34.540: I/ASessionDescription(24294): b=AS:500 > 05-28 16:00:34.540: I/ASessionDescription(24294): a=rtpmap:96 H264/90000 > 05-28 16:00:34.540: I/ASessionDescription(24294): a=fmtp:96 > packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5JUDAS0IAAAAwBAAAAM0eMGVA==,aO48gA== > 05-28 16:00:34.540: I/ASessionDescription(24294): a=control:track1 > 05-28 16:00:34.540: W/MyHandler(24294): mBaseURL is change to > rtsp://192.168.0.198/working.264/ from 'content-base' > 05-28 16:00:34.540: W/MyHandler(24294): Property > [net.connectivity.qosbw] NOT Found, bwQoS=2147483647 > 05-28 16:00:34.540: W/APacketSource(24294): Format:video 0 RTP/AVP 96 > / MIME-Type:H264/90000 > 05-28 16:00:34.540: I/APacketSource(24294): dimensions 384x288 > *05-28 16:00:34.540: I/ARTPConnection(24294): Start:16202* > #### > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel I finally got it! I dealed with the source code especially the parseSPropParameterSets function of H264VideoRTPSource.cpp. My fault was the representation of the sprop-string. Only the "bytestream" without the qualifier "sprop-parameter-sets=" is needed. I created the right substring and now the setSPSandPPS function is working as expected! So what ive done: FramedSource* imLiveStreamMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) { (...) FramedSource* h264NALSource = imLiveStreamSource::createNew(envir(), param); H264VideoStreamFramer* framer = H264VideoStreamFramer::createNew(envir(), h264NALSource); framer->setSPSandPPS(mSPropParameterSet); *// **mSPropParameterSetwithout"sprop-parameter-sets="* return framer; } RTPSink* imLiveStreamMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) { return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic, mSPropParameterSet); *// **mSPropParameterSetwithout"sprop-parameter-sets="* } Hope that will help somebody in the future! Thanks for reading, best regards, Christian -------------- next part -------------- An HTML attachment was scrubbed... URL: From csavtche at gmail.com Thu May 31 10:40:42 2012 From: csavtche at gmail.com (Constantin Savtchenko) Date: Thu, 31 May 2012 13:40:42 -0400 Subject: [Live-devel] Integrating live555 RTSP client In-Reply-To: <4FC746CC.7050408@embedded-sol.com> References: <4FC746CC.7050408@embedded-sol.com> Message-ID: Hi Felix, I am doing a similar project. My approach has been exactly as you described it. The documentation notes that Live555 runs in a single a thread and library calls should not be done from multiple threads, so I have the BasicTaskScheduler running in its own thread and hogging it. Just like you, I will use a condition variable to extract the the newest decoded frame. As an additional point, I am doing the decoding within the Live555 framework by subclassing a MediaSink. This is in an interesting design question. I have a feeling that the expected way to do it though would be to subclass the TaskScheduler per your application needs instead of using the BasicTaskScheduler. Constantin On Thu, May 31, 2012 at 6:24 AM, Felix Radensky wrote: > Hi, > > I have to integrate live555 RTSP client into an existing application > that decodes H.264 video and sends it to display. The application > is multithreaded, with separate threads for decoding and display > tasks. > > I was thinking of using testRTSPClient code as a base, putting > it into separate thread, and notifying decoding thread when next > frame is ready. My idea was to use condition variable embedded into > DummySink object and waiting on condvar in another thread. Does that > sound reasonable ? > > Thanks a lot. > > Felix. > > > > > ______________________________**_________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/**mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: