From tuan.dn at anlab.vn Thu Dec 1 01:19:29 2011 From: tuan.dn at anlab.vn (Tuan DN) Date: Thu, 1 Dec 2011 16:19:29 +0700 Subject: [Live-devel] option to make audio louder In-Reply-To: References: Message-ID: Thanks Ross Finlayson I'm searching around to find mp4 container atom that defined sound volume value, but I could not find anything. Is there any atom of mp4 container which contain sound volume value? Did you mean "edit .mp4 file" is "edit each audio frame" or "edit mp4 container"? Please show me a way. On Wed, Nov 30, 2011 at 8:05 PM, Ross Finlayson wrote: > I use live555 to record a mediastream by following command: > > *testProgs/openRTSP.exe -d 10 -4 rtsp://192.168.1.174:5544 >live555.mp4 > * > Is there any option to set audio volume of output file live555.mp4 bigger > or smaller? > > > No, not in our software. If it's possible at all, it would only be by > (somehow) editing the ".mp4" file after you've recorded it. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Thu Dec 1 01:34:53 2011 From: david.myers at panogenics.com (David J Myers) Date: Thu, 1 Dec 2011 09:34:53 -0000 Subject: [Live-devel] StreamParser internal error (149999+ 4 > Message-ID: <001501ccb00c$7b798780$726c9680$@myers@panogenics.com> Hi Ross, Further to your answer on this, we have an image size of 2144x1944 pixels and have already increased the BANK_SIZE beyond 300000 to 450000. However there is no guarantee that any number is sufficient. Surely the answer is to test the frame-size from the encoder before passing it into Live555, and truncating it if necessary. My questions are, 1. Is it possible to retrieve the BANK_SIZE from the stream parser for testing against? 2. Is just truncating the long frame the best way to avoid crashing the system, or should the whole frame or GOP be dropped, or something else (eg. part frames)? David -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Thu Dec 1 04:58:59 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Thu, 1 Dec 2011 14:58:59 +0200 Subject: [Live-devel] Session management Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8A3C@SSTSVR1.sst.local> Hi everyone, I was wondering if there was something built into Live555 that can be used for session management and initiation. For example, a server application using live555 sends a command to a client application using live555 to signal the client to start listening to the server. After the media has been transported, the server sends a signal to the client to indicate that it should stop listening. Regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 1 05:13:13 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 Dec 2011 05:13:13 -0800 Subject: [Live-devel] StreamParser internal error (149999+ 4 > In-Reply-To: <001501ccb00c$7b798780$726c9680$@myers@panogenics.com> References: <001501ccb00c$7b798780$726c9680$@myers@panogenics.com> Message-ID: > Surely the answer is to test the frame-size from the encoder before passing it into Live555 Unfortunately not, because we don't know the frame size in advance; we can figure it out only by parsing/copying the data, looking for the next 'start code'. And it's this that's overflowing our stream parsing code. On reflection, however, I think I can fix our H.264 parsing code to avoid this problem (the fix would be for H.264 parsing only). However, you also should reconsider generating such huge NAL units. The problem with NAL units this size is that if you stream them, then each such NAL unit will get fragmented over multiple RTP packets. That's OK, but the loss of just one of these packets will cause the entire NAL unit to be undecodable by the receiver. And the larger the NAL unit, the more likely that that will happen. If you have control over your encoder, it would be far better to have it generate such large frames as multiple 'slices' - i.e., one NAL unit per slice, rather than as one NAL unit for the whole frame. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 1 05:18:25 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 Dec 2011 05:18:25 -0800 Subject: [Live-devel] Session management In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8A3C@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A3C@SSTSVR1.sst.local> Message-ID: > I was wondering if there was something built into Live555 that can be used for session management and initiation. No, because (as far as I know) there's no standardized protocol for doing this. > After the media has been transported, the server sends a signal to the client to indicate that it should stop listening. This is done already, using RTCP "BYE" packets (which the receiver can handle, as appropriate). Even simpler, though, if the client knows the duration of the stream (it can often know this from the stream's SDP description), then it will know when it needs to "stop listening". Our "openRTSP" client application does both of these things, FYI. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Thu Dec 1 05:44:46 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 1 Dec 2011 13:44:46 +0000 Subject: [Live-devel] StreamParser internal error (149999+ 4 > In-Reply-To: References: <001501ccb00c$7b798780$726c9680$@myers@panogenics.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B18EED7@IL-BOL-EXCH01.smartwire.com> I thought the point of the NAL units was to manage size. The key frame is the only worry and it gets broken into slices. On one camera that has higer resolution I get a stream of nals like this [7][8][5][5][5][1][1][1][1].... Where the 7 and 8 are the sps and pps and are very small and the 5's represent slices of the large keyframe. (the ones are the much smaller difference frames) From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, December 01, 2011 7:13 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] StreamParser internal error (149999+ 4 > Surely the answer is to test the frame-size from the encoder before passing it into Live555 Unfortunately not, because we don't know the frame size in advance; we can figure it out only by parsing/copying the data, looking for the next 'start code'. And it's this that's overflowing our stream parsing code. On reflection, however, I think I can fix our H.264 parsing code to avoid this problem (the fix would be for H.264 parsing only). However, you also should reconsider generating such huge NAL units. The problem with NAL units this size is that if you stream them, then each such NAL unit will get fragmented over multiple RTP packets. That's OK, but the loss of just one of these packets will cause the entire NAL unit to be undecodable by the receiver. And the larger the NAL unit, the more likely that that will happen. If you have control over your encoder, it would be far better to have it generate such large frames as multiple 'slices' - i.e., one NAL unit per slice, rather than as one NAL unit for the whole frame. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1873 / Virus Database: 2102/4649 - Release Date: 11/30/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Thu Dec 1 06:02:05 2011 From: david.myers at panogenics.com (David J Myers) Date: Thu, 1 Dec 2011 14:02:05 -0000 Subject: [Live-devel] StreamParser internal error (149999+ 4 > Message-ID: <006801ccb031$cfa47750$6eed65f0$@myers@panogenics.com> Hi Ross, >> Surely the answer is to test the frame-size from the encoder before >> passing it into Live555 >Unfortunately not, because we don't know the frame size in advance; we can figure it out only by parsing/copying the >data, looking for the next 'start code'. And it's this that's overflowing our stream parsing code. In our camera software we get the data back from the encoder before passing it to Live555 via a Linux pipe. We therefore know the size of the frame and we could change/truncate it to stay within the limits of the BANK_SIZE. However, as you say, this may cause the frame to be undecodable at the receiver. Is there a way to make this truncation and the frame to still remain valid? David Myers Panogenics Ltd -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Thu Dec 1 09:20:45 2011 From: warren at etr-usa.com (Warren Young) Date: Thu, 01 Dec 2011 10:20:45 -0700 Subject: [Live-devel] option to make audio louder In-Reply-To: References: Message-ID: <4ED7B76D.7090907@etr-usa.com> On 12/1/2011 2:19 AM, Tuan DN wrote: > > Is there any atom of mp4 container which contain sound volume value? Digital audio doesn't have volume any more than the transcript of a speech includes tone of voice. The closest thing I could think of to what you're asking for would be to somehow reference an audio effects filter from within the stream, so that the downstream decompressor does the audio processing you need to increase the audio level. A bit of quick Googling for QuickTime effects suggests that the 'geff' atom might be employed for this purpose, but that doesn't exist in the more tightly scoped MP4 format. (See http://www.mp4ra.org/atoms.html) Ross is saying that you will probably have to decompress the audio, scale the samples, then recompress the resulting audio stream. You're not going to find a byte to diddle in the stream to magically boost the volume. From tuan.dn at anlab.vn Thu Dec 1 17:42:47 2011 From: tuan.dn at anlab.vn (Tuan DN) Date: Fri, 2 Dec 2011 08:42:47 +0700 Subject: [Live-devel] option to make audio louder In-Reply-To: <4ED7B76D.7090907@etr-usa.com> References: <4ED7B76D.7090907@etr-usa.com> Message-ID: Now I know what to do next Thanks you very murch On Fri, Dec 2, 2011 at 12:20 AM, Warren Young wrote: > On 12/1/2011 2:19 AM, Tuan DN wrote: > >> >> Is there any atom of mp4 container which contain sound volume value? >> > > Digital audio doesn't have volume any more than the transcript of a speech > includes tone of voice. > > The closest thing I could think of to what you're asking for would be to > somehow reference an audio effects filter from within the stream, so that > the downstream decompressor does the audio processing you need to increase > the audio level. > > A bit of quick Googling for QuickTime effects suggests that the 'geff' > atom might be employed for this purpose, but that doesn't exist in the more > tightly scoped MP4 format. (See http://www.mp4ra.org/atoms.**html > ) > > Ross is saying that you will probably have to decompress the audio, scale > the samples, then recompress the resulting audio stream. You're not going > to find a byte to diddle in the stream to magically boost the volume. > > ______________________________**_________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/**mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Fri Dec 2 05:57:41 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Fri, 2 Dec 2011 15:57:41 +0200 Subject: [Live-devel] Multiple streams - same port Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8A47@SSTSVR1.sst.local> Hi, I got multiple streams working but only if they are on there own port. I would like to have multiple streams on the same port, and have them distinguished by the subsession URL. Am I correct to say that I have to create one RTSP server, create one ServerMediaSession and then add multiple OnDemandServerSubsessions to the ServerMediaSession and use addServerMediaSession to add the ServerMediaSession to the RTSPserver? If so, how do I remove OnDemandServerSubsessions from ServerMediaSession? Is that the correct way to implement multiple streams on one port? Or is there a better way? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 2 08:46:10 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Dec 2011 08:46:10 -0800 Subject: [Live-devel] Multiple streams - same port In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8A47@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A47@SSTSVR1.sst.local> Message-ID: > I got multiple streams working but only if they are on there own port. > > I would like to have multiple streams on the same port, and have them distinguished by the subsession URL. It's not clear to me what you mean by this. When you talk about "port" here, are you referring to the port that's used by the RTSP protocol - e.g., 554 or 8554 by default? > Am I correct to say that I have to create one RTSP server, create one ServerMediaSession and then add multiple OnDemandServerSubsessions to the ServerMediaSession and use addServerMediaSession to add the ServerMediaSession to the RTSPserver? Once again, it's not clear exactly what you want, but I think the answer is "no". Instead, what you probably want to do is: - create one RTSP server - create multiple "ServerMediaSession"s; one for each stream that you want to support - distinguished by stream name. - for each stream that you want to support, add one "OnDemandServerMediaSubsession" (subclass) object to the corresponding "ServerMediaSession" object for each 'substream' (e.g., audio, video) in the stream. This should become clear by looking at the code for "onDemandServerMediaSession". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From egoltzman at gmail.com Thu Dec 1 12:54:57 2011 From: egoltzman at gmail.com (eyal goltzman) Date: Thu, 1 Dec 2011 22:54:57 +0200 Subject: [Live-devel] StreamParser internal error (149999+ 4 > 150000) In-Reply-To: References: Message-ID: >Date: Tue, 29 Nov 2011 16:58:09 -0800 >From: Ross Finlayson >To: LIVE555 Streaming Media - development & use > >Subject: Re: [Live-devel] StreamParser internal error (149999+ 4 > 150000) >Message-ID: <0B428036-277D-430B-8664-FEB4B992EE31 at live555.com> >Content-Type: text/plain; charset="iso-8859-1" > >The problem here is the extremely large H.264 frame (about 280,000 bytes in size) that you have in this video. This was too big for our stream parsing code to >handle. > >You can fix this by changing the constant BANK_SIZE in "liveMedia/ StreamParser.cpp" from 150000 to 300000. (I'll also make this change in the next release of the >software.) > > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ Thank you for you accurate answer, the size of the frame was indeed the problem and increasing the BANK_SIZE solve the problem! Just wanted to say that after I fix it I got those messages: *MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (301172). 21730 bytes of trailing data was dropped! Correct this by increasing "OutPacketBuffer::maxSize" to at least 321730, *before* creating this 'RTPSink'. (Current value is 300000.)* but the error message also gave the solution so it was easy... Thanks again, Eyal Goltzman -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 2 13:56:45 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Dec 2011 13:56:45 -0800 Subject: [Live-devel] StreamParser internal error (149999+ 4 > 150000) In-Reply-To: References: Message-ID: <89A4C2D5-E99E-47D2-B41F-906360974F2F@live555.com> FYI, I've now installed a new version (2011.12.02) of the "LIVE555 Streaming Media" code that avoids this problem (for H.264 parsing). You should no longer need to modify the "BANK_SIZE" constant. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon Dec 5 05:54:25 2011 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 5 Dec 2011 14:54:25 +0100 Subject: [Live-devel] memory ownership in UserAuthenticationDatabase liveMedia/RTSPServer.cpp Message-ID: <15797_1323093282_4EDCCD22_15797_13039_1_1BE8971B6CFF3A4F97AF4011882AA2550155F735E7EF@THSONEA01CMS01P.one.grp> Hi, In UserAuthenticationDatabase::addUserRecord, the value inserted in the map is allocated through strDup(password). As the class allocate the memory, I think the class own this memory, then the destructor should free it. Is it possible to insert in the destructor something like : UserAuthenticationDatabase::~UserAuthenticationDatabase() { delete[] fRealm; + char* password = NULL; + while ((password = (char*)fTable->RemoveNext()) != NULL) { + delete[] password; + } delete fTable; } It's possible through the removeUserRecord to free it, but this way need to keep in an other container the key to free. What's your feeling about this ? Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Mon Dec 5 09:28:24 2011 From: david.myers at panogenics.com (David J Myers) Date: Mon, 5 Dec 2011 17:28:24 -0000 Subject: [Live-devel] StreamParser internal error (149999+ 4 > Message-ID: <008301ccb373$4bde1bc0$e39a5340$@myers@panogenics.com> Hi Ross, >> FYI, I've now installed a new version (2011.12.02) of the "LIVE555 Streaming Media" code that avoids this problem (for H.264 parsing). You should no longer need to modify the "BANK_SIZE" constant. I guess I should download the patch and look at the new code but could you just elaborate on how this now works? Can it handle any size of NAL frame? Some of the frames from our encoder can easily be over 1MB. Is the BANK_SIZE constant now redundant or does it still matter? Regards David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 5 21:41:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 5 Dec 2011 22:41:31 -0700 Subject: [Live-devel] StreamParser internal error (149999+ 4 > In-Reply-To: <008301ccb373$4bde1bc0$e39a5340$@myers@panogenics.com> References: <008301ccb373$4bde1bc0$e39a5340$@myers@panogenics.com> Message-ID: > >> FYI, I've now installed a new version (2011.12.02) of the "LIVE555 Streaming Media" code that avoids this problem (for H.264 parsing). You should no longer need to modify the "BANK_SIZE" constant. > > I guess I should download the patch and look at the new code but could you just elaborate on how this now works? Can it handle any size of NAL frame? Yes. You should no longer need to increase "BANK_SIZE" (at least, not for H.264 parsing). > Some of the frames from our encoder can easily be over 1MB. If you plan to stream these, then it would be much better if you encoded them as multiple 'slice' NAL units instead, Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 5 22:18:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 5 Dec 2011 23:18:23 -0700 Subject: [Live-devel] memory ownership in UserAuthenticationDatabase liveMedia/RTSPServer.cpp In-Reply-To: <15797_1323093282_4EDCCD22_15797_13039_1_1BE8971B6CFF3A4F97AF4011882AA2550155F735E7EF@THSONEA01CMS01P.one.grp> References: <15797_1323093282_4EDCCD22_15797_13039_1_1BE8971B6CFF3A4F97AF4011882AA2550155F735E7EF@THSONEA01CMS01P.one.grp> Message-ID: <070727A9-0B1E-4C08-A723-F43205F5E209@live555.com> Yes, you've found a memory leak. The allocated 'password' strings should be deleted when the "UserAuthenticationDatabase" is deleted - but currently they aren't. This will be fixed in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kristen.eisenberg at yahoo.com Mon Dec 5 09:05:21 2011 From: kristen.eisenberg at yahoo.com (Kristen Eisenberg) Date: Mon, 5 Dec 2011 09:05:21 -0800 (PST) Subject: [Live-devel] How to bypass streamlimit causes by EventTriggerIDs Message-ID: <1323104721.94574.YahooMailNeo@web122309.mail.ne1.yahoo.com> Hello, I am using live555 to stream several network cameras. For that I generate one RTSP-Server and for every camera a subsession on this server with a new URL. To signal the TaskScheduler, that there is a new frame for a stream, I use a EventTriggerID. Every stream has his own EventTriggerID. Now I got the problem, that the EventTriggerID is generated by a bitmask (0x80000000), and the line ?m_EventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0);? generates only 32 EventTriggerID?s, so that I have a maximum of 32 stream receivers. Now my question: Is it possible to solve that problem without generating more RTSP-Server with different TaskScheduler on different ports? Can you make me a suggestion how I bypass these limit of maximum stream receivers? Thank you for your efforts ? Kristen Eisenberg Billige Fl?ge Marketing GmbH Emanuelstr. 3, 10317 Berlin Deutschland Telefon: +49 (33) 5310967 Email: utebachmeier at gmail.com Site: http://flug.airego.de - Billige Fl?ge vergleichen -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 6 19:24:55 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 Dec 2011 20:24:55 -0700 Subject: [Live-devel] How to bypass streamlimit causes by EventTriggerIDs In-Reply-To: <1323104721.94574.YahooMailNeo@web122309.mail.ne1.yahoo.com> References: <1323104721.94574.YahooMailNeo@web122309.mail.ne1.yahoo.com> Message-ID: > ?m_EventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0);? > > > > generates only 32 EventTriggerID?s, so that I have a maximum of 32 stream receivers. > > Now my question: Is it possible to solve that problem without generating more RTSP-Server with different TaskScheduler on different ports? Can you make me a suggestion how I bypass these limit of maximum stream receivers? Note that the limit of 32 event triggers is just in the implementation of the particular "TaskScheduler" subclass - "BasicTaskScheduler" - that we provide with the released code. If you wish, you could implement your own subclass of "TaskScheduler" or "BasicTaskScheduler" that does something different. However, in your case, I don't think you need more than 32 event triggers. In fact, I don't think you need more than one. If you are calling the same handler function - "deliverFrame0()" - each time, then you need just one event handler for this. You can use the "clientData" parameter to specify the particular target of the event. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alvarezgalberto at uniovi.es Wed Dec 7 01:24:55 2011 From: alvarezgalberto at uniovi.es (Alberto Alvarez) Date: Wed, 7 Dec 2011 10:24:55 +0100 Subject: [Live-devel] Write completed Nal / frame to sink when a new begins Message-ID: Hi all, I have been messing around with doGetNextFrame1() code in MultiFramedRTPSource AFAIK, currently this function sends to its sink the completed frame whenever the "currenPacketCompletesFrames" is flagged, which makes perfect sense. However, i would like to be able to send it to the sink once i detect the new begining of a frame/nal. It would require to "pass" to the sink all appended packets but the latest, which will be part of the next call to the sink. Is it possible, with a few changes, to accomplish this? Also, i am a bit lost at how the data is transfered between source and sink. Where is the data appended to the buffer from which the sink reads? is it in nextPacket->use() ? Kind Regards, Alberto ?lvarez Gonz?lez alvarezgalberto at uniovi.es -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 7 02:51:26 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 Dec 2011 03:51:26 -0700 Subject: [Live-devel] Write completed Nal / frame to sink when a new begins In-Reply-To: References: Message-ID: <7CD1D862-74C6-4234-8697-17C5A0DEF703@live555.com> I'm sorry, but I don't really understand your question. However, rather than messing around modifying the supplied code, which is something that you shouldn't be doing, especially since you are "a bit lost at how the data is transferred between source and sink", why don't you instead treat the existing "MultiFramedRTPSource" code as being an opaque 'black box', and tell us what you want *your* application code to do? I.e., assuming, for now at least, that you cannot modify the existing code (because you will get no support from me if you do), what problem do you have - or think you have - with it? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alvarezgalberto at uniovi.es Wed Dec 7 03:21:19 2011 From: alvarezgalberto at uniovi.es (Alberto Alvarez) Date: Wed, 7 Dec 2011 12:21:19 +0100 Subject: [Live-devel] Write completed Nal / frame to sink when a new begins In-Reply-To: References: Message-ID: Hi, I'm sorry I did not explain myself. First of all, I am aware that I must not be touching existent and working code. I have not done any modifications. I am just trying to understand how it works. And the code has no problem. I just want to adapt it to my needs. Let's say I would like to have a Video RTP source that identify complete frames (now it does at Nal-by-nal basis). Then the Source waits until the frame is completed, to write it to the sink at once. As you explained to me already, i can use the markerBit (at RTPSource) to check the end of a frame (use it with "completesFrame" flag (leaving MultiFramedRTPSource as a black box). I have done it and it works. Thanks for that. But what if i can't rely on markerBit (which the RFC already advises)?. What I have is a way to detect the beginning of a new frame, but not the end of it. ?Do you know how can I use the beginsFrame/completesFrame flags to write the frame to the sink in that situation? Thanks. your comments are really helpful Alberto ?lvarez Gonz?lez alvarezgalberto at uniovi.es I'm sorry, but I don't really understand your question. However, rather than messing around modifying the supplied code, which is something that you shouldn't be doing, especially since you are "a bit lost at how the data is transferred between source and sink", why don't you instead treat the existing "MultiFramedRTPSource" code as being an opaque 'black box', and tell us what you want *your* application code to do? I.e., assuming, for now at least, that you cannot modify the existing code (because you will get no support from me if you do), what problem do you have - or think you have - with it? Ross Finlayson Live Networks, Inc.http://www.live555.com/ On Wed, Dec 7, 2011 at 10:24 AM, Alberto Alvarez wrote: > Hi all, > > I have been messing around with doGetNextFrame1() code in > MultiFramedRTPSource > AFAIK, currently this function sends to its sink the completed frame > whenever the "currenPacketCompletesFrames" is flagged, which makes perfect > sense. > > However, i would like to be able to send it to the sink once i detect the > new begining of a frame/nal. It would require to "pass" to the sink all > appended packets but the latest, which will be part of the next call to the > sink. > > Is it possible, with a few changes, to accomplish this? > > Also, i am a bit lost at how the data is transfered between source and > sink. Where is the data appended to the buffer from which the sink reads? > is it in nextPacket->use() ? > > Kind Regards, > > > Alberto ?lvarez Gonz?lez > alvarezgalberto at uniovi.es > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 7 04:09:24 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 Dec 2011 05:09:24 -0700 Subject: [Live-devel] Write completed Nal / frame to sink when a new begins In-Reply-To: References: Message-ID: <48A6C707-2D05-46CF-B9DD-A3E6B1670786@live555.com> > First of all, I am aware that I must not be touching existent and working code. I have not done any modifications. I am just trying to understand how it works. And the code has no problem. I just want to adapt it to my needs. You should be thinking in terms of "subclassing the code for your needs"; not "adapting the code to your needs". > Let's say I would like to have a Video RTP source that identify complete frames (now it does at Nal-by-nal basis). Then the Source waits until the frame is completed, to write it to the sink at once. As you explained to me already, i can use the markerBit (at RTPSource) to check the end of a frame (use it with "completesFrame" flag (leaving MultiFramedRTPSource as a black box). I have done it and it works. Thanks for that. > But what if i can't rely on markerBit (which the RFC already advises)?. What I have is a way to detect the beginning of a new frame, but not the end of it. ?Do you know how can I use the beginsFrame/completesFrame flags to write the frame to the sink in that situation? You can't, because the "H264VideoRTPSource" code sets these flags based on the beginning/end of each NAL unit - because that (i.e., one NAL unit at a time) is what it delivers to its downstream object. The right place to detect "video frame" (i.e., "access unit") boundaries is in your H.264 decoder - or whatever downstream object you have that receives NAL units from your "H264VideoRTPSource" - because that is what decoders do (i.e., render video frames). The RTP receiving code - on the other hand - would be the *wrong* place to be doing this, because, for H.264, RTP packets contain "NAL units", not "access units" ('video frames'). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 7 08:36:58 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 7 Dec 2011 16:36:58 +0000 Subject: [Live-devel] shutdown rtsp client subclass without stopping program. Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A2787@IL-BOL-EXCH01.smartwire.com> I am trying to shutdown my subclassed rtsp client and because of the migration to async, I am having a hard time avoiding access violations. It seems like the object destroys when I call Media::close and then it tries to handle an incoming response or data and then crashes when there is no RequestQueue. So fHead is invlaid and fHead->next crashes. I must not be doing this correct. I am following the shutdown logic in the OpenRTSP, but it uses exit() so it is a bit differnet. What is the proper way. Medium::close(ourClient) now becomes ourself since we subclassed. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 7 08:57:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 Dec 2011 09:57:53 -0700 Subject: [Live-devel] shutdown rtsp client subclass without stopping program. In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A2787@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A2787@IL-BOL-EXCH01.smartwire.com> Message-ID: > I am trying to shutdown my subclassed rtsp client and because of the migration to async, I am having a hard time avoiding access violations. > > It seems like the object destroys when I call Media::close and then it tries to handle an incoming response or data and then crashes when there is no RequestQueue. So fHead is invlaid and fHead->next crashes. > > I must not be doing this correct. I am following the shutdown logic in the OpenRTSP, but it uses exit() so it is a bit differnet. The code in the "continueAfterTEARDOWN()" function is what you should be doing, I think; just leave off the call to "exit()" at the end. If you use this code (after you've done a RTSP "TEARDOWN"), then I don't see how you could still be receiving any incoming packets for that session. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 7 09:41:11 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 7 Dec 2011 17:41:11 +0000 Subject: [Live-devel] shutdown rtsp client subclass without stopping program. In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1A2787@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A27D6@IL-BOL-EXCH01.smartwire.com> That is exactly what I am trying. :( From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 07, 2011 10:58 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] shutdown rtsp client subclass without stopping program. I am trying to shutdown my subclassed rtsp client and because of the migration to async, I am having a hard time avoiding access violations. It seems like the object destroys when I call Media::close and then it tries to handle an incoming response or data and then crashes when there is no RequestQueue. So fHead is invlaid and fHead->next crashes. I must not be doing this correct. I am following the shutdown logic in the OpenRTSP, but it uses exit() so it is a bit differnet. The code in the "continueAfterTEARDOWN()" function is what you should be doing, I think; just leave off the call to "exit()" at the end. If you use this code (after you've done a RTSP "TEARDOWN"), then I don't see how you could still be receiving any incoming packets for that session. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Wed Dec 7 08:26:45 2011 From: 6.45.vapuru at gmail.com (6.45 6.45.Vapuru) Date: Wed, 7 Dec 2011 18:26:45 +0200 Subject: [Live-devel] Creating multiple OpenRtspClient [ or destroying current client in a proper way ] Message-ID: Hi, In openRtspClient i modify shutdown() method: I remove "exit(exitValue)"...Because without exiting the program i try to create new client.... When I use live555 client i see that env object is not desyroyed when client [modified] shutdown method is called... So when I decide to use env->reclaim() but it does not destroy object also since env->liveMediaPriv --> is not NULL And i can not delete any FileSink object using delete because their destructor are not private... FileSink objects remains ... I want a "new born" client without exit the code...How can I get real "shutdown" method which destroy all aobjects and give me new one? Any one has modified code so that it can create multiple clients? Best Wishes -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Tue Dec 6 11:40:24 2011 From: tayeb.dotnet at gmail.com (Meftah Tayeb) Date: Tue, 6 Dec 2011 21:40:24 +0200 Subject: [Live-devel] auto playlist generation Message-ID: hello, i do have a hendred of MP3 files that i want to stream to mobile devices but is hard to generate a m3u or any kind of playlist can live555 media server do that ? or any url to stream it all, one by one? thank you Meftah Tayeb IT Consulting http://www.tmvoip.com/ phone: +21321656139 Mobile: +213660347746 __________ Information from ESET NOD32 Antivirus, version of virus signature database 6692 (20111207) __________ The message was checked by ESET NOD32 Antivirus. http://www.eset.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 7 19:55:28 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 Dec 2011 20:55:28 -0700 Subject: [Live-devel] Creating multiple OpenRtspClient [ or destroying current client in a proper way ] In-Reply-To: References: Message-ID: <376A29B8-A134-4157-85F7-867BA3357B86@live555.com> This software is not intended to be modified by people who use "@gmail.com" email addresses. (Generally speaking, such people lack the necessary technical sophistication.) But in your case, you probably don't need to modify the "openRTSP" code at all. Because you want to just open multiple "rtsp://" URLs, one after the other, you can do so simply by running (the original, unmodified) "openRTSP" application multiple times, in succession - e.g., from a shell script. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 7 19:57:15 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 Dec 2011 20:57:15 -0700 Subject: [Live-devel] auto playlist generation In-Reply-To: References: Message-ID: <74D378F9-9FCC-4294-84D5-51C92951428B@live555.com> > i do have a hendred of MP3 files > that i want to stream to mobile devices > but is hard to generate a m3u or any kind of playlist > can live555 media server do that ? No, not unless you give the client a playlist. The simplest way to do what you want is simply to concatenate together all of your MP3 files into a single new file, and just make that file available to be streamed by the server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Thu Dec 8 00:00:04 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Thu, 8 Dec 2011 10:00:04 +0200 Subject: [Live-devel] Stop specific streams. Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local> Hi Ross, Thanks to all you assistance in the past, I am now able to play multiple streams from the same port. Everything is working as I need it to, but I have another question: How do I stop a specific stream if multiple streams are running? Let me first let you how I am creating these multiple streams: 1. Create an RTSPServer, if none exists. I make use of only 1 RTSP server. 2. Create a new ServerMediaSession for each stream. This ServerMediaSession is saved in a map with a unique identifier for the stream. 3. Create a new OnDemandMediaSubsession of each stream, depending on the type of media. 4. Add the OnDemandMediaSubsession to the ServerMediaSession. 5. Add the ServerMediaSession to the RTSP Server. 6. Run the doEventLoop if it has not been started yet. If it has, do nothing. Note that I have only one RTSP server and one doEventLoop, regardless of how many streams run. I am aware that in order to stop a stream, a watchVariable is used. I am able to stop ALL streams using this method, but I cannot stop a specific stream. I cannot see how it is possible to stop only a specific stream if I make use of only one doEventLoop. If I have a doEventLoop for every stream, then the app crashes, so I assume that is not the correct way to do it. Any suggestions will be appreciated. Regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 8 00:46:24 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Dec 2011 01:46:24 -0700 Subject: [Live-devel] Stop specific streams. In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local> Message-ID: <968E6CAF-1641-4C33-A1F4-F1B0C4C57593@live555.com> > Let me first let you how I am creating these multiple streams: > > 1. Create an RTSPServer, if none exists. I make use of only 1 RTSP server. > 2. Create a new ServerMediaSession for each stream. This ServerMediaSession is saved in a map with a unique identifier for the stream. > 3. Create a new OnDemandMediaSubsession of each stream, depending on the type of media. > 4. Add the OnDemandMediaSubsession to the ServerMediaSession. > 5. Add the ServerMediaSession to the RTSP Server. > 6. Run the doEventLoop if it has not been started yet. If it has, do nothing. > > Note that I have only one RTSP server and one doEventLoop, regardless of how many streams run. > > I am aware that in order to stop a stream, a watchVariable is used. I am able to stop ALL streams using this method, but I cannot stop a specific stream. First, you may find it better to use the new "event trigger" mechanism - see "UsageEnvironment/include/UsageEnvironment.hh" - rather than the "watchVariable" mechanism. (If you use "event triggers", then the "clientData" parameter to "triggerEvent()" could be used to (somehow) indicate a specific stream that you want to 'stop'.) But alternatively, if you want to still use the "watchVariable" mechanism, the you can do so by having your 'signalling' thread (i.e., the one that sets the 'watch variable', but does not otherwise access any LIVE555 objects) also set some global variable in such a way to indicate which specific stream you want to 'stop'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Thu Dec 8 05:16:47 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Thu, 8 Dec 2011 15:16:47 +0200 Subject: [Live-devel] Stop specific streams. In-Reply-To: <968E6CAF-1641-4C33-A1F4-F1B0C4C57593@live555.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local> <968E6CAF-1641-4C33-A1F4-F1B0C4C57593@live555.com> Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8A66@SSTSVR1.sst.local> Hi Ross, Thanks for the reply. The part that I don't understand that if I only have one doEventLoop for any number of streams, how can I stop that event loop without stopping all of the streams? Surely if I stop my one and only eventLoop, then none of the streams will work. I have been experimenting will closing the desired ServerMediaSession and the OnDemandMediaSession but so far no success. Any advise would be appreciated. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 08 December 2011 10:46 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Stop specific streams. Let me first let you how I am creating these multiple streams: 1. Create an RTSPServer, if none exists. I make use of only 1 RTSP server. 2. Create a new ServerMediaSession for each stream. This ServerMediaSession is saved in a map with a unique identifier for the stream. 3. Create a new OnDemandMediaSubsession of each stream, depending on the type of media. 4. Add the OnDemandMediaSubsession to the ServerMediaSession. 5. Add the ServerMediaSession to the RTSP Server. 6. Run the doEventLoop if it has not been started yet. If it has, do nothing. Note that I have only one RTSP server and one doEventLoop, regardless of how many streams run. I am aware that in order to stop a stream, a watchVariable is used. I am able to stop ALL streams using this method, but I cannot stop a specific stream. First, you may find it better to use the new "event trigger" mechanism - see "UsageEnvironment/include/UsageEnvironment.hh" - rather than the "watchVariable" mechanism. (If you use "event triggers", then the "clientData" parameter to "triggerEvent()" could be used to (somehow) indicate a specific stream that you want to 'stop'.) But alternatively, if you want to still use the "watchVariable" mechanism, the you can do so by having your 'signalling' thread (i.e., the one that sets the 'watch variable', but does not otherwise access any LIVE555 objects) also set some global variable in such a way to indicate which specific stream you want to 'stop'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Thu Dec 8 06:16:05 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 8 Dec 2011 14:16:05 +0000 Subject: [Live-devel] Creating multiple OpenRtspClient [ or destroying current client in a proper way ] In-Reply-To: <376A29B8-A134-4157-85F7-867BA3357B86@live555.com> References: <376A29B8-A134-4157-85F7-867BA3357B86@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A2BB6@IL-BOL-EXCH01.smartwire.com> I too am having difficulty shutting down in my rtspclients . When I went to async, I ended up with things happening after the dtor occasionaly or a memory leak. I realized the OpenRTSP test app avoids this with an exit, which is not an option from me. I need one app that manages multiple concurrent RTSP connections that may independently go up and down at will as well as on demand. 100's or even thousands of RTSP sources! Refactoring for multiple processes is not in the budget. I think this is a great library but I must take exception to the gmail statement, I have seen it many times before on this list and I think it is just no longer true. (Kinda like how you couldn't validate your new VISA card with a cell phone in the past.) I have my old earthlink account, but that is tied to one machine. I have the email of my current job and I have a long standing gmail that I use for everything else because it works on every computer and my smartphone. I had originally wished I signed up for this email list on gmail so I didn't have to drive 10 miles to see if there had been a reply (or Remote desktop or otherwise POLL for a response) until I saw these statements. I do NOT want this to start a big conversation on gmail vs foo and pollute this list, I just want point out that statement may need to be re-evaluated. Gmail it is not Hotmail, and the Generally speaking conclusion has no relation to a persons email address. (jshanab42 at gmail.com) From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 07, 2011 9:55 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Creating multiple OpenRtspClient [ or destroying current client in a proper way ] This software is not intended to be modified by people who use "@gmail.com" email addresses. (Generally speaking, such people lack the necessary technical sophistication.) But in your case, you probably don't need to modify the "openRTSP" code at all. Because you want to just open multiple "rtsp://" URLs, one after the other, you can do so simply by running (the original, unmodified) "openRTSP" application multiple times, in succession - e.g., from a shell script. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1873 / Virus Database: 2102/4666 - Release Date: 12/07/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 8 07:42:36 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Dec 2011 08:42:36 -0700 Subject: [Live-devel] Stop specific streams. In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8A66@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local> <968E6CAF-1641-4C33-A1F4-F1B0C4C57593@live555.com> <002962EA5927BE45B2FFAB0B5B5D67970A8A66@SSTSVR1.sst.local> Message-ID: <47DBD27E-2965-4EC4-82CD-DB4145BC3CF3@live555.com> > The part that I don't understand that if I only have one doEventLoop for any number of streams, how can I stop that event loop without stopping all of the streams? Surely if I stop my one and only eventLoop, then none of the streams will work. Yes, but the way that programmers typically use "doEventLoop()" with a 'watch variable' is inside a loop - e.g. while (1) { env.taskScheduler().doEventLoop(&watchVariable); // handle the setting of the 'watch variable', then reenter the event loop } This is why it's better to use the new "event trigger" mechanism. That way, you can call "doEventlLoop()" just once, and never leave it. (In this case, the 'trigger handler' routines are all called from within the event loop.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 8 08:09:28 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Dec 2011 09:09:28 -0700 Subject: [Live-devel] Creating multiple OpenRtspClient [ or destroying current client in a proper way ] In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A2BB6@IL-BOL-EXCH01.smartwire.com> References: <376A29B8-A134-4157-85F7-867BA3357B86@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A2BB6@IL-BOL-EXCH01.smartwire.com> Message-ID: > I too am having difficulty shutting down in my rtspclients . When I went to async, I ended up with things happening after the dtor occasionaly or a memory leak. If things worked OK when you were using the synchronous interface, but fail only now that you are using the asynchronous interface, then this is probably a programming error on your part. Most likely, you are inadvertently closing objects that are associated with streams that are still running. The only real difference between the synchronous interface and the asynchronous interface is that the latter allows you to do things while RTSP requests are still 'in progress'. If, instead, you don't do anything with a stream until you get a RTSP response, then you have exactly the same functionality as the synchronous. > I think this is a great library but I must take exception to the gmail statement That's fine, but I will not back down on this. Using a "@gmail" (or "@yahoo" or "@hotmail" etc.) email address is the modern-day equivalent of wearing a dunce cap. I will continue to use such email addresses as a first-level "bozo filter" when deciding which messages are worth responding to. If you look through the mailing list archives and note the widespread cluelessness displayed by most people who use such addresses, you'll see what I mean. Note that you can, if you wish, use Google's 'GMail' with your own domain name; that's fine with me. (I don't care what mail delivery/storage system you use for your email; only the domain name that your email address uses - i.e., the face that you're presenting to the world.) This is a professional mailing list, and if you want to be taken seriously on it, you should use a professional email address. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From buivuhoang15 at gmail.com Thu Dec 8 02:33:56 2011 From: buivuhoang15 at gmail.com (Hoang Bui Vu) Date: Thu, 8 Dec 2011 17:33:56 +0700 Subject: [Live-devel] How to create a custom sink that streams to both the network and a local file Message-ID: I'm currently doing a live camera streaming project that needs a history function. I've done some search and come to the solution of creating a custom sink that is the combination of MPEG4ESVideoRTPSink and FileSink, which will stream to both a local file and the network. What I've done is creating a new class called RTPFileSink which is the merged version of the two classes. I modified the constructor and the createNew function to take parameters from both classes. However, the resulting class seems to be in conflict, as the server continuously tries to read from the source and fails. The output looks like this: Beginning to read from file... ...done reading from file Beginning to read from file... ...done reading from file Beginning to read from file... ...done reading from file Beginning to read from file... ...done reading from file Beginning to read from file... ...done reading from file and it goes on forever. My guess is that the code i merge is containing two different source readers and streamers, thus they deny each other to read the source. I am still very new to this framework, any help is much appreciated. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 8 11:41:59 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Dec 2011 12:41:59 -0700 Subject: [Live-devel] How to create a custom sink that streams to both the network and a local file In-Reply-To: References: Message-ID: <900D5E18-98A6-446B-B025-8A078FB91691@live555.com> > I'm currently doing a live camera streaming project that needs a history function. I've done some search and come to the solution of creating a custom sink that is the combination of MPEG4ESVideoRTPSink and FileSink, which will stream to both a local file and the network. A simpler solution would be to write a new 'filter' class (i.e., a subclass of "FramedFilter") that simply delivers its input to its output, but also writes to a file. Then, add an object of this class to the end of your data stream (i.e., before you feed it into a sink). That way, you won't need to create (or modify) any "MediaSink" class at all. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Fri Dec 9 01:04:45 2011 From: Marlon at scansoft.co.za (Marlon Reid) Date: Fri, 9 Dec 2011 11:04:45 +0200 Subject: [Live-devel] Stop specific streams. In-Reply-To: <47DBD27E-2965-4EC4-82CD-DB4145BC3CF3@live555.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local><968E6CAF-1641-4C33-A1F4-F1B0C4C57593@live555.com><002962EA5927BE45B2FFAB0B5B5D67970A8A66@SSTSVR1.sst.local> <47DBD27E-2965-4EC4-82CD-DB4145BC3CF3@live555.com> Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8A6C@SSTSVR1.sst.local> Hi, So I am trying to create a event trigger, but I am unsure on how to proceed. The taskScheduler->createEventTrigger() function expects a taskFunc object and I am not sure what this is. Is it similar to a callback? If so, how do I cast my function do a taskFunc object? I currently having something like this, which is way off: void killStream() {}; taskSched->createEventTrigger(&killstream); I am also still unsure about the mechanisms for actually stopping the streams. In the past killing the eventLoop was fine, but if I want to stop a single stream and leave the eventLoop running, how do I do it? Do I medium::close() the appropriate servermediasubsession? Do I remove it from the rtspServer? I tried these things but none seem to work. Thanks ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 08 December 2011 05:43 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Stop specific streams. The part that I don't understand that if I only have one doEventLoop for any number of streams, how can I stop that event loop without stopping all of the streams? Surely if I stop my one and only eventLoop, then none of the streams will work. Yes, but the way that programmers typically use "doEventLoop()" with a 'watch variable' is inside a loop - e.g. while (1) { env.taskScheduler().doEventLoop(&watchVariable); // handle the setting of the 'watch variable', then reenter the event loop } This is why it's better to use the new "event trigger" mechanism. That way, you can call "doEventlLoop()" just once, and never leave it. (In this case, the 'trigger handler' routines are all called from within the event loop.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 9 08:40:35 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 9 Dec 2011 09:40:35 -0700 Subject: [Live-devel] Stop specific streams. In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8A6C@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local><968E6CAF-1641-4C33-A1F4-F1B0C4C57593@live555.com><002962EA5927BE45B2FFAB0B5B5D67970A8A66@SSTSVR1.sst.local> <47DBD27E-2965-4EC4-82CD-DB4145BC3CF3@live555.com> <002962EA5927BE45B2FFAB0B5B5D67970A8A6C@SSTSVR1.sst.local> Message-ID: <4AAFBCEB-BAE4-4DBF-A501-BBE201A5B799@live555.com> > So I am trying to create a event trigger, but I am unsure on how to proceed. The taskScheduler->createEventTrigger() function expects a taskFunc object and I am not sure what this is. It's quite simple. It's this: typedef void TaskFunc(void* clientData); I.e., it's a void function that takes a "void*" as argument. > Is it similar to a callback? Yes, but it's 'called back' from within the event loop, when the event is 'triggered'. > If so, how do I cast my function do a taskFunc object? I currently having something like this, which is way off: > > void killStream() {}; "kilStream()" needs a single void* parameter. Don't you want to specify a specific stream that you want to 'kill'? If so, then you can use that as your parameter (cast it to a void*) when you later call "triggerEvent()". > taskSched->createEventTrigger(&killstream); Yes, but you'll need to remember the result "EventTriggerId" of this call, so you can later use it in your call to "triggerEvent()". If you're still unsure about how to use event triggers, then you can see an example in "liveMedia/DeviceSource.cpp". > I am also still unsure about the mechanisms for actually stopping the streams. OK, this is a completely different question. To remove (and delete) a "ServerMediaSession" object from the server, simply call RTSPServer::removeServerMediaSession() ***Do not*** call "Medium::close()" on the "ServerMediaSession" object; that is done automatically by "removeServerMediaSession()". Note, however, that although this will prevent any future clients from accessing this stream, it will not stop any streaming that's current ongoing for this "ServerMediaSession". If you want to do that, you would need to delete the "RTSPServer::RTSPClientSession" object for each currently active client. (This has the same effect as the client having done a RTSP "TEARDOWN".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 9 10:09:35 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 9 Dec 2011 18:09:35 +0000 Subject: [Live-devel] Stop specific streams. In-Reply-To: <4AAFBCEB-BAE4-4DBF-A501-BBE201A5B799@live555.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A8A64@SSTSVR1.sst.local><968E6CAF-1641-4C33-A1F4-F1B0C4C57593@live555.com><002962EA5927BE45B2FFAB0B5B5D67970A8A66@SSTSVR1.sst.local> <47DBD27E-2965-4EC4-82CD-DB4145BC3CF3@live555.com> <002962EA5927BE45B2FFAB0B5B5D67970A8A6C@SSTSVR1.sst.local> <4AAFBCEB-BAE4-4DBF-A501-BBE201A5B799@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A305D@IL-BOL-EXCH01.smartwire.com> I happen to use the following data class because I want to hang onto a few items. //Call back instance and session pointer container. class MVSRTSPClient; class MVSClientData { public: MVSClientData(MVSRTSPClient* instancePtr, MediaSubsession* subsessionPtr) : instancePtr_(instancePtr), subsessionPtr_(subsessionPtr){} MVSRTSPClient* instancePtr_; MediaSubsession* subsessionPtr_; }; Then I just cast it when I get called back Ie void callbackSubsessionAfterPLAYING(void* vclient) { boost::mutex::scoped_lock lock(MVS::CMVSRTSPClient::AFTERPlayMutex_); // Begin by closing this media subsession's stream: MVSClientData* client = (MVSClientData*) vclient; MediaSubsession* subsession = (MediaSubsession*)client->subsessionPtr_; Medium::close(subsession->sink); subsession->sink = NULL; // Next, check whether *all* subsessions' streams have now been closed: MediaSession& session = subsession->parentSession(); MediaSubsessionIterator iter(session); while ((subsession = iter.next()) != NULL) { if (subsession->sink != NULL) return; // this subsession is still active } // All subsessions' streams have now been closed MVSRTSPClient* mySelf = (MVSRTSPClient*) client->instancePtr_; mySelf->sessionAfterPlaying(); } The locking is because I have many instances and many threads From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 09, 2011 10:41 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Stop specific streams. So I am trying to create a event trigger, but I am unsure on how to proceed. The taskScheduler->createEventTrigger() function expects a taskFunc object and I am not sure what this is. It's quite simple. It's this: typedef void TaskFunc(void* clientData); I.e., it's a void function that takes a "void*" as argument. Is it similar to a callback? Yes, but it's 'called back' from within the event loop, when the event is 'triggered'. If so, how do I cast my function do a taskFunc object? I currently having something like this, which is way off: void killStream() {}; "kilStream()" needs a single void* parameter. Don't you want to specify a specific stream that you want to 'kill'? If so, then you can use that as your parameter (cast it to a void*) when you later call "triggerEvent()". taskSched->createEventTrigger(&killstream); Yes, but you'll need to remember the result "EventTriggerId" of this call, so you can later use it in your call to "triggerEvent()". If you're still unsure about how to use event triggers, then you can see an example in "liveMedia/DeviceSource.cpp". I am also still unsure about the mechanisms for actually stopping the streams. OK, this is a completely different question. To remove (and delete) a "ServerMediaSession" object from the server, simply call RTSPServer::removeServerMediaSession() ***Do not*** call "Medium::close()" on the "ServerMediaSession" object; that is done automatically by "removeServerMediaSession()". Note, however, that although this will prevent any future clients from accessing this stream, it will not stop any streaming that's current ongoing for this "ServerMediaSession". If you want to do that, you would need to delete the "RTSPServer::RTSPClientSession" object for each currently active client. (This has the same effect as the client having done a RTSP "TEARDOWN".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From TWiser at logostech.net Fri Dec 9 15:38:02 2011 From: TWiser at logostech.net (Wiser, Tyson) Date: Fri, 9 Dec 2011 15:38:02 -0800 Subject: [Live-devel] Live555 limiting network traffic? Message-ID: <8CD7A9204779214D9FDC255DE48B9521ADB23061@EXPMBX105-1.exch.logostech.net> This is kind of a shot in the dark, but I have been puzzling over a problem for several days and I am hoping someone here can either rule out Live555 or confirm that it could be contributing. I am running an RTSP server application based on Live555 that receives live H.264 encoded video from a hardware encoder (which gets its data from a camera), muxes it into an MPEG2-TS and then sends it out as multicast raw UDP data (our customer's requirements do not allow us to use RTP/RTCP). My setup is as follows. I get a signal from a camera that a new frame is available, which triggers the encoder to encode the frame and then pass it off to my server. After my server muxes the data it writes it to a Linux pipe that is used as the input to a ByteStreamFileSource. The ByteStreamFileSource is used as the input to an MPEG2TransportStreamFramer. The MPEG2TransportStreamFramer is returned by an OnDemandServerMediaSubsession subclass's createNewStreamSource function. This has been working very well for us, though we have noticed what we have assumed were decoder buffering problems when viewing the live video. Recently, however, I was analyzing a Wireshark capture and noticed a very strange traffic pattern (see attached image) that I am now convinced has caused most if not all of what we were considering decoder buffering problems. The graph plots the number of packets received per millisecond. The blue bars indicate the receipt of the signal that a new frame is available from the camera. The red bars indicate the receipt of the resulting multicast stream packets. For about the first quarter of the graph you will see the expected pattern of frame available signal followed by a burst of multicast traffic followed by nothing until the next frame available signal (our camera frame rate is about 7Hz). The remaining three quarters of the graph, however, show the pattern that we get most of the time. After a signal we will occasionally get a burst of multicast traffic, but most often we seem to be limited to one multicast packet every 10 milliseconds or so. Occasionally it will clear up and go back to the expected pattern for a brief time period but then it will return to the bad pattern. It is definitely not a firewall, bandwidth, or networking equipment issue. Is there anything in BasicUDPSink (which is created by OnDemandServerMediaSubsession for raw UDP transfer), MPEG2TransportStreamFramer, or ByteStreamFileSource that could somehow be limiting the rate at which the UDP packets are sent under any circumstances? Thanks in advance for your attention and response. Tyson Wiser -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: multicast-traffic-graph.png Type: image/png Size: 10341 bytes Desc: multicast-traffic-graph.png URL: From finlayson at live555.com Sat Dec 10 02:48:47 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 10 Dec 2011 03:48:47 -0700 Subject: [Live-devel] Live555 limiting network traffic? In-Reply-To: <8CD7A9204779214D9FDC255DE48B9521ADB23061@EXPMBX105-1.exch.logostech.net> References: <8CD7A9204779214D9FDC255DE48B9521ADB23061@EXPMBX105-1.exch.logostech.net> Message-ID: > Is there anything in BasicUDPSink (which is created by OnDemandServerMediaSubsession for raw UDP transfer), MPEG2TransportStreamFramer, or ByteStreamFileSource that could somehow be limiting the rate at which the UDP packets are sent under any circumstances? Yes. "BasicUDPSink" (and its RTP equivalent) delays after sending each packet, depending upon the "duration" (specifically, "fDurationInMicroseconds") parameter that has been set for the chunk of data that it has just received. If, however, this duration parameter is zero (its default value), then "BasicUDPSink" - after sending out each packet - immediately asks for new data from its upstream object. This "duration" parameter is important when you're streaming from a pre-recorded file, because otherwise the file data would be streamed as fast as possible, which is not what you want. (When you're streaming from a file, you want to stream it out at its 'natural rate'.) For Transport Stream data, the "MPEG2TransportStreamFramer" object is used to parse the Tranport Stream data to estimate the "duration" of each chunk of data that gets passed to the "BasicUDPSink" (or an equivalent RTP sink). However, because you're streaming from a live input source, you don't need to compute this "duration" parameter. Instead, you can just send out data as soon as it's generated This means that you don't need a "MPEG2TransportStreamFramer"; you can therefore omit this in your implementation of the "createNewStreamSource()" virtual function (in your "OnDemandServerMediaSubsession" subclass). Instead, you should set the (normally optional) "preferredFrameSize" parameter to "ByteStreamFileSource::createNew()", to ensure that you read properly-sized chunks of data (a multiple of the 188-byte Transport 'packet' size) each time. (It's likely that your receivers care about this.) I suggest setting this parameter to 7*188 (==1316), to ensure that your outgoing UDP packets (probably) won't get fragmented in the network. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon Dec 12 01:46:24 2011 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 12 Dec 2011 10:46:24 +0100 Subject: [Live-devel] TR: NoReuse::NoReuse static variable and multithreading Message-ID: <4877_1323683195_4EE5CD7B_4877_13256_1_1BE8971B6CFF3A4F97AF4011882AA2550155F789344E@THSONEA01CMS01P.one.grp> De : PROMONET Michel Envoy? : lundi 5 d?cembre 2011 15:07 ? : 'live-devel-bounces at ns.live555.com' Objet : NoReuse::NoReuse static variable and multithreading Hi, I read from the mailing list that is not recommended to use live555 in a multithread way, but is it acceptable to run several live555 loop in different threads ? In such a case the static in NoReuse is annoying because could be set/reset by a concurrent thread. To tackle this we implement a workaround introducing this information in the UsageEnvironment. What's your feeling ? Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftriboix at falcon-one.com Mon Dec 12 13:49:16 2011 From: ftriboix at falcon-one.com (Fabrice Triboix) Date: Mon, 12 Dec 2011 21:49:16 +0000 Subject: [Live-devel] Add an audio sub-session makes the video stop Message-ID: <4EE676DC.1090001@falcon-one.com> Hello everyone, I hope somebody could point me to some directions where to investigate. I am trying to modify an existing RTSP server based on live555. It streams live video without problems, and I have to add live audio sub-streams for each video streams. As a first step, I wanted to stream an MP3 file, so I created an audio source class based on the "DeviceSource" template. Every time the doGetNextFrame() function is called, I read 10000 bytes from the file and update the data members of the class accordingly, and then call the static FramedSource::afterGetting() method. I normally connected the output of my class to an "MPEG1or2AudioStreamFramer" by calling CMySource* src = CMySource::createNew(...) MPEG1or2AudioStreamFramer::createNew(envir(), src) In the existing code, there is also a class based on OnDemandServerMediaSubsession, let's call it CMySubsession. It implements: - createNewStreamSource(): which returns a MPEG1or2AudioStreamFramer* created as above - createNewRTPSink(): which returns an MPEG1or2AudioRTPSink Now, if I don't add the audio sub-session, the video plays fine with VLC. If I add the audio substream, the destination address in the "groupsock" for the video stream stays at 0.0.0.0, and thus nothing is sent. I also noticed on the VLC side that no SDP is sent and the server closes the RTSP TCP connection after about 10 seconds. The MP3 file looks OK, I can stream it using testMP3Streamer. Thanks a lot for any help! Best regards, Fabrice From finlayson at live555.com Tue Dec 13 05:36:24 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Dec 2011 06:36:24 -0700 Subject: [Live-devel] NoReuse::NoReuse static variable and multithreading In-Reply-To: <4877_1323683195_4EE5CD7B_4877_13256_1_1BE8971B6CFF3A4F97AF4011882AA2550155F789344E@THSONEA01CMS01P.one.grp> References: <4877_1323683195_4EE5CD7B_4877_13256_1_1BE8971B6CFF3A4F97AF4011882AA2550155F789344E@THSONEA01CMS01P.one.grp> Message-ID: <6C4ADA06-0183-4989-B711-82B23DC27BDB@live555.com> > I read from the mailing list that is not recommended to use live555 in a multithread way, but is it acceptable to run several live555 loop in different threads ? I.e,, provided that each thread uses its own "TaskScheduler" and "UsageEnvironment", yes. > In such a case the static in NoReuse is annoying because could be set/reset by a concurrent thread. > > To tackle this we implement a workaround introducing this information in the UsageEnvironment. > > What's your feeling ? You're correct. It should be moved the the "UsageEnvironment"s "groupsockPriv" structure. This will be fixed in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Tue Dec 13 05:52:27 2011 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 13 Dec 2011 14:52:27 +0100 Subject: [Live-devel] memory leaks in DelayQueue ? Message-ID: <3507_1323784362_4EE758AA_3507_2763_1_1BE8971B6CFF3A4F97AF4011882AA2550155F79848B1@THSONEA01CMS01P.one.grp> Hi, I continue to analyse memory leaks at exit, so I have an other valgrind report to submit to your mailing list : 192 bytes in 3 blocks are definitely lost in loss record 1 of 1 (see: http://valgrind.org/docs/manual/mc-manual.html#mc-manual.leaks) at 0x4C27CC1: operator new(unsigned long) (vg_replace_malloc.c:261) by 0xDF883C: BasicTaskScheduler0::scheduleDelayedTask(long, void (*)(void*), void*) (BasicTaskScheduler0.cpp:64) by 0xDF6228: schedulerTickTask(void*) (BasicTaskScheduler.cpp:37) by 0xDF8BFA: AlarmHandler::handleTimeout() (BasicTaskScheduler0.cpp:34) by 0xDF75B9: DelayQueue::handleAlarm() (DelayQueue.cpp:180) by 0xDF6BAF: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:190) by 0xDF87C4: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:81) I propose a fix something like : In BasicUsageEnvironment/DelayQueue.cpp DelayQueue::~DelayQueue() { - while (fNext != this) removeEntry(fNext); + while (fNext != this) { DelayQueueEntry* entry=fNext; removeEntry(fNext) ; delete entry; }; } I haven't deeply investigate how works this chained list, but seems to makes valgrind happy. Do you think this is acceptable ? Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 13 05:53:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Dec 2011 06:53:31 -0700 Subject: [Live-devel] Add an audio sub-session makes the video stop In-Reply-To: <4EE676DC.1090001@falcon-one.com> References: <4EE676DC.1090001@falcon-one.com> Message-ID: <49706EE7-BF32-4673-82CF-5F83E9C53FB8@live555.com> > I am trying to modify an existing RTSP server based on live555. It streams live video without problems, and I have to add live audio sub-streams for each video streams. > > As a first step, I wanted to stream an MP3 file, so I created an audio source class based on the "DeviceSource" template. Before you do this, you should first just add a "MP3AudioFileServerMediaSubsession" with the MP3 file - just to make sure that this (an audio subsession with data coming from a file) works OK for you. > Every time the doGetNextFrame() function is called, I read 10000 bytes from the file and update the data members of the class accordingly, and then call the static FramedSource::afterGetting() method. > I normally connected the output of my class to an "MPEG1or2AudioStreamFramer" by calling > CMySource* src = CMySource::createNew(...) > MPEG1or2AudioStreamFramer::createNew(envir(), src) > > In the existing code, there is also a class based on OnDemandServerMediaSubsession, let's call it CMySubsession. It implements: > - createNewStreamSource(): which returns a MPEG1or2AudioStreamFramer* created as above > - createNewRTPSink(): which returns an MPEG1or2AudioRTPSink > > Now, if I don't add the audio sub-session, the video plays fine with VLC. If I add the audio substream, the destination address in the "groupsock" for the video stream stays at 0.0.0.0, and thus nothing is sent. I also noticed on the VLC side that no SDP is sent and the server closes the RTSP TCP connection after about 10 seconds. That's strange. Before using VLC, I suggest using "openRTSP" as a client. That should give you a little more information about where things are going wrong. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 13 06:06:06 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Dec 2011 07:06:06 -0700 Subject: [Live-devel] memory leaks in DelayQueue ? In-Reply-To: <3507_1323784362_4EE758AA_3507_2763_1_1BE8971B6CFF3A4F97AF4011882AA2550155F79848B1@THSONEA01CMS01P.one.grp> References: <3507_1323784362_4EE758AA_3507_2763_1_1BE8971B6CFF3A4F97AF4011882AA2550155F79848B1@THSONEA01CMS01P.one.grp> Message-ID: > I continue to analyse memory leaks Yes, but you should do so on an up-to-date version of the code! > In BasicUsageEnvironment/DelayQueue.cpp > > DelayQueue::~DelayQueue() { > - while (fNext != this) removeEntry(fNext); > + while (fNext != this) { DelayQueueEntry* entry=fNext; removeEntry(fNext) ; delete entry; }; > } That was fixed back in the 2011.06.12 version! To save everyone's time (especially yours), you should be working with the latest version of the code! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Tue Dec 13 09:17:53 2011 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 13 Dec 2011 18:17:53 +0100 Subject: [Live-devel] memory leaks in DelayQueue ? In-Reply-To: References: <3507_1323784362_4EE758AA_3507_2763_1_1BE8971B6CFF3A4F97AF4011882AA2550155F79848B1@THSONEA01CMS01P.one.grp> Message-ID: <9368_1323796687_4EE788CF_9368_624_1_1BE8971B6CFF3A4F97AF4011882AA2550155F7984F3E@THSONEA01CMS01P.one.grp> Hi, You're right we are using 2011.01.10, I upgrade a couple of week from 2009.06.02 to 2011.01.10, but I did not take lastest releease to minimize changes. Thank's, I will do the job to upgrade to a newer release. Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : mardi 13 d?cembre 2011 15:06 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] memory leaks in DelayQueue ? I continue to analyse memory leaks Yes, but you should do so on an up-to-date version of the code! In BasicUsageEnvironment/DelayQueue.cpp DelayQueue::~DelayQueue() { - while (fNext != this) removeEntry(fNext); + while (fNext != this) { DelayQueueEntry* entry=fNext; removeEntry(fNext) ; delete entry; }; } That was fixed back in the 2011.06.12 version! To save everyone's time (especially yours), you should be working with the latest version of the code! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From TWiser at logostech.net Tue Dec 13 09:55:51 2011 From: TWiser at logostech.net (Wiser, Tyson) Date: Tue, 13 Dec 2011 09:55:51 -0800 Subject: [Live-devel] Live555 limiting network traffic? In-Reply-To: References: <8CD7A9204779214D9FDC255DE48B9521ADB23061@EXPMBX105-1.exch.logostech.net> Message-ID: <8CD7A9204779214D9FDC255DE48B9521ADC06E57@EXPMBX105-1.exch.logostech.net> Ross, Thank you very much for your response. Removing MPEG2TransportStreamFramer from the chain seemed to do the trick. Thanks for your great work. Tyson From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, December 10, 2011 5:49 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 limiting network traffic? Is there anything in BasicUDPSink (which is created by OnDemandServerMediaSubsession for raw UDP transfer), MPEG2TransportStreamFramer, or ByteStreamFileSource that could somehow be limiting the rate at which the UDP packets are sent under any circumstances? Yes. "BasicUDPSink" (and its RTP equivalent) delays after sending each packet, depending upon the "duration" (specifically, "fDurationInMicroseconds") parameter that has been set for the chunk of data that it has just received. If, however, this duration parameter is zero (its default value), then "BasicUDPSink" - after sending out each packet - immediately asks for new data from its upstream object. This "duration" parameter is important when you're streaming from a pre-recorded file, because otherwise the file data would be streamed as fast as possible, which is not what you want. (When you're streaming from a file, you want to stream it out at its 'natural rate'.) For Transport Stream data, the "MPEG2TransportStreamFramer" object is used to parse the Tranport Stream data to estimate the "duration" of each chunk of data that gets passed to the "BasicUDPSink" (or an equivalent RTP sink). However, because you're streaming from a live input source, you don't need to compute this "duration" parameter. Instead, you can just send out data as soon as it's generated This means that you don't need a "MPEG2TransportStreamFramer"; you can therefore omit this in your implementation of the "createNewStreamSource()" virtual function (in your "OnDemandServerMediaSubsession" subclass). Instead, you should set the (normally optional) "preferredFrameSize" parameter to "ByteStreamFileSource::createNew()", to ensure that you read properly-sized chunks of data (a multiple of the 188-byte Transport 'packet' size) each time. (It's likely that your receivers care about this.) I suggest setting this parameter to 7*188 (==1316), to ensure that your outgoing UDP packets (probably) won't get fragmented in the network. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftriboix at falcon-one.com Tue Dec 13 14:29:19 2011 From: ftriboix at falcon-one.com (Fabrice Triboix) Date: Tue, 13 Dec 2011 22:29:19 +0000 Subject: [Live-devel] Add an audio sub-session makes the video stop In-Reply-To: References: Message-ID: <4EE7D1BF.50205@falcon-one.com> Dear Ross, > Before you do this, you should first just add a "MP3AudioFileServerMediaSubsession" with the MP3 file - just to make sure that this (an audio subsession with data coming from a file) works OK for you. Thanks for the tip! I just tried it instead of our custom subsession class, and it works. Well the audio works, but the video is still... > That's strange. Before using VLC, I suggest using "openRTSP" as a client. That should give you a little more information about where things are going wrong. I did the observation using wireshark, and the server clearly disconnected the RTSP TCP session by sending a FIN packet. I'll try to have a deeper look at our media subsession class.. Thanks for the help, I'll come again if I get stuck! Best regards, Fabrice From jshanab at smartwire.com Tue Dec 13 15:12:03 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 13 Dec 2011 23:12:03 +0000 Subject: [Live-devel] mpeg2transportstreamfromessource Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> I have an existing multi rtsp source application that records video to disk and streams saved video and live video across our own http protocol. I am now trying to add HTTP Live Streaming for portable devices. I go to the point where all the connections are happening and index files and ts files are created but the .ts files containing the H264 video are not playable. I don't know what I am doing wrong. I have an evvironment and scheduler and a MPEG2TransportStreamFrameFromSource instance with a ESSource modeled after the DeviceSource and a class that inherits from MediaSink for the sink. Frames are added into the source and everything flows with 188 byte packets coming out the other side. The eventloop calls my sink and I add the 188 byte packets into the buffer that represents the .ts file. (it is all in memory). The web server creates the index file (.m3u8 playlist) and the client comes back for the file. It downloads the file to disk in Firefox on windows and tries to play the video on the iPad. Do I need rtp around es then go to ts ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 13 19:07:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Dec 2011 19:07:53 -0800 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> Message-ID: <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> > I have an existing multi rtsp source application that records video to disk and streams saved video and live video across our own http protocol. I am now trying to add HTTP Live Streaming for portable devices. I go to the point where all the connections are happening and index files and ts files are created but the .ts files containing the H264 video are not playable. I don?t know what I am doing wrong. If you stream these (indexed) files using the (original, unmodified) "LIVE555 Media Server" application, can iPhones or iPads receive the stream OK? If not, then the problem is probably that the iPhone/iPad is unhappy with some feature of the H.264 (e.g., the H.264 "profile"). (If that's the case, then there's nothing we can do about this; you need to modify your encoding). > I have an evvironment and scheduler and a MPEG2TransportStreamFrameFromSource instance with a ESSource modeled after the DeviceSource and a class that inherits from MediaSink for the sink. Frames are added into the source and everything flows with 188 byte packets coming out the other side. The eventloop calls my sink and I add the 188 byte packets into the buffer that represents the .ts file. (it is all in memory). The web server creates the index file (.m3u8 playlist) and the client comes back for the file. It downloads the file to disk in Firefox on windows and tries to play the video on the iPad. ??? I don't understand what this has to do with HTTP Live Streaming. For our server code to do HTTP Live Streaming, you *must* be streaming *indexed* Transport Stream *files*. Our implementation of HTTP Live Streaming *cannot* work with a live input source, even if you (somehow) have made the data look like a file, because the file also needs to be indexed, and that's something that you can do only with a prerecorded file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 14 05:48:32 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 14 Dec 2011 13:48:32 +0000 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> I am not using the "LIVE555 Media Server" because I didn't think it could do Http Live Streaming.** I have my own web server and generate the index on the fly. This is very simple and works well, it is just the contents of the .ts file that are incorrect. I think this because the saved .ts file is also unplayable with VLC when other recorded files from the same H264 source play fine. I do not know if the ts file is just a container or if it needs to be wrapped around RTP that then contains the Elementary Stream data. Or perhaps the frames coming to the MPEG2TransportStreamFromESSource are looked at to determine the meta data and non payload stuff. I have an aggregated keyframe that contains the SPS,PPS and the keyframe slices. I am gonna have VLC dump a ts file and do the byte-by-byte comparison, I just thought I was maybe missing something fundamental. **HTTP Live Streaming is delayed by usually 3 - 10 second chunks or 30 seconds. The index (playlist ) does not contain the end marker and the client just goes back for more when it runs out. So ... HTTP almost Live Streaming. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, December 13, 2011 9:08 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource I have an existing multi rtsp source application that records video to disk and streams saved video and live video across our own http protocol. I am now trying to add HTTP Live Streaming for portable devices. I go to the point where all the connections are happening and index files and ts files are created but the .ts files containing the H264 video are not playable. I don't know what I am doing wrong. If you stream these (indexed) files using the (original, unmodified) "LIVE555 Media Server" application, can iPhones or iPads receive the stream OK? If not, then the problem is probably that the iPhone/iPad is unhappy with some feature of the H.264 (e.g., the H.264 "profile"). (If that's the case, then there's nothing we can do about this; you need to modify your encoding). I have an evvironment and scheduler and a MPEG2TransportStreamFrameFromSource instance with a ESSource modeled after the DeviceSource and a class that inherits from MediaSink for the sink. Frames are added into the source and everything flows with 188 byte packets coming out the other side. The eventloop calls my sink and I add the 188 byte packets into the buffer that represents the .ts file. (it is all in memory). The web server creates the index file (.m3u8 playlist) and the client comes back for the file. It downloads the file to disk in Firefox on windows and tries to play the video on the iPad. ??? I don't understand what this has to do with HTTP Live Streaming. For our server code to do HTTP Live Streaming, you *must* be streaming *indexed* Transport Stream *files*. Our implementation of HTTP Live Streaming *cannot* work with a live input source, even if you (somehow) have made the data look like a file, because the file also needs to be indexed, and that's something that you can do only with a prerecorded file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1873 / Virus Database: 2108/4679 - Release Date: 12/13/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 14 06:23:45 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Dec 2011 06:23:45 -0800 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> Message-ID: <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> > I am not using the ?LIVE555 Media Server? because I didn?t think it could do Http Live Streaming. It does, as of July this year - though only for pre-recorded (and indexed) H.264 Transport Stream files. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 14 06:45:06 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 14 Dec 2011 14:45:06 +0000 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> I understand but I need the architecture I have because of the 100's or more of sources and 100's of viewing clients so I can't write files, then index, then serve. It has to be a bit more dynamic than that. So... back to the issue, can you tell me the steps the data goes thru for this? Encoder data-->H264DescriteFramer-->essoure.addData-->MPEG2TransportStreamFromESSource->.ts file Or did I miss a step? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 14, 2011 8:24 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource I am not using the "LIVE555 Media Server" because I didn't think it could do Http Live Streaming. It does, as of July this year - though only for pre-recorded (and indexed) H.264 Transport Stream files. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 14 07:07:43 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Dec 2011 07:07:43 -0800 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> Message-ID: <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> > So? back to the issue, can you tell me the steps the data goes thru for this? > > Encoder data?H264DescriteFramer?essoure.addData?MPEG2TransportStreamFromESSource->.ts file In this case, you don't need any H.264 'framer' class (you need that only if your source is a byte stream, or if you're feeding into a "H264VideoRTPSink"). You should be able to feed your encoder data directly to a "MPEG2TransportStreamFromESSource", by calling "MPEG2TransportStreamFromESSource::addNewVideoSource()", with "mpegVersion" == 5. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 14 07:32:09 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 14 Dec 2011 15:32:09 +0000 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> OK. I may be set for version 4 :( But if I do pas the data in full frames, will it still work? My current arch has "subscribers" some subscribers restream, some write to disk, this one is gonna HTTP live stream and the next will be RTMP. I would rather only do the descriteFraming once. I may have no choice, I do need to know when each GOP starts. I was hoping that the beginning of payload plus the first 4 bytes of payload could give me that. Looking at the bytes I do see 00 00 01 EO in a good exported .ts stream and I do not see that in the bad one. Almost like the H264 stream is inside a MPEG4 stream??? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 14, 2011 9:08 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource So... back to the issue, can you tell me the steps the data goes thru for this? Encoder data-->H264DescriteFramer-->essoure.addData-->MPEG2TransportStreamFromESSource->.ts file In this case, you don't need any H.264 'framer' class (you need that only if your source is a byte stream, or if you're feeding into a "H264VideoRTPSink"). You should be able to feed your encoder data directly to a "MPEG2TransportStreamFromESSource", by calling "MPEG2TransportStreamFromESSource::addNewVideoSource()", with "mpegVersion" == 5. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 14 07:45:21 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 14 Dec 2011 15:45:21 +0000 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A45BB@IL-BOL-EXCH01.smartwire.com> Nope, set for 5. But looking at Wikipedia it is not a PES it is just the h264 Elementary stream. So how to get it in the Packetized Elementary Stream? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Wednesday, December 14, 2011 9:32 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource OK. I may be set for version 4 :( But if I do pas the data in full frames, will it still work? My current arch has "subscribers" some subscribers restream, some write to disk, this one is gonna HTTP live stream and the next will be RTMP. I would rather only do the descriteFraming once. I may have no choice, I do need to know when each GOP starts. I was hoping that the beginning of payload plus the first 4 bytes of payload could give me that. Looking at the bytes I do see 00 00 01 EO in a good exported .ts stream and I do not see that in the bad one. Almost like the H264 stream is inside a MPEG4 stream??? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 14, 2011 9:08 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource So... back to the issue, can you tell me the steps the data goes thru for this? Encoder data-->H264DescriteFramer-->essoure.addData-->MPEG2TransportStreamFromESSource->.ts file In this case, you don't need any H.264 'framer' class (you need that only if your source is a byte stream, or if you're feeding into a "H264VideoRTPSink"). You should be able to feed your encoder data directly to a "MPEG2TransportStreamFromESSource", by calling "MPEG2TransportStreamFromESSource::addNewVideoSource()", with "mpegVersion" == 5. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 14 08:05:13 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 14 Dec 2011 16:05:13 +0000 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A45BB@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1A45BB@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A45F3@IL-BOL-EXCH01.smartwire.com> How about this ESSource->MPEG2TransportStreamMultiplexor->MPEG2TransportStreamFromPESsource->mySink Instead of ESSource->MPEG2TransportStreamFromESsource->mySink Is that what is needed? PS is there a description of what each class does? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Wednesday, December 14, 2011 9:45 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource Nope, set for 5. But looking at Wikipedia it is not a PES it is just the h264 Elementary stream. So how to get it in the Packetized Elementary Stream? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Wednesday, December 14, 2011 9:32 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource OK. I may be set for version 4 :( But if I do pas the data in full frames, will it still work? My current arch has "subscribers" some subscribers restream, some write to disk, this one is gonna HTTP live stream and the next will be RTMP. I would rather only do the descriteFraming once. I may have no choice, I do need to know when each GOP starts. I was hoping that the beginning of payload plus the first 4 bytes of payload could give me that. Looking at the bytes I do see 00 00 01 EO in a good exported .ts stream and I do not see that in the bad one. Almost like the H264 stream is inside a MPEG4 stream??? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 14, 2011 9:08 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource So... back to the issue, can you tell me the steps the data goes thru for this? Encoder data-->H264DescriteFramer-->essoure.addData-->MPEG2TransportStreamFromESSource->.ts file In this case, you don't need any H.264 'framer' class (you need that only if your source is a byte stream, or if you're feeding into a "H264VideoRTPSink"). You should be able to feed your encoder data directly to a "MPEG2TransportStreamFromESSource", by calling "MPEG2TransportStreamFromESSource::addNewVideoSource()", with "mpegVersion" == 5. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Dec 14 10:08:10 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 14 Dec 2011 18:08:10 -0000 Subject: [Live-devel] Live555 EventLoop crash Message-ID: <004001ccba8b$573037a0$0590a6e0$@myers@panogenics.com> Hi Ross, I've modified my Live source Server application in the following way:- Where I was using Linux pipes to get data from the main thread to the Live555 event thread, I now cycle through a series of shared buffers. Linux pipes were too small (64k only) for my big 5MP images, and they also hang the main writing thread whilst writing to the pipes. I've also switched from background-processing to using an Event trigger to call my DeliverFrame routine. Now, after streaming a number of frames to the client, I get the following warning (the actual byte counts vary) StreamParser::afterGettingBytes() warning: read 9828 bytes; expected no more than 4142 Shortly after that I get a SIGSEGV and my server crashes somewhere in the Event loop. I think that my previous pipe-using version was self-synchronising and you could never write enough data to the pipe to keep the receiving thread busy. This version will write frames much quicker and I seem to be hitting the BANK_SIZE limit. Any advice appreciated, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 14 10:35:08 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Dec 2011 10:35:08 -0800 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> Message-ID: > OK. I may be set for version 4 L WTF is that supposed to mean? If you want people to help you, you're going to have to learn to write coherently. > My current arch has ?subscribers? some subscribers restream, some write to disk, this one is gonna HTTP live stream and the next will be RTMP. I would rather only do the descriteFraming once. Ditto. > But if I do pas the data in full frames, will it still work? Yes, probably, provided that the receivers can decode this OK. (A Transport Stream is just that - a container for 'transporting' data, in any form.) However, you will need to prepend each H.264 frame with 0x00 0x00 0x00 0x01 (if those 4 bytes are not already there). > I may have no choice, I do need to know when each GOP starts. I was hoping that the beginning of payload plus the first 4 bytes of payload could give me that. > Looking at the bytes I do see 00 00 01 EO in a good exported .ts stream and I do not see that in the bad one. Almost like the H264 stream is inside a MPEG4 stream??? Perhaps you're not looking at a proper H.264-in-Transport-Stream file? If you want to see a good one, look at http://www.live555.com/liveMedia/public/h264-in-mp2t/ (BTW, the file there ("bipbop-gear1-all.ts"), along with its index file, can also be used to test HTTP LIve Streaming.) > How about this > > ESSource->MPEG2TransportStreamMultiplexor->MPEG2TransportStreamFromPESsource->mySink > > Instead of > > ESSource->MPEG2TransportStreamFromESsource->mySink > > Is that what is needed? No! Note that "MPEG2TransportStreamFromPESSource" is a subclass of "MPEG2TransportStreamMultiplexor". I've already told you what you need to do: Feed your encoded H.264 data directly to a "MPEG2TransportStreamFromESSource", by calling "MPEG2TransportStreamFromESSource::addNewVideoSource()", with "mpegVersion" == 5. (Also, make sure that your H.264 data source generates proper presentation times for each generated frame - by setting "fPresentationTime" properly.) But, as I noted above, you may need to prepend each H.264 frame with 0x00 0x00 0x00 0x01 before feeding it to the "MPEG2TransportStreamFromESSource" (Sigh... You're much too 'high maintenance' for my taste :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 14 10:41:12 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Dec 2011 10:41:12 -0800 Subject: [Live-devel] Live555 EventLoop crash In-Reply-To: <004001ccba8b$573037a0$0590a6e0$@myers@panogenics.com> References: <004001ccba8b$573037a0$0590a6e0$@myers@panogenics.com> Message-ID: <6DCD92BB-65D0-4EB2-87D9-E23F18F67DD8@live555.com> > Now, after streaming a number of frames to the client, I get the following warning (the actual byte counts vary) > StreamParser::afterGettingBytes() warning: read 9828 bytes; expected no more than 4142 That error message indicates that your input source object did not set "fFrameSize" properly. In particular, it set it to a value greater than "fMaxSize" - bad! A data source object *must* check "fMaxSize" before delivering data and setting "fFrameSize". In particular, it needs to do something like this (note: This code taken from "liveMedia/DeviceSource.cpp"): unsigned newFrameSize = THE_FRAME_SIZE_THAT_YOU_WANT_TO_DELIVER; // Deliver the data here: if (newFrameSize > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = newFrameSize - fMaxSize; } else { fFrameSize = newFrameSize; } gettimeofday(&fPresentationTime, NULL); // If you have a more accurate time - e.g., from an encoder - then use that instead. // If the device is *not* a 'live source' (e.g., it comes instead from a file or buffer), then set "fDurationInMicroseconds" here. memmove(fTo, newFrameDataStart, fFrameSize); > Shortly after that I get a SIGSEGV and my server crashes somewhere in the Event loop. > I think that my previous pipe-using version was self-synchronising and you could never write enough data to the pipe to keep the receiving thread busy. This version will write frames much quicker and I seem to be hitting the BANK_SIZE limit. > Any advice appreciated, David > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 14 11:06:48 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 14 Dec 2011 19:06:48 +0000 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A473E@IL-BOL-EXCH01.smartwire.com> From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 14, 2011 12:35 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpeg2transportstreamfromessource OK. I may be set for version 4 :( >>Sorry. It was in response to the possibility of setting the mpegVersion to 5. WTF is that supposed to mean? If you want people to help you, you're going to have to learn to write coherently. My current arch has "subscribers" some subscribers restream, some write to disk, this one is gonna HTTP live stream and the next will be RTMP. I would rather only do the descriteFraming once. Ditto. But if I do pas the data in full frames, will it still work? Yes, probably, provided that the receivers can decode this OK. (A Transport Stream is just that - a container for 'transporting' data, in any form.) However, you will need to prepend each H.264 frame with 0x00 0x00 0x00 0x01 (if those 4 bytes are not already there). >> all NALs have the frame headers (avcodec would not decode them without it) I may have no choice, I do need to know when each GOP starts. I was hoping that the beginning of payload plus the first 4 bytes of payload could give me that. Looking at the bytes I do see 00 00 01 EO in a good exported .ts stream and I do not see that in the bad one. Almost like the H264 stream is inside a MPEG4 stream??? Perhaps you're not looking at a proper H.264-in-Transport-Stream file? If you want to see a good one, look at http://www.live555.com/liveMedia/public/h264-in-mp2t/ (BTW, the file there ("bipbop-gear1-all.ts"), along with its index file, can also be used to test HTTP LIve Streaming.) >> that is exactly the "good" file I was comparing and it is obvious now that it has PES headers and mine does not. How about this ESSource->MPEG2TransportStreamMultiplexor->MPEG2TransportStreamFromPESsource->mySink Instead of ESSource->MPEG2TransportStreamFromESsource->mySink Is that what is needed? No! Note that "MPEG2TransportStreamFromPESSource" is a subclass of "MPEG2TransportStreamMultiplexor". I've already told you what you need to do: Feed your encoded H.264 data directly to a "MPEG2TransportStreamFromESSource", by calling "MPEG2TransportStreamFromESSource::addNewVideoSource()", with "mpegVersion" == 5. (Also, make sure that your H.264 data source generates proper presentation times for each generated frame - by setting "fPresentationTime" properly.) But, as I noted above, you may need to prepend each H.264 frame with 0x00 0x00 0x00 0x01 before feeding it to the "MPEG2TransportStreamFromESSource" (Sigh... You're much too 'high maintenance' for my taste :-) >>Ross. I did all that was instructed before the first email to the list. I thought I explained the problem clearly that the .ts resulting file was not playable. I continued to work on it, guessing at options finally discovering that the PES headers are not in there. Sorry if I was not clear in that first email. Just getting nervous as my deadline approaches. :) Thanks for confirming my initial code was the correct architecture. I will debug and check the fPresentationTime variable. Is there anything that can cause MPEG2TransportStreamFromESSource to disable the PES headers? Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4680 - Release Date: 12/14/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 14 13:00:57 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Dec 2011 13:00:57 -0800 Subject: [Live-devel] mpeg2transportstreamfromessource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A473E@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A41EE@IL-BOL-EXCH01.smartwire.com> <612E6398-8BA5-418B-8A3A-C248D145AD73@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A4421@IL-BOL-EXCH01.smartwire.com> <81E16AD9-FB46-4608-9B09-38B52155AD08@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A44DF@IL-BOL-EXCH01.smartwire.com> <4D0CDDF0-D0E5-4619-9775-C2EEB1B57562@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1A457F@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1A473E@IL-BOL-EXCH01.smartwire.com> Message-ID: <156F74A2-C059-4E16-909C-AA497C2D4031@live555.com> > Is there anything that can cause MPEG2TransportStreamFromESSource to disable the PES headers? No; it automatically adds a PES header at the front of each chunk of data, before writing it to the resulting Transport Stream. The PES headers will be there; you just need to look closer :-) FYI, note that we have a demo application "testH264VideoToTransportStream" - in "testProgs" - that will convert a H.264 Elementary Stream file into a Transport Stream. If you run this application on a H.264 file that you've gotten from your decoder, you should get a playable Transport Stream file. (The main difference between "testH264VideoToTransportStream" and your application is that your application's input source delivers discrete H.264 frames, and so does not need a 'framer' class.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Dec 14 13:10:58 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 14 Dec 2011 21:10:58 -0000 Subject: [Live-devel] Live555 EventLoop crash Message-ID: <008401ccbaa4$e151a950$a3f4fbf0$@myers@panogenics.com> >That error message indicates that your input source object did not set "fFrameSize" properly. In particular, it set it to a value greater than "fMaxSize" - bad! >A data source object *must* check "fMaxSize" before delivering data and setting "fFrameSize". Ok, this is surely what I'm doing wrong, but I don't quite understand what happens to the truncated bytes when the frame is bigger than fMaxSize, they seem to just get thrown away. Does fMaxSize change each time? DeliverFrame will get called again immediately but this will get the next frame. Don't we need to send the rest of the frame on the next call to DeliverFrame? Am I missing something obvious here, sorry? - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 14 13:27:38 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Dec 2011 13:27:38 -0800 Subject: [Live-devel] Live555 EventLoop crash In-Reply-To: <008401ccbaa4$e151a950$a3f4fbf0$@myers@panogenics.com> References: <008401ccbaa4$e151a950$a3f4fbf0$@myers@panogenics.com> Message-ID: > >That error message indicates that your input source object did not set "fFrameSize" properly. In particular, it set it to a value greater than "fMaxSize" - bad! > > >A data source object *must* check "fMaxSize" before delivering data and setting "fFrameSize". > > Ok, this is surely what I?m doing wrong, but I don?t quite understand what happens to the truncated bytes when the frame is bigger than fMaxSize, they seem to just get thrown away. Yes, exactly. That's what "truncated" means :-) > Does fMaxSize change each time? DeliverFrame will get called again immediately but this will get the next frame. Don?t we need to send the rest of the frame on the next call to DeliverFrame? No, you must send only one frame at a time, because downstream objects expect 'frames', not 'portions of frames'. "fMaxSize" is the size of the buffer that the downstream object specified when it called "getNextFrame()" on your input source object. If your frames are larger than "fMaxSize" (which is what you are seeing), then that simply means that your downstream object's buffer is too small. You need to increase it. (Unfortunately, you didn't say what your downstream object is, so until you do, I can't really tell you how to increase the buffer size.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From buivuhoang15 at gmail.com Sat Dec 10 06:44:50 2011 From: buivuhoang15 at gmail.com (Hoang Bui Vu) Date: Sat, 10 Dec 2011 21:44:50 +0700 Subject: [Live-devel] How to create a custom sink that streams to both the network and a local file In-Reply-To: <900D5E18-98A6-446B-B025-8A078FB91691@live555.com> References: <900D5E18-98A6-446B-B025-8A078FB91691@live555.com> Message-ID: Hi Ross, I have been playing with the FramedFilter class for a while, and created a simple filter that will just deliver its input to its output. However, it is not working for some reason. Can you point me to the point that I'm missing? HistoryFilter::HistoryFilter(UsageEnvironment& env, FramedSource* inputSource) : FramedFilter(env, inputSource) { } HistoryFilter::~HistoryFilter() { Medium::close(fInputSource); } HistoryFilter* HistoryFilter::createNew(UsageEnvironment& env, FramedSource* inputSource, char const* fileName) { return new HistoryFilter(env, inputSource); } void HistoryFilter::doGetNextFrame() { fFrameSize=0; fInputSource->getNextFrame(fTo, fMaxSize, afterGettingFrame, this, FramedSource::handleClosure, this); } void HistoryFilter::afterGettingFrame(void* clientData, unsigned frameSize, unsigned /*numTruncatedBytes*/, struct timeval presentationTime, unsigned /*durationInMicroseconds*/) { HistoryFilter* filter = (HistoryFilter*)clientData; filter->afterGettingFrame1(frameSize, presentationTime); } void HistoryFilter::afterGettingFrame1(unsigned frameSize, struct timeval presentationTime) { fFrameSize = frameSize; fPresentationTime = presentationTime; afterGetting(this); } In my main server play() function, I put in this code: FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES); historyFilter = HistoryFilter::createNew(*env, videoSource, "history.mp4"); // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*historyFilter, afterPlaying, videoSink); Thank you for your time and patience. On Fri, Dec 9, 2011 at 2:41 AM, Ross Finlayson wrote: > I'm currently doing a live camera streaming project that needs a history > function. I've done some search and come to the solution of creating a > custom sink that is the combination of MPEG4ESVideoRTPSink and FileSink, > which will stream to both a local file and the network. > > > A simpler solution would be to write a new 'filter' class (i.e., a > subclass of "FramedFilter") that simply delivers its input to its output, > but also writes to a file. Then, add an object of this class to the end of > your data stream (i.e., before you feed it into a sink). That way, you > won't need to create (or modify) any "MediaSink" class at all. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jmm55 at psu.edu Thu Dec 15 08:14:35 2011 From: jmm55 at psu.edu (Justin Miller) Date: Thu, 15 Dec 2011 11:14:35 -0500 Subject: [Live-devel] Audio Sync Issue from Axis Camera Message-ID: We are trying to use an Axis Q6034 Camera that sends out a video and audio stream in h.264 and acc. The problem we are having is that the audio and video do not sync no matter what we try. We have tried the -y hook and that actually causes a drift in the audio after a period of time. We have tried changing the bit rate and GOV settings, I have been able to get the closest results by setting the camera to 24fps and the software to 23.95 but we get inconsistent results. Typically the audio is ahead of the video by 2 to 6 frames. Also this is not a bandwidth issue as we have tested that, there is also no packet loss and the time clock on the camera matches the machine, as it syncs to the machine. Any suggestions? Thanks, Justin M. Miller Multimedia Specialist Media Commons Project Manager 212A Rider bld. University Park, PA 16802 814-863-7764 http://mediacommons.psu.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben at europa-network.com Thu Dec 15 10:32:09 2011 From: ben at europa-network.com (Ben Wheway) Date: Thu, 15 Dec 2011 19:32:09 +0100 Subject: [Live-devel] Multicast to rtsp with Amino A125 Message-ID: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> Hi All, I?m hoping you can help us, it would be much appreciated if you can. We have many Amino A125 STB?s. The server we currently use to rtsp to them is outdated and we need a new system to stream to them. Your software appears to be able to do this but im struggling to find guides to achieve this. Here is a background of what we have as in streams etc: We multicast from our encoders to our current streaming server. This server then RTSP?s out to the Amino STB. The multicast input stream is MPEG4/h264 TS UDP. We then RTSP over UDP unicast out to the STB. So we need to input UDP multicast to live555 server and then RTSP UDP unicast to the Amino STB. Could you help me out with how to achieve this or point me to the correct site page etc. We can pay for someone to help us achieve this? Thanks Ben From jshanab at smartwire.com Thu Dec 15 14:26:35 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 15 Dec 2011 22:26:35 +0000 Subject: [Live-devel] HTTP Live Streaming Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A4CF5@IL-BOL-EXCH01.smartwire.com> I have been working on HTTP live streaming and have run across a snag. I was hoping someone here could offer some ideas. The setup is that I am pulling an RTSP H264 stream in baseline profile from a security camera and am using the MPEG2TransportStreamFromESSource to convert it into a .ts file. My web server creates the needed segments and indecies on the fly and serves them out. If I connect with a browser on my desktop I can get the url's for the segments from the playlist and then put those in the browser and save the segments to disk. These .ts files play in VLC. When I connect with the ipad or a mac notebook it races gets the index file and then the segments and then returns (quickly) for an updated index and more segments. It appears to be blowing thru the .ts segment files as unplayable and then just asks for the next one. The same encoder restreaming to Http Live Streaming thru wozza are playable on the ipad. I have used a byte editor and a .ts analyzer and I cannot see a difference. All of this, except the fact that the restreamed thru wowza works, would point to a incompatible h264 profile. Anyone run across something like this? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 15 17:31:29 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Dec 2011 17:31:29 -0800 Subject: [Live-devel] Audio Sync Issue from Axis Camera In-Reply-To: References: Message-ID: > We are trying to use an Axis Q6034 Camera that sends out a video and audio stream in h.264 and acc. The problem we are having is that the audio and video do not sync no matter what we try. Is this a general problem with audio and video from this camera being out of sync, or just a problem with audio/video sync if you try to record it to a file using "openRTSP"? (You didn't mention "openRTSP" in your message!) I.e., if you view the stream directly using VLC (i.e, not from a file), then is the audio and video in sync? If not, then there's probably nothing we can do. If, however, you're seeing a/v sync problems only when you try to record the stream to a '.mov' or '.mp4' file using "openRTSP", then the problem is basically that the '.mov' and '.mp4' formats are ill-suited for recording incoming audio/video streams, and "openRTSP" is trying to do the best that it can. To have any hope for this to work, however, you must give "openRTSP" an accurate 'frame rate' parameter (using "-f"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 15 17:57:06 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Dec 2011 17:57:06 -0800 Subject: [Live-devel] Multicast to rtsp with Amino A125 In-Reply-To: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> References: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> Message-ID: <478B2874-502A-4003-BC00-56AF5FC95F8E@live555.com> > We have many Amino A125 STB?s. The server we currently use to rtsp to them is outdated and we need a new system to stream to them. Your software appears to be able to do this but im struggling to find guides to achieve this. Here is a background of what we have as in streams etc: > > We multicast from our encoders to our current streaming server. This server then RTSP?s out to the Amino STB. The multicast input stream is MPEG4/h264 TS UDP. We then RTSP over UDP unicast out to the STB. > > So we need to input UDP multicast to live555 server and then RTSP UDP unicast to the Amino STB. Yes, you should be able do this fairly easily. I suggest using the "testOnDemandRTSPServer" demo application as a model; note, in particular, the code for streaming Transport Stream data (lines 215 through 218 of "testProgs/testOnDemandRTSPServer.cpp"). The one change that you'll need to make to this code is that rather than adding a "MPEG2TransportFileServerMediaSubsession" to the "ServerMediaSession" object, you'll be adding an object of a different "OnDemandServerMediaSubsession" - one that you will write yourself. In fact, I suggest that you subclass "MPEG2TransportFileServerMediaSubsession". If you do that, then you will need only to redefine the "createNewStreamSource()" virtual function. In your subclass's constructor, when it calls the parent class ("MPEG2TransportFileServerMediaSubsession")'s constructor, you should set the "fileName" and "indexFile" parameters to NULL, and set "reuseFirstSource" to True. (This tells the server to use the same input source object, even if more than one client is streaming from the server concurrently.) Your subclass's "createNewStreamSource()" virtual function can be quite simple - basically just creating a "groupsock" for your IP multicast address, and then creating a "BasicUDPSource" using that "groupsock" object. I suggest looking at the "testRelay" demo application code for a hint about how to do this. (Because your input is Transport Stream data packed into UDP packets, I don't think that you'll need a separate 'framer' object in front of the "BasicUDPSource" object. Instead, you'll probably be able to transfer the contents of each incoming UDP multicast packet directly into output UDP unicast packets. The method that I've outlined above should do that.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben at europa-network.com Fri Dec 16 00:28:15 2011 From: ben at europa-network.com (Ben Wheway) Date: Fri, 16 Dec 2011 09:28:15 +0100 Subject: [Live-devel] Multicast to rtsp with Amino A125 In-Reply-To: <478B2874-502A-4003-BC00-56AF5FC95F8E@live555.com> References: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> <478B2874-502A-4003-BC00-56AF5FC95F8E@live555.com> Message-ID: <064301ccbbcc$a9043710$fb0ca530$@com> Hi Ross Many Thanks for your reply. This is the first time I have seen Live555 and the below doesn?t make any sense when im looking at that file. Could you possibly send me the file with the updated changes? We will pay for your time. If you can, could you create 1 of the channels and I can duplicate the rest. Channel 1: Input: 234.5.90.131:4900 Output: rtsp://xxx.xxx.xxx.xxx/channel1.xx Thanks Ross again for your help! Regards Ben From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 16 December 2011 02:57 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Multicast to rtsp with Amino A125 We have many Amino A125 STB?s. The server we currently use to rtsp to them is outdated and we need a new system to stream to them. Your software appears to be able to do this but im struggling to find guides to achieve this. Here is a background of what we have as in streams etc: We multicast from our encoders to our current streaming server. This server then RTSP?s out to the Amino STB. The multicast input stream is MPEG4/h264 TS UDP. We then RTSP over UDP unicast out to the STB. So we need to input UDP multicast to live555 server and then RTSP UDP unicast to the Amino STB. Yes, you should be able do this fairly easily. I suggest using the "testOnDemandRTSPServer" demo application as a model; note, in particular, the code for streaming Transport Stream data (lines 215 through 218 of "testProgs/testOnDemandRTSPServer.cpp"). The one change that you'll need to make to this code is that rather than adding a "MPEG2TransportFileServerMediaSubsession" to the "ServerMediaSession" object, you'll be adding an object of a different "OnDemandServerMediaSubsession" - one that you will write yourself. In fact, I suggest that you subclass "MPEG2TransportFileServerMediaSubsession". If you do that, then you will need only to redefine the "createNewStreamSource()" virtual function. In your subclass's constructor, when it calls the parent class ("MPEG2TransportFileServerMediaSubsession")'s constructor, you should set the "fileName" and "indexFile" parameters to NULL, and set "reuseFirstSource" to True. (This tells the server to use the same input source object, even if more than one client is streaming from the server concurrently.) Your subclass's "createNewStreamSource()" virtual function can be quite simple - basically just creating a "groupsock" for your IP multicast address, and then creating a "BasicUDPSource" using that "groupsock" object. I suggest looking at the "testRelay" demo application code for a hint about how to do this. (Because your input is Transport Stream data packed into UDP packets, I don't think that you'll need a separate 'framer' object in front of the "BasicUDPSource" object. Instead, you'll probably be able to transfer the contents of each incoming UDP multicast packet directly into output UDP unicast packets. The method that I've outlined above should do that.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Fri Dec 16 02:15:49 2011 From: david.myers at panogenics.com (David J Myers) Date: Fri, 16 Dec 2011 10:15:49 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <004601ccbbdb$af466580$0dd33080$@myers@panogenics.com> Hi Ross, >(Unfortunately, you didn't say what your downstream object is, so until you do, I can't really tell you how to increase the buffer size.) Ok, In H.264 mode, I'm directly using H264VideoStreamFramer, and in MPEG4 mode, I'm using MPEG4VideoStreamDiscreteFramer. I hope this is what you mean by the downstream object. How do I increase the buffer size? Thanks and regards - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From jmm55 at psu.edu Fri Dec 16 06:16:15 2011 From: jmm55 at psu.edu (Justin Miller) Date: Fri, 16 Dec 2011 09:16:15 -0500 Subject: [Live-devel] Audio Sync Issue from Axis Camera In-Reply-To: References: Message-ID: Apologies, yes we are using OpenRTSP, we can get synced audio and video from VLC but the image quality is not very good. If we open the stream directly in quicktime the audio and video are synced as well. The problem is OpenRTSP is giving inconsistent results, sometimes it's 5 frames out of sync other times 2 or 3 and sometimes not at all. We found using the -y modifier produces an issue with drifting audio, where it might be dead on at the beginning but after 20 minutes is a full second out of sync. We also found that if a file is 5 frames out of sync and we re-encode it with ffmpeg it then produces a file that is less out of since with perhaps 2 or 3 frames. It is a very strange issue. The file seems to play better in VLC and windows media player, meaning that only the videos that are up to 5 frames out of sync are noticeable. However in quicktime or Final Cut Pro (basically anything based on quicktime) shows the issue to a much greater degree. The stream is automatically encoded h.264 and aac audio. I do get better results when I change the GOV settings to 8 and the bit rate to 6400 than any other setting on the camera. I am not sure why this is, I also get better results when the camera is set to 24 fps and I set OpenRTSP to 23.95 fps also not sure why that is, again it is inconsistent though. Thanks, Justin M. Miller Multimedia Specialist Media Commons Project Manager 212A Rider bld. University Park, PA 16802 814-863-7764 http://mediacommons.psu.edu On Thu, Dec 15, 2011 at 8:31 PM, Ross Finlayson wrote: > We are trying to use an Axis Q6034 Camera that sends out a video and audio > stream in h.264 and acc. The problem we are having is that the audio and > video do not sync no matter what we try. > > > Is this a general problem with audio and video from this camera being out > of sync, or just a problem with audio/video sync if you try to record it to > a file using "openRTSP"? (You didn't mention "openRTSP" in your message!) > > I.e., if you view the stream directly using VLC (i.e, not from a file), > then is the audio and video in sync? If not, then there's probably nothing > we can do. > > If, however, you're seeing a/v sync problems only when you try to record > the stream to a '.mov' or '.mp4' file using "openRTSP", then the problem is > basically that the '.mov' and '.mp4' formats are ill-suited for recording > incoming audio/video streams, and "openRTSP" is trying to do the best that > it can. To have any hope for this to work, however, you must give > "openRTSP" an accurate 'frame rate' parameter (using "-f"). > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 16 07:04:16 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 16 Dec 2011 07:04:16 -0800 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <004601ccbbdb$af466580$0dd33080$@myers@panogenics.com> References: <004601ccbbdb$af466580$0dd33080$@myers@panogenics.com> Message-ID: > >(Unfortunately, you didn't say what your downstream object is, so until you do, I can't really tell you how to increase the buffer size.) > Ok, In H.264 mode, I?m directly using H264VideoStreamFramer, and in MPEG4 mode, I?m using MPEG4VideoStreamDiscreteFramer. I hope this is what you mean by the downstream object. No, I meant the 'sink' object that those feed into. Is this a "RTPSink" (subclass)? If so, then you should be seeing an error message like "MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size" This error message will also tell you what to do: Increase "OutPacketBuffer::maxSize" to at least *before* creating this 'RTPSink'. You can do this in your main program, before you create any LIVE555 objects: OutPacketBuffer::maxSize = YOUR_NEW_BUFFER_SIZE; Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 16 07:42:30 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 16 Dec 2011 07:42:30 -0800 Subject: [Live-devel] Multicast to rtsp with Amino A125 In-Reply-To: <064301ccbbcc$a9043710$fb0ca530$@com> References: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> <478B2874-502A-4003-BC00-56AF5FC95F8E@live555.com> <064301ccbbcc$a9043710$fb0ca530$@com> Message-ID: > Many Thanks for your reply. This is the first time I have seen Live555 and the below doesn?t make any sense when im looking at that file. Use of the "LIVE555 Streaming Media" software requires a good working knowledge of C++, and systems programming. You need to have this, plus take the time to understand (using the demo applications) how LIVE555-based applications work. Once you've done that, my previous message will make sense. > Could you possibly send me the file with the updated changes? We will pay for your time. > > If you can, could you create 1 of the channels and I can duplicate the rest. > > > Channel 1: > Input: 234.5.90.131:4900 > Output: rtsp://xxx.xxx.xxx.xxx/channel1.xx Unfortunately I don't have the time to engage in custom programming for individual projects like this. But perhaps someone else on this mailing list might be interested in helping you? (In that case, they can contact you off the list.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben at europa-network.com Fri Dec 16 08:13:52 2011 From: ben at europa-network.com (Ben Wheway) Date: Fri, 16 Dec 2011 17:13:52 +0100 Subject: [Live-devel] Multicast to rtsp with Amino A125 In-Reply-To: References: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> <478B2874-502A-4003-BC00-56AF5FC95F8E@live555.com> <064301ccbbcc$a9043710$fb0ca530$@com> Message-ID: <06e701ccbc0d$b497b660$1dc72320$@com> Thanks for your reply Ross, I?m assuming people will see this email and put a request in if they are interested? Thanks Ben From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 16 December 2011 16:43 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Multicast to rtsp with Amino A125 Many Thanks for your reply. This is the first time I have seen Live555 and the below doesn?t make any sense when im looking at that file. Use of the "LIVE555 Streaming Media" software requires a good working knowledge of C++, and systems programming. You need to have this, plus take the time to understand (using the demo applications) how LIVE555-based applications work. Once you've done that, my previous message will make sense. Could you possibly send me the file with the updated changes? We will pay for your time. If you can, could you create 1 of the channels and I can duplicate the rest. Channel 1: Input: 234.5.90.131:4900 Output: rtsp://xxx.xxx.xxx.xxx/channel1.xx Unfortunately I don't have the time to engage in custom programming for individual projects like this. But perhaps someone else on this mailing list might be interested in helping you? (In that case, they can contact you off the list.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 16 09:01:04 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 16 Dec 2011 17:01:04 +0000 Subject: [Live-devel] mpegVersion in addNewAudioSource Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A6C12@IL-BOL-EXCH01.smartwire.com> In the MPEG2TransportStreamFromESSource I am adding a video and audio source. For the addNewVideoSource the comments say mpegversion = 4 for mpeg4 and 5 for h264 There is no such comment for the addNewAudioSource but examples seem to use 1. Anyway what is the meaning and setting for this argument. With audio I stop about 1 second in, without audio it continues until I stop it. I am trying to go from H264 received from RTSP earlier and the test.aac file into a transport stream. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 16 11:55:56 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 16 Dec 2011 11:55:56 -0800 Subject: [Live-devel] mpegVersion in addNewAudioSource In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A6C12@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A6C12@IL-BOL-EXCH01.smartwire.com> Message-ID: <819C4093-0AF3-4F25-86EF-D487CC30DA34@live555.com> > In the MPEG2TransportStreamFromESSource I am adding a video and audio source. > > For the addNewVideoSource the comments say mpegversion = 4 for mpeg4 and 5 for h264 > > There is no such comment for the addNewAudioSource but examples seem to use 1. Anyway what is the meaning and setting for this argument. It is used to compute the value of the "stream_type" field that gets put in the PMT (Program Map Table) in the resulting Transport Stream. This value depends upon the specific type of video or audio (i.e., upon the specific video or audio codec) that's used in the file. Because you're trying to store AAC audio in your Transport Stream, the "mpegVersion" field should definitely *not* be 1 or 2. You should, instead, use a "mpegVersion" of 4 (because AAC is "MPEG-4 audio"). Our code will use this to compute a "stream_type" of 0xF, which may or may not be correct for AAC audio. (If it's not correct, then someone please let me know, and I'll update our code accordingly.) > I am trying to go from H264 received from RTSP earlier and the test.aac file into a transport stream. If, instead, you were to use a MP3 file for your audio (rather than an AAC file), then you would use a "mpegVersion" of 1 or (more likely) 2, and that would probably work OK. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Fri Dec 16 12:18:54 2011 From: warren at etr-usa.com (Warren Young) Date: Fri, 16 Dec 2011 13:18:54 -0700 Subject: [Live-devel] Multicast to rtsp with Amino A125 In-Reply-To: <064301ccbbcc$a9043710$fb0ca530$@com> References: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> <478B2874-502A-4003-BC00-56AF5FC95F8E@live555.com> <064301ccbbcc$a9043710$fb0ca530$@com> Message-ID: <4EEBA7AE.8010902@etr-usa.com> On 12/16/2011 1:28 AM, Ben Wheway wrote: > > Channel 1: > Input: 234.5.90.131:4900 > Output: rtsp://xxx.xxx.xxx.xxx/channel1.xx For such a simple application, you might find VLM sufficient: http://www.videolan.org/doc/streaming-howto/en/ch05.html From jshanab at smartwire.com Fri Dec 16 12:24:46 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 16 Dec 2011 20:24:46 +0000 Subject: [Live-devel] mpegVersion in addNewAudioSource In-Reply-To: <819C4093-0AF3-4F25-86EF-D487CC30DA34@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A6C12@IL-BOL-EXCH01.smartwire.com> <819C4093-0AF3-4F25-86EF-D487CC30DA34@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A6D30@IL-BOL-EXCH01.smartwire.com> Thanks. I have been experimenting all day and comparing to reference streams and 4 seems to be the correct number in the PAT (my ts analyzer seems to call it the program_association_table). What I am seeing in the playing back of the file is that there is no audio in the stream, The table entry got there, so vlc shows in the codec info the correct information but there is no sound and, on inspection with a hex editor or ts stream analyzer, no audio packets. It also flies thru the audio file source so I must have something set up wrong or the headers do not parse correctly. I looped it for now, but I think I will try MP3 this next attempt. For now I need to know how to get the audio muxed in, I must of missed a step. What I really need, in the long run will be a "NULLAudioSource" created like the device source so I can put in an audio channel of low bitrate total silence. I think the ipad will not play a stream unless it has sound and I must have video only. I just do not yet know what a minimalist stream of silence looks like. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 16, 2011 1:56 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] mpegVersion in addNewAudioSource In the MPEG2TransportStreamFromESSource I am adding a video and audio source. For the addNewVideoSource the comments say mpegversion = 4 for mpeg4 and 5 for h264 There is no such comment for the addNewAudioSource but examples seem to use 1. Anyway what is the meaning and setting for this argument. It is used to compute the value of the "stream_type" field that gets put in the PMT (Program Map Table) in the resulting Transport Stream. This value depends upon the specific type of video or audio (i.e., upon the specific video or audio codec) that's used in the file. Because you're trying to store AAC audio in your Transport Stream, the "mpegVersion" field should definitely *not* be 1 or 2. You should, instead, use a "mpegVersion" of 4 (because AAC is "MPEG-4 audio"). Our code will use this to compute a "stream_type" of 0xF, which may or may not be correct for AAC audio. (If it's not correct, then someone please let me know, and I'll update our code accordingly.) I am trying to go from H264 received from RTSP earlier and the test.aac file into a transport stream. If, instead, you were to use a MP3 file for your audio (rather than an AAC file), then you would use a "mpegVersion" of 1 or (more likely) 2, and that would probably work OK. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4684 - Release Date: 12/16/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 16 12:55:06 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 16 Dec 2011 20:55:06 +0000 Subject: [Live-devel] stackoverflow Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A6D77@IL-BOL-EXCH01.smartwire.com> I am having stack overflow whenever anything takes a bit of time. I read in the archives that this should be a very rare occurance but it happens if I debug, if I put any std::cout and on many startups before things get a chance to settle. I can see this is a mutual recursive path between the continuePlaying of one source and the afterGettingFrame of the sink or filter to the left. I suspect it is showing up now more than in the past because of the tiny 188 byte packets that transport streams use. I increased my stack size but that did not help. Is there a non-recursive equivalent that can avoid this so I can debug and remain running when things get behind? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 16 14:13:13 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 16 Dec 2011 14:13:13 -0800 Subject: [Live-devel] stackoverflow In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A6D77@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A6D77@IL-BOL-EXCH01.smartwire.com> Message-ID: This can happen, though only if both your input source and your output sink (and any filters in-between) are synchronous. E.g, if your input source is a file on Windoze (where file reading has to be synchronous), and your output sink is also a file. (It should *not* happen if your output sink is a "RTPSink".) The way to overcome this is to find one place in your code (your own code, not the supplied source code!) that calls afterGetting(this); and replace this with: envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)afterGetting, this); so that you'll return to the event loop rather than getting into a recursive call. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Dec 17 06:43:20 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 17 Dec 2011 14:43:20 +0000 Subject: [Live-devel] stackoverflow In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1A6D77@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A9764@IL-BOL-EXCH01.smartwire.com> In my ESSource, a class based on the device source example is the only place I have a similar line, FramedSource::afterGetting(this). Replacing that with envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this) does not help. However just commenting out the one in MP3FileSource and running that as if I am noin windows works for stopping the stack overflow. But there is no MP3 in the output. I am suspecting this is that the MP3 is not in the file format expected. I am stepping thru it now. My sink has the continuePlaying architecure. My sinks afterGettingFrame1 ends by calling continuePlaying which is Boolean MYsink::continuePlaying() { If (fSource == NULL) Return false; fSource->getNextFrame(fBuffer,fBufferSize,afterGettingFrame,this,onSourceClosure,this) return true; } If I was to make the signature compatible with a TaskFunction, would scheduling it for the event loop work? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 16, 2011 4:13 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] stackoverflow This can happen, though only if both your input source and your output sink (and any filters in-between) are synchronous. E.g, if your input source is a file on Windoze (where file reading has to be synchronous), and your output sink is also a file. (It should *not* happen if your output sink is a "RTPSink".) The way to overcome this is to find one place in your code (your own code, not the supplied source code!) that calls afterGetting(this); and replace this with: envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)afterGetting, this); so that you'll return to the event loop rather than getting into a recursive call. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4684 - Release Date: 12/16/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Dec 17 10:51:32 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 17 Dec 2011 18:51:32 +0000 Subject: [Live-devel] stackoverflow In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1A9764@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1A6D77@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1A9764@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A9804@IL-BOL-EXCH01.smartwire.com> I am missing something fundamental about muxing the audio in with the video. Something to do with timeing. All my attempts to use a file burned thru the audio and exited even though it was an hour worth of audio and I am making 5 second .ts clips. So I created a "SilentAacSource" class from the DeviceSource example and put in it one frame of audio that I return whenever GetNextFrame is called. This is the code that caused infinit recursion before changing the FramedSource::afterGetting(this) to the recommended task scheduling. However I ended up with a very rabid burst of video then 5 seconds of video. So I put 30,000 microseconds as the first argument to scheduledDelayedTask and the file size droped from 19MB to 200K and was playable with repeated samples of sound. I am setting the PTS to current machine time, but the video does not have the same pts. Do the two streams have to have synced PTS's? Is GetNextFrame called really fast (every 188 bytes) and it is up to the device to say it doesn't have anything? The encoded file now plays with sound in VLC but it is still downloaded and ignored by the the mac/ipad From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Saturday, December 17, 2011 8:43 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] stackoverflow In my ESSource, a class based on the device source example is the only place I have a similar line, FramedSource::afterGetting(this). Replacing that with envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this) does not help. However just commenting out the one in MP3FileSource and running that as if I am noin windows works for stopping the stack overflow. But there is no MP3 in the output. I am suspecting this is that the MP3 is not in the file format expected. I am stepping thru it now. My sink has the continuePlaying architecure. My sinks afterGettingFrame1 ends by calling continuePlaying which is Boolean MYsink::continuePlaying() { If (fSource == NULL) Return false; fSource->getNextFrame(fBuffer,fBufferSize,afterGettingFrame,this,onSourceClosure,this) return true; } If I was to make the signature compatible with a TaskFunction, would scheduling it for the event loop work? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 16, 2011 4:13 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] stackoverflow This can happen, though only if both your input source and your output sink (and any filters in-between) are synchronous. E.g, if your input source is a file on Windoze (where file reading has to be synchronous), and your output sink is also a file. (It should *not* happen if your output sink is a "RTPSink".) The way to overcome this is to find one place in your code (your own code, not the supplied source code!) that calls afterGetting(this); and replace this with: envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)afterGetting, this); so that you'll return to the event loop rather than getting into a recursive call. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4684 - Release Date: 12/16/11 ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2108/4686 - Release Date: 12/17/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftriboix at falcon-one.com Sat Dec 17 11:32:29 2011 From: ftriboix at falcon-one.com (Fabrice Triboix) Date: Sat, 17 Dec 2011 19:32:29 +0000 Subject: [Live-devel] Add an audio sub-session makes the video stop In-Reply-To: References: Message-ID: <4EECEE4D.4070607@falcon-one.com> Dear Ross, We implemented a class based on OnDemandServerMediaSubsession, and it uses an apparently widely used trick in live555 to get the "SDP lines". It plays the stream into a "dummy" RTP sink to get those lines and then stops the stream. That's where things were going wrong, because the RTP sink I was using for the MP3 audio sub-stream (MPEG1or2AudioRTPSink), if it is the one used for this trick) was not returning any SDP lines, and the system never gets out of this loop. I fiddled a bit with that to try to get things working. I changed our subsession class so its "getAuxSDPLine()" method returns "rtpsink->auxSDPLine()", like it's done in the OnDemandServerMediaSubsession. It actually returns NULL, but it allowed me to move forward, as the DESCRIBE response is now coming with the SDP. I however noticed that the SDP part corresponding to the audio stream contains just one line "control=track2". This looks quite limited and one would expect to get at least the rtp profile... So my question is: what exactly is "OnDemandServerMediaSubsession::getAuxDSPLine()"? When is it called? What should it return? I have not been able to find any documentation about that... Anyway, VLC can now play the audio and the video, but for just half a second. Then the audio plays again after a glitch, but the video remains still. I can see in wireshark that both audio and video data are sent properly on their respective RTP ports... I probably need to investigate further, but I just wanted to know if you would have any clue about what could cause that. Thank you so much for your help! Best regards, Fabrice From finlayson at live555.com Sat Dec 17 21:24:11 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 17 Dec 2011 21:24:11 -0800 Subject: [Live-devel] The "getAuxSDPLine()" virtual function In-Reply-To: <4EECEE4D.4070607@falcon-one.com> References: <4EECEE4D.4070607@falcon-one.com> Message-ID: <430B0C98-F08D-4101-A980-96F7E1C40E51@live555.com> (I've corrected the "Subject:" line of this email thread to make it more accurate.) The "auxSDPLine()" virtual function is used - by the "OnDemandServerMediaSubsession" implementation - when setting up the SDP lines that describe the substream. In particular, it is used to set up a special 'extra' SDP line (usually beginning with "a=fmtp:") that *some*, but not all, media codecs need to describe the stream. Its default implementation simply queries the "RTPSink" object for this value (and the "RTPSink", by default, will just return NULL). The bottom line is that you - as someone who defines and implements an "OnDemandServerMediaSubsession" subclass - will *not* need to reimplement this virtual function, unless: 1/ The codec that you are using is one that needs a special extra "a=fmtp:" SDP line, *and* 2/ This special extra SDP line can't be obtained simply by querying the "RTPSink" object. Conditions 1/ and 2/ are usually true only for when you're streaming codecs like H.264 or MPEG-4 video that require special 'configuration' parameters, *and* those special 'configuration' parameters are available only 'in band'. For example, if you're streaming a H.264 or MPEG-4 Elementary Stream video file. Because your codec is MP3, you definitely *do not* need to reimplement the "getAuxSDPLine()" virtual function. I.e., if you have such a reimplementation, then you should remove it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben at europa-network.com Sun Dec 18 03:04:44 2011 From: ben at europa-network.com (Ben Wheway) Date: Sun, 18 Dec 2011 12:04:44 +0100 Subject: [Live-devel] Multicast to rtsp with Amino A125 In-Reply-To: <4EEBA7AE.8010902@etr-usa.com> References: <989C7F23-38B8-4F9E-BC41-1F0EE7DCA9FC@europa-network.com> <478B2874-502A-4003-BC00-56AF5FC95F8E@live555.com> <064301ccbbcc$a9043710$fb0ca530$@com> <4EEBA7AE.8010902@etr-usa.com> Message-ID: Hi warren Thanks for your reply. We currently use that for our other type of STB but vlc /vlm doesn't work with Amino's. The rtsp protocol isn't supported. I have tested live555 with a mpeg4 ts capture from our multicast and it works perfectly. I just don't know how to configure live streaming. Ross has sent instructions but to me it doesn't make sense Is this something you can do for us? Of course we will pay for your time. Thanks Ben On 16 Dec 2011, at 21:18, Warren Young wrote: > On 12/16/2011 1:28 AM, Ben Wheway wrote: >> >> Channel 1: >> Input: 234.5.90.131:4900 >> Output: rtsp://xxx.xxx.xxx.xxx/channel1.xx > > For such a simple application, you might find VLM sufficient: > > http://www.videolan.org/doc/streaming-howto/en/ch05.html > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From ftriboix at falcon-one.com Sun Dec 18 06:40:43 2011 From: ftriboix at falcon-one.com (Fabrice Triboix) Date: Sun, 18 Dec 2011 14:40:43 +0000 Subject: [Live-devel] Add an audio sub-session makes the video stop In-Reply-To: References: Message-ID: <4EEDFB6B.8000806@falcon-one.com> Dear Ross, I have another question about FramedSource::doGetNextFrame(). Does this method requires the actual MP3 frame to be returned, or the MP3 frame encapsulated as required by the chosen RTP profile (14 in my case)? Thanks a lot for your help! Best regards, Fabrice From finlayson at live555.com Sun Dec 18 07:19:59 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 18 Dec 2011 07:19:59 -0800 Subject: [Live-devel] Add an audio sub-session makes the video stop In-Reply-To: <4EEDFB6B.8000806@falcon-one.com> References: <4EEDFB6B.8000806@falcon-one.com> Message-ID: <874C6E28-089D-4199-A0F0-E6F54023B829@live555.com> > I have another question about FramedSource::doGetNextFrame(). Does this method requires the actual MP3 frame to be returned, or the MP3 frame encapsulated as required by the chosen RTP profile (14 in my case)? Our RTP output code - in this case, the "MPEG1or2AudioRTPSink" class - automatically takes care of packing an appropriate number of MP3 frames (along with required headers) into each outgoing RTP packet. If you have a "FramedSource" subclass that feeds into this, then all you need to do is feed it individual MP3 frames. (If your input source is a MP3 file, then you can just use a "MP3FileSource"; you don't need to write your own "FramedSource" subclass. If, however, your input source is a live source - e.g., from a MP3 encoder - then you will need to write your own "FramedSource" subclass that delivers one MP3 frame at a time.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sun Dec 18 08:01:06 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sun, 18 Dec 2011 16:01:06 +0000 Subject: [Live-devel] Http Live Streaming Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1A9C97@IL-BOL-EXCH01.smartwire.com> I am trying to stream to mac, iphone and ipad using http streaming from my application that uses the live555 libs. Data is flowing but I do not get video. When I extract out one of the segment files it will play in VLC. On further investigation, I see that the ts file when run in quicktime(apple's toolkit for video) it does not even see there is h264 content. The same h264 stream restreamed by some off the shelf software into HTTP live streaming is fine. I found a tool to analyze the headers in the transport stream and was wondering if someone can look at the two sets of analysis and tell me what I am forgetting. Working: Adaptation fields Adaptation_field_length: 7 discontinuity_indicator: False random_access_indicator: False ES_priority_indicator: False PCR_flag: True OPCR_flag: False splicing_point_flag: False transport_private_data_flag: False adaptation_field_extension_flag: False PCR: 269280000 PES header stream_id: E0 (video stream 224) PES_packet_length: 0 (undefined) PES_scrambling: 0 PES_priority: False data_alignment: True copyright: False original_or_copy: False PTS_flag: True DTS_flag: True ESCR_flag: False ES_rate_flag: False DSM_trick_mode_flag: False additional_copy_info_flag: False PES_CRC_flag: False PES_extension_flag: False PES_header_data_length: 10 PTS: 900000 DTS: 900000 Video sequence Sequence header code not found in this packet AFD not found in this packet Not working: ========================================================== Adaptation fields Adaptation_field_length: 7 discontinuity_indicator: False random_access_indicator: False ES_priority_indicator: False PCR_flag: True OPCR_flag: False splicing_point_flag: False transport_private_data_flag: False adaptation_field_extension_flag: False PCR: 6922408255 PES header stream_id: E0 (video stream 224) PES_packet_length: 13861 PES_scrambling: 0 PES_priority: False data_alignment: False copyright: False original_or_copy: False PTS_flag: True DTS_flag: False ESCR_flag: False ES_rate_flag: False DSM_trick_mode_flag: False additional_copy_info_flag: False PES_CRC_flag: False PES_extension_flag: False PES_header_data_length: 5 PTS: 23074694 Video sequence Sequence header code not found in this packet AFD not found in this packet -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftriboix at falcon-one.com Sun Dec 18 13:13:07 2011 From: ftriboix at falcon-one.com (Fabrice Triboix) Date: Sun, 18 Dec 2011 21:13:07 +0000 Subject: [Live-devel] Add an audio sub-session makes the video stop In-Reply-To: References: Message-ID: <4EEE5763.1060205@falcon-one.com> Dear Ross, > Because your codec is MP3, you definitely *do not* need to reimplement the "getAuxSDPLine()" virtual function. I.e., if you have such a reimplementation, then you should remove it. Many thanks for these details, that's enlightening! > Our RTP output code - in this case, the "MPEG1or2AudioRTPSink" class - automatically takes care of packing an appropriate number of MP3 frames (along with required headers) into each outgoing RTP packet. If you have a "FramedSource" subclass that feeds into this, then all you need to do is feed it individual MP3 frames. > > (If your input source is a MP3 file, then you can just use a "MP3FileSource"; you don't need to write your own "FramedSource" subclass. If, however, your input source is a live source - e.g., from a MP3 encoder - then you will need to write your own "FramedSource" subclass that delivers one MP3 frame at a time.) I am currently using a file, but eventually it will be live source. I got the audio streaming, which is good, but it stutters. I looked at the packet format with wireshark, but found no issues. However I found out that apparently RTP packets are sent too quickly with my class. But if you use MP3AudioFileServerMediaSubsession, the audio is fine and RTP packets are sent at a normal rate. If I use my class with openRTSP for a few seconds, the audio file is 3x larger that the video file (with your class it's 10x smaller). Do you have any idea what is happening here? Many thanks for your support! Fabrice From finlayson at live555.com Sun Dec 18 13:24:13 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 18 Dec 2011 13:24:13 -0800 Subject: [Live-devel] Add an audio sub-session makes the video stop In-Reply-To: <4EEE5763.1060205@falcon-one.com> References: <4EEE5763.1060205@falcon-one.com> Message-ID: <83C09D2A-1372-4145-A730-7287599F20C4@live555.com> > I am currently using a file, but eventually it will be live source. > I got the audio streaming, which is good, but it stutters. I looked at the packet format with wireshark, but found no issues. > However I found out that apparently RTP packets are sent too quickly with my class. But if you use MP3AudioFileServerMediaSubsession, the audio is fine and RTP packets are sent at a normal rate. OK, so you have code that works: "MP3AudioFileServerMediaSubsession", and code that does not work: Your subclass of "OnDemandServerMediaSubsession". By looking at the differences between them, it should be relatively easy, then, for you to figure out what's wrong with your code. > If I use my class with openRTSP for a few seconds, the audio file is 3x larger that the video file (with your class it's 10x smaller). > Do you have any idea what is happening here? What happens when you try playing this "3x larger" audio file (i.e., after renaming it to have a ".mp3" filename suffix)? I suspect that you are delivering the same MP3 frame into the downstream "MPEG1or2AudioRTPSink" more than once (so you end up with duplicate MP3 frames being sent). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Dec 18 13:51:19 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 18 Dec 2011 13:51:19 -0800 Subject: [Live-devel] How to create a custom sink that streams to both the network and a local file In-Reply-To: References: <900D5E18-98A6-446B-B025-8A078FB91691@live555.com> Message-ID: <11D15901-ABBD-47D3-B5C0-7B3191E10CD2@live555.com> > I have been playing with the FramedFilter class for a while, and created a simple filter that will just deliver its input to its output. However, it is not working for some reason. Can you point me to the point that I'm missing? [...] > void HistoryFilter::doGetNextFrame() > { > fFrameSize=0; You don't need to do this here (although it does no harm), because you are (properly) setting "fFrameSize" later, in your 'after getting' function. > void HistoryFilter::afterGettingFrame(void* clientData, unsigned > frameSize, unsigned /*numTruncatedBytes*/, struct timeval > presentationTime, unsigned /*durationInMicroseconds*/) > { > HistoryFilter* filter = (HistoryFilter*)clientData; > filter->afterGettingFrame1(frameSize, presentationTime); > } > > void HistoryFilter::afterGettingFrame1(unsigned frameSize, struct timeval presentationTime) > { > fFrameSize = frameSize; > fPresentationTime = presentationTime; > afterGetting(this); > } Your filter also needs to be setting "fNumTruncatedBytes" and "fDurationInMicroseconds" (in your case, because you're making a direct copy, these will be the values of the "numTruncatedBytes" and "durationInMicroseconds" parameters, respectively). > In my main server play() function, I put in this code: > > FramedSource* videoES = fileSource; > // Create a framer for the Video Elementary Stream: > videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES); > historyFilter = HistoryFilter::createNew(*env, videoSource, "history.mp4"); > // Finally, start playing: > *env << "Beginning to read from file...\n"; > videoSink->startPlaying(*historyFilter, afterPlaying, videoSink); That looks fine - but don't forget to enter the event loop, by calling env->taskScheduler().doEventLoop(); at the end of this code, otherwise nothing will happen. (In a LIVE555-based application, almost everything gets done within the event loop). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Wed Dec 14 14:44:51 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 14 Dec 2011 14:44:51 -0800 Subject: [Live-devel] Live555 EventLoop crash In-Reply-To: References: Message-ID: On Wed, Dec 14, 2011 at 1:27 PM, Ross Finlayson wrote: > No, you must send only one frame at a time, because downstream objects > expect 'frames', not 'portions of frames'. "fMaxSize" is the size of the > buffer that the downstream object specified when it called "getNextFrame()" > on your input source object. If your frames are larger than "fMaxSize" > (which is what you are seeing), then that simply means that your downstream > object's buffer is too small. You need to increase it. > > (Unfortunately, you didn't say what your downstream object is, so until > you do, I can't really tell you how to increase the buffer size.) > Two quick questions: 1. One thing another open source project I maintain does is automatically adjust buffer size when a frame too large comes along using realloc. Seems like you field this question often enough that it might be worth consideration to do something similar? 2. One of the biggest performance hits in my profiling is memcpy (I use an embedded platform, so memcpy gets pricy fast), much of it due to copying media buffers. Would you ever consider adding (or consider accepting ;) code that allows live555 to work in the calling library's buffers instead of its own? (in other words, I give Live555 a pointer to a buffer to send and the size, rather than memcpy'ing the buffer into live555's space) Thanks!. -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Mon Dec 19 02:32:07 2011 From: david.myers at panogenics.com (David J Myers) Date: Mon, 19 Dec 2011 10:32:07 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <004b01ccbe39$75e8b4e0$61ba1ea0$@myers@panogenics.com> Hi Ross, >No, I meant the 'sink' object that those feed into. Is this a "RTPSink" (subclass)? If so, then you should be seeing >an error message like > "MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size" >This error message will also tell you what to do: > Increase "OutPacketBuffer::maxSize" to at least *before* creating this 'RTPSink'. >You can do this in your main program, before you create any LIVE555 objects: > OutPacketBuffer::maxSize = YOUR_NEW_BUFFER_SIZE; No I don't see this error message, however I am seeing continuous truncations, almost every frame is truncated. My own debug output looks like this, at a rate of around 4 frames per second:- deliverFrame(): newFrameSize:216054, fNumTruncatedBytes:66055 deliverFrame(): newFrameSize:108994, fNumTruncatedBytes:98217 deliverFrame():newFrameSize:96844, fNumTruncatedBytes :45293 I've tried increasing OutPacketBuffer::maxSize from 1000000 to 5000000 (5 million) but this has no effect. Am I just trying to stream too much data, too quickly? Is it that however large you make the output buffer, it's going to fill up if it can't stream out fast enough? How do I get it to drop frames or GOPs sensibly without truncation? Remember, this is an in-camera live video server - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 19 02:51:55 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Dec 2011 02:51:55 -0800 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <004b01ccbe39$75e8b4e0$61ba1ea0$@myers@panogenics.com> References: <004b01ccbe39$75e8b4e0$61ba1ea0$@myers@panogenics.com> Message-ID: <02EA1F2B-C632-4833-8D86-C801A0EBFFEC@live555.com> > No I don?t see this error message, however I am seeing continuous truncations, almost every frame is truncated. My own debug output looks like this, at a rate of around 4 frames per second:- > deliverFrame(): newFrameSize:216054, fNumTruncatedBytes:66055 > deliverFrame(): newFrameSize:108994, fNumTruncatedBytes:98217 > deliverFrame():newFrameSize:96844, fNumTruncatedBytes :45293 > > I?ve tried increasing OutPacketBuffer::maxSize from 1000000 to 5000000 (5 million) but this has no effect. Am I just trying to stream too much data, too quickly? No, the problem has nothing to do with 'speed'. There's a bug in your code somewhere. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremiah.Morrill at econnect.tv Mon Dec 19 15:08:13 2011 From: Jeremiah.Morrill at econnect.tv (Jer Morrill) Date: Mon, 19 Dec 2011 23:08:13 +0000 Subject: [Live-devel] SET_PARAMETER doesn't get handled in the RTSPServer over TCP Message-ID: <80C795F72B3CB241A9256DABF0A04EC5F96DDA@SN2PRD0702MB101.namprd07.prod.outlook.com> Many apologies if this is a dupe, I don't think I properly signed up for the list before sending this earlier so I'm not sure if the email was "lost" or not. First I want to thank everyone involved in this project for such a high quality library. This is surely open-source done right! I am running the 12-2-2011 build of Live555. When I run RTSPClient::sendSetParameter(...) with a session that is running UDP to a live555 server implementation, the server parses and successfully runs the RTSPClientSession:: handleCmd_SET_PARAMETER (overridden in a subclass). If I run RTSPClient:sendSetParameter(...) with a session that is running via TCP, it does not ever get to the RTSPClientSession::handleCmd_SET_PARAMETER. Seems sending trickplay commands work fine in both TCP and UDP. I've searched the email list and found a few issues similar to this that were fixed, but couldn't find anyone reporting this afterwards. I can provide more debug details if needed, but it seems RTSPServer::RTSPClientSession::handleRequestBytes(...) doesn't successfully parse the full SET_PARAMETER command when using TCP (or is a symptom of something else). Thanks for any help and again, for the wonderful code! -Jer -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 19 15:44:37 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Dec 2011 15:44:37 -0800 Subject: [Live-devel] SET_PARAMETER doesn't get handled in the RTSPServer over TCP In-Reply-To: <80C795F72B3CB241A9256DABF0A04EC5F96DDA@SN2PRD0702MB101.namprd07.prod.outlook.com> References: <80C795F72B3CB241A9256DABF0A04EC5F96DDA@SN2PRD0702MB101.namprd07.prod.outlook.com> Message-ID: <1EB78808-10ED-490E-BCA1-6A614FE88CEB@live555.com> > I am running the 12-2-2011 build of Live555. When I run RTSPClient::sendSetParameter(?) with a session that is running UDP to a live555 server implementation, the server parses and successfully runs the RTSPClientSession:: handleCmd_SET_PARAMETER (overridden in a subclass). If I run RTSPClient:sendSetParameter(?) with a session that is running via TCP, it does not ever get to the RTSPClientSession::handleCmd_SET_PARAMETER. Seems sending trickplay commands work fine in both TCP and UDP. That's strange. (It's especially strange that other commands - e.g., "PLAY" - work OK for you in RTP-over-TCP mode, but that "SET_PARAMETER" does not.) > I can provide more debug details if needed Yes, please add #define DEBUG 1 to the start of "liveMedia/RTSPServer.cpp", recompile and rerun your server, and access it both with a RTP-over-UDP client, and with a RTP-over-TCP client. Please send us the debugging output in each case. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ken at starseedsoft.com Mon Dec 19 17:10:40 2011 From: ken at starseedsoft.com (Ken Dunne) Date: Mon, 19 Dec 2011 17:10:40 -0800 (PST) Subject: [Live-devel] question regarding recieving & unpacking TS over RTP, generated from testH264VideoToTransportStream Message-ID: <1324343440.41379.YahooMailNeo@web1204.biz.mail.gq1.yahoo.com> I've used the example "testH264VideoToTransportStream" as a basis for generating a H264-TS-over-RTP stream of packets. I have used Wireshark to view the stream of RTP packets, and they appear to be correctly filled with an integer number of 188 byte Transport-Stream packets. I am trying to further validate this stream, and have used "testMPEG1or2VideoReceiver.cpp", and modified it to use the MP2T format (33) in the call to: ????? sessionState.source = MPEG1or2VideoRTPSource::createNew( *env, &rtpGroupsock, 33, 90000 ); This saves a file, but i am unable to play it with any player that i know of. Does anyone have any suggestions? Ken -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremiah.Morrill at econnect.tv Mon Dec 19 17:25:29 2011 From: Jeremiah.Morrill at econnect.tv (Jer Morrill) Date: Tue, 20 Dec 2011 01:25:29 +0000 Subject: [Live-devel] SET_PARAMETER doesn't get handled in the RTSPServer over TCP In-Reply-To: <1EB78808-10ED-490E-BCA1-6A614FE88CEB@live555.com> References: <80C795F72B3CB241A9256DABF0A04EC5F96DDA@SN2PRD0702MB101.namprd07.prod.outlook.com> <1EB78808-10ED-490E-BCA1-6A614FE88CEB@live555.com> Message-ID: <80C795F72B3CB241A9256DABF0A04EC5F96E11@SN2PRD0702MB101.namprd07.prod.outlook.com> Thanks for the quick response! I have attached a client debug output and server debug output text files (hope that's ok for this list). These are from the same rtsp session. Though the log does say RTSPServer::RTSPClientSession::handleRequestBytes(...) does say "parseRTSPRequestString() succeed...", it does not pass the next check: if (ptr + newBytesRead < tmpPtr + 2 + contentLength) break; // we still need more data; subsequent reads will give it to us At the point of the content length check the buffer looks like this: SET_PARAMETER rtsp://127.0.0.1/media?dev=1&source=archive&startTime=129661319116046065/ RTSP/1.0 CSeq: 6 User-Agent: HJT RTSP Client (LIVE555 Streaming Media v2011.12.02) Session: CA195293 Content-Length: 30 The complete command should also have (\r's and \n's excluded): GotoTime: 1234567890 It does get these "GotoTime..." bytes afterwards the next calls to "handleRequestBytes", but the server never successfully parses the SET_PARAMETER command in RTP over TCP. Totally weird other messages get through just fine. Thanks again for all your help! Hope this is reproducible for you as I hate to send anyone on a wild goose chase! -Jer From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, December 19, 2011 3:45 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] SET_PARAMETER doesn't get handled in the RTSPServer over TCP I am running the 12-2-2011 build of Live555. When I run RTSPClient::sendSetParameter(...) with a session that is running UDP to a live555 server implementation, the server parses and successfully runs the RTSPClientSession:: handleCmd_SET_PARAMETER (overridden in a subclass). If I run RTSPClient:sendSetParameter(...) with a session that is running via TCP, it does not ever get to the RTSPClientSession::handleCmd_SET_PARAMETER. Seems sending trickplay commands work fine in both TCP and UDP. That's strange. (It's especially strange that other commands - e.g., "PLAY" - work OK for you in RTP-over-TCP mode, but that "SET_PARAMETER" does not.) I can provide more debug details if needed Yes, please add #define DEBUG 1 to the start of "liveMedia/RTSPServer.cpp", recompile and rerun your server, and access it both with a RTP-over-UDP client, and with a RTP-over-TCP client. Please send us the debugging output in each case. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: clientdebugout.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: serverdebugout.txt URL: From finlayson at live555.com Mon Dec 19 19:35:59 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Dec 2011 19:35:59 -0800 Subject: [Live-devel] question regarding recieving & unpacking TS over RTP, generated from testH264VideoToTransportStream In-Reply-To: <1324343440.41379.YahooMailNeo@web1204.biz.mail.gq1.yahoo.com> References: <1324343440.41379.YahooMailNeo@web1204.biz.mail.gq1.yahoo.com> Message-ID: > I've used the example "testH264VideoToTransportStream" as a basis for generating a H264-TS-over-RTP stream of packets. > I have used Wireshark to view the stream of RTP packets, and they appear to be correctly filled with an integer number of 188 byte Transport-Stream packets. > > I am trying to further validate this stream, and have used "testMPEG1or2VideoReceiver.cpp", and modified it to use the MP2T format (33) in the call to: > > sessionState.source = MPEG1or2VideoRTPSource::createNew( *env, &rtpGroupsock, 33, 90000 ); That's incorrect, because "MPEG1or2VideoRTPSource" is used for receiving MPEG *Elementary Stream* video over RTP. To receive a MPEG *Transport Stream* over RTP, you should instead be calling: SimpleRTPSource::createNew(*env, &rtpGroupsock, 33, 90000, "video/MP2T", 0, False); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 19 23:54:05 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Dec 2011 23:54:05 -0800 Subject: [Live-devel] SET_PARAMETER doesn't get handled in the RTSPServer over TCP In-Reply-To: <80C795F72B3CB241A9256DABF0A04EC5F96E11@SN2PRD0702MB101.namprd07.prod.outlook.com> References: <80C795F72B3CB241A9256DABF0A04EC5F96DDA@SN2PRD0702MB101.namprd07.prod.outlook.com> <1EB78808-10ED-490E-BCA1-6A614FE88CEB@live555.com> <80C795F72B3CB241A9256DABF0A04EC5F96E11@SN2PRD0702MB101.namprd07.prod.outlook.com> Message-ID: OK, I've now installed a new version (2011.12.20) of the code that fixes this problem. (The problem occurred only with commands - like "SET_PARAMETER" and "GET_PARAMETER" that have a "Content-Length:" header, and only when the entire command is not read at once - which is the case for RTP-over-TCP streams.) Thanks for the bug report. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsaintmartin at mediabroadcast-t.com Tue Dec 20 00:53:06 2011 From: bsaintmartin at mediabroadcast-t.com (Boris Saint-Martin) Date: Tue, 20 Dec 2011 09:53:06 +0100 Subject: [Live-devel] Write the stream in a file with openRTSP Message-ID: Hi, I would like to use OpenRTSP to record an IP camera stream on my PC. I arrived to connect it by using this kind of command : openrtsp.exe -4 rtsp://192.168.0.200/rtsph264 but I don't know how convert the stream received in a video file ? Someone to help me ? Cheers Boris Saint-Martin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 20 00:58:46 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Dec 2011 00:58:46 -0800 Subject: [Live-devel] Write the stream in a file with openRTSP In-Reply-To: References: Message-ID: > I would like to use OpenRTSP to record an IP camera stream on my PC. > I arrived to connect it by using this kind of command : openrtsp.exe -4 rtsp://192.168.0.200/rtsph264 > but I don't know how convert the stream received in a video file ? The "-4" (and "-q") options output to 'stdout'. So you will need to redirect 'stdout' to a file. And, as explained in the documentation http://www.live555.com/openRTSP/#quicktime you should also use the "-w ", "-h " and "-f " options. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzx at acejet.com.cn Tue Dec 20 01:22:01 2011 From: yangzx at acejet.com.cn (=?gb2312?B?0e7Wvs/p?=) Date: Tue, 20 Dec 2011 17:22:01 +0800 Subject: [Live-devel] how to use openrtsp get nal unit and video config info Message-ID: <41FC1A94FE13684D8249B089C38119700F68C7@mailserver-nj1.njacejet.com> dear: i want use live555 openrtsp get nal data from h264 encode device. i see live555 testprog playcommon.cpp code.it use h264videofilesink. my question: 1?i hope use playcommon get nal data rather than write file.so i must write our myh264videosink? yes or no? 2?if use our myh264videosink get nal data,i how to get video config info? i.e. video size and video bit? thankyou! From finlayson at live555.com Tue Dec 20 01:25:39 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Dec 2011 01:25:39 -0800 Subject: [Live-devel] how to use openrtsp get nal unit and video config info In-Reply-To: <41FC1A94FE13684D8249B089C38119700F68C7@mailserver-nj1.njacejet.com> References: <41FC1A94FE13684D8249B089C38119700F68C7@mailserver-nj1.njacejet.com> Message-ID: <43EF052A-649B-4A73-B049-0934064ECCF7@live555.com> > i want use live555 openrtsp get nal data from h264 encode device. i see live555 testprog playcommon.cpp code.it use h264videofilesink. my question: > 1?i hope use playcommon get nal data rather than write file.so i must write our myh264videosink? yes or no? Yes, if you want to do something with the incoming NAL units - other than write them to a file - then you will need to write your own "MediaSink" subclass to do this (instead of using "H264VideoFileSink"). > 2?if use our myh264videosink get nal data,i how to get video config info? i.e. video size and > video bit? First, you get the 'sprop-parameter-sets' configuration string from the stream's SDP description, by calling subsession->fmtp_spropparametersets() This returns an ASCII string, that you can parse with the "parseSPropParameterSets()" function, to get a set of (binary) NAL units (usually, SPS and PPS). (Note, for example, the implementation of "H264VideoFileSink", which calls "parseSPropParameterSets()".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Tue Dec 20 02:32:05 2011 From: david.myers at panogenics.com (David J Myers) Date: Tue, 20 Dec 2011 10:32:05 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <005b01ccbf02$9f092c00$dd1b8400$@myers@panogenics.com> Hi Ross, >> I?ve tried increasing OutPacketBuffer::maxSize from 1000000 to 5000000 (5 million) but this has no effect. Am I just trying to stream too much data, too quickly? >No, the problem has nothing to do with 'speed'. There's a bug in your code somewhere. Sorry. Would you please take another look at this? My debug below shows that something (not OutPacketBuffer::maxSize) is limiting the output buffer to 150k. Interestingly, this is the BANK_SIZE in StreamParser.cpp, and I know there have been changes around this in this latest release to avoid blowing up on exceeding this value. Whatever I set the OutPacketBuffer::maxSize to, I only ever seem to get 150k of frame buffer. deliverFrame(): Stream:0, fFrameSize:194476, fNumTruncatedBytes:44478 deliverFrame(): Stream:0, fFrameSize:221966, fNumTruncatedBytes:71967 deliverFrame(): Stream:0, fFrameSize:186214, fNumTruncatedBytes:36216 deliverFrame(): Stream:0, fFrameSize:193011, fNumTruncatedBytes:43012 deliverFrame(): Stream:0, fFrameSize:197736, fNumTruncatedBytes:47738 deliverFrame(): Stream:0, fFrameSize:210813, fNumTruncatedBytes:60816 deliverFrame(): Stream:0, fFrameSize:204507, fNumTruncatedBytes:54509 Thanks and regards - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From volker.marks at outstanding-solutions.de Tue Dec 20 04:31:47 2011 From: volker.marks at outstanding-solutions.de (Volker Marks) Date: Tue, 20 Dec 2011 13:31:47 +0100 Subject: [Live-devel] Problem with RTPTransmissionStats Message-ID: <000901ccbf13$58bee0d0$0a3ca270$@outstanding-solutions.de> Hi Ross, first of all thanks for your great library. I'm using your library for an application running under Windows XP. (I know it's not your favorite OS.) On one machine ("A") my application multicasts two video streams and one audio stream. On another machine ("B") the application connects to "A", receives and renders the three streams and also multicasts two video streams and one audio stream, which a being received and rendered "A". Everything works fine so far, except the RTPTransmissionStats: If both machines use the same RTCP/RTP ports the TransmissionStats are not correct or empty. If both instances use different ports the Stats work really fine. Unfortunately I need to have the same ports on both machines. I guess that the problem is inside "RTCPInstance::incomingReportHandler1", but I was not able to find the issue. Can you give me a hint on how to get the TransmissionStats in my scenario? Thanks in advance Volker Marks -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzx at acejet.com.cn Tue Dec 20 06:34:19 2011 From: yangzx at acejet.com.cn (=?gb2312?B?0e7Wvs/p?=) Date: Tue, 20 Dec 2011 22:34:19 +0800 Subject: [Live-devel] how to use openrtsp get nal unit and video configinfo References: <41FC1A94FE13684D8249B089C38119700F68C7@mailserver-nj1.njacejet.com> <43EF052A-649B-4A73-B049-0934064ECCF7@live555.com> Message-ID: <41FC1A94FE13684D8249B089C38119700F68CA@mailserver-nj1.njacejet.com> ________________________________ thank you answer my question. another question,i how to get each nal unit timestamp ? thank you! i want use live555 openrtsp get nal data from h264 encode device. i see live555 testprog playcommon.cpp code.it use h264videofilesink. my question: 1?i hope use playcommon get nal data rather than write file.so i must write our myh264videosink? yes or no? Yes, if you want to do something with the incoming NAL units - other than write them to a file - then you will need to write your own "MediaSink" subclass to do this (instead of using "H264VideoFileSink"). 2?if use our myh264videosink get nal data,i how to get video config info? i.e. video size and video bit? First, you get the 'sprop-parameter-sets' configuration string from the stream's SDP description, by calling subsession->fmtp_spropparametersets() This returns an ASCII string, that you can parse with the "parseSPropParameterSets()" function, to get a set of (binary) NAL units (usually, SPS and PPS). (Note, for example, the implementation of "H264VideoFileSink", which calls "parseSPropParameterSets()".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Dec 20 07:58:29 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Dec 2011 07:58:29 -0800 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <005b01ccbf02$9f092c00$dd1b8400$@myers@panogenics.com> References: <005b01ccbf02$9f092c00$dd1b8400$@myers@panogenics.com> Message-ID: > >No, the problem has nothing to do with 'speed'. There's a bug in your code somewhere. Sorry. > > Would you please take another look at this? Well, I'm not sure what the "this" is that I can take a look at, because the problem is with your custom code, which I know nothing about. But in your first message, you talked about "5MP" images. Did you mean "5 MByte images"; "5 Megapixel images"? In any case, what codec is this? I.e., what kind of data are you trying to stream? I suspect that there's something fundamentally wrong in the way that you're trying to stream whatever kind of data you're trying to stream. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 20 08:01:00 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Dec 2011 08:01:00 -0800 Subject: [Live-devel] how to use openrtsp get nal unit and video configinfo In-Reply-To: <41FC1A94FE13684D8249B089C38119700F68CA@mailserver-nj1.njacejet.com> References: <41FC1A94FE13684D8249B089C38119700F68C7@mailserver-nj1.njacejet.com> <43EF052A-649B-4A73-B049-0934064ECCF7@live555.com> <41FC1A94FE13684D8249B089C38119700F68CA@mailserver-nj1.njacejet.com> Message-ID: > another question,i how to get each nal unit timestamp ? It will be the "presentationTime" parameter to your "afterGettingFrame()" function - i.e., the 'after getting' function that you passed in your sink's call to "itsUpstreamSource->getNextFrame()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Dec 20 09:48:45 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 20 Dec 2011 17:48:45 +0000 Subject: [Live-devel] DeliverPMT Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AA894@IL-BOL-EXCH01.smartwire.com> I am trying to force the MPEG2TransportStreamMultiplexor class to send a PAT, then an PMT and then a keyframe then the diff frames, exactly in that order. In Contrast to the default once per 100 and once per 500frames. I think this is what I will need to simplify my HLS segmenter flter that follows. I have made a copy of this class and am modifying it for use. But I keep getting the frames out of order and I don't understand why. Inside MPEG2TransportStreamMultiplexor::doGetNextFrame when I detect the first keyframe. I call deliverPATPacket(); then deliverPMTPacket(); then the frames deliverDataToClient. I have had to put aftergetting(this) after each call to keep it progressing but when I look at the output I end up with the PAT packet, then a whole bunch of video packets, then the PMT packet, the PMT gets delayed. It seems like I need to put all 3 in the buffer then call afterGetting(this) but it just stops on the PMT packet. The muxer is slightly different from the other filters because of the one to many frame flow and the buffer recycling and I am a bit lost. (I need to draw a sequence diagram :( ) Any ideas what I may be missing here? How can the frames seem to get out of order? Is it per programID because of the sourceRec and that makes them independent? -------------- next part -------------- An HTML attachment was scrubbed... URL: From lanlamer at gmail.com Fri Dec 16 19:13:51 2011 From: lanlamer at gmail.com (Johnnie Au-Yeung) Date: Sat, 17 Dec 2011 11:13:51 +0800 Subject: [Live-devel] How to safely close the openRtsp? Message-ID: Hi,ALL: The openRtsp has designed to be close rudely by the CTRL+C or console windows's close button. But while developing GUI streaming client that based on the openRtsp,we may have to safely close the openRtsp and do some related works. Can you please give me some sugguestion that what I should do or not. In my opinion, if the client want to close on it's own initiavte,the followming may be necessary: 1) write the leaving received video data to file. 2) send TEARDOWN to server to notify that the client is going to close. 3) release related resources such as allocated memory and objects. 4) set watchVariable to NULL or 0 to exit the doEventLoop. if the server was fininished to streaming, an BYE command may send to the client,and we can be notified by the onSourceClosure that set by getNextFrame on the contiunePlaying in the MediaSink and do the following: 1) write the file format tail to the storage file. 2) notify the GUI that the server asked the openRtsp to close. Any sugguestion may be pleasure and helpful! Thanks very much! -------------- next part -------------- An HTML attachment was scrubbed... URL: From aduran at tredess.com Wed Dec 21 01:29:49 2011 From: aduran at tredess.com (=?ISO-8859-1?Q?=C1ngel_Dur=E1n_Alcaide?=) Date: Wed, 21 Dec 2011 10:29:49 +0100 Subject: [Live-devel] openRTSP + Apache + mediaServer Message-ID: <4EF1A70D.1090006@tredess.com> Hi, I am trying to recieve using openRTSP an RTSP stream over HTTP using Apache as a "proxy" (using mod_proxy and mod_http_proxy). My Apache configuration is the following: / ProxyRequests Off Order deny,allow Allow from all ProxyPass /media/ http://127.0.0.1:8080/ ProxyPassReverse /media/ http://127.0.0.1:8080/ / When I send a request to the server, for example: ./openRTSP -T 80 rtsp://127.0.0.1/media/17569370928202, comunication is not correctly stablished and apache reports / (70007)The timeout specified has expired: proxy: prefetch request body failed to 127.0.0.1:8080 (127.0.0.1) from/ which could show some kind of problem with the live555 media server that is running in the 8080 port for supporting RTSP over HTTP. I am using the 2011.11.02 version of live555. Is there any miss-configuration? Is there any kind of problem between live555 and apache server? Is there any other possibility in order to support RTSP over HTTP and a web service over 80 port in the same machine? Thanks in advance ?ngel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 04:21:18 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 04:21:18 -0800 Subject: [Live-devel] openRTSP + Apache + mediaServer In-Reply-To: <4EF1A70D.1090006@tredess.com> References: <4EF1A70D.1090006@tredess.com> Message-ID: In general, I see no reason why RTSP-over-HTTP would not work through a HTTP proxy. However, I'm confused by your last statement: > Is there any other possibility in order to support RTSP over HTTP and a web service over 80 port in the same machine? By this, do you mean a RTSP-over-HTTP client (e.g., "openRTSP"), or a RTSP server (such as ours) that supports RTSP-over-HTTP? Obviously, a RTSP server cannot use port 80 for RTSP-over-HTTP if there is already a HTTP server on the same machine that also uses port 80. No 'proxying' can make that possible. Note, though, that our server code (as illustrated by the "live555MediaServer" and "testOnDemandRTSPServer" applications) will not use a port number for RTSP-over-HTTP (or for just plain RTSP) if there is already another server on the same machine using that port. However, there's no reason why a RTSP-over-HTTP client (such as "openRTSP") can't access a *remote* RTSP server over port 80, even if there is already a HTTP server running on the client's machine. (Just as long as there isn't a HTTP server (using port 80) already running on the *remote* machine.) You should not need a proxy to do this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Dec 21 04:31:46 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 21 Dec 2011 12:31:46 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <009701ccbfdc$818921a0$849b64e0$@myers@panogenics.com> Hi Ross You wrote: >But in your first message, you talked about "5MP" images. Did you mean "5 MByte images"; "5 Megapixel images"? In any case, what codec is this? I.e., what kind of data are you trying to stream? Yes, I am trying to stream H.264 compressed live images from my camera sensor which are 2144x1944 pixels, so just under 5MegaPixels. I'm using h264videoStreamFramer directly and H264VideoRTPSink downstream. My deliver frame function looks like this, virtually copied from your DeviceSource sample, except for where we get the frame from our inter-process buffers. void StreamSource::deliverFrame() { streamSubStream_t *substream = m_substream; FUNCTION_TRACE; if (!isCurrentlyAwaitingData() || (substream == NULL)) { TRACE ("NoData, substream=0x%p\r\n", substream); return; } TRACE("Max Frame size = %u\r\n", fMaxSize); fNumTruncatedBytes = 0; //fDurationInMicroseconds = ; // not set //get the next frame buffer for streaming CStreamBuffer * pBuffer = substream->pStreamManager->GetNewestBufferForStreaming(substream->iSeqNumber ); if (pBuffer == NULL) { TRACE("deliverFrame had no data\r\n"); return; //no data, have to wait for main thread to signal Live555 to call us back later } substream->iSeqNumber = pBuffer->GetSequenceNumber(); //we have the frame so save Sequence number ready for next one TRACE("deliverFrame(): Stream:%d, Sequence Number:%lu\r\n", substream->iStreamNo, substream->iSeqNumber); unsigned newFrameSize = pBuffer->GetDataSize(); if (newFrameSize > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = newFrameSize - fMaxSize; INFO("deliverFrame(): Stream:%d, Frame size:%u, %u bytes truncated\r\n", substream->iStreamNo, newFrameSize, fNumTruncatedBytes); } else { fFrameSize = newFrameSize; } gettimeofday(&fPresentationTime, NULL); memmove(fTo, pBuffer->GetDataPtr(), fFrameSize); TRACE("deliverFrame %d bytes delivered\r\n", fFrameSize); //return buffer back to Stream Manager substream->pStreamManager->ReturnBufferAfterStreaming(pBuffer); FramedSource::afterGetting(this); } In my corresponding "pushFrame" function, I save the current frame from the encoder and then call signalNewFrame to signal the event to the scheduler, also as per the DeviceSource sample. However, whatever I set the OutPacketBuffer::maxSize to, I end up truncating frames in the above function. BANK_SIZE in StreamParser.cpp also seems to be critical. If I increase the BANK_SIZE value I get less frequent truncations but still can't stop them completely. If the problem is in my code, please could you suggest where else I should look? Best regards - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From developpeur02 at kafemeeting.com Wed Dec 21 04:21:47 2011 From: developpeur02 at kafemeeting.com (Rodolophe Fouquet) Date: Wed, 21 Dec 2011 13:21:47 +0100 Subject: [Live-devel] SimpleRTPSource and SimpleRTPSink constructor parameter meaning Message-ID: Hi, We are developping an application using an Mpeg 2 TS (H264/AAC) streaming. And we are wondering about the meaning of the boolean parameters of the SimpleRTPSink constructor, and the SimpleRTPsource constructor. We are having some problems synchronizing audio and video, and demuxing the output stream (the demuxer just block out after a random time). SimpleRTPSource::createNew(*env, &(*rtpGroupSock), 33, 90000, "video/MP2T", 0, false); SimpleRTPSink::createNew(*env, &(*rtpGroupSock), 33, 90000, "video", "mp2t", 1, true, false/*no 'M' bit*/); We are wondering if changing those parameter could improve our situation, or if those default shouldn't be changed. Regards, Rodolphe Fouquet. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 04:48:11 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 04:48:11 -0800 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <009701ccbfdc$818921a0$849b64e0$@myers@panogenics.com> References: <009701ccbfdc$818921a0$849b64e0$@myers@panogenics.com> Message-ID: <5769B2E5-6338-447C-B296-F3C33AF83FBD@live555.com> > Yes, I am trying to stream H.264 compressed live images from my camera sensor which are 2144x1944 pixels, so just under 5MegaPixels. I'm not sure what that translates into in terms of "bytes", but (as noted several times already on this mailing list) you really shouldn't be streaming a very large frame as a single NAL unit. Instead, if your encoder supports this, you should encoder large frames into multiple NAL units ('slices'), and deliver/stream each NAL unit individually. > I?m using h264videoStreamFramer That's your problem. Because your "StreamSource" object is delivering discrete NAL units (in this case, discrete frames, where each frame is a single NAL unit) - i.e., delivering one NAL unit at a time - then you should be using "H264VideoStreamDiscreteFramer". Just make sure that your encoded NAL units *do not* begin with a 'start code' (0x00000001 or 0x000001). ("H264VideoStreamFramer" is used when your input source is a *byte stream* - e.g., from a H.264 Elementary Stream video *file*.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 04:55:05 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 04:55:05 -0800 Subject: [Live-devel] SimpleRTPSource and SimpleRTPSink constructor parameter meaning In-Reply-To: References: Message-ID: <0AB44E68-2046-4F6E-AFC7-EA58325CD546@live555.com> > We are developping an application using an Mpeg 2 TS (H264/AAC) streaming. And we are wondering about the meaning of the boolean parameters of the SimpleRTPSink constructor, and the SimpleRTPsource constructor. We are having some problems synchronizing audio and video, and demuxing the output stream (the demuxer just block out after a random time). > > SimpleRTPSource::createNew(*env, &(*rtpGroupSock), 33, 90000, "video/MP2T", 0, false); > > SimpleRTPSink::createNew(*env, &(*rtpGroupSock), 33, 90000, "video", "mp2t", 1, true, false/*no 'M' bit*/); > > We are wondering if changing those parameter could improve our situation, or if those default shouldn't be changed. No, for receiving/sending MPEG Transport Stream data over RTP, those parameters are correct, and should not be changed. However, because your data is Transport Stream data (in which audio and video are multiplexed together into a single stream), your audio/video synchronization problem has absolutely nothing to do with our streaming code. You can verify this by taking your input Transport Stream (i.e. directly from your encoder), saving it to a file, and playing this file using a media player such as VLC. Any audio/video synchronization problems within a Transport Stream are caused by the *encoding* of the Transport Stream, not by the way in which it is streamed. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From aduran at tredess.com Wed Dec 21 05:48:54 2011 From: aduran at tredess.com (=?ISO-8859-1?Q?=C1ngel_Dur=E1n_Alcaide?=) Date: Wed, 21 Dec 2011 14:48:54 +0100 Subject: [Live-devel] openRTSP + Apache + mediaServer In-Reply-To: References: <4EF1A70D.1090006@tredess.com> Message-ID: <4EF1E3C6.9020408@tredess.com> Thank Ross for your rapid answer Basically, I want to have in the same computer a web server (in my example apache) and the live555 media server as RTSP server,therefore I need apache working as an HTTP proxy for RTSP requests (?)... but in my tests it did not work. (I don't know if I need to setup anything more in any of the system parts) My last statement was about if another possibility to do that exists, that is,for example: live555 has its own HTTP server that is able to receive HTTP requests for serving webs as well as RTSP requests and then avoid using apache. On 12/21/2011 01:21 PM, Ross Finlayson wrote: > In general, I see no reason why RTSP-over-HTTP would not work through > a HTTP proxy. > > However, I'm confused by your last statement: >> Is there any other possibility in order to support RTSP over HTTP and >> a web service over 80 port in the same machine? > > By this, do you mean a RTSP-over-HTTP client (e.g., "openRTSP"), or a > RTSP server (such as ours) that supports RTSP-over-HTTP? > > Obviously, a RTSP server cannot use port 80 for RTSP-over-HTTP if > there is already a HTTP server on the same machine that also uses port > 80. No 'proxying' can make that possible. > > Note, though, that our server code (as illustrated by the > "live555MediaServer" and "testOnDemandRTSPServer" applications) will > not use a port number for RTSP-over-HTTP (or for just plain RTSP) if > there is already another server on the same machine using that port. > > > However, there's no reason why a RTSP-over-HTTP client (such as > "openRTSP") can't access a *remote* RTSP server over port 80, even if > there is already a HTTP server running on the client's machine. (Just > as long as there isn't a HTTP server (using port 80) already running > on the *remote* machine.) You should not need a proxy to do this. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Dec 21 05:51:35 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 21 Dec 2011 13:51:35 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <009f01ccbfe7$a7d27540$f7775fc0$@myers@panogenics.com> Hi Ross, >That's your problem. Because your "StreamSource" object is delivering discrete NAL units (in this case, discrete frames, where each frame is a single NAL unit) - i.e., delivering one NAL unit at a time - then you should be using "H264VideoStreamDiscreteFramer". >Just make sure that your encoded NAL units *do not* begin with a 'start code' (0x00000001 or 0x000001). >("H264VideoStreamFramer" is used when your input source is a *byte stream* - e.g., from a H.264 Elementary Stream video *file*.) So I've tried using H264VideoStreamDiscreteFramer and removing the first 4 bytes (which is always 00 00 00 01) from the encoded frame data, but this fails with the output:- Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake? I'm guessing that my encoded output frame data is not just one NAL unit but a bunch of them. H264VideoStreamFramer copes with this (apart from the truncations). Can I go through the frame extracting the NAL units and sending them one by one. - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 06:03:07 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 06:03:07 -0800 Subject: [Live-devel] openRTSP + Apache + mediaServer In-Reply-To: <4EF1E3C6.9020408@tredess.com> References: <4EF1A70D.1090006@tredess.com> <4EF1E3C6.9020408@tredess.com> Message-ID: > Basically, I want to have in the same computer a web server (in my example apache) and the live555 media server as RTSP server You CANNOT do this if these two servers use the same port number. I.e., if you already have an Apache web server running using port 80, then "live555MediaServer" cannot also use port 80. As I said before, no 'proxying' can make this possible. HOWEVER, "live555MediaServer" will automatically try to use other port numbers if its first choice (port 80) is already in use. It will first try port 80, then port 8000, then port 8080. Note the message that "live555MediaServer" prints out at the end, when it starts out: (We use port for optional RTSP-over-HTTP tunneling, or for HTTP live streaming (for indexed Transport Stream files only).) In this case, is the port number that your client (e.g., "openRTSP") will use to access the RTSP server using RTSP-over-HTTP streaming. If, however, "live555MediaServer" prints out the following message: (RTSP-over-HTTP tunneling is not available.) then that means that other server(s) (probably your Apache server) are already using all three port numbers: 80, 8000, and 8080. In that case, you can modify the "live555MediaServer.cpp" code to use a different port number instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 06:05:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 06:05:53 -0800 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <009f01ccbfe7$a7d27540$f7775fc0$@myers@panogenics.com> References: <009f01ccbfe7$a7d27540$f7775fc0$@myers@panogenics.com> Message-ID: <1B46EDD7-F240-4547-8D64-A81750027CBA@live555.com> > So I?ve tried using H264VideoStreamDiscreteFramer and removing the first 4 bytes (which is always 00 00 00 01) from the encoded frame data, but this fails with the output:- > Warning: Invalid 'nal_unit_type': 0. OK, that means that your encoder is producing invalid H.264 output. > I?m guessing that my encoded output frame data is not just one NAL unit but a bunch of them. That would not explain the "Invalid 'nal_unit_type': 0" error. There is something else wrong with your encoded data. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Dec 21 06:22:02 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 21 Dec 2011 14:22:02 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <00a401ccbfeb$e9411c80$bbc35580$@myers@panogenics.com> >> So I've tried using H264VideoStreamDiscreteFramer and removing the first 4 bytes (which is always 00 00 00 01) from the encoded frame data, but this fails with the output:- >> Warning: Invalid 'nal_unit_type': 0. >OK, that means that your encoder is producing invalid H.264 output. >> I'm guessing that my encoded output frame data is not just one NAL unit but a bunch of them. >That would not explain the "Invalid 'nal_unit_type': 0" error. There is something else wrong with your encoded data. There can't be much wrong with my encoded data because it displays fine in VLC (apart from some frames being truncated) when I use H264VideoStreamFramer instead of H264VideoStreamDiscreteFramer. -David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 06:33:45 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 06:33:45 -0800 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <00a401ccbfeb$e9411c80$bbc35580$@myers@panogenics.com> References: <00a401ccbfeb$e9411c80$bbc35580$@myers@panogenics.com> Message-ID: > There can?t be much wrong with my encoded data because it displays fine in VLC (apart from some frames being truncated) when I use H264VideoStreamFramer instead of H264VideoStreamDiscreteFramer. Well, a "nal_unit_type" of 0 is definitely wrong. But if only a few of the encoded NAL units have this (incorrect) value, then that should not prevent your streaming from working, if you use "H264VideoStreamDiscreteFramer". The message "Warning: Invalid 'nal_unit_type': 0" that "H264VideoStreamDiscreteFramer" prints is merely that - a warning. It will not cause your streaming to 'fail'. But to clear this up, please create an example file (including start codes) that "displays fine in VLC (apart from some frames being truncated)", put it online, and send us the URL, so we can download and test it for ourselves. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Dec 21 06:49:19 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 21 Dec 2011 14:49:19 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <00a901ccbfef$b915e2d0$2b41a870$@myers@panogenics.com> On further examination of my encoded frame data, it looks like an I-frame consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17 bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of the frame size. Each P-frame is just one NALU. My code is now using H264VideoStreamDiscreteFramer and sending each NAL unit one by one. So each I-Frame is sent with 3 calls to afterGetting(this). Each P-frame takes only one call to aftergetting(this). However this does not seem to be acceptable to a VLC client as a decent RTSP stream. Is this not an acceptable way to drive the stream? -David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 06:59:02 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 06:59:02 -0800 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <00a901ccbfef$b915e2d0$2b41a870$@myers@panogenics.com> References: <00a901ccbfef$b915e2d0$2b41a870$@myers@panogenics.com> Message-ID: <1B774A12-014D-4058-BF69-08C33F4FAE13@live555.com> On Dec 21, 2011, at 6:49 AM, David J Myers wrote: > On further examination of my encoded frame data, it looks like an I-frame consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17 bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of the frame size. Each P-frame is just one NALU. > My code is now using H264VideoStreamDiscreteFramer and sending each NAL unit one by one. So each I-Frame is sent with 3 calls to afterGetting(this). You can do this only if each call to "afterGetting(this)" corresponds to exactly one earlier call to "doGetNextFrame()". I.e., you *must not* call "afterGetting(this)" more than once for each call to "doGetNextFrame()". I.e, if you want to deliver these three NAL units - one at a time - to the downstream object (in this case, a "H264VideoStreamDiscreteFramer"), then you must really do so 'one at a time'. Each call to "doGetNextFrame()" must be followed by one (and only one) data delivery, followed by a call to "FramedSource::afterGetting()". If you do this properly, though, then this should produce a proper RTSP/RTP stream. (However, I suggest that you test this first with "openRTSP", rather than VLC.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 21 07:07:01 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 21 Dec 2011 15:07:01 +0000 Subject: [Live-devel] invalid ts stream Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> I am trying to create a ts file from h264 video from a live source. Everything looks identical to streams that work from a different muxer but when these streams are segmented and served to the Apple mediastreamvalidator, it complains with "error: (-12976) unable to parse segment as either MPEG-2 ts stream ...) I am quite stumped. Any ideas of what could be wrong with the .ts files generated? They play fine in VLC. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 07:17:03 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 07:17:03 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> Message-ID: <0A6525D1-4E5C-476F-8238-61DF9811B147@live555.com> > I am trying to create a ts file from h264 video from a live source. Everything looks identical to streams that work from a different muxer but when these streams are segmented and served to the Apple mediastreamvalidator, it complains with ?error: (-12976) unable to parse segment as either MPEG-2 ts stream ?) You need to ask about that on an Apple mailing list (not this one). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Dec 21 07:22:14 2011 From: david.myers at panogenics.com (David J Myers) Date: Wed, 21 Dec 2011 15:22:14 -0000 Subject: [Live-devel] Live555 EventLoop Crash Message-ID: <00b101ccbff4$51e52940$f5af7bc0$@myers@panogenics.com> Hi Ross, >> On further examination of my encoded frame data, it looks like an I-frame consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17 bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of the frame size. Each P-frame is just one NALU. >> My code is now using H264VideoStreamDiscreteFramer and sending each NAL unit one by one. So each I-Frame is sent with 3 calls to afterGetting(this). >You can do this only if each call to "afterGetting(this)" corresponds to exactly one earlier call to "doGetNextFrame()". I.e., you *must not* call "afterGetting(this)" more than once for each call to "doGetNextFrame()". >I.e, if you want to deliver these three NAL units - one at a time - to the downstream object (in this case, a "H264VideoStreamDiscreteFramer"), then you must really do so 'one at a time'. Each call to "doGetNextFrame()" must be followed by one (and only one) data delivery, followed by a call to "FramedSource::afterGetting()". >If you do this properly, though, then this should produce a proper RTSP/RTP stream. (However, I suggest that you test this first with "openRTSP", rather than VLC.) Ok, understood. How do I get these extra calls to doGetNextFrame()? Currently I'm getting one call to doGetNextFrame(), but I'm then sending all three NAL units for the frame which, as you say, doesn't produce a decent stream. - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 21 07:30:41 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 21 Dec 2011 15:30:41 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: <0A6525D1-4E5C-476F-8238-61DF9811B147@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <0A6525D1-4E5C-476F-8238-61DF9811B147@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AAD2A@IL-BOL-EXCH01.smartwire.com> Trust me I have done my due diligence on this before asking here. After a whole week, it appears to be down to the headers in the ts stream created by the MPEG2TransportStreamMultiplexor. The deliverPAT and the deliverPMT functions. Aplle is very very picky and vauge :( From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 21, 2011 9:17 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream I am trying to create a ts file from h264 video from a live source. Everything looks identical to streams that work from a different muxer but when these streams are segmented and served to the Apple mediastreamvalidator, it complains with "error: (-12976) unable to parse segment as either MPEG-2 ts stream ...) You need to ask about that on an Apple mailing list (not this one). Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2109/4694 - Release Date: 12/21/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Dec 21 07:52:29 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 21 Dec 2011 15:52:29 +0000 Subject: [Live-devel] Live555 EventLoop Crash In-Reply-To: <00b101ccbff4$51e52940$f5af7bc0$@myers@panogenics.com> References: <00b101ccbff4$51e52940$f5af7bc0$@myers@panogenics.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AAD7C@IL-BOL-EXCH01.smartwire.com> This is the exact stream format I get from my encoders Given nal type 7 = SPS,8 = PPS, 5 = IDR slices, and 1 = Diff frames I see [7][8][5][1][1][1][1][1] of 6fps streams less than Mp I see [7][8][5][5][5][1][1][1][1][1] on 6ms multimega-pixel streams. Some streams to NOT have the SPS and PPS in the stream so I datamine them from the SDP packet and remember them for insertion into the stream. This makes libavcodec happy. I always send the encoder a SPS+PPS + IDRSlice as an aggregated frame (00 00 01 in front of each, it need that delimetter) I have a filter that does this aggregation for me. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of David J Myers Sent: Wednesday, December 21, 2011 9:22 AM To: live-devel at ns.live555.com Subject: Re: [Live-devel] Live555 EventLoop Crash Hi Ross, >> On further examination of my encoded frame data, it looks like an I-frame consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17 bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of the frame size. Each P-frame is just one NALU. >> My code is now using H264VideoStreamDiscreteFramer and sending each NAL unit one by one. So each I-Frame is sent with 3 calls to afterGetting(this). >You can do this only if each call to "afterGetting(this)" corresponds to exactly one earlier call to "doGetNextFrame()". I.e., you *must not* call "afterGetting(this)" more than once for each call to "doGetNextFrame()". >I.e, if you want to deliver these three NAL units - one at a time - to the downstream object (in this case, a "H264VideoStreamDiscreteFramer"), then you must really do so 'one at a time'. Each call to "doGetNextFrame()" must be followed by one (and only one) data delivery, followed by a call to "FramedSource::afterGetting()". >If you do this properly, though, then this should produce a proper RTSP/RTP stream. (However, I suggest that you test this first with "openRTSP", rather than VLC.) Ok, understood. How do I get these extra calls to doGetNextFrame()? Currently I'm getting one call to doGetNextFrame(), but I'm then sending all three NAL units for the frame which, as you say, doesn't produce a decent stream. - David ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2109/4694 - Release Date: 12/21/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 21 13:43:57 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 13:43:57 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> Message-ID: <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> > I am trying to create a ts file from h264 video from a live source. Everything looks identical to streams that work from a different muxer but when these streams are segmented and served to the Apple mediastreamvalidator, it complains with ?error: (-12976) unable to parse segment as either MPEG-2 ts stream ?) > > I am quite stumped. Any ideas of what could be wrong with the .ts files generated? They play fine in VLC. Please put an example of one of these .ts files on a web server, and send us the URL, so we can download and take a look at it ourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Fri Dec 16 01:45:14 2011 From: 6.45.vapuru at gmail.com (6.45 6.45.Vapuru) Date: Fri, 16 Dec 2011 11:45:14 +0200 Subject: [Live-devel] How to disable RTCP timestamp calculations on single-stream sessions? [OpenRtspClient] Message-ID: Hi, My RTSP Source's ?RTCP SR ?are not reliable...So calculated timestamps frequently resulting in large negative jumps. So how can i tell Live555 [ using OpenRtspClient ] NOT to use ?RTCP timestamps or simply ignore RTCP info? Best Wishes From 6.45.vapuru at gmail.com Sat Dec 17 05:19:14 2011 From: 6.45.vapuru at gmail.com (6.45 6.45.Vapuru) Date: Sat, 17 Dec 2011 15:19:14 +0200 Subject: [Live-devel] Presentaition time problem at H264 streams Message-ID: Hi I modify the OpenRtspClient so that -- Now instead of writing frames to file I collect them in a queue with incoming presenttaion times -- Then give the h264 frames to MP4 muxer [ Geraint Davies MP4 mux filter] -- Finally write muxed data to file... So I can able to save h264 stream into MP4 container... But the problem is that, some of the recorded data [NOT all of them] has wrong values for time duration: Suppose that a 10 minute record seems that it was 12 h stream... VLC play the 10 minute that play last frame for the remaing time. It seems that i set sample times wrong into the Muxer... Then i debug and see that there is positive and negative dramatic jumps at time stamps... Here is how i set time stamps: -- Firts i take presentationTime from H264VideoFileSink::afterGettingFrame1 function -- Then calculate the firstPresentaionTime [ at the beginning] -- Then collect other timestamps #define TIMEVAL_TO_REFERENCE_TIME(x) ((__int64)(x.tv_sec * 1000000) + x.tv_usec) * 10 void H264VideoFileSink::afterGettingFrame1(unsigned frameSize, struct timeval presentationTime) { // At the beginning [ just for once calculate firstPresentaionTime ] firstPresentaionTime = TIMEVAL_TO_REFERENCE_TIME(presentationTime); // for the other frames collect frames timestamps frameTimeStamp = TIMEVAL_TO_REFERENCE_TIME(presentationTime) - firstPresentationTime } frameTimeStamp values show dramatic jumps to negative or positive values...[ i keeep those values as int64 ] What my cause this? -- Or is it agood idea to use this "presentationTime" for a MP4 Muxer? -- Where the "presentationTime" is calculated at library? -- Is it possible that H264VideoFileSink::afterGettingFrame1 method "presentationTime" values may be wrong? -- Anybody record h264 stream in a mp4 contianer and wanted to share his/her experience? Best Wishes Novalis Vapuru PS: By the way i can able to re-write OpenRtspClient in a more OO-way.Each client has its own envr and scheduler... So i can start many of them programatically in a multithreaded way... Just simple as string rtspUrl = "xxxxx"; New_OO_OpenRtspClient* client = new New_OO_OpenRtspClient(rtspUrl); client->Start(); client->Shutdown(); But I think there is still some memory leaks at my Shutdown method...Some sources not "cleanup"... Hope, for the next release we will see more OO way written OpenRtspClient example at Live555... From finlayson at live555.com Wed Dec 21 20:53:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Dec 2011 20:53:31 -0800 Subject: [Live-devel] Live555 EventLoop crash In-Reply-To: References: Message-ID: <0376AECF-6591-4B08-B6E6-E524BEADAE51@live555.com> > 2. One of the biggest performance hits in my profiling is memcpy (I use an embedded platform, so memcpy gets pricy fast), much of it due to copying media buffers. Would you ever consider adding (or consider accepting ;) code that allows live555 to work in the calling library's buffers instead of its own? If I were to go back in time and design the LIVE555 code all over again, then this is something that I might well have done differently. In any case, sometime in the future, when I replace/reimplement the crusty old "groupsock" library (so that network interfaces become proper "FramedSource" and "MediaSink" objects, and we can also support IPv6), I will likely need to rethink the buffering mechanism, otherwise we'll be introducing an extra memcpy when transmitting data over a network socket. But that's for sometime in the future (because any buffering changes will be a *major* change to the code, involving a *lot* more than a simple patch). > (in other words, I give Live555 a pointer to a buffer to send and the size, rather than memcpy'ing the buffer into live555's space) Unfortunately it wouldn't be quite that simple. Suppose, for example, we're outputting data into a "RTPSink". If the upstream object is providing it a buffer, it would also need to know that it needs to leave sufficient space at the front of the buffer, so that the "RTPSink" can add the 12-byte RTP header (plus any required RTP header extension bytes). (This wouldn't be necessary if the network interface supports 'scatter gather I/O', but we can't assume this in general.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From buivuhoang15 at gmail.com Mon Dec 19 02:46:38 2011 From: buivuhoang15 at gmail.com (Hoang Bui Vu) Date: Mon, 19 Dec 2011 17:46:38 +0700 Subject: [Live-devel] How to create a custom sink that streams to both the network and a local file In-Reply-To: <11D15901-ABBD-47D3-B5C0-7B3191E10CD2@live555.com> References: <900D5E18-98A6-446B-B025-8A078FB91691@live555.com> <11D15901-ABBD-47D3-B5C0-7B3191E10CD2@live555.com> Message-ID: Hi Ross, Thank you very much for your attention. I am facing with some kind of difficulties, because I don't know what is missing, the data just seems not to be fed to the output of the filter. That looks fine - but don't forget to enter the event loop, by calling > env->taskScheduler().doEventLoop(); > at the end of this code, otherwise nothing will happen. Actually I am using the code of the testMPEG4VideoStreamer, so that function is called after play(), in the main function. *env << "Beginning streaming...\n"; play(); env->taskScheduler().doEventLoop(); // does not return What I am confused about is when replacing historyFilter with videoSource in videoSink->startPlaying, then it works perfectly. videoSink->startPlaying(*historyFilter, afterPlaying, videoSink); This indicates that the problem is somewhere within the filter class itself, however, as I said before, I couldn't find anything out of order. On Mon, Dec 19, 2011 at 4:51 AM, Ross Finlayson wrote: > I have been playing with the FramedFilter class for a while, and created a > simple filter that will just deliver its input to its output. However, it > is not working for some reason. Can you point me to the point that I'm > missing? > > [...] > > void HistoryFilter::doGetNextFrame() > { > fFrameSize=0; > > You don't need to do this here (although it does no harm), because you are > (properly) setting "fFrameSize" later, in your 'after getting' function. > > void HistoryFilter::afterGettingFrame(void* clientData, unsigned > frameSize, unsigned /*numTruncatedBytes*/, struct timeval > presentationTime, unsigned /*durationInMicroseconds*/) > { > HistoryFilter* filter = (HistoryFilter*)clientData; > filter->afterGettingFrame1(frameSize, presentationTime); > } > > void HistoryFilter::afterGettingFrame1(unsigned frameSize, struct timeval > presentationTime) > { > fFrameSize = frameSize; > fPresentationTime = presentationTime; > afterGetting(this); > } > > > Your filter also needs to be setting "fNumTruncatedBytes" and > "fDurationInMicroseconds" (in your case, because you're making a direct > copy, these will be the values of the "numTruncatedBytes" and > "durationInMicroseconds" parameters, respectively). > > > In my main server play() function, I put in this code: > > FramedSource* videoES = fileSource; > // Create a framer for the Video Elementary Stream: > videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES); > historyFilter = HistoryFilter::createNew(*env, videoSource, "history.mp4"); > // Finally, start playing: > *env << "Beginning to read from file...\n"; > videoSink->startPlaying(*historyFilter, afterPlaying, videoSink); > > > That looks fine - but don't forget to enter the event loop, by calling > env->taskScheduler().doEventLoop(); > at the end of this code, otherwise nothing will happen. (In a > LIVE555-based application, almost everything gets done within the event > loop). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Dec 22 00:36:45 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 22 Dec 2011 00:36:45 -0800 Subject: [Live-devel] Presentaition time problem at H264 streams In-Reply-To: References: Message-ID: On Sat, Dec 17, 2011 at 5:19 AM, 6.45 6.45.Vapuru <6.45.vapuru at gmail.com>wrote: > Hi > > I modify the OpenRtspClient so that > > -- Now instead of writing frames to file I collect them in a queue > with incoming presenttaion times > -- Then give the h264 frames to MP4 muxer [ Geraint Davies MP4 mux filter] > -- Finally write muxed data to file... > > So I can able to save h264 stream into MP4 container... > But the problem is that, some of the recorded data [NOT all of them] > has wrong values for time duration: > Suppose that a 10 minute record seems that it was 12 h stream... > VLC play the 10 minute that play last frame for the remaing time. > First thing that comes to mind is you have an obvious overflow issue: #define TIMEVAL_TO_REFERENCE_TIME(x) ((__int64)(x.tv_sec * 1000000) + x.tv_usec) * 10 ...x.ty_sec and x.ty_usec are 32-bit signed (long). You multimply x.tv_sec by 1,000,000, and *then* cast to __int64. So you'll overflow that in about ~2147 seconds. You need to cast to __int64 *first* and then do your multiplication. This probably explains the weird jumping. -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Thu Dec 22 01:39:29 2011 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Thu, 22 Dec 2011 11:39:29 +0200 Subject: [Live-devel] Presentaition time problem at H264 streams In-Reply-To: References: Message-ID: Thanks Jeremy .. You are right...Fool me..I make the classic mistake... And this may cause my timestamp radical jumps...I will test it... Best Wishes 2011/12/22 Jeremy Noring : > On Sat, Dec 17, 2011 at 5:19 AM, 6.45 6.45.Vapuru <6.45.vapuru at gmail.com> > wrote: >> >> Hi >> >> I modify the OpenRtspClient so that >> >> -- Now instead of writing frames to file I collect them in a queue >> with incoming presenttaion times >> -- Then give the h264 frames to MP4 muxer [ Geraint Davies MP4 mux filter] >> -- Finally write muxed data to file... >> >> So I can able to save h264 stream into MP4 container... >> But the problem is that, some of the recorded data [NOT all of them] >> has wrong values for time duration: >> Suppose that a 10 minute record seems that it was 12 h stream... >> VLC play the 10 minute that play last ?frame for the remaing time. > > > First thing that comes to mind is you have an obvious overflow issue: > > > #define TIMEVAL_TO_REFERENCE_TIME(x) ((__int64)(x.tv_sec * 1000000) + > x.tv_usec) * 10 > > ...x.ty_sec and x.ty_usec are 32-bit signed (long).? You multimply x.tv_sec > by 1,000,000, and *then* cast to __int64.? So you'll overflow that in about > ~2147 seconds.? You need to cast to __int64 *first* and then do your > multiplication.? This probably explains the weird jumping. > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From jshanab at smartwire.com Thu Dec 22 06:12:59 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 22 Dec 2011 14:12:59 +0000 Subject: [Live-devel] Live555 EventLoop crash In-Reply-To: <0376AECF-6591-4B08-B6E6-E524BEADAE51@live555.com> References: <0376AECF-6591-4B08-B6E6-E524BEADAE51@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AB2A6@IL-BOL-EXCH01.smartwire.com> Above the live 555 libs I have my own frame class. It is a simple RAII data class with payload, a bit of byte alignment, and some metadata like size and type. I use a reference counted pointer to this. This allows my multiple subscribers to keep lists of pointers to frames, they each have their own list and not worry about memory management and there is minimal memcopying around. Another important detail is this makes sure he who created the memory deletes the memory. This is to keep windows happy, If memory is allocated in one heap and attempted to delete it from code running in a different heap, windows throws an access violation, especially in debug mode! From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 21, 2011 10:54 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 EventLoop crash 2. One of the biggest performance hits in my profiling is memcpy (I use an embedded platform, so memcpy gets pricy fast), much of it due to copying media buffers. Would you ever consider adding (or consider accepting ;) code that allows live555 to work in the calling library's buffers instead of its own? If I were to go back in time and design the LIVE555 code all over again, then this is something that I might well have done differently. In any case, sometime in the future, when I replace/reimplement the crusty old "groupsock" library (so that network interfaces become proper "FramedSource" and "MediaSink" objects, and we can also support IPv6), I will likely need to rethink the buffering mechanism, otherwise we'll be introducing an extra memcpy when transmitting data over a network socket. But that's for sometime in the future (because any buffering changes will be a *major* change to the code, involving a *lot* more than a simple patch). (in other words, I give Live555 a pointer to a buffer to send and the size, rather than memcpy'ing the buffer into live555's space) Unfortunately it wouldn't be quite that simple. Suppose, for example, we're outputting data into a "RTPSink". If the upstream object is providing it a buffer, it would also need to know that it needs to leave sufficient space at the front of the buffer, so that the "RTPSink" can add the 12-byte RTP header (plus any required RTP header extension bytes). (This wouldn't be necessary if the network interface supports 'scatter gather I/O', but we can't assume this in general.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2109/4695 - Release Date: 12/21/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 22 06:49:00 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Dec 2011 06:49:00 -0800 Subject: [Live-devel] How to disable RTCP timestamp calculations on single-stream sessions? [OpenRtspClient] In-Reply-To: References: Message-ID: <8468D767-0500-43F2-950E-23EE0F198A5C@live555.com> > My RTSP Source's RTCP SR are not reliable... You are probably mistaken. > So calculated timestamps frequently resulting in large negative jumps. Let me guess: You're receiving H.264 (or MPEG) video that uses "P" (prediction) frames. For video streams like this, frames are sent (and thus received) in 'decoding order' - i.e., the order that they will be fed to a decoder, not 'presentation order' - the order that they will be displayed on the screen. In this case, it is normal (and expected) that the received frames' presentation times will not all be monotonically increasing. The incoming frames should be fed into your decoder in the order that they arrive, but their presentation times will tell you when they should be displayed. But in any case, RTCP is an important part of the RTP/RTCP protocol, and should not be disabled. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Thu Dec 22 10:05:18 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 22 Dec 2011 18:05:18 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> http://appleservice.my-mvs.com:88/SampleSegment/segment1(7).zip From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 21, 2011 3:44 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream I am trying to create a ts file from h264 video from a live source. Everything looks identical to streams that work from a different muxer but when these streams are segmented and served to the Apple mediastreamvalidator, it complains with "error: (-12976) unable to parse segment as either MPEG-2 ts stream ...) I am quite stumped. Any ideas of what could be wrong with the .ts files generated? They play fine in VLC. Please put an example of one of these .ts files on a web server, and send us the URL, so we can download and take a look at it ourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2109/4694 - Release Date: 12/21/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 22 11:51:45 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Dec 2011 11:51:45 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> Message-ID: > http://appleservice.my-mvs.com:88/SampleSegment/segment1(7).zip Sorry, but this site (appleservice.my-mvs.com:88) is inaccessible. (I also tried port 80.) And why on Earth are you "zip"ing a Transport Stream file? It's already compressed. It can't be hard to find a publically-visible web server that you can use. Does your company (SmartWire?) not have one? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Thu Dec 22 12:14:45 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 22 Dec 2011 20:14:45 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> OK, I posted on mediaFire. (IT dept decided to redo routers today!, zip was their requirement also :) only recognized file extensions.) http://www.mediafire.com/?3nu8da7gt14vkfh From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, December 22, 2011 1:52 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream http://appleservice.my-mvs.com:88/SampleSegment/segment1(7).zip Sorry, but this site (appleservice.my-mvs.com:88) is inaccessible. (I also tried port 80.) And why on Earth are you "zip"ing a Transport Stream file? It's already compressed. It can't be hard to find a publically-visible web server that you can use. Does your company (SmartWire?) not have one? Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2109/4696 - Release Date: 12/22/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Beth.Turk at drs-c3a.com Thu Dec 22 12:22:37 2011 From: Beth.Turk at drs-c3a.com (Turk, Beth (SA-1)) Date: Thu, 22 Dec 2011 15:22:37 -0500 Subject: [Live-devel] question Message-ID: Does this code allow tagging of the incoming video data? Beth Turk DRS C3A Engineering 246 Airport Road Johnstown, PA 15904 beth.turk at drs-c3.com Office: 814-534-8705 Cell:814-242-4943 -------------- next part -------------- An HTML attachment was scrubbed... URL: From nmenne at dcscorp.com Thu Dec 22 14:08:10 2011 From: nmenne at dcscorp.com (Menne, Neil) Date: Thu, 22 Dec 2011 17:08:10 -0500 Subject: [Live-devel] Buffer Question Message-ID: I'm working on updating a couple video widgets for my project; and it seems like Live555 is much more robust for all things RTP/RTSP which is how I get most of my video. I'm trying to forward this on to FFmpeg for decoding. I just finished updating the FFmpeg layer to permit this behavior, and now I'm onto the Live555 layer. I've been following along with the openRTSP example as it has most of what I need (probably all of it, and I'm just not seeing it). Here's where I'm stuck: After I get the subsession relating to video started (via: "mSubsession->sink->startPlaying(*(mSubsession->readSource()),handlePlay,mSubsession);"), how do I get a populated frame/buffer to pass along to FFmpeg? Every other layer of this software so far has exceeded my expectations; this is the final piece of the proverbial puzzle. Thanks in advance, Neil -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 22 18:52:35 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Dec 2011 18:52:35 -0800 Subject: [Live-devel] question In-Reply-To: References: Message-ID: > Does this code allow tagging of the incoming video data? > Can you explain more what you mean by "tagging"? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 22 19:00:14 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Dec 2011 19:00:14 -0800 Subject: [Live-devel] Buffer Question In-Reply-To: References: Message-ID: <3860EBFE-DB0F-4BC8-8A99-9EBB6CBAE9E6@live555.com> > Here?s where I?m stuck: > After I get the subsession relating to video started (via: ?mSubsession->sink->startPlaying(*(mSubsession->readSource()),handlePlay,mSubsession);?), how do I get a populated frame/buffer to pass along to FFmpeg? The incoming data (one frame at a time) is fed into your data sink object (i.e., some subclass of "MediaSink"). In the "openRTSP", the data sink is a "FileSink" (because the purpose of the "openRTSP" application is to write incoming data into a file). In your case, however, you will need to write your own subclass of "MediaSink" that passes incoming frames to FFmpeg for decoding. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 02:03:52 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 02:03:52 -0800 Subject: [Live-devel] New demo application "testRTSPClient" (in "testProgs"); a better RTSP client model than "openRTSP" Message-ID: <6F013F18-E4D6-437B-A277-E903349174CB@live555.com> Several people have had trouble developing their own RTSP client applications using the LIVE555 code, and this is partly my fault, because "openRTSP" - the only RTSP client application code that we've made available - is not a very good model for how to write a RTSP client application, for several reasons: 1/ "openRTSP" is quite complex, with several command-line options for a lot of rather obscure functionality. 2/ "openRTSP" shares a lot of code with the "playSIP" application - and this makes the code hard to understand. 3/ The "openRTSP" code was specifically designed to be a complete, standalone application - not something that would be embedded into another application. 4/ Because "openRTSP" was designed to be a standalone application for streaming one "rtsp://" URL, it does not illustrate how you could stream *multiple* "rtsp://" URLs concurrently from within the same (single-threaded) application. To fix this, I've given you all a Christmas present: A new demo application (in the "testProgs" directory) called "testRTSPClient". "testRTSPClient" is simple: It takes one or more "rtsp://" URLs as command-line arguments, and streams each URL, concurrently. Unlike "openRTSP", it has no command-line options. Also, unlike "openRTSP", it doesn't output (or do anything else with) the audio/video data that it receives; it just receives the data into a buffer (and outputs - to the console - a message describing what was received). I.e., unlike "openRTSP", "testRTSPClient" is not very useful as an application by itself. It's main purpose is to illustrate how you, as an application developer, can use the "RTSPClient" interface within your own application. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 02:51:42 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 02:51:42 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> Message-ID: <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> OK, I couldn't see anything obviously wrong with this TS file. It played OK in VLC, and I was able to stream it (out and in) OK via RTSP/RTP, using our software. But when I tried streaming it to an iPhone via HTTP Live Streaming from "live555MediaServer" (after indexing the file), it didn't work at all. Unfortunately I don't know whether this is because there's something about your H.264 profile that's making the iPhone unhappy, or whether this is because there's something about the TS file that's making the iPhone (using HTTP Live Streaming) unhappy. So, your next task is to find out - via appropriate Apple mailing list(s) and/or forum(s) - what is going wrong. (You're not allowed to post on *this* mailing list any more about this issue until you've done that :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 23 05:33:36 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 23 Dec 2011 13:33:36 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> I wanna thank you for looking, I thought I was going nuts. I did get it to stream! It turned out to be 1 bit. Like I mentioned before, I had posted on that list a week before posting here!. I got a reply when I posted the ts file. The developer who wrote the mediastreamvalidator took one look and pointed to the timestamps! In trying to make it work, one of the things I did was enable DTS in the PES headers and put the DTS in there, What I missed was that the first 4 bits change from 2 for the PTS to 3 for the PTS and 1 for the DTS. This : // Fill in the PES PTS (from our SCR): fInputBuffer[9] = 0x20|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; fInputBuffer[10] = fSCR.remainingBits>>22; fInputBuffer[11] = (fSCR.remainingBits>>14)|0x01; fInputBuffer[12] = fSCR.remainingBits>>7; fInputBuffer[13] = (fSCR.remainingBits<<1)|0x01; Had to change to this: // Fill in the PES PTS (from our SCR): fInputBuffer[9] = 0x30|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; fInputBuffer[10] = fSCR.remainingBits>>22; fInputBuffer[11] = (fSCR.remainingBits>>14)|0x01; fInputBuffer[12] = fSCR.remainingBits>>7; fInputBuffer[13] = (fSCR.remainingBits<<1)|0x01; // Fill in the PES DTS (from our SCR): fInputBuffer[14] = 0x10|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; fInputBuffer[15] = fSCR.remainingBits>>22; fInputBuffer[16] = (fSCR.remainingBits>>14)|0x01; fInputBuffer[17] = fSCR.remainingBits>>7; fInputBuffer[18] = (fSCR.remainingBits<<1)|0x01; Understanding that I have gorilla subclassed the MPEG2TransportStreamFromESSource and MPEG2TransportStreamMultiplexor into MPEG2TransportStreamFromESSource4iOS and MPEG2TransportStreamMultiplexor4iOS that makes my segmenter trivial. I now have a RTSP and my own PUSH RTSP along with HTTP multiple in, out, record and HLS streaming built on mongoose and the live555 libs. The apple validator still throws a warning about timestamps which I think is the cause of it to not play back smoothly after the first 3 segments, but I am working on it. PS, I will look at that Xmas present, sounds great! I was thinking of writing one in my spare time (*cough*) From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 23, 2011 4:52 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream OK, I couldn't see anything obviously wrong with this TS file. It played OK in VLC, and I was able to stream it (out and in) OK via RTSP/RTP, using our software. But when I tried streaming it to an iPhone via HTTP Live Streaming from "live555MediaServer" (after indexing the file), it didn't work at all. Unfortunately I don't know whether this is because there's something about your H.264 profile that's making the iPhone unhappy, or whether this is because there's something about the TS file that's making the iPhone (using HTTP Live Streaming) unhappy. So, your next task is to find out - via appropriate Apple mailing list(s) and/or forum(s) - what is going wrong. (You're not allowed to post on *this* mailing list any more about this issue until you've done that :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1890 / Virus Database: 2109/4697 - Release Date: 12/22/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 06:45:45 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 06:45:45 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> Message-ID: <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> > This : > // Fill in the PES PTS (from our SCR): > fInputBuffer[9] = 0x20|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; > fInputBuffer[10] = fSCR.remainingBits>>22; > fInputBuffer[11] = (fSCR.remainingBits>>14)|0x01; > fInputBuffer[12] = fSCR.remainingBits>>7; > fInputBuffer[13] = (fSCR.remainingBits<<1)|0x01; > > Had to change to this: > > // Fill in the PES PTS (from our SCR): > fInputBuffer[9] = 0x30|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; > fInputBuffer[10] = fSCR.remainingBits>>22; > fInputBuffer[11] = (fSCR.remainingBits>>14)|0x01; > fInputBuffer[12] = fSCR.remainingBits>>7; > fInputBuffer[13] = (fSCR.remainingBits<<1)|0x01; > > // Fill in the PES DTS (from our SCR): > fInputBuffer[14] = 0x10|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; > fInputBuffer[15] = fSCR.remainingBits>>22; > fInputBuffer[16] = (fSCR.remainingBits>>14)|0x01; > fInputBuffer[17] = fSCR.remainingBits>>7; > fInputBuffer[18] = (fSCR.remainingBits<<1)|0x01; Was that the only change that you needed to make to our code in order to get your TS file to work? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsaintmartin at mediabroadcast-t.com Fri Dec 23 06:54:32 2011 From: bsaintmartin at mediabroadcast-t.com (Boris Saint-Martin) Date: Fri, 23 Dec 2011 15:54:32 +0100 Subject: [Live-devel] Video file duration problem Message-ID: <8ABEF13CFF6649DF8AB18D39EABEBB5C@Boris> Hi, I have a problem with the recorded video file duration. I use openRTSP with the "-d" parameter. For exemple when I set "-d 30" my video file duration is around 53 seconds. Does the stream quality can influence my record duration ? Beacause when I look for the video recorded it seems to lag. Any idea ? Thanks in advance. Boris -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 23 07:01:48 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 23 Dec 2011 15:01:48 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> No, Imade a lot of changes back and forth trying to figure it out so I am not sure what is actually required and what is not. I was looking at streams from two other sources that worked with the ipad and they for example always had a PTS and DTS. They also had PID's > 255 for the PMT and so I changed that capability in the Bit logic. Also the standard says each segment "should" start with a PAT then PMT so I changed the code to do that. I am currently back to the darn stack overflow, as soon as I make it faster by removing extra copying and diagnostic printf's. but once I have these two classes working well....more to come From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 23, 2011 8:46 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream This : // Fill in the PES PTS (from our SCR): fInputBuffer[9] = 0x20|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; fInputBuffer[10] = fSCR.remainingBits>>22; fInputBuffer[11] = (fSCR.remainingBits>>14)|0x01; fInputBuffer[12] = fSCR.remainingBits>>7; fInputBuffer[13] = (fSCR.remainingBits<<1)|0x01; Had to change to this: // Fill in the PES PTS (from our SCR): fInputBuffer[9] = 0x30|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; fInputBuffer[10] = fSCR.remainingBits>>22; fInputBuffer[11] = (fSCR.remainingBits>>14)|0x01; fInputBuffer[12] = fSCR.remainingBits>>7; fInputBuffer[13] = (fSCR.remainingBits<<1)|0x01; // Fill in the PES DTS (from our SCR): fInputBuffer[14] = 0x10|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01; fInputBuffer[15] = fSCR.remainingBits>>22; fInputBuffer[16] = (fSCR.remainingBits>>14)|0x01; fInputBuffer[17] = fSCR.remainingBits>>7; fInputBuffer[18] = (fSCR.remainingBits<<1)|0x01; Was that the only change that you needed to make to our code in order to get your TS file to work? Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4698 - Release Date: 12/23/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 07:24:24 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 07:24:24 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> Message-ID: <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> > No, Imade a lot of changes back and forth trying to figure it out so I am not sure what is actually required and what is not. Could you please try to figure out the smallest set of changes that you needed to make to our code in order to get your TS file to work - and then send us a "diff" file, so I can make equivalent changes in our released code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 07:31:27 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 07:31:27 -0800 Subject: [Live-devel] Video file duration problem In-Reply-To: <8ABEF13CFF6649DF8AB18D39EABEBB5C@Boris> References: <8ABEF13CFF6649DF8AB18D39EABEBB5C@Boris> Message-ID: <7912F753-190A-4F4A-9F59-9E457AAA5CF2@live555.com> > I have a problem with the recorded video file duration. > I use openRTSP with the "-d" parameter. > For exemple when I set "-d 30" my video file duration is around 53 seconds. > Does the stream quality can influence my record duration ? Beacause when I look for the video recorded it seems to lag. You didn't say what "openRTSP" options you used to record your file, but I'm assuming that you used the "-4" or "-q" options to record to a ".mp4" or ".mov"-format file. Note that if you use these options, then the -f option is very important. Otherwise the file will be written assuming a frame rate of 15 frames-per-second, which is probably not what you want in your case. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 23 13:08:55 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 23 Dec 2011 21:08:55 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> I don't have time to do the diff and I think there may be enough differences that a refactor into a base and derived class is probably warranted. The philosophy is a bit strange. I need to continue for a jan release but wanted to send you this before the holidays, (including the warts.) 1) It is written only for H264, and my [7][8][5] keyframes are aggregated. 2) Both the [7][8][5] keyframe and the [1] diff frames are prefixed with the access_unit_delimeter nal [9] 3) It is more frame by frame that by low water mark, As small resolution video was coming in under the 1000 setting and causing me some trouble. Keeping it frame aligned for me worked because it made it easy to detect the keyframe. If we had a pointer to a struct that contained the buffer, then we could have metadata like the sourcerec does, a little tweaking and that may work using the sourceRec. Speaking of which, I temporarily made that array a LOT bigger because I needed to implement the whole standard and have 13 bit PID's. Would like to try a map as a sparse array. 4) The segmentDurationSecs_ is a static int at the moment, this obviously needs to be part of the CTOR 5) So what I am doing is detecting the keyframe and when I have enough keyframes to fill a segment, inserting a PAT then a PMT. I use mod so it always starts with a PAT and PMT. I have in my code an assumption of a keyframe per second, so this is really a keyframe count not seconds and it will be changed and will be enforced that segment size is an even multiple of the keyframes. Then my segmenter is trivail, all I have to do is detect the PAT and start a new segment file and add it to my index. Obviously this is a lot higher freq than the original class design, but apple likes it that way, every segment start with it. http://pastebin.com/PZvXpE7R http://pastebin.com/VHdMP8PQ http://pastebin.com/C7ufnTd8 http://pastebin.com/jE1KiYsa I didn't change any headers but I should of but a modified for iOS by... If you do, my name is spelled Jeff Shanab :) From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 23, 2011 9:24 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream No, Imade a lot of changes back and forth trying to figure it out so I am not sure what is actually required and what is not. Could you please try to figure out the smallest set of changes that you needed to make to our code in order to get your TS file to work - and then send us a "diff" file, so I can make equivalent changes in our released code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4698 - Release Date: 12/23/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 15:02:07 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 15:02:07 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> Message-ID: OK, thanks - this should give me enough to work with. To make my job a bit easier, though, could you send my (via a web site) two more files: 1/ An example of a Transport Stream file - created by your code - that works OK (i.e., plays on an iPhone) with HTTP Live Streaming, and 2/ The raw H.264 Video Elementary Stream file (i.e., ".264"-format) that you used to produce the Transport Stream file in 1. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yanfei_1 at yeah.net Mon Dec 19 22:52:39 2011 From: yanfei_1 at yeah.net (=?GBK?B?48a3yQ==?=) Date: Tue, 20 Dec 2011 14:52:39 +0800 (CST) Subject: [Live-devel] live-devel Digest, Vol 98, Issue 25 In-Reply-To: References: Message-ID: <2739dcca.7118.1345a3e4c43.Coremail.yanfei_1@yeah.net> I useone thread to handle16data requests, meaning thatdoEventloophandle16to acceptthe data.Butin the experiment, I founda very highCPU usage,memoryusage is Continued growth.Do not knowwhy?I thinkthe problemisthe buffermemorysettings, but Itriedseveralmethodshave notchanged,I do not knowhow tochangeit? can you give me a hand? Thank you -- At 2011-12-20 09:46:00,live-devel-request at ns.live555.com wrote: >Send live-devel mailing list submissions to > live-devel at lists.live555.com > >To subscribe or unsubscribe via the World Wide Web, visit > http://lists.live555.com/mailman/listinfo/live-devel >or, via email, send a message with subject or body 'help' to > live-devel-request at lists.live555.com > >You can reach the person managing the list at > live-devel-owner at lists.live555.com > >When replying, please edit your Subject line so it is more specific >than "Re: Contents of live-devel digest..." > > >Today's Topics: > > 1. Re: Live555 EventLoop Crash (Ross Finlayson) > 2. SET_PARAMETER doesn't get handled in the RTSPServer over TCP > (Jer Morrill) > 3. Re: SET_PARAMETER doesn't get handled in the RTSPServer over > TCP (Ross Finlayson) > 4. question regarding recieving & unpacking TS over RTP, > generated from testH264VideoToTransportStream (Ken Dunne) > 5. Re: SET_PARAMETER doesn't get handled in the RTSPServer over > TCP (Jer Morrill) > > >---------------------------------------------------------------------- > >Message: 1 >Date: Mon, 19 Dec 2011 02:51:55 -0800 >From: Ross Finlayson >To: LIVE555 Streaming Media - development & use > >Subject: Re: [Live-devel] Live555 EventLoop Crash >Message-ID: <02EA1F2B-C632-4833-8D86-C801A0EBFFEC at live555.com> >Content-Type: text/plain; charset="windows-1252" > >> No I don?t see this error message, however I am seeing continuous truncations, almost every frame is truncated. My own debug output looks like this, at a rate of around 4 frames per second:- >> deliverFrame(): newFrameSize:216054, fNumTruncatedBytes:66055 >> deliverFrame(): newFrameSize:108994, fNumTruncatedBytes:98217 >> deliverFrame():newFrameSize:96844, fNumTruncatedBytes :45293 >> >> I?ve tried increasing OutPacketBuffer::maxSize from 1000000 to 5000000 (5 million) but this has no effect. Am I just trying to stream too much data, too quickly? > >No, the problem has nothing to do with 'speed'. There's a bug in your code somewhere. Sorry. > > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ > >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: > >------------------------------ > >Message: 2 >Date: Mon, 19 Dec 2011 23:08:13 +0000 >From: Jer Morrill >To: "live-devel at lists.live555.com" >Subject: [Live-devel] SET_PARAMETER doesn't get handled in the > RTSPServer over TCP >Message-ID: > <80C795F72B3CB241A9256DABF0A04EC5F96DDA at SN2PRD0702MB101.namprd07.prod.outlook.com> > >Content-Type: text/plain; charset="us-ascii" > >Many apologies if this is a dupe, I don't think I properly signed up for the list before sending this earlier so I'm not sure if the email was "lost" or not. > >First I want to thank everyone involved in this project for such a high quality library. This is surely open-source done right! > >I am running the 12-2-2011 build of Live555. When I run RTSPClient::sendSetParameter(...) with a session that is running UDP to a live555 server implementation, the server parses and successfully runs the RTSPClientSession:: handleCmd_SET_PARAMETER (overridden in a subclass). If I run RTSPClient:sendSetParameter(...) with a session that is running via TCP, it does not ever get to the RTSPClientSession::handleCmd_SET_PARAMETER. Seems sending trickplay commands work fine in both TCP and UDP. > >I've searched the email list and found a few issues similar to this that were fixed, but couldn't find anyone reporting this afterwards. I can provide more debug details if needed, but it seems RTSPServer::RTSPClientSession::handleRequestBytes(...) doesn't successfully parse the full SET_PARAMETER command when using TCP (or is a symptom of something else). > >Thanks for any help and again, for the wonderful code! > >-Jer > >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: > >------------------------------ > >Message: 3 >Date: Mon, 19 Dec 2011 15:44:37 -0800 >From: Ross Finlayson >To: LIVE555 Streaming Media - development & use > >Subject: Re: [Live-devel] SET_PARAMETER doesn't get handled in the > RTSPServer over TCP >Message-ID: <1EB78808-10ED-490E-BCA1-6A614FE88CEB at live555.com> >Content-Type: text/plain; charset="windows-1252" > >> I am running the 12-2-2011 build of Live555. When I run RTSPClient::sendSetParameter(?) with a session that is running UDP to a live555 server implementation, the server parses and successfully runs the RTSPClientSession:: handleCmd_SET_PARAMETER (overridden in a subclass). If I run RTSPClient:sendSetParameter(?) with a session that is running via TCP, it does not ever get to the RTSPClientSession::handleCmd_SET_PARAMETER. Seems sending trickplay commands work fine in both TCP and UDP. > >That's strange. (It's especially strange that other commands - e.g., "PLAY" - work OK for you in RTP-over-TCP mode, but that "SET_PARAMETER" does not.) > >> I can provide more debug details if needed > >Yes, please add >#define DEBUG 1 >to the start of "liveMedia/RTSPServer.cpp", recompile and rerun your server, and access it both with a RTP-over-UDP client, and with a RTP-over-TCP client. Please send us the debugging output in each case. > > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ > >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: > >------------------------------ > >Message: 4 >Date: Mon, 19 Dec 2011 17:10:40 -0800 (PST) >From: Ken Dunne >To: "live-devel at lists.live555.com" >Cc: "guru at starseedsoft.com" >Subject: [Live-devel] question regarding recieving & unpacking TS over > RTP, generated from testH264VideoToTransportStream >Message-ID: > <1324343440.41379.YahooMailNeo at web1204.biz.mail.gq1.yahoo.com> >Content-Type: text/plain; charset="iso-8859-1" > > >I've used the example "testH264VideoToTransportStream" as a basis for generating a H264-TS-over-RTP stream of packets. > >I have used Wireshark to view the stream of RTP packets, and they appear to be correctly filled with an integer number of 188 byte Transport-Stream packets. > >I am trying to further validate this stream, and have used "testMPEG1or2VideoReceiver.cpp", and modified it to use the MP2T format (33) in the call to: > >????? sessionState.source = MPEG1or2VideoRTPSource::createNew( *env, &rtpGroupsock, 33, 90000 ); > >This saves a file, but i am unable to play it with any player that i know of. > >Does anyone have any suggestions? > >Ken >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: > >------------------------------ > >Message: 5 >Date: Tue, 20 Dec 2011 01:25:29 +0000 >From: Jer Morrill >To: LIVE555 Streaming Media - development & use > >Subject: Re: [Live-devel] SET_PARAMETER doesn't get handled in the > RTSPServer over TCP >Message-ID: > <80C795F72B3CB241A9256DABF0A04EC5F96E11 at SN2PRD0702MB101.namprd07.prod.outlook.com> > >Content-Type: text/plain; charset="us-ascii" > >Thanks for the quick response! > >I have attached a client debug output and server debug output text files (hope that's ok for this list). These are from the same rtsp session. > >Though the log does say RTSPServer::RTSPClientSession::handleRequestBytes(...) does say "parseRTSPRequestString() succeed...", it does not pass the next check: > >if (ptr + newBytesRead < tmpPtr + 2 + contentLength) break; // we still need more data; subsequent reads will give it to us > >At the point of the content length check the buffer looks like this: > >SET_PARAMETER rtsp://127.0.0.1/media?dev=1&source=archive&startTime=129661319116046065/ RTSP/1.0 >CSeq: 6 >User-Agent: HJT RTSP Client (LIVE555 Streaming Media v2011.12.02) >Session: CA195293 >Content-Length: 30 > >The complete command should also have (\r's and \n's excluded): >GotoTime: 1234567890 > >It does get these "GotoTime..." bytes afterwards the next calls to "handleRequestBytes", but the server never successfully parses the SET_PARAMETER command in RTP over TCP. Totally weird other messages get through just fine. > >Thanks again for all your help! Hope this is reproducible for you as I hate to send anyone on a wild goose chase! > >-Jer > > > >From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson >Sent: Monday, December 19, 2011 3:45 PM >To: LIVE555 Streaming Media - development & use >Subject: Re: [Live-devel] SET_PARAMETER doesn't get handled in the RTSPServer over TCP > >I am running the 12-2-2011 build of Live555. When I run RTSPClient::sendSetParameter(...) with a session that is running UDP to a live555 server implementation, the server parses and successfully runs the RTSPClientSession:: handleCmd_SET_PARAMETER (overridden in a subclass). If I run RTSPClient:sendSetParameter(...) with a session that is running via TCP, it does not ever get to the RTSPClientSession::handleCmd_SET_PARAMETER. Seems sending trickplay commands work fine in both TCP and UDP. > >That's strange. (It's especially strange that other commands - e.g., "PLAY" - work OK for you in RTP-over-TCP mode, but that "SET_PARAMETER" does not.) > > >I can provide more debug details if needed > >Yes, please add >#define DEBUG 1 >to the start of "liveMedia/RTSPServer.cpp", recompile and rerun your server, and access it both with a RTP-over-UDP client, and with a RTP-over-TCP client. Please send us the debugging output in each case. > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ > >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: >-------------- next part -------------- >An embedded and charset-unspecified text was scrubbed... >Name: clientdebugout.txt >URL: >-------------- next part -------------- >An embedded and charset-unspecified text was scrubbed... >Name: serverdebugout.txt >URL: > >------------------------------ > >_______________________________________________ >live-devel mailing list >live-devel at lists.live555.com >http://lists.live555.com/mailman/listinfo/live-devel > > >End of live-devel Digest, Vol 98, Issue 25 >****************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 23 15:48:23 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 23 Dec 2011 23:48:23 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> Well the segment is easy http://www.mediafire.com/?3nu8da7gt14vkfh I use "MPEG-2 TS packet analyser 2.3.0.1 on windows to inspect the segments. But there is no .h264 source file. This is pulled from an rtsp source(IP camera) or comes from a push RTSP source. I made an ESSource based on the DeviceSource that I call an addFrame member function I created. It was done this way because I "subscribe" multiple consumers to the one stream (Pull,push, archive, or transcode) and they all get a pointer to the reference counted frame. Our archiver records into our own file format, rather raw and generic. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 23, 2011 5:02 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream OK, thanks - this should give me enough to work with. To make my job a bit easier, though, could you send my (via a web site) two more files: 1/ An example of a Transport Stream file - created by your code - that works OK (i.e., plays on an iPhone) with HTTP Live Streaming, and 2/ The raw H.264 Video Elementary Stream file (i.e., ".264"-format) that you used to produce the Transport Stream file in 1. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4698 - Release Date: 12/23/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 16:09:47 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 16:09:47 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> Message-ID: On Dec 23, 2011, at 3:48 PM, Jeff Shanab wrote: > Well the segment is easy > > http://www.mediafire.com/?3nu8da7gt14vkfh No, that's the Transport Stream file that you sent me yesterday - the one that *doesn't* work with HTTP Live Streaming. I'd like an example of a Transport Stream file (generated by your software) that *does* work with HTTP Live Streaming. > But there is no .h264 source file. This is pulled from an rtsp source(IP camera) OK. So, instead, could you please run the following (using our (unmodified) "openRTSP" tool): openRTSP -d 30 After 35 seconds or so, "openRTSP" will exit, leaving you a file named (something like) "video-H264-1". Could you please send me that file (via a web site). (My goal here is to have a H.264 file that I know that - if processed with our "testH264VideoToTransportStream" demo application, after I've modified our Transport Stream multiplexing code based on your changes - will produce a Transport Stream file that works with HTTP Live Streaming.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 23 23:39:09 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Dec 2011 23:39:09 -0800 Subject: [Live-devel] How to safely close the openRtsp? In-Reply-To: References: Message-ID: <4F089224-8957-492C-BC7E-A1A1B7213225@live555.com> > The openRtsp has designed to be close rudely by the CTRL+C or console windows's close button. There's nothing "rude" about terminating a process; a process doesn't have feelings. The only problem with simply terminating a process that's acting as a RTSP client is that it will not stop the server from continuing to transmit the stream's data. (Eventually, the server will usually 'time out' the stream, because of lack of 'liveness' by the client, but it will continue to transmit data in the meantime.) So ideally you should also send a RTSP "TEARDOWN" command to the client, telling it to stop streaming. If your RTSP client is a standalone process (like "openRTSP"), then all you need to do is close the output 'sink' objects (e.g., output files), and then send a RTSP "TEARDOWN" command. Then you can just call "exit()" to terminate the process. If, however, your RTSP client is part of an application that you (for whatever reason) do not want to terminate, then you have to do more. For guidance, I suggest that you look at the new "testRTSPClient" application (*not* the "openRTSP" code) - in particular, the implementation of the "shutdownStream()" function. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Sat Dec 24 08:20:39 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Sat, 24 Dec 2011 16:20:39 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1ABFAF@IL-BOL-EXCH01.smartwire.com> WTH? http://www.mediafire.com/?kyccwby0kxy70q5 I have a bunch of files on mediafire and I guess the link in my clipboard was not updated when I cut and pasted and I got the old one. PFFFT openRTSP won't build on my win7 dev machine, something wrong with newer command line build tools on windoze. I built and ran it on the mac but it ends up with a zero length file showing a teardown immediately after sending the play with this encoder. Very similar code in my app works great. Embedded devices are sometimes latency challenged. So I used vlc with the -sout file/es:CaptureH264.es option http://www.mediafire.com/?ss2imm337b8y53i It is xmas eve morning so not much traffic on the road outside work. This capture is raw so doesn't necessarily start at a keyframe. I hope that suffices. Counting there are 36 keyframes and 998 diframes. This is a 30fps GOP=30 stream of 704x480 H264 baseline. .Should be one keyframe per second. Diffframes are not exactly = to 1044 for two reasons: cut off at end probably and NTSC timing correction, some seconds have only 28 diff frames. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 23, 2011 6:10 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream On Dec 23, 2011, at 3:48 PM, Jeff Shanab wrote: Well the segment is easy http://www.mediafire.com/?3nu8da7gt14vkfh No, that's the Transport Stream file that you sent me yesterday - the one that *doesn't* work with HTTP Live Streaming. I'd like an example of a Transport Stream file (generated by your software) that *does* work with HTTP Live Streaming. But there is no .h264 source file. This is pulled from an rtsp source(IP camera) OK. So, instead, could you please run the following (using our (unmodified) "openRTSP" tool): openRTSP -d 30 After 35 seconds or so, "openRTSP" will exit, leaving you a file named (something like) "video-H264-1". Could you please send me that file (via a web site). (My goal here is to have a H.264 file that I know that - if processed with our "testH264VideoToTransportStream" demo application, after I've modified our Transport Stream multiplexing code based on your changes - will produce a Transport Stream file that works with HTTP Live Streaming.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4699 - Release Date: 12/23/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Dec 24 12:50:49 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 24 Dec 2011 12:50:49 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1ABFAF@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABFAF@IL-BOL-EXCH01.smartwire.com> Message-ID: > openRTSP won?t build on my win7 dev machine, something wrong with newer command line build tools on windoze. I built and ran it on the mac but it ends up with a zero length file showing a teardown immediately after sending the play with this encoder. That's strange. Could you send us the "openRTSP" debugging output (from running "openRTSP -d 30 "). I'm curious what it is about your camera that might cause this to happen. (And thanks for sending the TS and raw H.264 files.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Dec 25 02:00:33 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 25 Dec 2011 02:00:33 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> Message-ID: <21E7ED8E-8550-4FC1-980F-6BEF9C640473@live555.com> > (My goal here is to have a H.264 file that I know that - if processed with our "testH264VideoToTransportStream" demo application, after I've modified our Transport Stream multiplexing code based on your changes - will produce a Transport Stream file that works with HTTP Live Streaming.) Unfortunately I wasn't able to succeed in doing this. As an experiment, I modified our "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" implementations to exactly match what you did in your code. But even when I did this, I wasn't able to produce a Transport Stream file (from your "CaptureH264.es" file, using our "testH264VideoToTransportStream" demo application) that works with HTTP Live Streaming. So, until I know for sure what changes, if any, are necessary to the "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" code in order for HTTP Live Streaming to work, I'm going to hold off - at least for now - on making any changes to this code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhanm at join.net.cn Sun Dec 25 19:27:58 2011 From: zhanm at join.net.cn (=?gb2312?B?1bLD9w==?=) Date: Mon, 26 Dec 2011 11:27:58 +0800 Subject: [Live-devel] How to create multiple RTSP clients using threads for live video source playing at same time? Message-ID: <003c01ccc37e$5d88be00$189a3a00$@net.cn> Hi, I want to create a rtsp server that supports multiple live video source playing using threads. I have created my custom live source, sink and session object and modified MediaSubsession::initiate to handle it. I created multiple rtspclients using threads to use my own session. The problem is: I can play only one session. Do I need to create separate subsessions for each live channel playing using different urlsurfix? How can I synchronize between these clients? Best Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Dec 25 21:35:41 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 25 Dec 2011 21:35:41 -0800 Subject: [Live-devel] How to create multiple RTSP clients using threads for live video source playing at same time? In-Reply-To: <003c01ccc37e$5d88be00$189a3a00$@net.cn> References: <003c01ccc37e$5d88be00$189a3a00$@net.cn> Message-ID: > I want to create a rtsp server that supports multiple live video source playing using threads. I have created my custom live source, sink and session object and modified MediaSubsession::initiate to handle it. I created multiple rtspclients using threads to use my own session. The problem is: I can play only one session. Do I need to create separate subsessions for each live channel playing using different urlsurfix? How can I synchronize between these clients? Unfortunately I found your question a little confusing. It wasn't clear to me whether (1) your question was about a RTSP server (How to set up a RTSP server that supports multiple streams?), or whether (2) your question was about RTSP clients (How to create multiple, concurrent RTSP clients?). If your question is (1) (How to set up a RTSP server that supports multiple streams?), then I suggest that you look at the code for the "testOnDemandRTSPServer" for guidance, and also read the FAQ entries that discuss how to have a server that streams from live source(s). Note, in particular, that each stream (e.g., coming from a specific video+audio source) has its own "ServerMediaSession", with its own name (which gets used in the "rtsp://" URL that will be used to access the stream). Each particular track within the stream (e.g., audio, video, text) will have its own "ServerMediaSubsession". Once again, the "testOnDemandRTSPServer" code should make this clearer. If your question is (2) (How to create multiple, concurrent RTSP clients?), then there are several possible ways to do this: - Create multiple processes, each receiving its own "rtsp://" URL. For example, you can run multiple copies of the "openRTSP" application - which will require no extra programming at all. - Write a single application (i.e., process) that opens/receives multiple concurrent "rtsp://" URLs, using a single thread of control. See the code for the "testRTSPClient" demo application for guidance on how to do this. - Write a single application (i.e., process) that opens/receives multiple concurrent "rtsp://" URLs, using multiple threads (one thread for each "rtsp://" URL). This approach is possible, although not recommended. If you really want to do this, then you need to read and understand the FAQ entry about threads. (Note, in particular, that each thread *must* use a separate "TaskScheduler" and "UsageEnvironment".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Thu Dec 22 07:15:22 2011 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Thu, 22 Dec 2011 17:15:22 +0200 Subject: [Live-devel] How to disable RTCP timestamp calculations on single-stream sessions? [OpenRtspClient] In-Reply-To: <8468D767-0500-43F2-950E-23EE0F198A5C@live555.com> References: <8468D767-0500-43F2-950E-23EE0F198A5C@live555.com> Message-ID: Thanks... You are right It is not a good idea to ignore RTCP corrections... I have no problem with viewing...I just push incoming streams [ since they are from real time ip camera ] to decoder than viewer, I can see the video Since my rtsp source is in real time i do not insert any time-stamp or presentation time before sending them to decoder... My timestamps problems start when i try to write them to MP4[container] file using an MP4 muxer... Because of MP4 container format i have set their presentation time to muxer otherwise i have a broken MP4 file... so first i calculate firstPresentationTime TIMEVAL_TO_REFERENCE_TIME(presentationTime); // at first frame[key frame] then push muxer other frames time stamps as follow TIMEVAL_TO_REFERENCE_TIME(presentationTime) - firstPresentationTime; Jeremy Noring warn me about my TIMEVAL_TO_REFERENCE_TIME so i fixit :(there is overflow) as #define TIMEVAL_TO_REFERENCE_TIME(x) (((__int64)x.tv_sec * 1000000) + x.tv_usec) * 10 For now it seems to work in a correct way... But anyway if anybody write h264 stream to MP4 container using a MP4 muxer, i will like to hear their advices about timestamps calculations. Best Wishes 2011/12/22 Ross Finlayson : > My RTSP Source's ?RTCP SR ?are not reliable... > > > You are probably mistaken. > > So calculated timestamps?frequently resulting in large negative jumps. > > > Let me guess: You're receiving H.264 (or MPEG) video that uses "P" > (prediction) frames. ?For video streams like this, frames are sent (and thus > received) in 'decoding order' - i.e., the order that they will be fed to a > decoder, not 'presentation order' - the order that they will be displayed on > the screen. ?In this case, it is normal (and expected) that the received > frames' presentation times will not all be monotonically increasing. ?The > incoming frames should be fed into your decoder in the order that they > arrive, but their presentation times will tell you when they should be > displayed. > > But in any case, RTCP is an important part of the RTP/RTCP protocol, and > should not be disabled. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From kidjan at gmail.com Thu Dec 22 09:45:00 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 22 Dec 2011 09:45:00 -0800 Subject: [Live-devel] Live555 EventLoop crash In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AB2A6@IL-BOL-EXCH01.smartwire.com> References: <0376AECF-6591-4B08-B6E6-E524BEADAE51@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB2A6@IL-BOL-EXCH01.smartwire.com> Message-ID: On Thu, Dec 22, 2011 at 6:12 AM, Jeff Shanab wrote: > Above the live 555 libs I have my own frame class. It is a simple RAII > data class with payload, a bit of byte alignment, and some metadata like > size and type. I use a reference counted pointer to this. This allows my > multiple subscribers to keep lists of pointers to frames, they each have > their own list and not worry about memory management and there is minimal > memcopying around. > I do this as well--all of our media data is reference counted to prevent multiple copies of the same sample. That said, my embedded processor is slow enough that even a final memcpy into Live555's buffers show up in profiling. > Another important detail is this makes sure he who created the memory > deletes the memory. This is to keep windows happy, If memory is allocated > in one heap and attempted to delete it from code running in a different > heap, windows throws an access violation, especially in debug mode! > I had considered this. For Live555's event-driven model, you'd need a function that gets called when Live555 is done with a sample so the calling application can appropriately free the memory. In any event, this isn't sounding like it'd be a simple or straight-forward thing to add to Live555, so....not much to discuss. -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Thu Dec 22 17:12:00 2011 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Fri, 23 Dec 2011 02:12:00 +0100 Subject: [Live-devel] Relaying an MP2T fed into RTP source through an RTSP server Message-ID: Hi Ross, I want to serve a live RTP source serving MP2T tv channel through an RTSP server. To do it I have subclassed the OnDemandServerSubsession throug my MPEG2TransportUDPServerSubsession class where I have redefined the createNewStreamSource function as follow FramedSource* MPEG2TransportUDPServerMediaSubsession ::createNewStreamSource(unsigned/* clientSessionId*/, unsigned& estBitrate){ estBitrate = 5000; return MPEG2TransportStreamFramer::createNew(fEnv,fSource); } And I have redefined the createNewRTPSink exactly as done in the MPEG2TransportFileServerMediaSubsession class. To test this implementation I have added a new bloc to the testOnDemandRTSPServer to serve this live stream like this : { char const* tvServiceName = "tv"; char const* inputAddressStr = "127.0.0.1"; struct in_addr inputAddress; inputAddress.s_addr = our_inet_addr(inputAddressStr); Port const inputPort(5004); unsigned char const inputTTL = 0; Groupsock inputRTPsock(*env, inputAddress, inputPort, inputTTL); SimpleRTPSource* RTPSource = SimpleRTPSource::createNew(*env,&inputRTPsock, 33, 90000, "video/MP2T", 0, False); ServerMediaSession* sms = ServerMediaSession::createNew(*env, tvServiceName, tvServiceName, descriptionString); sms->addSubsession(MPEG2TransportUDPServerMediaSubsession::createNew(*env, RTPSource, True));// All clients share the same live source rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms,tvServiceName,tvServiceName); } The RTP source is obtained like that STB--http raw MP2T-->VLC----MP2T on RTP--->testOnDemandRTSPServer . The VLC and the testOnDemandRTSPServer are running on the same machine. The problem is that when I request the live stream from any client player I get a SEGMENTATION FAULT on the tesOnDemandRTSPServer. When I request the RTP stream with vlc without passing through the server (STB--http raw MP2T-->VLC----MP2T on RTP--->VLC client) I get the display without any problem. Am I missing some thing ???? Thanks in advance -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hana-uma.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 26 08:36:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Dec 2011 08:36:23 -0800 Subject: [Live-devel] Relaying an MP2T fed into RTP source through an RTSP server In-Reply-To: References: Message-ID: <981166BD-D038-4FDD-9D43-996FE59A3217@live555.com> > The problem is that when I request the live stream from any client player I get a SEGMENTATION FAULT on the tesOnDemandRTSPServer. OK, so where specifically in the code is this segmentation fault happening, and why? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From wjwu611 at gmail.com Fri Dec 23 05:25:33 2011 From: wjwu611 at gmail.com (wj wu) Date: Fri, 23 Dec 2011 21:25:33 +0800 Subject: [Live-devel] testMPEG4VideoToDarwin.exe Message-ID: hi,everyone.I encounter a question when I try testMPEG4VideoToDarwin function. First,I feed file stream to live555 server via testMPEG4VideoToDarwin,and I modied the file testMPEG4VideoToDarwin.cpp as following below: if (!injector->setDestination(dssNameOrAddress, remoteStreamName, programName, "LIVE555 Streaming Media",8554,"username1","password1")) . the hint of the error is "Injector->setDestination<> failed:405 Method Not Allowed". But,when I feed file stream to Darwin server vie testMPEG4VideoToDarwin,and I modied the file testMPEG4VideoToDarwin.cpp as following below: if (!injector->setDestination(dssNameOrAddress, remoteStreamName, programName, "LIVE555 Streaming Media")). this is successful via VLC using the url "rtsp://127.0.0.1:554/test.sdp". The live555 server portNum is 8554,Dawin server portNum is 554. I don't known,first where is wrong,please help me,thank you. . -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 26 19:29:55 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Dec 2011 19:29:55 -0800 Subject: [Live-devel] testMPEG4VideoToDarwin.exe In-Reply-To: References: Message-ID: <922A282A-8012-4FC0-8D27-42E45453134E@live555.com> > the hint of the error is "Injector->setDestination<> failed:405 Method Not Allowed". That sounds like a permission problem - i.e., you don't have 'write' permission set on your 'Darwin' server. Note the comments at the top of "liveMedia/include/DarwinInjector.hh"/ Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jayeshpk1 at gmail.com Mon Dec 26 17:24:07 2011 From: jayeshpk1 at gmail.com (Jayesh Parayali) Date: Mon, 26 Dec 2011 20:24:07 -0500 Subject: [Live-devel] Save RTSP/H264 to MPEG4, WebM and JPEG files Message-ID: How can I read RTSP stream (H264) to MPEG4, WebM and JPEG files at the same time. Any pointers is appreciated. Also is there any documentation on how media source and sinks work in live555? Thanks, Jay -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 26 20:10:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Dec 2011 20:10:23 -0800 Subject: [Live-devel] Save RTSP/H264 to MPEG4, WebM and JPEG files In-Reply-To: References: Message-ID: > How can I read RTSP stream (H264) to MPEG4, WebM and JPEG files at the same time. Right now you can't, for several reasons: - We don't (yet) provide any mechanism for 'replicating' an input stream into more than one output stream. - We don't (yet) provide any mechanism for writing WebM-format files. - We don't provide any codecs (video or audio decoding/encoding software). Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Dec 27 05:49:53 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 27 Dec 2011 13:49:53 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: <21E7ED8E-8550-4FC1-980F-6BEF9C640473@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> <21E7ED8E-8550-4FC1-980F-6BEF9C640473@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1ACB23@IL-BOL-EXCH01.smartwire.com> Sorry, I was away from the computer that receives this list for the holidays. Maybe I should try to explain in greater detail what apple needs. They need short segments that start with a PAT packet, then, in order, a PMT and a Keyframe and so on. So we have to have a ts stream that contains the PAT,PMT,keyframe, diff frames that can be splitable into these segments. What I did was change the periodic inserting of PAT and PMT to on detection of the nth keyframe. That way I could, at 10 FPS and gopsize of 10, have my segmenter code trigger off the PAT frame and splits these into separate, consecutively numbered files and create and index for them. This is part of our overall application's internal webserver and is all in memory so it is not posted. For live555 to make full use of this there is going to need to be a "segmenter" filter that takes the ts frames in, detects the PAT packet and starts a new file and adds an entry to the index file(.m3u8). Once these are available on a web server, iPads and iPhones (and safari on mac, and the upcoming android version) can play it. Apple has a good explanation and demo files on their site. http://developer.apple.com/library/ios/#documentation/networkinginternet/conceptual/streamingmediaguide/Introduction/Introduction.html demo stream http://devimages.apple.com/iphone/samples/bipbopgear1.html The last bit is the "adaptive part" That requires transcoding or otherwise substitutable streams. The Index can take some parameters that indicate the bandwidth so the client can measure the bandwidth and switch to a different URI if the bandwidth is not cutting it. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Sunday, December 25, 2011 4:01 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream (My goal here is to have a H.264 file that I know that - if processed with our "testH264VideoToTransportStream" demo application, after I've modified our Transport Stream multiplexing code based on your changes - will produce a Transport Stream file that works with HTTP Live Streaming.) Unfortunately I wasn't able to succeed in doing this. As an experiment, I modified our "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" implementations to exactly match what you did in your code. But even when I did this, I wasn't able to produce a Transport Stream file (from your "CaptureH264.es" file, using our "testH264VideoToTransportStream" demo application) that works with HTTP Live Streaming. So, until I know for sure what changes, if any, are necessary to the "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" code in order for HTTP Live Streaming to work, I'm going to hold off - at least for now - on making any changes to this code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4701 - Release Date: 12/24/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Dec 27 06:03:28 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 27 Dec 2011 14:03:28 +0000 Subject: [Live-devel] Save RTSP/H264 to MPEG4, WebM and JPEG files In-Reply-To: References: Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1ACB5E@IL-BOL-EXCH01.smartwire.com> I think I am working towards this myself, whether I want to or not. Ross, please correct me if I screw something up here.... If you are planning on doing some programming, here is one way. First, you must use another library to do the transcoding and then write each one out to a file. (recommend libavcodec from ffmpeg) Each of these conversions are "sinks" in live555 parlence. Most of the live555 is a linear connection from sources through filters to sinks. Filters just being something that is both a source and a sink. http://www.live555.com/liveMedia/faq.html#control-flow So first we write what looks like a memorysink to live555 and connect it as the sink to the RTSPClient. This sink has a list of each of your converters. When afterGettingFrame is called, You need to call each of your converters. This is where the earlier discussion on reference counting comes in handy. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jayesh Parayali Sent: Monday, December 26, 2011 7:24 PM To: LIVE555 Streaming Media - development & use Subject: [Live-devel] Save RTSP/H264 to MPEG4, WebM and JPEG files How can I read RTSP stream (H264) to MPEG4, WebM and JPEG files at the same time. Any pointers is appreciated. Also is there any documentation on how media source and sinks work in live555? Thanks, Jay ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4705 - Release Date: 12/26/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bdiwi.rawya at gmail.com Tue Dec 27 06:30:11 2011 From: bdiwi.rawya at gmail.com (bdiwi rawia) Date: Tue, 27 Dec 2011 15:30:11 +0100 Subject: [Live-devel] set top box problem Message-ID: Hi all, I use live555 Media server to read Transport Stream file. The file is launched without any problem with the Videolan client. The problem is that my set top box client (which recieves , decodes and displays stream) doesen?t display the video. Analyzing wireshark capture i have found that SETUP method server is successful but the method used by the STB GET_PARAMETER is not supported (?405 method not allowed?). I use a private network for test. Has any one see this problem? is there a change that i have made ??in the source code of the rtsp server ? or there is a configuration that I have to make to solve this problem ? Please help. Thanks !!! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 27 06:49:06 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Dec 2011 06:49:06 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1ACB23@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> <21E7ED8E-8550-4FC1-980F-6BEF9C640473@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ACB23@IL-BOL-EXCH01.smartwire.com> Message-ID: <23B06372-A1BE-4FEC-BBED-BEFCF34C3235@live555.com> Sigh... I think you might not be familiar with the HTTP Live Streaming server implementation that we *already* have (and have had for almost 6 months now). See: http://www.live555.com/mediaServer/#http-live-streaming Our server can *already* stream a H.264-encoded Transport Stream file via HTTP Live Streaming. We explicitly *do not* segment the file in any way. Instead, we create (automatically, in our server) a playlist to serve to clients. Each entry in the playlist refers (using time-based parameters in the URL) to a specific region of the file, but we do not actually segment the file. Instead, our server delivers the appropriate portion of the file automatically. Our only requirement is that the file be indexed (using our normal 'trick mode' indexing mechanism) beforehand. And this works just fine - e.g., with a file made up from Apple's "bipbopgear1" Transport Stream example. However, it *did not* work with a Transport File that I created - using our "testH264VideoToTransportStream" demo application - from the "CaptureH264.es" file that you provided - ***even after*** I modified our "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" implementations to exactly match what you did in your code. So, as I said before, until I know for sure what changes, if any, are necessary to the "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" code in order for HTTP Live Streaming to work, I'm going to hold off - at least for now - on making any changes to this code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 27 06:56:08 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Dec 2011 06:56:08 -0800 Subject: [Live-devel] set top box problem In-Reply-To: References: Message-ID: <034D0616-567D-4E54-8161-018D3DF1E409@live555.com> > I use live555 Media server to read Transport Stream file. The file is launched without any problem with the Videolan client. > > The problem is that my set top box client (which recieves , decodes and displays stream) doesen?t display the video. > Analyzing wireshark capture i have found that SETUP method server is successful but the method used by the STB GET_PARAMETER is not supported (?405 method not allowed?). > Please post (on this mailing list) a specific example of the RTSP protocol exchange (between your set top box and our server) that fails for you. > is there a change that i have made ??in the source code of the rtsp server ? > Well, *have you* made a change to the source code of the RTSP server? (If you have, then you can expect no support on this mailing list. You should have first done your testing without having made any changes at all to the supplied source code.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Dec 27 07:22:53 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 27 Dec 2011 15:22:53 +0000 Subject: [Live-devel] invalid ts stream In-Reply-To: <23B06372-A1BE-4FEC-BBED-BEFCF34C3235@live555.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> <21E7ED8E-8550-4FC1-980F-6BEF9C640473@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ACB23@IL-BOL-EXCH01.smartwire.com> <23B06372-A1BE-4FEC-BBED-BEFCF34C3235@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1ACBE4@IL-BOL-EXCH01.smartwire.com> I was aware of it by description only. The "explicitly do not segment" and having to create the index after the file is completed were game killers for me :( . I needed to segment and build the index in real time to restream a live source and have pieces time out and go away. Maybe I should of looked closer at it instead. But maybe misguided, when I dug into the multiplexor code I found that the PAT and PMT were being added on an interval and had nothing to do with segment size or relative keyframe position. Something that I thought was essential for iOS.(it is "strongly suggested" LOL ) The cleanest method for me was to have a muxer for iOS that made a ts stream that makes segmenting and indexing almost trivial. I am afraid I do not know what the trick play index looks like and thought the index mentioned was the .m3u8 being done after the fact. I believe it is an external index and I unfortunately already have an external index and my own file format which is now our "legacy" for our archives. Can the trick play index be created and maintained on the fly? Or does it need , and need to be, a completed file? For our live stream, there are no files on disk, it is all rolling virtual files in memory. What I gathered from the documentation was that the existing implementation was 1) serving on it's own port and 2) file based. In my server, If a user gets the .m3u8 index and it has 3 - 5 second entries in it, ie segment1.ts, segement2.ts ,segment3.ts, and they come back 5 seconds later for the same index, they get an updated one that has segment2.ts, segment3.ts, segment4.ts. It implements a "sliding window" to quote the apple docs. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, December 27, 2011 8:49 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] invalid ts stream Sigh... I think you might not be familiar with the HTTP Live Streaming server implementation that we *already* have (and have had for almost 6 months now). See: http://www.live555.com/mediaServer/#http-live-streaming Our server can *already* stream a H.264-encoded Transport Stream file via HTTP Live Streaming. We explicitly *do not* segment the file in any way. Instead, we create (automatically, in our server) a playlist to serve to clients. Each entry in the playlist refers (using time-based parameters in the URL) to a specific region of the file, but we do not actually segment the file. Instead, our server delivers the appropriate portion of the file automatically. Our only requirement is that the file be indexed (using our normal 'trick mode' indexing mechanism) beforehand. And this works just fine - e.g., with a file made up from Apple's "bipbopgear1" Transport Stream example. However, it *did not* work with a Transport File that I created - using our "testH264VideoToTransportStream" demo application - from the "CaptureH264.es" file that you provided - ***even after*** I modified our "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" implementations to exactly match what you did in your code. So, as I said before, until I know for sure what changes, if any, are necessary to the "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" code in order for HTTP Live Streaming to work, I'm going to hold off - at least for now - on making any changes to this code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4706 - Release Date: 12/27/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bdiwi.rawya at gmail.com Tue Dec 27 08:23:10 2011 From: bdiwi.rawya at gmail.com (bdiwi rawia) Date: Tue, 27 Dec 2011 17:23:10 +0100 Subject: [Live-devel] set top box problem In-Reply-To: <034D0616-567D-4E54-8161-018D3DF1E409@live555.com> References: <034D0616-567D-4E54-8161-018D3DF1E409@live555.com> Message-ID: Thank you for the response, i haven't modified anything in the source code i just asked if i have to modify the code to solve this problem. So, i analyzed the traffic between the decoder and the RTSP server with a trace module and i got this message: 00:00:00.000 BBClientSDK Version: 2.1.53 00:00:00.001 set_param: STBID=002691D28012 00:00:00.001 set_param: BufferSizeSec=1 00:00:00.001 set_param: MaxBufferSizeKByte=256 00:00:00.001 set_param: DefBufferSizeKByte=128 00:00:00.001 ------------------------------------------------------------ ---------------------------- 00:00:00.001 | Id | Name | Minimum | Maximum | Value | 00:00:00.001 ------------------------------------------------------------ ---------------------------- 00:00:00.001 | 1| DebugLevel | 0| 255| 15 | 00:00:00.001 | 2| RemoteDebug | 0| 80| | 00:00:00.001 | 3| UserAgent | 0| 256| BBClientSDK Ver:2.1 | 00:00:00.001 | 4| STBID | 0| 256| 002691D28012 | 00:00:00.001 | 5| LoopEnabled | 0| 1| 0 | 00:00:00.001 | 6| LoadConfigTimeOut | 200| 60000| 5000 | 00:00:00.001 | 7| LoadPlayListTimeout | 200| 60000| 5000 | 00:00:00.001 | 8| ServerNotFoundDelay | 200| 30000| 2000 | 00:00:00.001 | 9| FindServerRetrNumber | 1| 3000| 3 | 00:00:00.001 | 10| MaxBitrate | 1000000| 100000000| 10000000 | 00:00:00.002 | 11| LocalIP | 0| 64| | 00:00:00.002 | 12| LocalIfName | 0| 64| eth0 | 00:00:00.002 | 13| SGatewayIP | 0| 74| | 00:00:00.002 | 14| DecoderBufSize | 0| 8388608| 0 | 00:00:00.002 | 15| UpdateTimeFromSG | 0| 1| 1 | 00:00:00.002 | 16| BufferSizeSec | 1| 20| 1 | 00:00:00.002 | 17| MaxBufferSizeKByte | 128| 131072| 256 | 00:00:00.002 | 18| DefBufferSizeKByte | 128| 131072| 128 | 00:00:00.002 | 19| RtThreadPriority | 0| 1024| 0 | 00:00:00.002 | 20| NRtThreadPriority | 0| 1024| 0 | 00:00:00.002 | 21| HTTPFindServerTimeOut | 1000| 60000| 5000 | 00:00:00.002 | 22| HTTPFindGoodLoading | 5| 95| 20 | 00:00:00.002 | 23| HTTPConnectTimeOut | 200| 60000| 2000 | 00:00:00.002 | 24| HTTPResponseTimeOut | 200| 60000| 2000 | 00:00:00.002 | 25| HTTPDataReceiveTimeOut | 50| 30000| 2000 | 00:00:00.002 | 26| HTTPFseekFrameMSec | 500| 10000| 800 | 00:00:00.002 | 27| HTTPFilterEnable | 0| 1| 0 | 00:00:00.002 | 28| RTSPFindServerByDescribe | 0| 1| 0 | 00:00:00.002 | 29| RTSPFindServerTimeOut | 1000| 60000| 5000 | 00:00:00.003 | 30| RTSPFindGoodLoading | 5| 95| 20 | 00:00:00.003 | 31| RTSPConnectTimeout | 500| 30000| 2000 | 00:00:00.003 | 32| RTSPResponseTimeOut | 200| 60000| 2000 | 00:00:00.003 | 33| RTSPTcpConnectTimeOut | 500| 10000| 2000 | 00:00:00.003 | 34| RTSPDataReceiveTimeOut | 200| 60000| 2000 | 00:00:00.003 | 35| RTSPReceiveErrMaxRetries | 1| 100000000| 5 | 00:00:01.002 | 36| RTSPQoSMaxRec | 0| 100| 10 | 00:00:01.002 | 37| RTSPProtocol | 0| 2| 1 | 00:00:01.002 | 38| PLTVDelay | 0| 60| 5 | 00:00:01.002 | 39| RTSPMaxSpeed_X100 | 100| 400| 120 | 00:00:01.002 | 40| RTSPBufTopMSec | 200| 4000| 1000 | 00:00:01.002 | 41| RTSPBufLowMSec | 200| 4000| 500 | 00:00:01.002 | 42| RTSPContinueByPTS | 0| 1| 0 | 00:00:01.002 | 43| RTSPTcpBufLimitMSec | 50| 2000| 100 | 00:00:01.002 | 44| RTSPUdpPortStart | 1500| 30000| 1500 | 00:00:01.002 | 45| RTSPUdpPortNumb | 2| 2000| 1500 | 00:00:01.002 | 46| RTSPUdpMapTimeoutSec | 0| 30000| 30 | 00:00:01.003 | 47| RTSPKkeepAliveNum | 1| 100| 1 | 00:00:01.003 | 48| UDPReconnectDelay | 100| 60000| 1000 | 00:00:01.003 | 49| UDPMaxRetries | 1| 100000000| 10 | 00:00:01.003 | 50| UDPDataReceiveTimeOut | 100| 60000| 2000 | 00:00:01.003 | 51| UDPFECDisable | 0| 1| 0 | 00:00:01.003 | 52| UDPPrebufKB | 0| 2048| 100 | 00:00:01.003 | 63| PLSTParamEnable | 0| 1| 1 | 00:00:01.003 | 64| DBGParam1 | 0| 2147483647| 0 | 00:00:01.003 | 65| DBGParam2 | 0| 2147483647| 0 | 00:00:01.003 ------------------------------------------------------------ ---------------------------- 00:00:01.003 Entry number 0 00:00:01.003 Insert 0 host Protocol:2 IP:190.99.21.31 Port:8554 PR:0 00:00:01.039 bbsdk_play_strm: entry:0 pos:0, scale:1. 00:00:01.039 bbrb_set_prebuffering: 0. 00:00:01.047 bbrtspdstrm_pause. 00:00:01.047 bbrtspdstrm_pause. pipe found 0x289a2e8 (nil) 00:00:01.099 close_sesion: sesion closed. 00:00:01.099 bbtcp_find_server: Check connection with:190.99.21.31:8554 00:00:01.140 bbtcp_send: GET_PARAMETER rtsp:// 190.99.21.31/mpeg2TransportStreamTest RTSP/1.0 CSeq: 1 Content-Length: 84 Content-Type: text/parameters Version Build Type ServerLoad QoSRecNum x-setup-timeout-msec server-time-utc 00:00:01.181 bbtcp_find_server: RTSP/1.0 405 Method Not Allowed CSeq: 1 Date: Tue, Dec 20 2011 11:00:47 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER . 00:00:01.222 bbtcp_find_server: server not found. 00:00:01.222 close_sesion: sesion closed. 00:00:01.222 wait_for_cmd: 2000 msec. 00:00:03.262 bbtcp_find_server: Check connection with:190.99.21.31:8554 00:00:03.303 bbtcp_send: GET_PARAMETER rtsp:// 190.99.21.31/mpeg2TransportStreamTest RTSP/1.0 CSeq: 1 Content-Length: 84 Content-Type: text/parameters Version Build Type ServerLoad QoSRecNum x-setup-timeout-msec server-time-utc 00:00:03.344 bbtcp_find_server: RTSP/1.0 405 Method Not Allowed CSeq: 1 Date: Tue, Dec 20 2011 11:00:49 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER . 00:00:03.385 bbtcp_find_server: server not found. 00:00:03.385 close_sesion: sesion closed. 00:00:03.385 wait_for_cmd: 2000 msec. 00:00:05.425 bbtcp_find_server: Check connection with:190.99.21.31:8554 00:00:05.467 bbtcp_send: GET_PARAMETER rtsp:// 190.99.21.31/mpeg2TransportStreamTest RTSP/1.0 CSeq: 1 Content-Length: 84 Content-Type: text/parameters Version Build Type ServerLoad QoSRecNum x-setup-timeout-msec server-time-utc 00:00:05.508 bbtcp_find_server: RTSP/1.0 405 Method Not Allowed CSeq: 1 Date: Tue, Dec 20 2011 11:00:51 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER . 00:00:05.549 bbtcp_find_server: server not found. 00:00:05.549 close_sesion: sesion closed. 00:00:05.549 wait_for_cmd: 2000 msec. 00:00:07.589 bbtcp_find_server: Check connection with:190.99.21.31:8554 00:00:07.631 bbtcp_send: GET_PARAMETER rtsp:// 190.99.21.31/mpeg2TransportStreamTest RTSP/1.0 CSeq: 1 Content-Length: 84 Content-Type: text/parameters Version Build Type ServerLoad QoSRecNum x-setup-timeout-msec server-time-utc 00:00:07.672 bbtcp_find_server: RTSP/1.0 405 Method Not Allowed CSeq: 1 Date: Tue, Dec 20 2011 11:00:53 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER . 00:00:07.713 bbtcp_find_server: server not found. 00:00:07.713 close_sesion: sesion closed. 00:00:07.713 wait_for_cmd: 2000 msec. 00:00:09.755 bbtcp_find_server: Check connection with:190.99.21.31:8554 00:00:09.796 bbtcp_send: GET_PARAMETER rtsp:// 190.99.21.31/mpeg2TransportStreamTest RTSP/1.0 CSeq: 1 Content-Length: 84 Content-Type: text/parameters Version Build Type ServerLoad QoSRecNum x-setup-timeout-msec server-time-utc 00:00:09.838 bbtcp_find_server: RTSP/1.0 405 Method Not Allowed CSeq: 1 Date: Tue, Dec 20 2011 11:00:56 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER . 00:00:09.879 bbtcp_find_server: server not found. 00:00:09.880 close_sesion: sesion closed. 00:00:09.880 wait_for_cmd: 2000 msec. 00:00:11.921 bbtcp_find_server: Check connection with:190.99.21.31:8554 00:00:11.962 bbtcp_send: GET_PARAMETER rtsp:// 190.99.21.31/mpeg2TransportStreamTest RTSP/1.0 CSeq: 1 Content-Length: 84 Content-Type: text/parameters Version Build Type ServerLoad QoSRecNum x-setup-timeout-msec server-time-utc 00:00:12.004 bbtcp_find_server: RTSP/1.0 405 Method Not Allowed CSeq: 1 Date: Tue, Dec 20 2011 11:00:58 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER . 00:00:12.046 bbtcp_find_server: server not found. 00:00:12.047 close_sesion: sesion closed. 00:00:12.047 wait_for_cmd: 2000 msec. 00:00:14.088 bbtcp_find_server: Check connection with:190.99.21.31:8554 00:00:14.129 bbtcp_send: GET_PARAMETER rtsp:// 190.99.21.31/mpeg2TransportStreamTest RTSP/1.0 CSeq: 1 Content-Length: 84 Content-Type: text/parameters Version Build Type ServerLoad QoSRecNum x-setup-timeout-msec server-time-utc 00:00:14.171 bbtcp_find_server: RTSP/1.0 405 Method Not Allowed CSeq: 1 Date: Tue, Dec 20 2011 11:01:00 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER . 00:00:14.213 bbtcp_find_server: server not found. 00:00:14.214 close_sesion: sesion closed. 00:00:14.214 wait_for_cmd: 2000 msec. 00:00:16.049 bbrtspdstrm_pause. 00:00:16.051 bbrtsp_command_loop: thread closed. Le 27 d?cembre 2011 15:56, Ross Finlayson a ?crit : > I use live555 Media server to read Transport Stream file. The file is > launched without any problem with the Videolan client. > > The problem is that my set top box client (which recieves , decodes and > displays stream) doesen?t display the video. > Analyzing wireshark capture i have found that SETUP method server is > successful but the method used by the STB GET_PARAMETER is not supported > (?405 method not allowed?). > > > Please post (on this mailing list) a specific example of the RTSP protocol > exchange (between your set top box and our server) that fails for you. > > is there a change that i have made ??in the source code of the rtsp server > ? > > Well, *have you* made a change to the source code of the RTSP server? (If > you have, then you can expect no support on this mailing list. You should > have first done your testing without having made any changes at all to the > supplied source code.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 27 15:17:29 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Dec 2011 15:17:29 -0800 Subject: [Live-devel] set top box problem In-Reply-To: References: <034D0616-567D-4E54-8161-018D3DF1E409@live555.com> Message-ID: <561751ED-C55D-4631-9716-8F8A2E44BF89@live555.com> The problem is that your client (STB) is trying to send a RTSP "GET_PARAMETER" command with a "rtsp://" URL, without having first done a RTSP "DESCRIBE" and then a RTSP "SETUP" on this URL. Our server doesn't support this, nor does any other server that I'm aware of. (I'm not even sure if what the client's doing here is even legal RTSP.) If your client had instead sent the "GET_PARAMETER" command with "*" as the URL, then our server would have responded without an error. However, by default, our server just responds to "GET_PARAMETER" with an empty response. Because your client is asking for a specific set of parameters, it's unlikely that it would be happy with our server's response, in any case. So, unless you can get your STB's manufacturer to change its RTSP client implementation (to be more standard), you're probably out of luck with this client. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 27 15:37:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Dec 2011 15:37:53 -0800 Subject: [Live-devel] invalid ts stream In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1ACBE4@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AACCB@IL-BOL-EXCH01.smartwire.com> <940D1AE2-5B21-4C3F-AC6B-66EA3156B38D@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB418@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1AB507@IL-BOL-EXCH01.smartwire.com> <32D9901D-D6BA-4A7A-9CE6-632E618A8D27@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB8E7@IL-BOL-EXCH01.smartwire.com> <47CC6ABB-D76C-4819-9779-031948DC5FD5@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1AB98A@IL-BOL-EXCH01.smartwire.com> <3A0522EB-BEE7-4BEE-A5DE-21ECFEDFCDE1@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ABBD7@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC20B1ABCE8@IL-BOL-EXCH01.smartwire.com> <21E7ED8E-8550-4FC1-980F-6BEF9C640473@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ACB23@IL-BOL-EXCH01.smartwire.com> <23B06372-A1BE-4FEC-BBED-BEFCF34C3235@live555.com> <615FD77639372542BF647F5EBAA2DBC20B1ACBE4@IL-BOL-EXCH01.smartwire.com> Message-ID: > Can the trick play index be created and maintained on the fly? No. > Or does it need , and need to be, a completed file? Yes. Our server implementation supports HTTP Live Streaming only for pre-recorded (and pre-indexed) files. It cannot work for live streams. So it would not be suitable for your application. But I don't care about that. Once again (for the third, and final time!) my point is that the raw H.264 file ("CaptureH264.es") that you provided does not produce a Transport Stream file that works with HTTP Live Streaming, ***even if*** I modify our "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" implementations to exactly match what you did in your code. Unless and until I see such a raw H.264 file, I'm not going to be making any such modifications to "MPEG2TransportStreamMultiplexor" and/or "MPEG2TransportStreamFromESSource" in our released code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Tue Dec 27 16:43:14 2011 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Wed, 28 Dec 2011 01:43:14 +0100 Subject: [Live-devel] relaying an MPEG2 Transport Stream RTP live source through an RTSPServer Message-ID: Dear Ross, Is it possible to relay a live RTP souce serving and MPEG2 TS stream through an RTSP server using live555. As i have say in my previous email, I have subclassed the OnDemandMediaSubsession and I have redefined the createNewStreanSource to return a simpleRTPSource fed into MPEG2TransportFramer object. The call to getnextframe on this source finish with a Segmentation fault, That's why I have asked if is it possibe to do this kind of relaying before debuggig deeply the source code. Thanksin advance. -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hana-uma.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 27 20:25:33 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Dec 2011 20:25:33 -0800 Subject: [Live-devel] relaying an MPEG2 Transport Stream RTP live source through an RTSPServer In-Reply-To: References: Message-ID: <6C1BC2F3-24FA-4D5B-A6C2-6E732A62192D@live555.com> > Is it possible to relay a live RTP souce serving and MPEG2 TS stream through an RTSP server using live555. Yes! > As i have say in my previous email And as I said in my previous response: >> The problem is that when I request the live stream from any client player I get a SEGMENTATION FAULT on the tesOnDemandRTSPServer. > > OK, so where specifically in the code is this segmentation fault happening, and why? You have written an application, but you get a segmentation fault when you run it. So your job now is to find out exactly where this segmentation fault is happening, and why. (Remember, You Have Complete Source Code.) What you're doing looks reasonable, and I didn't see any obvious problem with the code that you posted (though I didn't spend a lot of time looking at it). But I can't debug your code for you. You need to figure out where/why you're getting the segmentation fault. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhanm at join.net.cn Thu Dec 29 05:49:44 2011 From: zhanm at join.net.cn (=?gb2312?B?1bLD9w==?=) Date: Thu, 29 Dec 2011 21:49:44 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIEhvdyB0byBjcmVhdGUgbXVsdGlwbGUg?= =?gb2312?b?UlRTUCBjbGllbnRzIHVzaW5nIHRocmVhZHMgZm9yIGxpdmUgdmlk?= =?gb2312?b?ZW8gc291cmNlIHBsYXlpbmcgYXQgc2FtZSB0aW1lPw==?= In-Reply-To: References: <003c01ccc37e$5d88be00$189a3a00$@net.cn> Message-ID: <001801ccc630$b895d9e0$29c18da0$@net.cn> Thanks for your reply. Now, I can play multiple streams concurrently after I take the testRTSPClient program as an example. But this works OK when I provide all URLs at one time and then call the doEventLoop(). How can I make it work so that I can play another stream after other streams have arlready started but haven?t stoped playing (with one thread control not using threads)? ???: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] ?? Ross Finlayson ????: 2011?12?26? 13:36 ???: LIVE555 Streaming Media - development & use ??: Re: [Live-devel] How to create multiple RTSP clients using threads for live video source playing at same time? I want to create a rtsp server that supports multiple live video source playing using threads. I have created my custom live source, sink and session object and modified MediaSubsession::initiate to handle it. I created multiple rtspclients using threads to use my own session. The problem is: I can play only one session. Do I need to create separate subsessions for each live channel playing using different urlsurfix? How can I synchronize between these clients? Unfortunately I found your question a little confusing. It wasn't clear to me whether (1) your question was about a RTSP server (How to set up a RTSP server that supports multiple streams?), or whether (2) your question was about RTSP clients (How to create multiple, concurrent RTSP clients?). If your question is (1) (How to set up a RTSP server that supports multiple streams?), then I suggest that you look at the code for the "testOnDemandRTSPServer" for guidance, and also read the FAQ entries that discuss how to have a server that streams from live source(s). Note, in particular, that each stream (e.g., coming from a specific video+audio source) has its own "ServerMediaSession", with its own name (which gets used in the "rtsp://" URL that will be used to access the stream). Each particular track within the stream (e.g., audio, video, text) will have its own "ServerMediaSubsession". Once again, the "testOnDemandRTSPServer" code should make this clearer. If your question is (2) (How to create multiple, concurrent RTSP clients?), then there are several possible ways to do this: - Create multiple processes, each receiving its own "rtsp://" URL. For example, you can run multiple copies of the "openRTSP" application - which will require no extra programming at all. - Write a single application (i.e., process) that opens/receives multiple concurrent "rtsp://" URLs, using a single thread of control. See the code for the "testRTSPClient" demo application for guidance on how to do this. - Write a single application (i.e., process) that opens/receives multiple concurrent "rtsp://" URLs, using multiple threads (one thread for each "rtsp://" URL). This approach is possible, although not recommended. If you really want to do this, then you need to read and understand the FAQ entry about threads. (Note, in particular, that each thread *must* use a separate "TaskScheduler" and "UsageEnvironment".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 29 06:28:25 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Dec 2011 06:28:25 -0800 Subject: [Live-devel] How to create multiple RTSP clients using threads for live video source playing at same time? In-Reply-To: <001801ccc630$b895d9e0$29c18da0$@net.cn> References: <003c01ccc37e$5d88be00$189a3a00$@net.cn> <001801ccc630$b895d9e0$29c18da0$@net.cn> Message-ID: > Thanks for your reply. Now, I can play multiple streams concurrently after I take the testRTSPClient program as an example. But this works OK when I provide all URLs at one time and then call the doEventLoop(). How can I make it work so that I can play another stream after other streams have arlready started but haven?t stoped playing (with one thread control not using threads)? First, the easiest way to do this is to run each RTSP client in a separate *process* (i.e., in a separate application, *not* as a separate thread within a single application). If you do not need to share memory between the stream receivers, then this is the simplest solution. The rest of this message, however, assumes that - for whatever reason - you want all of your stream receivers to run in the same application (i.e., process). How to start playing another stream? This depends - What decides when a new stream should be played? Whatever this is, it will need to be handled as an "event", within the event loop. Perhaps, for example, you have a GUI - running in a different thread - where the user indicates (perhaps by filling in a form and clicking a button) that a new stream should be started. In this case, you could use the "EventTrigger" mechanism that's defined for the "TaskScheduler" class: See http://www.live555.com/liveMedia/faq.html#other-kinds-of-event Or perhaps you have a TCP socket connected to a console that the user is using to type commands (e.g. "start-playing "). In this case, this 'command-line user interface' TCP socket could be read and parsed from within the event loop - using a handler that's set up using the "TaskScheduler:turnOnBackgroundReadHandling()" routine. But whatever you do, the 'event' that determines that a new stream needs to be played is an 'event' that will need to be handled within the event loop. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Thu Dec 29 06:33:57 2011 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Thu, 29 Dec 2011 15:33:57 +0100 Subject: [Live-devel] Configuring a BasicUDPSource with an unicast stream Message-ID: Hi Ross, Ho to configure the groupSocket object fed into an BasicUDPSource to read an incoming unicast UDP stream? Thanks in advance -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hana-uma.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: From georgehafiz at gmail.com Thu Dec 29 09:00:27 2011 From: georgehafiz at gmail.com (George Hafiz) Date: Thu, 29 Dec 2011 17:00:27 +0000 Subject: [Live-devel] Changing buffer size on Win32 binary Message-ID: I am trying to stream a 720p x264 encoded video in the Matroska container, however I bet input frame size too large for buffer. I tried using the -b switch when loading the binary but the error indicated the same buffer size as before. How do you adjust server parameters with the win32 binary, or is it not possible? -- >From George Hafiz From finlayson at live555.com Thu Dec 29 11:40:33 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Dec 2011 11:40:33 -0800 Subject: [Live-devel] Configuring a BasicUDPSource with an unicast stream In-Reply-To: References: Message-ID: <2DB6F2AF-326F-4492-BF09-7D527989B051@live555.com> > Ho to configure the groupSocket object fed into an BasicUDPSource to read an incoming unicast UDP stream? Using 0 as the 'group address' should work - i.e., struct in_addr inputAddress; inputAddress.s_addr = 0; Port const inputPort(8888); // replace this with the actual port number that you want to use Groupsock* inputGroupsock = new Groupsock(*env, inputAddress, inputPort, 0); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 29 11:42:55 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Dec 2011 11:42:55 -0800 Subject: [Live-devel] Changing buffer size on Win32 binary In-Reply-To: References: Message-ID: > I am trying to stream a 720p x264 encoded video in the Matroska > container, however I bet input frame size too large for buffer. > > I tried using the -b switch when loading the binary but the error > indicated the same buffer size as before. How do you adjust server > parameters with the win32 binary, or is it not possible? You didn't say so explicitly, but I assume that you are using the "live555MediaServer" application as your server, and "openRTSP" as your client. Please put on a web server an example of a Matroska (.mkv or .webm) file that illustrates your problem, and send us the URL, so we can download and test it ourselves. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 30 08:55:15 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 30 Dec 2011 16:55:15 +0000 Subject: [Live-devel] New performance issue in my app Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1AD9E8@IL-BOL-EXCH01.smartwire.com> I added a new section and I am having a performance issue. Something with the way I use the scheduler. http://www.mediafire.com/i/?cdg5t964h8q8xlg Here is a hot-spot trace that shows where the time appears to be spent. Does this give any hints to what I have done wrong? -------------- next part -------------- An HTML attachment was scrubbed... URL: From drollinson at logostech.net Fri Dec 30 09:23:13 2011 From: drollinson at logostech.net (Rollinson, Derek) Date: Fri, 30 Dec 2011 09:23:13 -0800 Subject: [Live-devel] New performance issue in my app In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1AD9E8@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1AD9E8@IL-BOL-EXCH01.smartwire.com> Message-ID: <8CD7A9204779214D9FDC255DE48B9521B6C48767@EXPMBX105-1.exch.logostech.net> It was the one in the latest folder on GPU4. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Friday, December 30, 2011 11:55 AM To: LIVE555 Streaming Media - development & use (live-devel at ns.live555.com) Subject: [Live-devel] New performance issue in my app I added a new section and I am having a performance issue. Something with the way I use the scheduler. http://www.mediafire.com/i/?cdg5t964h8q8xlg Here is a hot-spot trace that shows where the time appears to be spent. Does this give any hints to what I have done wrong? -------------- next part -------------- An HTML attachment was scrubbed... URL: From drollinson at logostech.net Fri Dec 30 09:25:25 2011 From: drollinson at logostech.net (Rollinson, Derek) Date: Fri, 30 Dec 2011 09:25:25 -0800 Subject: [Live-devel] New performance issue in my app References: <615FD77639372542BF647F5EBAA2DBC20B1AD9E8@IL-BOL-EXCH01.smartwire.com> Message-ID: <8CD7A9204779214D9FDC255DE48B9521B6C4876A@EXPMBX105-1.exch.logostech.net> My apologizes to all, that last email was a mistake. I replied to the wrong message. From: Rollinson, Derek Sent: Friday, December 30, 2011 12:23 PM To: LIVE555 Streaming Media - development & use (live-devel at ns.live555.com) Subject: RE: New performance issue in my app It was the one in the latest folder on GPU4. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Friday, December 30, 2011 11:55 AM To: LIVE555 Streaming Media - development & use (live-devel at ns.live555.com) Subject: [Live-devel] New performance issue in my app I added a new section and I am having a performance issue. Something with the way I use the scheduler. http://www.mediafire.com/i/?cdg5t964h8q8xlg Here is a hot-spot trace that shows where the time appears to be spent. Does this give any hints to what I have done wrong? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Dec 30 12:44:00 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 30 Dec 2011 20:44:00 +0000 Subject: [Live-devel] A question about DeviceSource.cpp and eventTriggers Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1ADAEF@IL-BOL-EXCH01.smartwire.com> I modeled my ESSource.cpp off of the DeviceSource.cpp and got everything working. Because of the application's many to many dynamic stream model, for each source, I have a unique environment as they each run in their own thread. After I got it working for some reason only the first instance would pass data all the way through, the 2,3rd,and 4th that connected received all their data but neglected to signal the arrival. The problem simple but I am wondering why the DeviceSource was that way, that I may be missing something important. The EventTrigerId was static. This caused it to be shared across all instances and therefore get updated and only trigger one environments stream to call the AfterGettingFrame of the next guy. So my question is why is this static? Should it be part of a custom env? Just trying to understand this. For the life of me I don't see why it didn't only work on the last stream and not first stream. :) I just made it a nonstatic class member and initialized it in the ctor and all instances start to stream. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 30 13:13:16 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Dec 2011 13:13:16 -0800 Subject: [Live-devel] A question about DeviceSource.cpp and eventTriggers In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1ADAEF@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1ADAEF@IL-BOL-EXCH01.smartwire.com> Message-ID: > So my question is why is this static? "eventTriggerId" was made a static member variable of the "DeviceSource" class, because this code was intended as an illustration of how to encapsulate a *single* device - not a set of devices. But yes, it could have been a non-static member variable instead. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 30 13:32:01 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Dec 2011 13:32:01 -0800 Subject: [Live-devel] A question about DeviceSource.cpp and eventTriggers In-Reply-To: References: <615FD77639372542BF647F5EBAA2DBC20B1ADAEF@IL-BOL-EXCH01.smartwire.com> Message-ID: <7377165F-8445-42B5-A3BD-3B0B7874BB2A@live555.com> >> So my question is why is this static? > > "eventTriggerId" was made a static member variable of the "DeviceSource" class, because this code was intended as an illustration of how to encapsulate a *single* device - not a set of devices. But note also, BTW, that even if you have a set of related devices, then a single event trigger id is enough. You can trigger the event on any one of these devices - using the single event trigger id - by passing a pointer to the specific device as the "clientData" parameter to "triggerEvent()". A single "event trigger id" is intended to represent the 'type of event'; it's the "clientData" parameter that's intended to be used to represent the specific target for an event. But yes - if you are accessing the code using multiple threads (with a separate TaskScheduler/UsageEnvironment for each), then the event trigger id in your device class can't be static. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: