From stas at tech-mer.com Wed Sep 2 07:04:56 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Wed, 2 Sep 2009 17:04:56 +0300 Subject: [Live-devel] H.264 ES streaming In-Reply-To: <21E398286732DC49AD45BE8C7BE96C0795DDBCBDB6@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C0795D6DD88B1@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C0795DDBCBDB6@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C0795E0B6809A@fs11.mertree.mer.co.il> Answering my own question: The spec is Amendment 3: Transport of AVC video data over ITU-T Rec. H.222.0 | ISO/IEC 13818-1 stream. As usual its full of semantic changes of the original spec and not a streamlined spec ready to be implemented but at least it's a direction. AFAIK it says to grab the NAL packets with the start code (so called byte stream) and push it into TS PES along with service packets like SPS, PPS, SEI, . The problem is time stamp conversion which looks like a project on its own. I guess there is no derived FramedSource class that can take NAL packets and feed the streamer correctly or is it? Question is will the VLC and MPlayer be able to play this right once such a FramedSource sibling is created? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Stas Desyatnlkov Sent: Monday, August 31, 2009 2:49 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 ES streaming The problem is the lack of info, I still don't know how to send NAL packets in TS correctly. Is there a spec or RFC that can help? Can I create my own FramedFileSource that will read the NAL packets correctly and feed the data to the MPEG2TransportStreamFromESSource? Or should I strip the NAL headers and then feed it to MPEG2TransportStreamXXX? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, August 27, 2009 10:39 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 ES streaming What am I missing here? What you're missing is that you're trying to use code that reads and streams a MPEG-2 Transport Stream file to instead read and stream a H.264 Elementary Stream video file. There's no way this can possibly work. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From braitmaier at hlrs.de Wed Sep 2 08:25:54 2009 From: braitmaier at hlrs.de (Michael Braitmaier) Date: Wed, 02 Sep 2009 17:25:54 +0200 Subject: [Live-devel] UDP framing in RTP Message-ID: <4A9E8E82.1060206@hlrs.de> Hello to everyone! I am struggling for some while now, what happens if your sources receives a complete frame larger than the maximum that can be passed into an UDP packet. Intuitively I would say it gets split up into multiple packets and merged again on receiver side. From my understanding of liveMedia I thought that the framing of a video frame is then done by liveMedia. I was encouraged in my assumption when seeing that OutPacketBuffer has a mechanism for handling overflow data, which made me assume that this is the way the large frames are handled. From one of my last messages here, I learned that making the buffer fTo in a source large enough to hold my largest discrete frames is the way to avoid frame truncation and droping of partial frame data. I currently do this by calling "setPacketSizes(preferredSize, maxPacketSize)" just after the instantiation of the MultiFramedRTPSink. But with this buffer increase I also seem to increase the buffer size (the buffer in OutPacketBuffer) that is directly handed down to the UDP socket, which in unfortunate situations causes a "message too big" error. So I am confused now whether I have to take action in my derived source class or whether this should be handled by liveMedia natively? If it is handled then I am unsure about how to pass a rather large frame (e.g. 140K) from my source to a MultiFramedRTPSink when I can not increase the size of the fTo-buffer in my source (which directly relates to the size of the buffer in OutPacketBuffer and the message size of a UDP message) without causing the "message size too big" errors on the UDP socket. I guess I am missing something obvious , but can't seem to find it. Any help would be greatly appreciated. Michael From finlayson at live555.com Wed Sep 2 15:27:23 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Sep 2009 15:27:23 -0700 Subject: [Live-devel] live555 support for plain MPEG2-TS without RTP In-Reply-To: <4f706c210908290239m59d98dbr2856d24f0d8ce8f3@mail.gmail.com> References: <4f706c210908290239m59d98dbr2856d24f0d8ce8f3@mail.gmail.com> Message-ID: > Does live555 support streaming content, in which the UDP payload >contains just plain MPEG2-TS, without any RTP wrapper ? Yes. > Looking into >the testing programs that come with live555, most of them expect >MPEG2-TS over RTP. I guess this is the reason, why I'm not able to >play these streams in mplayer [ linked with live555 library ], which >also tries to guess RTP from the stream, and eventually crashes in its >demux module. Try VLC instead. It might do a better job of recognizing such streams. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Sep 2 15:51:38 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Sep 2009 15:51:38 -0700 Subject: [Live-devel] multiple instances of live555 library In-Reply-To: <274F7B50569A1B4C9D7BCAB17A9C7BE101B525E4@mailguy3.skynet.nuvation.com> References: <4A727B5E.5080608@gmail.com> <77938bc20908241213l6a2dc7dfl42bdcfd929aa1de9@mail.gmail.com> <274F7B50569A1B4C9D7BCAB17A9C7BE101B525E4@mailguy3.skynet.nuvation.com> Message-ID: >The LiveMedia library uses the magical world of select(), which >allows for monitoring multiple file handle descriptors (or in this >case, sockets) to monitor for incoming/outgoing data. More precisely: The "LIVE555 Streaming Media" libraries use the (not so magical) world of events, whereby events (such as the arrival of incoming data) is handled within a single-threaded event loop. The default event loop implementation ("BasicTaskScheduler", provided with the code) uses "select()", but other implementations (i.e., other subclasses of "TaskScheduler") could use other mechanisms. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Wed Sep 2 17:33:26 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Wed, 02 Sep 2009 17:33:26 -0700 Subject: [Live-devel] RTSPClientSession and doEventLoop in a dynamic server Message-ID: Hi, My implementation of a dynamic RTSPServer works as follows: When a request is received, the subclasses RTSPServer instance has an overloaded lookupServerMediaSession function. If the stream is not already present on the server, it will bootstrap the stream and add to the server before returning the correct ServerMediaSession object. The bootstrap process can sometimes take a long time, and so in order to not interrupt the operation of the other streams, I spawn a NON-LIVE555 thread to perform the actual bootstrap whilst the live555 runs an additional doEventLoop(watchVar). When the thread completes, it signals the watchVar and the stream is then added to the server if the bootstrapping is successful. Then the main live555 thread continues. The consequence is that within a ?doEventLoop? call there is an additional ?doEventLoop? being ran. One of the problems I have seen is that sometimes whilst the bootstrapping is happening, the client closes the connection. In the second ?doEventLoop? the server recognizes this and deletes the RTSPClientSession object. Unfortunately, it is this within this object that is performing the second doEventLoop (in the handleCmd_DESCRIBE function), and things obviously start to go awry. One solution is to handle this by reference counting the incoming connections on the my custom RTSPClientSession. However, the only place to handle this would be the incomingRequestHandler, however this function is not virtual in the RTSPClientSession. Another solution would be to turn off backgroundReadHandling whilst performing a DESCRIBE. Then turn it on again after the DESCRIBE is complete. This does not seem a general purpose solution tho, as what if I have another function where doEventLoop is called. At this point I would welcome feedback. Am I handling this case all wrong? Regards, Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 2 19:07:51 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Sep 2009 19:07:51 -0700 Subject: [Live-devel] Once again, SDP support for Live555 & interaction with FFMpeg ... In-Reply-To: <4A968FBB.7090800@impathnetworks.com> References: <4A968FBB.7090800@impathnetworks.com> Message-ID: >We have a multicast stream, initiated from another process, which >generates an .SDP file for the stream > >v=0 >o=MangoDSP 126 14843616424497153183 IN IP4 192.168.0.62 >s=Mango DSP Audio/Video >m=video 7170 RTP/AVP 96 >a=rtpmap:96 H264/90000 >c=IN IP4 232.0.0.11 >a=fmtp:96 >packetization-mode=1;profile-level-id=42001E;sprop-parameter-sets=Z0KAHkLaAtD0QA==,aEjgGody >a=control:__StreamID=270385256 > >We've searched through the Live555 archives and there is mention of >passing this to MediaSession::createNew(), but we're sort of stumped >on making this a reality. First, create a "MediaSession" object, by calling "MediaSession::createNew()", with the SDP description (string) as parameter. Then, go through each of this object's 'subsessions' (in this case, there'll be just one, for "video"), and call "MediaSubsession::initiate()" on it. Then, you can create an appropriate 'sink' object (e.g., encapsulating your decoder), and then call "startPlaying()" on it, passing the subsession's "readSource()" as parameter. See the "openRTSP" code (specifically, "testProgs/playCommon.cpp") for an example of how this is done. >Ultimately our plan is to pass the stream onto FFMpeg for >decoding/storage. When we try to open this stream directly with >FFMpeg it takes forever (and often never) for the program to lock >onto the stream, so perhaps there is a problem with ffmpeg and this >h264 stream. But, when we open a unicast stream with openRTSP and >pipe the output to ffmpeg it works fine. The problem here is probably that - when you tune into an ongoing multicast stream - you're missing the special PPS and SPS NAL units that normally appear at the start of the stream. (In contrast, when you stream a unicast stream, you get the whole stream, starting from the beginning, and so will get the special PPS and SPS NAL units.) To overcome this, you will need to decode the PPS and SPS NAL unit data from the SDP description, and insert these at the front of the stream that you pass to your decoder. Specifically, you call "MediaSubsession:: fmtp_spropparametersets()" on your 'subsession' object, to get the appropriate configuration string, and then decode this string by calling "parseSPropParameterSets()". (See "liveMedia/include/H264VideoRTPSource.hh".) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Sep 2 21:13:27 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Sep 2009 21:13:27 -0700 Subject: [Live-devel] openRTSP with no backchannel In-Reply-To: <1251345337.6149.1049.camel@dhenning-ubuntu> References: <1251345337.6149.1049.camel@dhenning-ubuntu> Message-ID: >I am working with a system that multicasts multiple streams over DVB-H. >Normally, I receive an SDP file for each stream from the electronic >service guide and I view the streams by passing the SDP file into my >libav-based application. However, this did not work for any of the >streams being generated by a Mango DSP system. > >Enter live555. By piping the output of openRTSP into my libav-based >application, the video from the Mango box now pops up right away and I >see no errors from libav. > >First, thanks so much for this library. It saved the day. > >Second, please explain conceptually what live555 is doing that makes it >possible to view the video and why I don't need it for all of our other >streams. Since I don't know anything about "libav" or "Mango DSP", I can't answer this question. However, if you tell us the SDP description that "openRTSP" retrieves from the server (this is displayed as part of the diagnostic output), then that might give us a hint as to what's special about these streams. >During this debug process, my video receiver has an ethernet >back-channel to the server so openRTSP can negotiate the stream. >However, in the final application, the receiver will be a mobile unit >with no back channel. > >Assuming the ESG provides the same SDP file that openRTSP receives >during its negotiation, should I still be able to write an application >using live555 with no backchannel? Yes, in principle. You won't be able to use RTSP (of course), and this will likely work only for multicast streams, but it should work. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From SRawling at pelco.com Thu Sep 3 17:27:14 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Thu, 03 Sep 2009 17:27:14 -0700 Subject: [Live-devel] Patch : clock changing issue In-Reply-To: <1250493214.4a89031ee17e7@imp.celeos.eu> Message-ID: > The issue is : if you change your system time while you receive a stream, then >> the library may make your cpu very busy. >> >> So I patched the library myself : I replaced gettimeofday by a monotonic >> clock. >> This has only been tested with linux. In case someone else is looking for a > solution to this bug, I attached my patch. > I just ran into this issue and will be using the patch. Is there any reason not to include this in the main source repository? Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 3 19:42:11 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Sep 2009 19:42:11 -0700 Subject: [Live-devel] Patch : clock changing issue In-Reply-To: References: Message-ID: >I just ran into this issue and will be using the patch. Is there >any reason not to include this in the main source repository? "clock_gettime()" does not seem to be portable. (In particular, Mac OS X doesn't seem to have it, and (I suspect) Windows doesn't either.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Thu Sep 3 20:18:31 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Thu, 03 Sep 2009 20:18:31 -0700 Subject: [Live-devel] Patch : clock changing issue In-Reply-To: Message-ID: >> "clock_gettime()" does not seem to be portable. (In particular, Mac OS X doesn't seem to have it, and (I suspect) Windows doesn't either.) The patch does not change the windows implementation other than renaming the function ?mgettimeofday?. It does a new function mgettimeofday as an #else (to cover non windows implementations). All the calls to gettimeofday are reimplemented as mgettimeofday. Perhaps the best procedure would be to have the following: (pseudo code) #ifdef WIN32 Int mgettimeofday() { // Normal windows implementation } #else Int mgettimeofday(){ #ifdef MONOTONIC_CLOCK // Call clock_gettime() variant #else // call regular gettimeofday() gettimeofday() #endif #endif This could be handled with a compile flag that is defined in the config.linux and config.linux-gdb configurations. I see the 100% cpu whenever the time jumps back. The cpu max out continues until the time catches up. This definitely seems to fix it. Regards, Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 4 01:50:16 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Sep 2009 01:50:16 -0700 Subject: [Live-devel] Patch : clock changing issue In-Reply-To: References: Message-ID: Rather than changing "gettimeofday()", or requiring the library code to use a different-named function, a better solution was to fix the "DelayQueue" implementation (in the "BasicUsageEnvironment" library) so that it checks for (and properly handles) the special case of "gettimeofday()" going backwards. I have now installed a new version (2009.09.04) of the "LIVE555 Streaming Media" software that includes this fix. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Sep 4 10:08:23 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Sep 2009 10:08:23 -0700 Subject: [Live-devel] RTSPClientSession and doEventLoop in a dynamic server In-Reply-To: References: Message-ID: >Another solution would be to turn off backgroundReadHandling whilst >performing a DESCRIBE. Then turn it on again after the DESCRIBE is >complete. That seems like a reasonable solution to your problem. Because you are breaking up your server implementation of "DESCRIBE" into two separate events, it makes sense to not handle any new commands from the client until the second, 'bootstrapping' event completes. > This does not seem a general purpose solution tho, as what if I >have another function where doEventLoop is called. I don't see the point here. The solution you've proposed sounds good to me. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Sep 4 17:25:36 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Sep 2009 17:25:36 -0700 Subject: [Live-devel] UDP framing in RTP In-Reply-To: <4A9E8E82.1060206@hlrs.de> References: <4A9E8E82.1060206@hlrs.de> Message-ID: >I am struggling for some while now, what happens if your sources >receives a complete frame larger than the maximum that can be >passed into an UDP packet. That's not a problem. The RTP payload format (and our implementation of the RTP payload format) for each codec takes care of fragmenting frames across RTP packets, if necessary. You don't need to worry about this; just pass the complete frame into the appropriate "RTPSink" subclass. > Intuitively I would say it gets split up into multiple packets and >merged again on receiver side. Yes. >From one of my last messages here, I learned that making the buffer >fTo in a source large enough to hold my largest discrete frames is >the way to avoid frame truncation and >droping of partial frame data. I currently do this by calling >"setPacketSizes(preferredSize, maxPacketSize)" just after the >instantiation of the MultiFramedRTPSink. No, that's not the right way to do this. Calling "setPacketSizes()" changes the sizes of the outgoing RTP/UDP packets. You usually don't need to do this; the default packet sizes work for most networks. Instead, you want to set the size of the *buffer* that the RTPSink uses to receive incoming frames. To do this, add (e.g., to set a buffer size of 150000 bytes) OutPacketBuffer::maxSize = 150000; *before* creating your RTPSink. >I am unsure about how to pass a rather large frame (e.g. 140K) from >my source to a MultiFramedRTPSink See above. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From devaureshy at gmail.com Sat Sep 5 14:29:59 2009 From: devaureshy at gmail.com (Steve Jiekak) Date: Sat, 5 Sep 2009 23:29:59 +0200 Subject: [Live-devel] FileSink for ADTS Message-ID: <614230390909051429g609e4eb5hc57a2525a492e34e@mail.gmail.com> Hello Ross, I've written a file sink saving AAC with ADTS for my own program, and I was wondering if it can be incorporated in live555, in case other people need something like this. Regards, Steve Jiekak -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 5 14:40:03 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 5 Sep 2009 14:40:03 -0700 Subject: [Live-devel] FileSink for ADTS In-Reply-To: <614230390909051429g609e4eb5hc57a2525a492e34e@mail.gmail.com> References: <614230390909051429g609e4eb5hc57a2525a492e34e@mail.gmail.com> Message-ID: >I've written a file sink saving AAC with ADTS for my own program, >and I was wondering if >it can be incorporated in live555, in case other people need >something like this. Sure. Send the code to the mailing list, and I'll take a look at it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From devaureshy at gmail.com Sat Sep 5 15:25:49 2009 From: devaureshy at gmail.com (Steve Jiekak) Date: Sun, 6 Sep 2009 00:25:49 +0200 Subject: [Live-devel] PATCH: FileSink for ADTS Message-ID: <614230390909051525m69e90ccevdd852cbe44d18568@mail.gmail.com> These are all the files you need. Enjoy. On Sat, Sep 5, 2009 at 11:40 PM, Ross Finlayson wrote: > I've written a file sink saving AAC with ADTS for my own program, and I was >> wondering if >> it can be incorporated in live555, in case other people need something >> like this. >> > > Sure. Send the code to the mailing list, and I'll take a look at it. > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ADTSAudioFileSink.cpp Type: text/x-c++src Size: 4119 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ADTSAudioFileSink.hh Type: text/x-c++hdr Size: 2054 bytes Desc: not available URL: From ansary858 at hotmail.com Mon Sep 7 19:06:33 2009 From: ansary858 at hotmail.com (ansary mohamed) Date: Tue, 8 Sep 2009 10:06:33 +0800 Subject: [Live-devel] (no subject) Message-ID: Hi experts, I am doing H.264 streaming using live555 libraries and using vlc to play the rtp stream. I understand from the mailing list that this is done before. I read thru the codes and rfc documents but unable to understand how to do it. There is a tutorial based on this and it is posted at http://www.white.ca/patrick/tutorial.tar.gz but the tutorial is not found anymore. I would really appreciate it if anyone has the new link to this. Or anyone can shed some light on how to go about doing it. Some sample codes would surf me as a starting point. Thanks _________________________________________________________________ More than messages?check out the rest of the Windows Live?. http://www.microsoft.com/windows/windowslive/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From devaureshy at gmail.com Wed Sep 9 04:38:16 2009 From: devaureshy at gmail.com (Steve Jiekak) Date: Wed, 9 Sep 2009 13:38:16 +0200 Subject: [Live-devel] (no subject) In-Reply-To: References: Message-ID: <614230390909090438k14551f09j6209640edf6b68ae@mail.gmail.com> I think I got the files from this tutorials, and I will try to find it. But it really deceived me and I found i a bit useless. I think you should read the FAQ, which says you need a framer subclass for H.264. From some sources (it work with files from FFmpeg with default parameters), your framer just need to recognize and remove the NALU start code (0x00 00 01) before sending the frame to the sink. Another thing you have to do (IF YOU'RE USING A RSTP SERVER YOU DONT NEED THIS) to play using vlc is to be able to provide a SDP description. This part is tough, and you should consult the library you use to produce this description. When you have it, you should transmit it to the receiver (there is a formal way of doing it , i can't give details) . if you're using FFmpeg I can help you a little more, but if you want to read files with differents kind of sources, I can't help you. Regards, Steve Jiekak On Tue, Sep 8, 2009 at 4:06 AM, ansary mohamed wrote: > Hi experts, > > I am doing H.264 streaming using live555 libraries and using vlc to play > the rtp stream. I understand from the mailing list that this is done before. > I read thru the codes and rfc documents but unable to understand how to do > it. There is a tutorial based on this and it is posted at > > http://www.white.ca/patrick/tutorial.tar.gz but the tutorial is not found > anymore. I would really appreciate it if anyone has the new link to this. Or > anyone can shed some light on how to go about doing it. Some sample codes > would surf me as a starting point. > > > Thanks > > ------------------------------ > Be seen with Buddy! Tag your picture and win exciting prizes! Click here > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Wed Sep 9 10:54:53 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 9 Sep 2009 10:54:53 -0700 Subject: [Live-devel] Live G711 audio source Message-ID: I want to expose a live G711 stream (specifically, PCMU) through an on-demand RTSP server; my application receives a pre-encoded stream (single channel, 8000 samples/sec, etc.). I reviewed a bunch of the samples that come with Live555 (and wis-streamer) and they were good references, but I want to make sure my approach to solving this problem isn't wild and crazy. testOnDemandRTSPServer uses WAVAudioFileServerMediaSubsession, which has an option to convert to ulaw--this class is sort of like what I want (?), although it reads from a file instead of a live source (hence the "seek" and "scale" methods that are irrelevant to a live source) and my ulaw source is already encoded, so clearly there are some differences. Would a good approach be to create my own class that inherits from OnDemandServerMediaSubsession, and then overriding the appropriate methods? And if yes, what methods would you recommend overriding? Any suggestions are welcome; in the mean time, I'm going to give it a try and see what happens. Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 9 15:17:54 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Sep 2009 15:17:54 -0700 Subject: [Live-devel] Live G711 audio source In-Reply-To: References: Message-ID: >testOnDemandRTSPServer uses WAVAudioFileServerMediaSubsession, which >has an option to convert to ulaw--this class is sort of like what I >want (?), although it reads from a file instead of a live source >(hence the "seek" and "scale" methods that are irrelevant to a live >source) and my ulaw source is already encoded, so clearly there are >some differences. Would a good approach be to create my own class >that inherits from OnDemandServerMediaSubsession, and then >overriding the appropriate methods? Yes, exactly. Remember, when you instantiate "OnDemandServerMediaSubsession", to set the "reuseFirstSource" parameter to True, because you're reading from a live source rather than a file. > And if yes, what methods would you recommend overriding? You must write your own implementation of "createNewStreamSource()" and "createNewRTPSink()". As you noted, you can use the existing implementation of these functions in "WAVAudioFileServerMediaSubsession" as a model. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kidjan at gmail.com Wed Sep 9 16:06:42 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 9 Sep 2009 16:06:42 -0700 Subject: [Live-devel] Live G711 audio source In-Reply-To: References: Message-ID: On Wed, Sep 9, 2009 at 3:17 PM, Ross Finlayson wrote: > testOnDemandRTSPServer uses WAVAudioFileServerMediaSubsession, which has an >> option to convert to ulaw--this class is sort of like what I want (?), >> although it reads from a file instead of a live source (hence the "seek" and >> "scale" methods that are irrelevant to a live source) and my ulaw source is >> already encoded, so clearly there are some differences. Would a good >> approach be to create my own class that inherits from >> OnDemandServerMediaSubsession, and then overriding the appropriate methods? >> > > Yes, exactly. Remember, when you instantiate > "OnDemandServerMediaSubsession", to set the "reuseFirstSource" parameter to > True, because you're reading from a live source rather than a file. > > And if yes, what methods would you recommend overriding? >> > > You must write your own implementation of "createNewStreamSource()" and > "createNewRTPSink()". As you noted, you can use the existing implementation > of these functions in "WAVAudioFileServerMediaSubsession" as a model. > Thanks, that helps. I have stuff sort of working, but I think I have a problem with my framer. I'm starting to think I need to have my framer inherit from AudioInputDevice and _not_ FramedSource; I believe this because VLC is incorrectly reporting the # of bits per sample as 16, and that's the only inheritance hierarchy I can see that clearly communicates such a value. Is that correct? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 9 17:50:23 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Sep 2009 17:50:23 -0700 Subject: [Live-devel] Live G711 audio source In-Reply-To: References: Message-ID: >I believe this because VLC is incorrectly reporting the # of bits >per sample as 16, and that's the only inheritance hierarchy I can >see that clearly communicates such a value. Is that correct? What RTP payload format are you delivering to the client? u-law ("PCMU")? "L8"? Or "L16"? Only the last is 16-bits-per-sample. Once again, look closely at the "WAVAudioFileServerMediaSubsession" code for a model. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Alastair.Bain at Cubic.com Wed Sep 9 20:44:58 2009 From: Alastair.Bain at Cubic.com (Alastair Bain) Date: Thu, 10 Sep 2009 15:44:58 +1200 Subject: [Live-devel] Streaming from a device Message-ID: I'm trying to use mediaServer to stream from a encoder card, but not sure how handle synchronization between the device source and the live555 event loop. The device I'm using presents transport stream packets in its own thread. I've created a subclass of DeviceSource, but I don't believe I can just call the deliverFrame() function from a non-event loop thread? Currently I'm just placing the frames into a temporary buffer, and then using the live555 event loop to pull frames from this buffer. The problem with this approach is that I can't seem to get the timing right, so end up having to use a large buffer which adds unacceptable delay. Any ideas for how to solve this issue? Regards, Alastair Bain From finlayson at live555.com Thu Sep 10 01:37:54 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Sep 2009 01:37:54 -0700 Subject: [Live-devel] Streaming from a device In-Reply-To: References: Message-ID: >I'm trying to use mediaServer to stream from a encoder card, but not >sure how handle synchronization between the device source and the >live555 event loop. First, is the encoder card accessible (readable) as a socket? Those are the easiest kinds of input devices to handle. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From spopo316 at yahoo.com.tw Thu Sep 10 01:47:48 2009 From: spopo316 at yahoo.com.tw (spopo316 at yahoo.com.tw) Date: Thu, 10 Sep 2009 16:47:48 +0800 (CST) Subject: [Live-devel] H264 multicast streaming question Message-ID: <747910.13340.qm@web72302.mail.tp2.yahoo.com> Hi, I am working on streaming H.264 video file to the remote PC. The H.264 file can be streamed by the unicast method and displayed in VLC player on the remote PC. But when I adopted the multicast method, the VLC can not show any image as it opened the RTSP stream. I have used Wireshark to compare the RTP packages which streamed by the unicast and multicast methodes, separately. I found they are equivalent. Then I checked the SDP when the VLC accessed the RTST server. It seems to be right. Could someone help me? I have read the discussion "[Live-devel] H264 video frame problem" that mentioned [ The "sprop-parameter-sets" string in your SDP description is wrong. You must pass a proper "sprop_parameter_sets_str" parameter to "H264VideoRTPSink::createNew()". This "sprop_parameter_sets_str" should be the Base64-encoded strings for the PPS and SPS NAL units, separated by a comma (',') character.]. But when I streamed the H.264 file by unicsat method successfully , the sprop-parameter-sets has been set ?h264?. Therefore i think the sprop-parameter-sets=h264 does't influence the stream when using multicast method. Is it right? Thanks Best Regards -------------------------------------------------------------------------------- multicast: parseRTSPRequestString() returned cmdName "OPTIONS", urlPreSuffix "", urlSuffix "h264" sending response: RTSP/1.0 200 OK CSeq: 20 Date: Thu, Jan 01 1970 15:05:41 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE RTSPClientSession[0x1dda00]::incomingRequestHandler1() read 155 bytes:DESCRIBE r tsp://192.168.0.196:8554/h264 RTSP/1.0 CSeq: 21 Accept: application/sdp User-Agent: VLC media player (LIVE555 Streaming Media v2008.07.24) parseRTSPRequestString() returned cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "h264" sending response: RTSP/1.0 200 OK CSeq: 21 Date: Thu, Jan 01 1970 15:05:41 GMT Content-Base: rtsp://192.168.0.196:8554/h264/ Content-Type: application/sdp Content-Length: 513 v=0 o=- 54340067602 1 IN IP4 192.168.0.196 s=Session streamed by "test h.264" i=h264 t=0 0 a=tool:LIVE555 Streaming Media v2008.04.02 a=type:broadcast a=control:* a=source-filter: incl IN IP4 * 192.168.0.196 a=rtcp-unicast: reflection a=range:npt=0- a=x-qt-text-nam:Session streamed by "test h.264" a=x-qt-text-inf:h264 m=video 18888 RTP/AVP 96 c=IN IP4 232.245.108.238/255 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=000042;sprop-parameter-sets=h264 a=control:track1 RTSPClientSession[0x1dda00]::incomingRequestHandler1() read 188 bytes:SETUP rtsp ://192.168.0.196:8554/h264/track1 RTSP/1.0 CSeq: 22 Transport: RTP/AVP;multicast;client_port=18888-18889 User-Agent: VLC media player (LIVE555 Streaming Media v2008.07.24) parseRTSPRequestString() returned cmdName "SETUP", urlPreSuffix "h264", urlSuffi x "track1" fIsMulticast RTP_UDP sending response: RTSP/1.0 200 OK CSeq: 22 Date: Thu, Jan 01 1970 15:05:41 GMT Transport: RTP/AVP;multicast;destination=232.245.108.238;source=192.168.0.196;po rt=18888-18889;ttl=255 Session: 1 RTSPClientSession[0x1dda00]::incomingRequestHandler1() read 158 bytes:PLAY rtsp: //192.168.0.196:8554/h264/ RTSP/1.0 CSeq: 23 Session: 1 Range: npt=0.000- User-Agent: VLC media player (LIVE555 Streaming Media v2008.07.24) parseRTSPRequestString() returned cmdName "PLAY", urlPreSuffix "h264", urlSuffix "" sending response: RTSP/1.0 200 OK CSeq: 23 Date: Thu, Jan 01 1970 15:05:41 GMT Range: npt=0.000- Session: 1 RTP-Info: url=rtsp://192.168.0.196:8554/h264/track1;seq=14331;rtptime=236795800 -------------------------------------------------------------------------------- -------------------------------------------------------------------------------- unicast: parseRTSPRequestString() returned cmdName "OPTIONS", urlPreSuffix "", urlSuffix "h264" sending response: RTSP/1.0 200 OK CSeq: 25 Date: Thu, Jan 01 1970 17:37:10 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE RTSPClientSession[0x19f9b0]::incomingRequestHandler1() read 155 bytes:DESCRIBE r tsp://192.168.0.196:8557/h264 RTSP/1.0 CSeq: 26 Accept: application/sdp User-Agent: VLC media player (LIVE555 Streaming Media v2008.07.24) parseRTSPRequestString() returned cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "h264" sending response: RTSP/1.0 200 OK CSeq: 26 Date: Thu, Jan 01 1970 17:37:10 GMT Content-Base: rtsp://192.168.0.196:8557/h264/ Content-Type: application/sdp Content-Length: 466 v=0 o=- 63404909982 1 IN IP4 192.168.0.196 s=RTSP/RTP stream from IPNC i=h264 t=0 0 a=tool:LIVE555 Streaming Media v2008.04.02 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:RTSP/RTP stream from IPNC a=x-qt-text-inf:h264 m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=000042;sprop-parameter-sets=h264 a=control:track1 m=audio 0 RTP/AVP 0 c=IN IP4 0.0.0.0 a=control:track2 RTSPClientSession[0x19f9b0]::incomingRequestHandler1() read 184 bytes:SETUP rtsp ://192.168.0.196:8557/h264/track1 RTSP/1.0 CSeq: 27 Transport: RTP/AVP;unicast;client_port=1644-1645 User-Agent: VLC media player (LIVE555 Streaming Media v2008.07.24) parseRTSPRequestString() returned cmdName "SETUP", urlPreSuffix "h264", urlSuffi x "track1" sending response: RTSP/1.0 200 OK CSeq: 27 Date: Thu, Jan 01 1970 17:37:10 GMT Transport: RTP/AVP;unicast;destination=192.168.0.198;source=192.168.0.196;client _port=1644-1645;server_port=6970-6971 Session: 1 RTSPClientSession[0x19f9b0]::incomingRequestHandler1() read 196 bytes:SETUP rtsp ://192.168.0.196:8557/h264/track2 RTSP/1.0 CSeq: 28 Transport: RTP/AVP;unicast;client_port=1646-1647 Session: 1 User-Agent: VLC media player (LIVE555 Streaming Media v2008.07.24) parseRTSPRequestString() returned cmdName "SETUP", urlPreSuffix "h264", urlSuffi x "track2" sending response: RTSP/1.0 200 OK CSeq: 28 Date: Thu, Jan 01 1970 17:37:10 GMT Transport: RTP/AVP;unicast;destination=192.168.0.198;source=192.168.0.196;client _port=1646-1647;server_port=6972-6973 Session: 1 RTSPClientSession[0x19f9b0]::incomingRequestHandler1() read 158 bytes:PLAY rtsp: //192.168.0.196:8557/h264/ RTSP/1.0 CSeq: 29 Session: 1 Range: npt=0.000- User-Agent: VLC media player (LIVE555 Streaming Media v2008.07.24) parseRTSPRequestString() returned cmdName "PLAY", urlPreSuffix "h264", urlSuffix "" startPlaying MediaSink::startPlaying continuePlaying() buildAndSendPacket: startPlaying MediaSink::startPlaying continuePlaying() buildAndSendPacket: sending response: RTSP/1.0 200 OK CSeq: 29 Date: Thu, Jan 01 1970 17:37:10 GMT Range: npt=0.000- Session: 1 RTP-Info: url=rtsp://192.168.0.196:8557/h264/track1;seq=18214;rtptime=4159189830 ,url=rtsp://192.168.0.196:8557/h264/track2;seq=8502;rtptime=89305239 ------------------------------------------------------------------------------------- ___________________________________________________ ??????? ? ???????????????? http://messenger.yahoo.com.tw/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Sep 10 09:41:18 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 10 Sep 2009 09:41:18 -0700 Subject: [Live-devel] Live G711 audio source In-Reply-To: References: Message-ID: On Wed, Sep 9, 2009 at 5:50 PM, Ross Finlayson wrote: > I believe this because VLC is incorrectly reporting the # of bits per >> sample as 16, and that's the only inheritance hierarchy I can see that >> clearly communicates such a value. Is that correct? >> > > What RTP payload format are you delivering to the client? u-law ("PCMU")? > "L8"? Or "L16"? Only the last is 16-bits-per-sample. Once again, look > closely at the "WAVAudioFileServerMediaSubsession" code for a model. > > Here's how I implemented the subsession methods: RTPSink * LiveG711MediaSubsession::createNewRTPSink(Groupsock *rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource *inputSource) { char const* mimeType = "PCMU"; unsigned char payloadFormatCode = 0; int sampleFrequency = 8000; unsigned int numChannels = 1; return SimpleRTPSink::createNew(envir(), rtpGroupsock, payloadFormatCode, sampleFrequency, "audio", mimeType, numChannels); } FramedSource * LiveG711MediaSubsession::createNewStreamSource(unsigned int clientSessionId, unsigned int &estBitrate) { // For G711, our bitrate is always going to be 64kbps estBitrate = 64; return LiveG711AudioStreamFramer::createNew(envir(), NULL, m_mediaSource ); } The "LiveG711AudioStreamFramer is also something I wrote, and "m_mediaSource" is the thing that supplies my audio. For the values I supply to SimpleRTPSink, I basically copied what WAVAudioFileServerMediaSubsession was delivering for PCMU audio (8000 hz, 1 channel, etc.). Still not sure what problem I'm having--I see VLC is receiving the audio stream, but it's dropping all of the data claiming the "PTS is out of range." Going to check and see what the SDP exchange looks like with openRTSP. Thanks again for all your help. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Sep 10 10:50:19 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 10 Sep 2009 10:50:19 -0700 Subject: [Live-devel] RTP over TCP, pushing Message-ID: Is it possible for Live555 to push a unicast stream to a server online, and have all the traffic go over TCP? I know Live555 can interleave AV data over the RTCP TCP channel, but I've only seen that for the on-demand configurations; is it possible to "push" video to a server over the RTCP line? The reason I ask is we want to relay video through a proxy we control to circumvent NAT. -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Sep 10 16:14:34 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 10 Sep 2009 16:14:34 -0700 Subject: [Live-devel] H264 multicast streaming question In-Reply-To: <747910.13340.qm@web72302.mail.tp2.yahoo.com> References: <747910.13340.qm@web72302.mail.tp2.yahoo.com> Message-ID: 2009/9/10 > > > But when I streamed the H.264 file by unicsat method successfully , the > sprop-parameter-sets has been set ?h264?. Therefore i think the > sprop-parameter-sets=h264 does't influence the stream when using multicast > method. Is it right? > > No, that's not right. For the decoder to understand your H.264 stream, it is crucial that the SPS/PPS info is communicated over a reliable protocol. If you are not sending in in the sprop-parameter-sets argument, how are you conveying it? Note that sending it in the actual stream is not recommended (see section 8.4 of RFC3984). Some people mistakenly think their Live555/H264 implementation "works," but really they're just conveying SPS/PPS info through the lossy RTP channel, which is strongly discouraged by the RFC: The picture and sequence parameter set NALUs SHOULD NOT be transmitted in the RTP payload unless reliable transport is provided for RTP, as a loss of a parameter set of either type will likely prevent decoding of a considerable portion of the corresponding RTP stream. Thus, the transmission of parameter sets using a reliable session control protocol (i.e., usage of principle A or B above) is RECOMMENDED. Botton line: correctly populate sprop-parameter-sets; you are wasting your time by not doing this. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Alastair.Bain at Cubic.com Thu Sep 10 16:58:14 2009 From: Alastair.Bain at Cubic.com (Alastair Bain) Date: Fri, 11 Sep 2009 11:58:14 +1200 Subject: [Live-devel] Streaming from a device In-Reply-To: References: Message-ID: >-----Original Message----- >From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson >Sent: Thursday, 10 September 2009 8:38 p.m. >To: LIVE555 Streaming Media - development & use >Subject: Re: [Live-devel] Streaming from a device >>I'm trying to use mediaServer to stream from a encoder card, but not >>sure how handle synchronization between the device source and the >>live555 event loop. >First, is the encoder card accessible (readable) as a socket? Those >are the easiest kinds of input devices to handle. >-- >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ >_______________________________________________ >live-devel mailing list >live-devel at lists.live555.com >http://lists.live555.com/mailman/listinfo/live-devel The card does not have a sockets interface. I had thought about simply writing the frames to a socket and reading from that socket in the DeviceSource, hoped there was a nicer solution but I guess it's not that bad. If anyone has any examples of this it would be most appreciated. Regards, Alastair Bain From finlayson at live555.com Thu Sep 10 21:48:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Sep 2009 21:48:28 -0700 Subject: [Live-devel] Live G711 audio source In-Reply-To: References: Message-ID: >Here's how I implemented the subsession methods: > > RTPSink * LiveG711MediaSubsession::createNewRTPSink(Groupsock >*rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource >*inputSource) > { > char const* mimeType = "PCMU"; > unsigned char payloadFormatCode = 0; > int sampleFrequency = 8000; > unsigned int numChannels = 1; > > return SimpleRTPSink::createNew(envir(), rtpGroupsock, > payloadFormatCode, sampleFrequency, > "audio", mimeType, numChannels); > } If your input data is *already* u-law audio (i.e., 8-bits-per sample), then this should work. (If, instead, it's 16-bit-per-sample PCM audio, then you need to insert a filter to convert it to u-law.) >The "LiveG711AudioStreamFramer is also something I wrote, and >"m_mediaSource" is the thing that supplies my audio. For the values >I supply to SimpleRTPSink, I basically copied what >WAVAudioFileServerMediaSubsession was delivering for PCMU audio >(8000 hz, 1 channel, etc.). > >Still not sure what problem I'm having--I see VLC is receiving the >audio stream, but it's dropping all of the data claiming the "PTS is >out of range." Ahh! The problem here is probably that you're not setting "fPresentationTime" properly in your 'framer' class. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ansary858 at hotmail.com Thu Sep 10 22:29:03 2009 From: ansary858 at hotmail.com (ansary mohamed) Date: Fri, 11 Sep 2009 13:29:03 +0800 Subject: [Live-devel] (no subject) In-Reply-To: <614230390909090438k14551f09j6209640edf6b68ae@mail.gmail.com> References: <614230390909090438k14551f09j6209640edf6b68ae@mail.gmail.com> Message-ID: Hi guys, thanks for all your advise on how to implement H.264 streaming". I have done so by using unicast. I have another question to ask you u guys. When the live555 is streaming h264 file, does it read the whole file at once or just reads several bytes at once. If it does not read the whole file intially, how to implement it such that it does. Thanks in advance Date: Wed, 9 Sep 2009 13:38:16 +0200 From: devaureshy at gmail.com To: live-devel at ns.live555.com Subject: Re: [Live-devel] (no subject) I think I got the files from this tutorials, and I will try to find it. But it really deceived me and I found i a bit useless. I think you should read the FAQ, which says you need a framer subclass for H.264. From some sources (it work with files from FFmpeg with default parameters), your framer just need to recognize and remove the NALU start code (0x00 00 01) before sending the frame to the sink. Another thing you have to do (IF YOU'RE USING A RSTP SERVER YOU DONT NEED THIS) to play using vlc is to be able to provide a SDP description. This part is tough, and you should consult the library you use to produce this description. When you have it, you should transmit it to the receiver (there is a formal way of doing it , i can't give details) . if you're using FFmpeg I can help you a little more, but if you want to read files with differents kind of sources, I can't help you. Regards, Steve Jiekak On Tue, Sep 8, 2009 at 4:06 AM, ansary mohamed wrote: Hi experts, I am doing H.264 streaming using live555 libraries and using vlc to play the rtp stream. I understand from the mailing list that this is done before. I read thru the codes and rfc documents but unable to understand how to do it. There is a tutorial based on this and it is posted at http://www.white.ca/patrick/tutorial.tar.gz but the tutorial is not found anymore. I would really appreciate it if anyone has the new link to this. Or anyone can shed some light on how to go about doing it. Some sample codes would surf me as a starting point. Thanks Be seen with Buddy! Tag your picture and win exciting prizes! Click here _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel _________________________________________________________________ See all the ways you can stay connected to friends and family http://www.microsoft.com/windows/windowslive/default.aspx -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 10 22:39:31 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Sep 2009 22:39:31 -0700 Subject: [Live-devel] Streaming from a device In-Reply-To: References: Message-ID: >The card does not have a sockets interface. OK, in this case I suggest using the "watchVariable" parameter to "doEventLoop()". Have a separate thread that reads data from your interface, and then - when sufficient data is available for a frame - sets the "watchVariable" to some value non-zero. (This should be this second thread's *only* interaction with the "LIVE555 Streaming Media" library.) In your code, call "doEventLoop()" in a loop - e.g., char watchVariable; while(1) { watchVariable = 0; scheduler->doEventLoop(&watchVariable); // If we get to this point, then we know that your separate thread has set "watchVariable" to a non-zero value // Call the appropriate function to handle the arrival of new data (e.g., "deliverFrame()") }; -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From garesuru at gmail.com Fri Sep 11 01:48:42 2009 From: garesuru at gmail.com (gerardo ares) Date: Fri, 11 Sep 2009 05:48:42 -0300 Subject: [Live-devel] Audio problem with openRTSP command Message-ID: <38f061a30909110148t59eca802rf9de0c218107ab7b@mail.gmail.com> Hi, I'm trying to get video+audio from an Axis m1031 device (RTSP stream) with the openRTSP command. I'm using options -4 (to create a mp4 file) and -y (to get video and audio sync). The video is fine, but i've some problem with the audio. When I play the mp4 file (generated by the openRTSP command) in VLC, the video is fine but i do not listen any audio. Looking at the metadata of the mp4 file, it's seems that the openRTSP is creating corrupt information at the esds atom: $ MP4Box -info test4.mp4 [iso file] Box "stbl" has 1752 extra bytes [iso file] Box "moov" has 4228 extra bytes Error opening file test4.mp4: IsoMedia File is truncated and the mp4dump command shows an error in the esds atom: $ mp4dump test4.mp4 mp4dump version 1.6 ReadAtom: invalid atom size, extends outside parent atom - skipping to end of "mp4a" "^D<80><80><80>" 705752561 vs 1109523 Dumping test4.mp4 meta-information... type mdat ... type mp4a reserved1 = <6 bytes> 00 00 00 00 00 00 dataReferenceIndex = 1 (0x0001) soundVersion = 1 (0x0001) reserved2 = <6 bytes> 00 00 00 00 00 00 channels = 1 (0x0001) sampleSize = 16 (0x0010) packetSize = 65534 (0xfffe) timeScale = 8000 (0x00001f40) reserved3 = <2 bytes> 00 00 samplesPerPacket = 50 (0x00000032) bytesPerPacket = 1702061171 (0x65736473) bytesPerFrame = 0 (0x00000000) bytesPerSample = 58753152 (0x03808080) type ^D<80><80><80> data = <26 bytes> 1c 40 15 00 18 00 00 00 6d 60 00 00 6d 60 05 80 80 80 02 15 88 06 80 80 80 01 type stts .... It should be "type esds" and not "type ^D<80><80><80>". I looked at the function definition of the esds atom in the QuickTimeFileSink.cpp file, and I think i need some help to understand what could be wrong or something that i need to check. I configured the axis device with the following options: Encoding: AAC Sample rate: 8 kHz Bit rate: 16 kbits/s Thanks, Gerardo. -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Fri Sep 11 08:36:41 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Fri, 11 Sep 2009 08:36:41 -0700 Subject: [Live-devel] Streaming from a device In-Reply-To: References: Message-ID: <4AAA6E89.4080205@schuckmannacres.com> Doesn't the watchVariable solution generally require you to setup some sort of scheduled periodic event scheduleDelayedTask so that the event loop is guarantied to wake up and check it every so often? For consuming data from a separate thread I've always preferred skipping the watch variable and just scheduling a periodic task (every 10 msec or so) to check a queue (a thread safe queue) to check for data from the producer thread. Matt S. Ross Finlayson wrote: >> The card does not have a sockets interface. > > OK, in this case I suggest using the "watchVariable" parameter to > "doEventLoop()". Have a separate thread that reads data from your > interface, and then - when sufficient data is available for a frame - > sets the "watchVariable" to some value non-zero. (This should be this > second thread's *only* interaction with the "LIVE555 Streaming Media" > library.) > > In your code, call "doEventLoop()" in a loop - e.g., > > char watchVariable; > while(1) { > watchVariable = 0; > scheduler->doEventLoop(&watchVariable); > // If we get to this point, then we know that your separate thread > has set "watchVariable" to a non-zero value > // Call the appropriate function to handle the arrival of new data > (e.g., "deliverFrame()") > }; From framefritti at gmail.com Fri Sep 11 14:14:08 2009 From: framefritti at gmail.com (Finta Tartaruga) Date: Fri, 11 Sep 2009 23:14:08 +0200 Subject: [Live-devel] Questions about Groupsock Message-ID: <80617a0a0909111414p2c9bf0beg2047ad4bc8e6aa63@mail.gmail.com> Dear all, for my final project I need to do few tests with an experimental transport protocol. We decided to use the library live555 suitably modified to include our experimental protocol. I studied the source code for a while, especially the Groupsock part. I think I understood it quite well, but still I feel a little bit unsure and I was wondering if someone could confirm/refine/deny my "theories" This is what I understood: * Groupsock is a "socket-like" object for multicasting. Its constructors receive the pair (multicast IP address, port) associated to the multicast group, the constructor for the SSM case expects also the IP addr of the source. * The constructors create a socket which is bind()-ed to the multicast port and joined to the multicast group. I would say that you can give also a non-multicast address with the only consequence that the join will fail. * You can send packets to the address passed to the constructor by using the method output(). * So far, so good... Now it comes the part a little bit more unclear: the Destination list. I understand that it allows you to have a "multi-unicast" (that is, you can send data to many hosts over several unicast connections) and that the first element of the list is the address given to the constructor. However, there is something that leaves me "uneasy" since I cannot understand the motivations behind some code; for example, + the method changeDestinationParameters() acts only on the first destination of the list (which typically coincides with the address given to the constructors unless you remove it via removeDestination) and leaves the others unchanged. Am I maybe supposed to use changeDestinationParameters() only with a single-destination Groupsock? + the method groupAddress() returns the IP address stored in fIncomingGroupEId which coincides with the address given to the constructor. I would guess that it has the meaning of the IP address of the multicast group we belong to since it is used, for example, to create the SDP description. However, if I change group with changeDestinationParameters(), I go to another multicast group, while the value of fIncomingGroupEId remains the old one. If now a PassiveServerMediaSubsession calls (inside sdpLines) the method groupAddress() [to get the value to be used in the "c=" field of the SDP description] it will get the old group address and not the new one. Is this an unlikely-triggered bug or is this correct? (and why?) Thank you in advance for your help From kidjan at gmail.com Fri Sep 11 11:27:59 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Fri, 11 Sep 2009 11:27:59 -0700 Subject: [Live-devel] Live G711 audio source In-Reply-To: References: Message-ID: Sorry for the constant posts--I solved it. Again, timestamping bug. I changed my code to use the gettimeofday() defined in GroupsockHelper.hh, and everything works, so...pretty clear the problem is in our timestamping class (portable timestamping code == royal PITA). Thanks for all your help, Ross! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 12 00:07:58 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 12 Sep 2009 00:07:58 -0700 Subject: [Live-devel] Questions about Groupsock In-Reply-To: <80617a0a0909111414p2c9bf0beg2047ad4bc8e6aa63@mail.gmail.com> References: <80617a0a0909111414p2c9bf0beg2047ad4bc8e6aa63@mail.gmail.com> Message-ID: >for my final project I need Does your school not have its own domain name? Serious professionals do not use "@gmail.com" email addresses. >* So far, so good... Now it comes the part a little bit more unclear: >the Destination list. I understand that it allows you to have a >"multi-unicast" (that is, you can send data to many hosts over several >unicast connections) and that the first element of the list is the >address given to the constructor. Yes. Note, though, that the ability of a "Groupsock" object to have 'multiple destinations' is basically a hack that was added just to support unicast RTSP sessions that that take a single input source. It's a feature that you're unlikely to need or care about. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From framefritti at gmail.com Sat Sep 12 01:13:03 2009 From: framefritti at gmail.com (Finta Tartaruga) Date: Sat, 12 Sep 2009 10:13:03 +0200 Subject: [Live-devel] Questions about Groupsock In-Reply-To: References: <80617a0a0909111414p2c9bf0beg2047ad4bc8e6aa63@mail.gmail.com> Message-ID: <80617a0a0909120113y67f75eacjf34c131617009f92@mail.gmail.com> 2009/9/12 Ross Finlayson > for my final project I need >> > > Does your school not have its own domain name? Serious professionals do > not use "@gmail.com" email addresses. > Yes, they have, but I joined the mailist using my home computer and I spontaneously used my private address. Hmmm... maybe you are right and I should re-join with my school address (or change the registerd e-mail, if possible, I'll check). > * So far, so good... Now it comes the part a little bit more unclear: >> the Destination list. I understand that it allows you to have a >> "multi-unicast" (that is, you can send data to many hosts over several >> unicast connections) and that the first element of the list is the >> address given to the constructor. >> > > Yes. Note, though, that the ability of a "Groupsock" object to have > 'multiple destinations' is basically a hack that was added just to support > unicast RTSP sessions that that take a single input source. It's a feature > that you're unlikely to need or care about. > -- OK, the "incongruences" that I found made me guess so, but I wanted to be sure. Thank you for your answer. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Sat Sep 12 23:17:14 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Sun, 13 Sep 2009 09:17:14 +0300 Subject: [Live-devel] NAL file reader Message-ID: <21E398286732DC49AD45BE8C7BE96C079668EA47A1@fs11.mertree.mer.co.il> Hi all, I've a file reader from H.264 raw NAL samples saved as received from the encoder. I'd like to stream it (with the correct timestamps) inside the MPG2 TS over RTP. What is the best way to do it and how do I set the correct timestamps of the TS packets? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 12 23:47:24 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 12 Sep 2009 23:47:24 -0700 Subject: [Live-devel] NAL file reader In-Reply-To: <21E398286732DC49AD45BE8C7BE96C079668EA47A1@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C079668EA47A1@fs11.mertree.mer.co.il> Message-ID: >I've a file reader from H.264 raw NAL samples saved as received from >the encoder. I'd like to stream it (with the correct timestamps) >inside the MPG2 TS over RTP. >What is the best way to do it I don't think our MPEG Transport Stream multiplexor currently supports H.264 video data. In any case, if you already have H.264 Video Elementary Stream data, it seems silly to package it up into a MPEG Transport Stream and stream it that way. Instead, just send the H.264 video data directly (using "H264VideoRTPSink"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Sun Sep 13 04:16:18 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Sun, 13 Sep 2009 14:16:18 +0300 Subject: [Live-devel] NAL file reader In-Reply-To: References: <21E398286732DC49AD45BE8C7BE96C079668EA47A1@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C079668EA481C@fs11.mertree.mer.co.il> That's true, the packaging of the NAL units into MPG2 TS looks like a waste of bandwidth. But this is what the client application expects. They want us to add video, audio and some custom data into TS and stream it over to a standard player (Elecard). There is a standard way to put NAL byte stream into TS, I just have to figure out how to deal with the time stamps and other H.264 specific details. In order to get the video and sequence details my sender will probably have to parse the SEI, SPS and PPS messages and form the TS accordingly. I just wonder if anyone has already done that before and can help me get there faster? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Sunday, September 13, 2009 9:47 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] NAL file reader I've a file reader from H.264 raw NAL samples saved as received from the encoder. I'd like to stream it (with the correct timestamps) inside the MPG2 TS over RTP. What is the best way to do it I don't think our MPEG Transport Stream multiplexor currently supports H.264 video data. In any case, if you already have H.264 Video Elementary Stream data, it seems silly to package it up into a MPEG Transport Stream and stream it that way. Instead, just send the H.264 video data directly (using "H264VideoRTPSink"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From woods.biz at gmail.com Sun Sep 13 23:54:00 2009 From: woods.biz at gmail.com (Woods) Date: Mon, 14 Sep 2009 14:54:00 +0800 Subject: [Live-devel] Bookmark TS file Message-ID: <3ba8ebc0909132354s1699dc70s29f7cfa003a36c69@mail.gmail.com> Hi, I am trying to bookmark a playing TS file for later re-play. What I am retrieving is the fNPT value from ClientTrickPlayState. My function is like: (step 1) updateTSRecordNum(); (step 2) fIndexFile->lookupPCRFromTSPacketNum(fTSRecordNum, False, fNPT, fIxRecordNum); (step 3) return fNPT But from the result, I saw the fNPT increases exponentially after above function is called repeatedly. I suspect it is due to the repeated call of updateTSRecordNum(). Could anyone tell me how to retrieve the correct NPT time value? Thank you! -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From gosha212 at gmail.com Mon Sep 14 14:16:32 2009 From: gosha212 at gmail.com (Georgy Steshin) Date: Tue, 15 Sep 2009 00:16:32 +0300 Subject: [Live-devel] Can't stream Mpeg4 Message-ID: <000701ca3580$a39d4700$ead7d500$@com> Hi everybody, I have a problem with streaming mpeg4. My program capture the frames from the camera encodes it to xvid and streams it. For some reason I don't see outgoing packet in wireshark and if I put a breakpoint in OutputSocket::write it never reaches it. Hope for your help. Sorry for the poor english. Here is the source code: ========Start DeviceSourceCamera.h ======= #pragma once #include #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "cv.h" #include "highgui.h" #include class DeviceSourceCammera: public DeviceSource { public: static DeviceSourceCammera* createNew(UsageEnvironment& env, DeviceParameters params); CvCapture* capture; protected: DeviceSourceCammera(UsageEnvironment& env, DeviceParameters params); virtual ~DeviceSourceCammera(); private: virtual void doGetNextFrame(); private: void deliverFrame(); private: DeviceParameters fParams; }; ========End DeviceSourceCamera.h ======= ========Start DeviceSourceCamera.cpp ======= #include "DeviceSourceCamera.h" DeviceSourceCammera::DeviceSourceCammera(UsageEnvironment& env, DeviceParameters params) : DeviceSource(env, params) { //initiate the device //Create a window to display capture = cvCreateCameraCapture(0); cvNamedWindow("playBack", 1); } DeviceSourceCammera* DeviceSourceCammera::createNew(UsageEnvironment& env, DeviceParameters params) { return new DeviceSourceCammera(env, params); } void DeviceSourceCammera::doGetNextFrame() { //Grab the image from the camera if (0 /* the source stops being readable */) { handleClosure(this); fprintf(stderr,"In Grab Image V4l, the source stops being readable!!!\n"); return; } deliverFrame(); } void DeviceSourceCammera::deliverFrame() { //Encode here the frame IplImage *img = 0; if (!isCurrentlyAwaitingData()) return; int width = 640; int height = 480; int frameSize; const char *filename = "test.m4v"; Revel_Error revError; if (REVEL_API_VERSION != Revel_GetApiVersion()) { printf("ERROR: Revel version mismatch!\n"); printf("Headers: version %06x, API version %d\n", REVEL_VERSION, REVEL_API_VERSION); printf("Library: version %06x, API version %d\n", Revel_GetVersion(), Revel_GetApiVersion()); exit(1); } int encoderHandle; revError = Revel_CreateEncoder(&encoderHandle); if (revError != REVEL_ERR_NONE) { printf("Revel Error while creating encoder: %d\n", revError); exit(1); } Revel_Params revParams; Revel_InitializeParams(&revParams); revParams.width = width; revParams.height = height; revParams.frameRate = 25.0f; revParams.quality = 1.0f; revParams.codec = REVEL_CD_XVID; revError = Revel_EncodeStart(encoderHandle, filename, &revParams); if (revError != REVEL_ERR_NONE) { printf("Revel Error while starting encoding: %d\n", revError); exit(1); } //Grab image do { cvGrabFrame(capture); img=cvRetrieveFrame(capture); cvShowImage("playBack", img); cvWaitKey(20); }while(img->imageData == 0); Revel_VideoFrame frame; frame.width = width; frame.height = height; frame.bytesPerPixel = 3; frame.pixelFormat = REVEL_PF_BGR; frame.pixels = new int[img->imageSize]; memset(frame.pixels, 0, img->imageSize); memcpy(frame.pixels, img->imageData, img->imageSize); revError = Revel_EncodeFrame(encoderHandle, &frame, &frameSize); if (revError != REVEL_ERR_NONE) { printf("Revel Error while writing frame: %d\n", revError); exit(1); } if (img->imageSize > fMaxSize) { frameSize = fMaxSize; } else frameSize = img->imageSize; fFrameSize = frameSize; // delete fTo; fTo = new unsigned char[img->imageSize]; memcpy(fTo, frame.pixels, frameSize); Revel_DestroyEncoder(encoderHandle); nextTask() = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*) afterGetting,this); } DeviceSourceCammera::~DeviceSourceCammera() { //printf("DeviceSourceCammera Deconstructor..."); //cvReleaseCapture(&capture); //cvDestroyWindow("playBack"); } ========End DeviceSourceCamera.cpp ======= ======== (Main) Start WebCam.cpp ============ #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" #include "DeviceSourceCamera.h" UsageEnvironment* env; MPEG4VideoStreamFramer* videoSource; RTPSink* videoSink; DeviceSourceCammera* fileSource; void afterPlaying(void* /*clientData*/); void play(); int main() { TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); char const* destinationAddressStr = "192.168.0.7"; struct in_addr destinationAddress; destinationAddress.s_addr = our_inet_addr(destinationAddressStr); const unsigned short rtpPortNum = 8888; const unsigned short rtcpPortNum = rtpPortNum+1; const unsigned char ttl = 7; const Port rtpPort(rtpPortNum); const Port rtcpPort(rtcpPortNum); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl); videoSink = MPEG4ESVideoRTPSink::createNew(*env, &rtpGroupsock, 96); const unsigned estimatedSessionBandwidth = 4500; const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; RTCPInstance* rtcp = RTCPInstance::createNew(*env, &rtcpGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL, False); DeviceParameters params; fileSource = DeviceSourceCammera::createNew(*env, params); if (fileSource == NULL) { *env << "Unable to open source\n"; exit(1); } play(); env->taskScheduler().doEventLoop(); return 0; } void play() { FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES); // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); } void afterPlaying(void* /*clientData*/) { *env << "...done reading from file\n"; Medium::close(videoSource); // Start playing once again: play(); } Georgy Steshin, image001 Hutsot Shefayim P.O.B 348 Shefayim Israel 60990 Mobile: +972-52-6222528 Office: +972-9-788 7342 Fax: +972-9-9576021 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 3016 bytes Desc: not available URL: From woods.biz at gmail.com Mon Sep 14 19:12:41 2009 From: woods.biz at gmail.com (Woods) Date: Tue, 15 Sep 2009 10:12:41 +0800 Subject: [Live-devel] Fwd: Bookmark TS file In-Reply-To: <3ba8ebc0909132354s1699dc70s29f7cfa003a36c69@mail.gmail.com> References: <3ba8ebc0909132354s1699dc70s29f7cfa003a36c69@mail.gmail.com> Message-ID: <3ba8ebc0909141912o115e5d5bvf3fc3f9fe143792b@mail.gmail.com> Hi, I think I found the answer. I should not updateTSRecordNum(), which should be called when playing or scale status changes. I just add fRSRecordNum to fFramer's packet counter or Trick filter's index/packet counter. Regards, Woods ---------- Forwarded message ---------- From: Woods Date: Mon, Sep 14, 2009 at 2:54 PM Subject: Bookmark TS file To: LIVE555 Streaming Media - development & use Hi, I am trying to bookmark a playing TS file for later re-play. What I am retrieving is the fNPT value from ClientTrickPlayState. My function is like: (step 1) updateTSRecordNum(); (step 2) fIndexFile->lookupPCRFromTSPacketNum(fTSRecordNum, False, fNPT, fIxRecordNum); (step 3) return fNPT But from the result, I saw the fNPT increases exponentially after above function is called repeatedly. I suspect it is due to the repeated call of updateTSRecordNum(). Could anyone tell me how to retrieve the correct NPT time value? Thank you! -- Woods -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Tue Sep 15 10:21:32 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Tue, 15 Sep 2009 10:21:32 -0700 Subject: [Live-devel] strstream.h Message-ID: We ran into some compile errors for "strstrea.h", did a bit of research and ended up making this change in both Groupsock.cpp and NetInterface.cpp: #ifndef NO_STRSTREAM #if (defined(_*WIN32__) || defined(_WIN32)) && !defined(__MINGW32__) #include #else #if defined(__GNUC__) && (__GNUC_* > 3 || *_GNUC_* == 3 && *_GNUC_MINOR_* > 0) #include #else #include #endif #endif #endif sstream is the preferred choice. strstrea.h and strstream.h are both deprecated (the former is no longer present in VS 2008 distributions, for example). We did some testing using just sstream and found that it worked for us on both our windows builds and linux builds. I see there was some talk in the mailing list about changing this ~5 years ago, but I think it might be safe to finally kick strstrea.h in the face. We tested this change on our Windows, Linux and embedded Linux builds, and it seems to work well, but as always I'm not sure how much variety there is in people's build environments. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 15 13:59:45 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Sep 2009 13:59:45 -0700 Subject: [Live-devel] strstream.h In-Reply-To: References: Message-ID: >sstream is the preferred choice. strstrea.h and strstream.h are >both deprecated (the former is no longer present in VS 2008 >distributions, for example). We did some testing using just sstream >and found that it worked for us on both our windows builds and linux >builds. Can someone with a credible, professional email address (i.e., not "@gmail", "@yahoo" "@hotmail" etc.) please confirm that (and *not* , , or ) is now the preferred thing to #include for Windows compilers ('Visual Studio'), *and also* MINGW, *and also* GNU GCC. If someone can confirm this, I'll change the code in the next release to include by default. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ddennedy at gmail.com Tue Sep 15 14:41:38 2009 From: ddennedy at gmail.com (Dan Dennedy) Date: Tue, 15 Sep 2009 14:41:38 -0700 Subject: [Live-devel] Once again, SDP support for Live555 & interaction with FFMpeg ... In-Reply-To: <4A968FBB.7090800@impathnetworks.com> References: <4A968FBB.7090800@impathnetworks.com> Message-ID: <27dd34e60909151441v29c1ad3anfdf394d90dbe39eb@mail.gmail.com> On Thu, Aug 27, 2009 at 6:52 AM, Sandy Walsh wrote: > Hi, > > We have a multicast stream, initiated from another process, which generates > an .SDP file for the stream > > v=0 > o=MangoDSP 126 14843616424497153183 IN IP4 192.168.0.62 > s=Mango DSP Audio/Video > m=video 7170 RTP/AVP 96 > a=rtpmap:96 H264/90000 > c=IN IP4 232.0.0.11 > a=fmtp:96 > packetization-mode=1;profile-level-id=42001E;sprop-parameter-sets=Z0KAHkLaAtD0QA==,aEjgGody > a=control:__StreamID=270385256 > > We've searched through the Live555 archives and there is mention of passing > this to MediaSession::createNew(), but we're sort of stumped on making this > a reality. > > Instead let me explain what we hope to achieve and perhaps someone can give > some guidance? > > Ultimately our plan is to pass the stream onto FFMpeg for decoding/storage. > When we try to open this stream directly with FFMpeg it takes forever (and > often never) for the program to lock onto the stream, so perhaps there is a > problem with ffmpeg and this h264 stream. But, when we open a unicast stream > with openRTSP and pipe the output to ffmpeg it works fine. > > My question is, what processing beyond the RTSP negotiation does openRTSP do > on the actual video stream? Does it strip off some network wrapper from the > stream and pass a raw h.264 stream along for processing? yes, it does not process the video stream other than re-framing from network packetization. > We're not really clear what value openRTSP brings to us solving our problem > if we aren't using the RTSP handshake? If it helps, I have made a testProgs/openSDP.cpp. You will have to adjust the Makefiles to build it (or compile manually), and it supports most of the options that openRTSP does since it just uses playCommon. Um, I suppose it could be slightly out-of-date - I think the last snapshot I compile it against was from February of this year. Ross can consider including it if he thinks it will help reduce his support burden (I surrender all rights to the following code). #include "playCommon.hh" Medium* createClient(UsageEnvironment& env, int verbosityLevel, char const* applicationName) { return ServerMediaSession::createNew(env); } char* getOptionsResponse(Medium* client, char const* url) { return NULL; } char* getSDPDescriptionFromURL(Medium* client, char const* url, char const* username, char const* password, char const* /*proxyServerName*/, unsigned short /*proxyServerPortNum*/, unsigned short /*clientStartPort*/) { FILE *f = fopen(url, "r"); char data[4096]; data[0] = 0; if (f) { fread(data, sizeof(data), 1, f); fclose(f); } return strdup(data); } Boolean clientSetupSubsession(Medium* client, MediaSubsession* subsession, Boolean streamUsingTCP) { subsession->sessionId = "mumble"; // anything that's non-NULL will work return True; } Boolean clientStartPlayingSession(Medium* client, MediaSession* session) { return True; } Boolean clientTearDownSession(Medium* client, MediaSession* session) { return True; } Boolean allowProxyServers = False; Boolean controlConnectionUsesTCP = False; Boolean supportCodecSelection = False; char const* clientProtocolName = "SDP"; From ben at decadent.org.uk Tue Sep 15 18:20:53 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Wed, 16 Sep 2009 02:20:53 +0100 Subject: [Live-devel] strstream.h In-Reply-To: References: Message-ID: <1253064053.4989.2.camel@localhost> On Tue, 2009-09-15 at 13:59 -0700, Ross Finlayson wrote: > >sstream is the preferred choice. strstrea.h and strstream.h are > >both deprecated (the former is no longer present in VS 2008 > >distributions, for example). We did some testing using just sstream > >and found that it worked for us on both our windows builds and linux > >builds. > > Can someone with a credible, professional email address (i.e., not > "@gmail", "@yahoo" "@hotmail" etc.) please confirm that > (and *not* , , or ) is now the > preferred thing to #include for Windows compilers ('Visual Studio'), > *and also* MINGW, *and also* GNU GCC. > > If someone can confirm this, I'll change the code in the next release > to include by default. and are both part of the standard C++ library and have been included in the Visual C++ and GCC libraries for many years (since Visual C++ 6 and GCC 3.0, if I remember correctly). Ben. -- Ben Hutchings This sentence contradicts itself - no actually it doesn't. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 828 bytes Desc: This is a digitally signed message part URL: From belloni at imavis.com Wed Sep 16 09:11:40 2009 From: belloni at imavis.com (Cristiano Belloni) Date: Wed, 16 Sep 2009 18:11:40 +0200 Subject: [Live-devel] Set a limit for RTSP Client connections. Message-ID: <4AB10E3C.90503@imavis.com> Hello Ross and everyone else, I'm using live555 RTSP server (unicast) in a system with limited memory and CPU resources. When too many clients connect to the server, the CPU can't cope with the load, and I start losing frames or get read errors from the pipe I read the frames from (it's a realtime application, and I think the kernel starts to be unstable when the system CPU time is very high). So, I'd like to set a static limit to the number of users who can get the live stream at the same time. Is there a way? If there isn't, couldn't I just create a derived class of the RTPsinks I'm using, increase a shared counter everytime their CreateNew() method is called and decrease it in the destructor of the sink? Then, if the counter is over the limit, I can refuse to create a new sink in the createNewRTPSink method of my MediaSubsession (aka return a NULL pointer). Would it work? Additionally, the increase and decrease counter operations wouldn't have to be mutually exclusive because the library works in a non-threaded fashion, is it right? Thanks in advance, Cristiano. -- Belloni Cristiano Imavis Srl. www.imavis.com belloni at imavis.com From ankur129 at gmail.com Wed Sep 16 11:48:14 2009 From: ankur129 at gmail.com (Ankur Bhattacharjee) Date: Wed, 16 Sep 2009 11:48:14 -0700 Subject: [Live-devel] Callback architecture Message-ID: <1a302d0e0909161148h4b9d2364x706f68828766ee65@mail.gmail.com> Hi all, I am new to this library, but one of my project needs to use this library. I am having a hard time understanding the flow of the library, specially the callbacks. I went through the FAQ, but couldn't help myself much. It would be great if someone can help me out in understanding the program flow or send me some pointers or links or tutorial for the library. Thanks in advance. Ankur -- -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 16 16:50:35 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Sep 2009 16:50:35 -0700 Subject: [Live-devel] Set a limit for RTSP Client connections. In-Reply-To: <4AB10E3C.90503@imavis.com> References: <4AB10E3C.90503@imavis.com> Message-ID: >When too many clients connect to the server, the CPU can't cope with >the load, and I start losing frames or get read errors from the pipe >I read the frames from (it's a realtime application, and I think the >kernel starts to be unstable when the system CPU time is very high). More likely: You don't have enough OS buffering on your pipe. You can probably increase this. >So, I'd like to set a static limit to the number of users who can >get the live stream at the same time. Is there a way? No, not without writing new code. >If there isn't, couldn't I just create a derived class of the >RTPsinks I'm using, increase a shared counter everytime their >CreateNew() method is called and decrease it in the destructor of >the sink? A better solution would be make changes just to the RTSP server, rather than to the (more general) RTP code. I.e., subclass "RTSPServer" and "RTSPClientSession". Specifically: - Write a subclass of "RTSPServer" that - adds a 'client session counter' member variable - redefines the "createNewClientSession()" member function to - just return if the counter is already at its limit - otherwise increment the counter and create a new object of your "RTSPClientSession" subclass (instead of the original ""RTSPClientSession") - Write a subclass of "RTSPClientSession" that: - has its own destructor that decrements the parent server's counter >Additionally, the increase and decrease counter operations wouldn't >have to be mutually exclusive because the library works in a >non-threaded fashion, is it right? Yes. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Alastair.Bain at Cubic.com Wed Sep 16 20:32:55 2009 From: Alastair.Bain at Cubic.com (Alastair Bain) Date: Thu, 17 Sep 2009 15:32:55 +1200 Subject: [Live-devel] non monotonic timestamps in mpeg transport stream Message-ID: I've got a source (hardware encoder) which delivers MPEG2 transport stream frames with PCR's that are non-monotonic. This completely breaks the MPEG2TransportStreamFramer as it uses these to determine the packet duration. Anyone have any idea whether (a) it's even valid for the hardware encoder to do this and (b) how to calculate a sensible duration when this is the case? The current approach I'm taking is to a append a timecode to the start of the packet (to effectively go from TS to M2TS) and use this to calculate the duration instead. I should mention that playback of data is fine if it is saved directly to disk and then played using VLC (for example), I guess VLC is able to read ahead and deliver the frames at the correct times. Alastair Bain From kidjan at gmail.com Wed Sep 16 19:02:50 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 16 Sep 2009 19:02:50 -0700 Subject: [Live-devel] Question about a FAQ answer Message-ID: The Live555 FAQ has this question and answer: Q: For many of the "test*Streamer" test prpgrams, the built-in RTSP server is optional (and disabled by default). For "testAMRudioStreamer", "testMPEG4VideoStreamer" and "testWAVAudioStreamer", however, the built-in RTSP server is manditory. Why? A: For those media types (AMR audio, MPEG-4 video, and PCM audio, respectively), the stream includes some codec-specific parameters that are communicated to clients out-of-band, in a SDP description. Because these parameters - and thus the SDP description - can vary from stream to stream, the only effective way to communicate this SDP description to clients is using the standard RTSP protocol. Therefore, the RTSP server is a manditory part of these test programs. My question is: does this same principle apply to H.264 as well (i.e. the sprop-parameter-sets and profile-level-idc guys in the SDP exchange?). Can't the sprop-parameter-sets/profile-level-idc be sent in the stream itself, or is a secondary communication line essential to make H.264 (and AMR/MPEG-4/PCM) understandable by a receiving application? Thanks. From sharmi.chatterjee at gmail.com Wed Sep 16 22:00:25 2009 From: sharmi.chatterjee at gmail.com (Sharmistha Chatterjee) Date: Thu, 17 Sep 2009 08:00:25 +0300 Subject: [Live-devel] Debugging live 555 on vlc Message-ID: <806945de0909162200j1a5ba8f5y25a76d102c99bb3c@mail.gmail.com> Hi, I am new to live 555. I want to get the debug messages of live 555 while I am getting the debug messages of VLC. I am using the linux/gdb config file of live 555 to generate make files. Also I am running VLC on maemo. Can anyone give me some pointers how I could get the debug message of live 555 on the console or into a file. Sharmistha -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastien-devel at celeos.eu Wed Sep 16 23:56:28 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Thu, 17 Sep 2009 08:56:28 +0200 Subject: [Live-devel] Debugging live 555 on vlc In-Reply-To: <806945de0909162200j1a5ba8f5y25a76d102c99bb3c@mail.gmail.com> References: <806945de0909162200j1a5ba8f5y25a76d102c99bb3c@mail.gmail.com> Message-ID: <1253170588.4ab1dd9c1d4dc@imp.celeos.eu> Hi, Quoting Sharmistha Chatterjee : > Hi, > > I am new to live 555. I want to get the debug messages of live 555 while I > am getting the debug messages of VLC. I am using the linux/gdb config file > of live 555 to generate make files. Also I am running VLC on maemo. > > Can anyone give me some pointers how I could get the debug message of live > 555 on the console or into a file. If you start vlc with -vvv from command line, you will see all RTSP commands in the console. From finlayson at live555.com Wed Sep 16 23:46:38 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Sep 2009 23:46:38 -0700 Subject: [Live-devel] non monotonic timestamps in mpeg transport stream In-Reply-To: References: Message-ID: >I've got a source (hardware encoder) which delivers MPEG2 transport >stream frames with PCR's that are non-monotonic. This completely breaks >the MPEG2TransportStreamFramer as it uses these to determine the packet >duration. Anyone have any idea whether (a) it's even valid for the >hardware encoder to do this I don't believe it is, no. > and (b) how to calculate a sensible duration >when this is the case? I suggest reprocessing your Transport Stream files (before you index and/or stream them) so that their PCRs make sense. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Sep 16 23:48:16 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Sep 2009 23:48:16 -0700 Subject: [Live-devel] Question about a FAQ answer In-Reply-To: References: Message-ID: >The Live555 FAQ has this question and answer: > >Q: For many of the "test*Streamer" test prpgrams, the built-in RTSP >server is optional (and disabled by default). For >"testAMRudioStreamer", "testMPEG4VideoStreamer" and >"testWAVAudioStreamer", however, the built-in RTSP server is >manditory. Why? > >A: For those media types (AMR audio, MPEG-4 video, and PCM audio, >respectively), the stream includes some codec-specific parameters that >are communicated to clients out-of-band, in a SDP description. Because >these parameters - and thus the SDP description - can vary from stream >to stream, the only effective way to communicate this SDP description >to clients is using the standard RTSP protocol. Therefore, the RTSP >server is a manditory part of these test programs. > >My question is: does this same principle apply to H.264 as well Yes. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From amitye at gmail.com Wed Sep 16 23:43:12 2009 From: amitye at gmail.com (Amit Yedidia) Date: Thu, 17 Sep 2009 09:43:12 +0300 Subject: [Live-devel] source contribution Message-ID: <8c7051d40909162343y47e56f4aw179659d8df150908@mail.gmail.com> Hi Ross, I am attaching modified code of the Live555. there are two issues in this code. 1. registering RTSPClientSession's in the RTSP server, so when the server is deleted, while sessions are still active, the server can delete them. 2. access to RTP Header Extension. I hope you will merge those changes in the future release. Amit Yedidia. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: unLive.rar Type: application/rar Size: 37926 bytes Desc: not available URL: From 12091637 at qq.com Thu Sep 17 00:13:52 2009 From: 12091637 at qq.com (=?ISO-8859-1?B?MTIwOTE2Mzc=?=) Date: Thu, 17 Sep 2009 15:13:52 +0800 Subject: [Live-devel] Question about streaming a H.264 video file via RTP Message-ID: Dear all, I am working with live555 library for H.264 video file streaming via RTP. When I creat "ByteStreamFileSource" instance to stream *.h264 video file, how can I set "PlayTimePerFrame" and "preferredFrameSize" value?How to get the "preferredFrameSize" and "playTimePerFrame" parameters base on existing *.h264 file? How to calculate them from the information of *.h264 file. If I set the two parameters to zero, my stream will be incompatible with existing, standards-compliant clients (e.g., VLC,MPlayer). Sometimes VLC will crash when play my stream( rtsp://192.168.10.62/test.h264) Maybe it's too dummy question, anyway I am new guy of this industry). Any advice would be great help for me. Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From sharmi.chatterjee at gmail.com Thu Sep 17 01:37:52 2009 From: sharmi.chatterjee at gmail.com (Sharmistha Chatterjee) Date: Thu, 17 Sep 2009 11:37:52 +0300 Subject: [Live-devel] Debugging live 555 on vlc In-Reply-To: <1253170588.4ab1dd9c1d4dc@imp.celeos.eu> References: <806945de0909162200j1a5ba8f5y25a76d102c99bb3c@mail.gmail.com> <1253170588.4ab1dd9c1d4dc@imp.celeos.eu> Message-ID: <806945de0909170137g14f9870bg8ad430144b85a677@mail.gmail.com> Hi, I could see all the RTSP commands the options, request and describe parameters but in between function calls I am writting debug messages as it has already been done inside the library of live 555 . I could not see those messages which are specified within stdout or stderr. So where is the output of stderr and stdout goes to in the debug flag of live 555 library. One such example is /live/liveMedia/RTCP.cpp. Configure used is linux-gdb and I am building inside scratchbox environment for running it on maemo. It will be nice to know what changes needs to be done for printing the output of the debug flag in the console or to a file while running VLC. Sharmistha On Thu, Sep 17, 2009 at 9:56 AM, S?bastien Escudier wrote: > Hi, > > > Quoting Sharmistha Chatterjee : > > > Hi, > > > > I am new to live 555. I want to get the debug messages of live 555 while > I > > am getting the debug messages of VLC. I am using the linux/gdb config > file > > of live 555 to generate make files. Also I am running VLC on maemo. > > > > Can anyone give me some pointers how I could get the debug message of > live > > 555 on the console or into a file. > > If you start vlc with -vvv from command line, you will see all RTSP > commands in > the console. > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- Sharmistha Chatterjee. BTech I.T -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Thu Sep 17 16:56:16 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Thu, 17 Sep 2009 16:56:16 -0700 Subject: [Live-devel] IncreaseSendBufferTo / RTSP_PARAM_STRING_MAX / dynamic rtp payload types Message-ID: I was investigating an issue where I was capturing an outgoing stream from my RtspServer, and I noticed there were missing packets when I captured using tcpdump. I increased the send buffer size using increaseSendBufferTo, and this helped the issue. However, I was curious as to what is the difference between increaseSendBufferTo and setSendBufferSize? (And the appropriate ReceiveBuffer calls) (setSendBufferTo does not seem to be called). On a separate point. can we set the RTSP_PARAM_STRING_MAX to something larger than 100? I use 256, but is there any reason it cannot be PATH_MAX (which is OS indepenedent ? tho I am not sure if this is defined on all systems). I use a larger size to allow more url parameters on the streams on my server. Another thing I have noticed is that there are certain non conforming devices that can change their RTP payload midstream. To handle this on my client end I added the abilility to ignore the rtp payload type being dynamic. I did this by not checking the payload type if the expected type was set to zero: --- vendor/liveMedia/MultiFramedRTPSource.cpp +++ custom/liveMedia/MultiFramedRTPSource.cpp @@ -263,7 +263,10 @@ bPacket->removePadding(numPaddingBytes); } // Check the Payload Type. - if ((unsigned char)((rtpHdr&0x007F0000)>>16) + // Handle a payload type of 0 + // Which means do not worry about the payload type + if (0 != source->rtpPayloadFormat() && + (unsigned char)((rtpHdr&0x007F0000)>>16) != source->rtpPayloadFormat()) { break; } Regards, Stuart - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 17 23:09:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Sep 2009 23:09:28 -0700 Subject: [Live-devel] IncreaseSendBufferTo / RTSP_PARAM_STRING_MAX / dynamic rtp payload types In-Reply-To: References: Message-ID: >However, I was curious as to what is the difference between >increaseSendBufferTo and setSendBufferSize? Remember, You Have Complete Source Code. >On a separate point. can we set the RTSP_PARAM_STRING_MAX to >something larger than 100? Sure, if you wish. > is there any reason it cannot be PATH_MAX Well, PATH_MAX means something different (I think) >(which is OS indepenedent - tho I am not sure if this is defined on >all systems). I use a larger size to allow more url parameters on >the streams on my server. OK, would 200 be large enough for you? (I generally don't like using a power-of-two (like 256) for these constants if there's not a real reason for it to be a power-of-two.) If so, I can change it in the next released version of the code. >Another thing I have noticed is that there are certain non >conforming devices that can change their RTP payload midstream. To >handle this on my client end I added the abilility to ignore the rtp >payload type being dynamic. I did this by not checking the payload >type if the expected type was set to zero: Unfortunately, RTP payload code 0 is a legal code - for u-law PCM audio. The ability fo properly handle (and demultiplex, as appropriate) multiple RTP payload codes in a single stream is a feature that will be added sometime in the future. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kidjan at gmail.com Thu Sep 17 09:44:15 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 17 Sep 2009 09:44:15 -0700 Subject: [Live-devel] Question about streaming a H.264 video file via RTP In-Reply-To: References: Message-ID: On Thu, Sep 17, 2009 at 12:13 AM, 12091637 <12091637 at qq.com> wrote: > Dear all, > I am working with live555 library for H.264 video file streaming via RTP. > When I creat "ByteStreamFileSource" instance to stream *.h264 video file, > how can I set > "PlayTimePerFrame" and "preferredFrameSize" value?How to get the > "preferredFrameSize" and > "playTimePerFrame" parameters base on existing *.h264 file? How to calculate > them from the information of *.h264 file. > If I set the two?parameters to zero,?my stream will be incompatible with > existing, standards-compliant clients (e.g.,?VLC,MPlayer). > Sometimes VLC will crash when play my stream( > rtsp://192.168.10.62/test.h264) To get the preferredFrameSize value, you'll have to parse the h264 bytestream; read the H.264 standard to figure out how to do this (you'll likely need to read the standard to get your streaming working anyway). You cannot get "playTimePerFrame" from a ".h264" file because raw h264 byte streams contain no timing information (H.264 encoders themselves have no notion of time, only successive frames). From sharda.murthi at gmail.com Thu Sep 17 21:14:03 2009 From: sharda.murthi at gmail.com (Sharda Murthi) Date: Fri, 18 Sep 2009 00:14:03 -0400 Subject: [Live-devel] Problem faced while using openRTSP and testOnDemandRTSPServer Message-ID: Hi, I have the latest version of live555 mediaServer installed on Ubuntu on my system. On running the live555mediaserver, I am able to stream a file present in that directory using rtsp:// However, while trying to stream using testOnDemandRTSPServer and openRTSP, I get the following error. Command: openRTSP -n rtsp://192.168.1.102:8554/toycom13.mpg Sending request: OPTIONS rtsp://192.168.1.102:8554/toycom13.mpg RTSP/1.0 CSeq: 1 User-Agent: openRTSP (LIVE555 Streaming Media v2008.07.24) Received OPTIONS response: RTSP/1.0 200 OK CSeq: 1 Date: Fri, Sep 18 2009 03:53:40 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER Sending request: DESCRIBE rtsp://192.168.1.102:8554/toycom13.mpg RTSP/1.0 CSeq: 2 Accept: application/sdp User-Agent: openRTSP (LIVE555 Streaming Media v2008.07.24) Received DESCRIBE response: RTSP/1.0 404 Stream Not Found CSeq: 2 Date: Fri, Sep 18 2009 03:53:40 GMT Failed to get a SDP description from URL "rtsp:// 192.168.1.102:8554/toycom13.mpg": cannot handle DESCRIBE response: RTSP/1.0 404 Stream Not Found What does this error mean and how do I rectify it? Why do I see this error only with testOnDemandRTSPServer and openRTSP? Appreciate any help. Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 18 01:20:50 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Sep 2009 01:20:50 -0700 Subject: [Live-devel] Problem faced while using openRTSP and testOnDemandRTSPServer In-Reply-To: References: Message-ID: >I have the latest version of live555 mediaServer installed on Ubuntu >on my system. On running the live555mediaserver, I am able to stream >a file present in that directory using rtsp:// > >However, while trying to stream using testOnDemandRTSPServer and >openRTSP, I get the following error. [...] >Received DESCRIBE response: RTSP/1.0 404 Stream Not Found That's because "testOnDemandRTSPServer" is just a demonstration application that uses a few, 'hard-wired' file names. Unless you want to change the file name(s) in the "testOnDemandRTSPServer" code, just continue to use "live555MediaServer" instead. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From SRawling at pelco.com Fri Sep 18 05:43:08 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Fri, 18 Sep 2009 05:43:08 -0700 Subject: [Live-devel] IncreaseSendBufferTo / RTSP_PARAM_STRING_MAX / dynamic rtp payload types In-Reply-To: Message-ID: >OK, would 200 be large enough for you? (I generally don't like using >a power-of-two (like 256) for these constants if there's not a real >reason for it to be a power-of-two.) If so, I can change it in the >next released version of the code. I believe 200 should be fine. Thanks. >The ability fo properly handle (and demultiplex, as appropriate) >multiple RTP payload codes in a single stream is a feature that will >be added sometime in the future. Ok. Thanks. This is a feature I look forward to. Will this also include better access to any RTP header extensions? - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 19 07:17:01 2009 From: finlayson at live555.com (Major Mobley) Date: Sat, 19 Sep 2009 09:17:01 -0500 Subject: [Live-devel] Order your Rolex Submariner right now Message-ID: Are you ready for hottest rep watches from 2010? They are here http://tablestuff.com/ From finlayson at live555.com Sat Sep 19 06:52:09 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Sep 2009 06:52:09 -0700 Subject: [Live-devel] Sorry about that last spam Message-ID: Argh! Damn spammers hare started impersonating me. I accidentally let one get through.... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ben at decadent.org.uk Sun Sep 20 09:08:14 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Sun, 20 Sep 2009 17:08:14 +0100 Subject: [Live-devel] [PATCH] Do not shortcut recursive make Message-ID: <1253462894.3415.81.camel@localhost> Currently, after a successful (or even partially successful) build, a subsequent 'make' at the top level will do nothing even if source files have changed. This does not seem to be useful behaviour. Therefore, this patch changes the top-level Makefile to run make in each subdirectory unconditionally. Ben. diff --git a/Makefile.tail b/Makefile.tail index 6a8f549..6838011 100644 --- a/Makefile.tail +++ b/Makefile.tail @@ -1,40 +1,20 @@ ##### End of variables to change LIVEMEDIA_DIR = liveMedia -LIVEMEDIA_LIB = $(LIVEMEDIA_DIR)/libliveMedia.$(LIB_SUFFIX) GROUPSOCK_DIR = groupsock -GROUPSOCK_LIB = $(GROUPSOCK_DIR)/libgroupsock.$(LIB_SUFFIX) USAGE_ENVIRONMENT_DIR = UsageEnvironment -USAGE_ENVIRONMENT_LIB = $(USAGE_ENVIRONMENT_DIR)/libUsageEnvironment.$(LIB_SUFFIX) BASIC_USAGE_ENVIRONMENT_DIR = BasicUsageEnvironment -BASIC_USAGE_ENVIRONMENT_LIB = $(BASIC_USAGE_ENVIRONMENT_DIR)/libBasicUsageEnvironment.$(LIB_SUFFIX) TESTPROGS_DIR = testProgs -TESTPROGS_APP = $(TESTPROGS_DIR)/testMP3Streamer$(EXE) MEDIA_SERVER_DIR = mediaServer -MEDIA_SERVER_APP = $(MEDIA_SERVER_DIR)/mediaServer$(EXE) -ALL = $(LIVEMEDIA_LIB) \ - $(GROUPSOCK_LIB) \ - $(USAGE_ENVIRONMENT_LIB) \ - $(BASIC_USAGE_ENVIRONMENT_LIB) \ - $(TESTPROGS_APP) \ - $(MEDIA_SERVER_APP) -all: $(ALL) - - -$(LIVEMEDIA_LIB): +all: cd $(LIVEMEDIA_DIR) ; $(MAKE) -$(GROUPSOCK_LIB): cd $(GROUPSOCK_DIR) ; $(MAKE) -$(USAGE_ENVIRONMENT_LIB): cd $(USAGE_ENVIRONMENT_DIR) ; $(MAKE) -$(BASIC_USAGE_ENVIRONMENT_LIB): cd $(BASIC_USAGE_ENVIRONMENT_DIR) ; $(MAKE) -$(TESTPROGS_APP): $(LIVEMEDIA_LIB) $(GROUPSOCK_LIB) $(USAGE_ENVIRONMENT_LIB) $(BASIC_USAGE_ENVIRONMENT_LIB) cd $(TESTPROGS_DIR) ; $(MAKE) -$(MEDIA_SERVER_APP): $(LIVEMEDIA_LIB) $(GROUPSOCK_LIB) $(USAGE_ENVIRONMENT_LIB) $(BASIC_USAGE_ENVIRONMENT_LIB) cd $(MEDIA_SERVER_DIR) ; $(MAKE) clean: -- 1.6.4.3 -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 828 bytes Desc: This is a digitally signed message part URL: From ben at decadent.org.uk Sun Sep 20 09:11:25 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Sun, 20 Sep 2009 17:11:25 +0100 Subject: [Live-devel] [PATCH] Build real static libraries under Linux Message-ID: <1253463085.3415.84.camel@localhost> This patch changes the Linux configs to build static libraries (archives with symbol tables) using ar rather than building combined object files. A real static library allows the final link to omit objects that are not referenced, and can be referenced multiple times on the command line without leading to multiple-definition errors. It also fixes the library order in linker command lines for executables to work with real static libraries. This is only applied to the Linux configs because it relies on the fact that GNU ar will automatically generate a symbol table. On other platforms a separate command 'ranlib' does that. Ben. diff --git a/config.armeb-uclibc b/config.armeb-uclibc index 0fa5eca..2457f7b 100644 --- a/config.armeb-uclibc +++ b/config.armeb-uclibc @@ -11,8 +11,8 @@ OBJ = o LINK = $(CROSS_COMPILE)gcc -o LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) -LIBRARY_LINK = $(CROSS_COMPILE)ld -o -LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic +LIBRARY_LINK = $(CROSS_COMPILE)ar cr +LIBRARY_LINK_OPTS = LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = LIBS_FOR_GUI_APPLICATION = diff --git a/config.armlinux b/config.armlinux index 62d44ee..ba4f761 100644 --- a/config.armlinux +++ b/config.armlinux @@ -10,8 +10,8 @@ OBJ = o LINK = $(CROSS_COMPILE)gcc -o LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) -LIBRARY_LINK = $(CROSS_COMPILE)ld -o -LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic +LIBRARY_LINK = $(CROSS_COMPILE)ar cr +LIBRARY_LINK_OPTS = LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = LIBS_FOR_GUI_APPLICATION = diff --git a/config.bfin-linux-uclibc b/config.bfin-linux-uclibc index 5dad85a..3342b7a 100644 --- a/config.bfin-linux-uclibc +++ b/config.bfin-linux-uclibc @@ -10,8 +10,8 @@ OBJ = o LINK = $(CROSS_COMPILER)g++ -o LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) -LIBRARY_LINK = $(CROSS_COMPILER)ld -o -LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic -m elf32bfinfd +LIBRARY_LINK = $(CROSS_COMPILER)ar cr +LIBRARY_LINK_OPTS = LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = LIBS_FOR_GUI_APPLICATION = diff --git a/config.bfin-uclinux b/config.bfin-uclinux index c36135c..157572e 100644 --- a/config.bfin-uclinux +++ b/config.bfin-uclinux @@ -10,8 +10,8 @@ OBJ = o LINK = $(CROSS_COMPILER)g++ -Wl,-elf2flt -o LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) -LIBRARY_LINK = $(CROSS_COMPILER)ld -o -LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic +LIBRARY_LINK = $(CROSS_COMPILER)ar cr +LIBRARY_LINK_OPTS = LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = LIBS_FOR_GUI_APPLICATION = diff --git a/config.linux b/config.linux index e111780..5c7bc8f 100644 --- a/config.linux +++ b/config.linux @@ -9,8 +9,8 @@ OBJ = o LINK = c++ -o LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) -LIBRARY_LINK = ld -o -LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic +LIBRARY_LINK = ar cr +LIBRARY_LINK_OPTS = LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = LIBS_FOR_GUI_APPLICATION = diff --git a/config.linux-gdb b/config.linux-gdb index 4c64030..87387e5 100644 --- a/config.linux-gdb +++ b/config.linux-gdb @@ -9,8 +9,8 @@ OBJ = o LINK = c++ -o LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) -LIBRARY_LINK = ld -o -LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic +LIBRARY_LINK = ar cr +LIBRARY_LINK_OPTS = LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = LIBS_FOR_GUI_APPLICATION = diff --git a/config.uClinux b/config.uClinux index cd20d2e..1e0696b 100644 --- a/config.uClinux +++ b/config.uClinux @@ -12,8 +12,8 @@ OBJ = o LINK = $(CROSS_COMPILE)g++ -o LINK_OPTS = -L. $(LDFLAGS) CONSOLE_LINK_OPTS = $(LINK_OPTS) -LIBRARY_LINK = $(CROSS_COMPILE)ld -o -LIBRARY_LINK_OPTS = -L. -r -Bstatic +LIBRARY_LINK = $(CROSS_COMPILE)ar cr +LIBRARY_LINK_OPTS = LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = $(CXXLIBS) LIBS_FOR_GUI_APPLICATION = $(LIBS_FOR_CONSOLE_APPLICATION) diff --git a/mediaServer/Makefile.tail b/mediaServer/Makefile.tail index 51b5b66..606b8ae 100644 --- a/mediaServer/Makefile.tail +++ b/mediaServer/Makefile.tail @@ -25,7 +25,7 @@ LIVEMEDIA_LIB = $(LIVEMEDIA_DIR)/libliveMedia.$(LIB_SUFFIX) GROUPSOCK_DIR = ../groupsock GROUPSOCK_LIB = $(GROUPSOCK_DIR)/libgroupsock.$(LIB_SUFFIX) LOCAL_LIBS = $(LIVEMEDIA_LIB) $(GROUPSOCK_LIB) \ - $(USAGE_ENVIRONMENT_LIB) $(BASIC_USAGE_ENVIRONMENT_LIB) + $(BASIC_USAGE_ENVIRONMENT_LIB) $(USAGE_ENVIRONMENT_LIB) LIBS = $(LOCAL_LIBS) $(LIBS_FOR_CONSOLE_APPLICATION) live555MediaServer$(EXE): $(MEDIA_SERVER_OBJS) $(LOCAL_LIBS) diff --git a/testProgs/Makefile.tail b/testProgs/Makefile.tail index e0028b9..589866f 100644 --- a/testProgs/Makefile.tail +++ b/testProgs/Makefile.tail @@ -60,7 +60,7 @@ LIVEMEDIA_LIB = $(LIVEMEDIA_DIR)/libliveMedia.$(LIB_SUFFIX) GROUPSOCK_DIR = ../groupsock GROUPSOCK_LIB = $(GROUPSOCK_DIR)/libgroupsock.$(LIB_SUFFIX) LOCAL_LIBS = $(LIVEMEDIA_LIB) $(GROUPSOCK_LIB) \ - $(USAGE_ENVIRONMENT_LIB) $(BASIC_USAGE_ENVIRONMENT_LIB) + $(BASIC_USAGE_ENVIRONMENT_LIB) $(USAGE_ENVIRONMENT_LIB) LIBS = $(LOCAL_LIBS) $(LIBS_FOR_CONSOLE_APPLICATION) testMP3Streamer$(EXE): $(MP3_STREAMER_OBJS) $(LOCAL_LIBS) -- 1.6.4.3 -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 828 bytes Desc: This is a digitally signed message part URL: From ben at decadent.org.uk Sun Sep 20 09:12:32 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Sun, 20 Sep 2009 17:12:32 +0100 Subject: [Live-devel] [PATCH] Parse SDP bandwidth line and use to set the receive socket buffer size Message-ID: <1253463152.3415.86.camel@localhost> This is required to avoid packet loss for high-bandwidth codecs such as DV video. Ben. diff --git a/liveMedia/MediaSession.cpp b/liveMedia/MediaSession.cpp index 2768157..1a09f2c 100644 --- a/liveMedia/MediaSession.cpp +++ b/liveMedia/MediaSession.cpp @@ -211,6 +211,7 @@ Boolean MediaSession::initializeWithSDP(char const* sdpDescription) { // Check for various special SDP lines that we understand: if (subsession->parseSDPLine_c(sdpLine)) continue; + if (subsession->parseSDPLine_b(sdpLine)) continue; if (subsession->parseSDPAttribute_rtpmap(sdpLine)) continue; if (subsession->parseSDPAttribute_control(sdpLine)) continue; if (subsession->parseSDPAttribute_range(sdpLine)) continue; @@ -540,7 +541,7 @@ MediaSubsession::MediaSubsession(MediaSession& parent) fClientPortNum(0), fRTPPayloadFormat(0xFF), fSavedSDPLines(NULL), fMediumName(NULL), fCodecName(NULL), fProtocolName(NULL), fRTPTimestampFrequency(0), fControlPath(NULL), - fSourceFilterAddr(parent.sourceFilterAddr()), + fSourceFilterAddr(parent.sourceFilterAddr()), fBandwidth(0), fAuxiliarydatasizelength(0), fConstantduration(0), fConstantsize(0), fCRC(0), fCtsdeltalength(0), fDe_interleavebuffersize(0), fDtsdeltalength(0), fIndexdeltalength(0), fIndexlength(0), fInterleaving(0), fMaxdisplacement(0), @@ -690,6 +691,13 @@ Boolean MediaSubsession::initiate(int useSpecialRTPoffset) { if (!success) break; // a fatal error occurred trying to create the RTP and RTCP sockets; we can't continue } + // Try to use a big receive buffer for RTP - at least 0.1 second of + // specified bandwidth and at least 50 KB + unsigned rtpBufSize = fBandwidth * 25 / 2; // 1 kbps * 0.1 s = 12.5 bytes + if (rtpBufSize < 50 * 1024) + rtpBufSize = 50 * 1024; + increaseReceiveBufferTo(env(), fRTPSocket->socketNum(), rtpBufSize); + // ASSERT: fRTPSocket != NULL && fRTCPSocket != NULL if (isSSM()) { // Special case for RTCP SSM: Send RTCP packets back to the source via unicast: @@ -892,7 +900,10 @@ Boolean MediaSubsession::initiate(int useSpecialRTPoffset) { // Finally, create our RTCP instance. (It starts running automatically) if (fRTPSource != NULL) { - unsigned totSessionBandwidth = 500; // HACK - later get from SDP##### + // If bandwidth is specified, use it and add 5% for RTCP overhead. + // Otherwise make a guess at 500 kbps. + unsigned totSessionBandwidth + = fBandwidth ? fBandwidth + fBandwidth / 20 : 500; fRTCPInstance = RTCPInstance::createNew(env(), fRTCPSocket, totSessionBandwidth, (unsigned char const*) @@ -1028,6 +1039,12 @@ Boolean MediaSubsession::parseSDPLine_c(char const* sdpLine) { return False; } +Boolean MediaSubsession::parseSDPLine_b(char const* sdpLine) { + // Check for "b=:" line + // RTP applications are expected to use bwtype="AS" + return sscanf(sdpLine, "b=AS:%u", &fBandwidth) == 1; +} + Boolean MediaSubsession::parseSDPAttribute_rtpmap(char const* sdpLine) { // Check for a "a=rtpmap: /" line: // (Also check without the "/"; RealNetworks omits this) diff --git a/liveMedia/MultiFramedRTPSource.cpp b/liveMedia/MultiFramedRTPSource.cpp index dee9a53..1e4bdc4 100644 --- a/liveMedia/MultiFramedRTPSource.cpp +++ b/liveMedia/MultiFramedRTPSource.cpp @@ -68,9 +68,6 @@ MultiFramedRTPSource : RTPSource(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency) { reset(); fReorderingBuffer = new ReorderingPacketBuffer(packetFactory); - - // Try to use a big receive buffer for RTP: - increaseReceiveBufferTo(env, RTPgs->socketNum(), 50*1024); } void MultiFramedRTPSource::reset() { diff --git a/liveMedia/include/MediaSession.hh b/liveMedia/include/MediaSession.hh index 5b9ad06..a73bb46 100644 --- a/liveMedia/include/MediaSession.hh +++ b/liveMedia/include/MediaSession.hh @@ -249,6 +249,7 @@ protected: void setNext(MediaSubsession* next) { fNext = next; } Boolean parseSDPLine_c(char const* sdpLine); + Boolean parseSDPLine_b(char const* sdpLine); Boolean parseSDPAttribute_rtpmap(char const* sdpLine); Boolean parseSDPAttribute_control(char const* sdpLine); Boolean parseSDPAttribute_range(char const* sdpLine); @@ -274,6 +275,7 @@ protected: unsigned fRTPTimestampFrequency; char* fControlPath; // holds optional a=control: string struct in_addr fSourceFilterAddr; // used for SSM + unsigned fBandwidth; // in kilobits, from b= line // Parameters set by "a=fmtp:" SDP lines: unsigned fAuxiliarydatasizelength, fConstantduration, fConstantsize; -- 1.6.4.3 -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 828 bytes Desc: This is a digitally signed message part URL: From ben at decadent.org.uk Sun Sep 20 09:13:09 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Sun, 20 Sep 2009 17:13:09 +0100 Subject: [Live-devel] [PATCH] Use estimated bitrate to set SDP bandwidth line and socket send buffer size Message-ID: <1253463189.3415.87.camel@localhost> This is required to avoid packet loss for high-bandwidth codecs such as DV video. Ben. diff --git a/liveMedia/OnDemandServerMediaSubsession.cpp b/liveMedia/OnDemandServerMediaSubsession.cpp index 235e60a..786c6a1 100644 --- a/liveMedia/OnDemandServerMediaSubsession.cpp +++ b/liveMedia/OnDemandServerMediaSubsession.cpp @@ -77,7 +77,7 @@ OnDemandServerMediaSubsession::sdpLines() { // subsession (as a unicast stream). To do so, we first create // dummy (unused) source and "RTPSink" objects, // whose parameters we use for the SDP lines: - unsigned estBitrate; // unused + unsigned estBitrate; FramedSource* inputSource = createNewStreamSource(0, estBitrate); if (inputSource == NULL) return NULL; // file not found @@ -88,7 +88,7 @@ OnDemandServerMediaSubsession::sdpLines() { RTPSink* dummyRTPSink = createNewRTPSink(&dummyGroupsock, rtpPayloadType, inputSource); - setSDPLinesFromRTPSink(dummyRTPSink, inputSource); + setSDPLinesFromRTPSink(dummyRTPSink, inputSource, estBitrate); Medium::close(dummyRTPSink); closeStreamSource(inputSource); } @@ -228,6 +228,15 @@ void OnDemandServerMediaSubsession if (rtpGroupsock != NULL) rtpGroupsock->removeAllDestinations(); if (rtcpGroupsock != NULL) rtcpGroupsock->removeAllDestinations(); + if (rtpGroupsock != NULL) { + // Try to use a big send buffer for RTP - at least 0.1 second of + // specified bandwidth and at least 50 KB + unsigned rtpBufSize = streamBitrate * 25 / 2; // 1 kbps * 0.1 s = 12.5 bytes + if (rtpBufSize < 50 * 1024) + rtpBufSize = 50 * 1024; + increaseSendBufferTo(envir(), rtpGroupsock->socketNum(), rtpBufSize); + } + // Set up the state of the stream. The stream will get started later: streamToken = fLastStreamToken = new StreamState(*this, serverRTPPort, serverRTCPPort, rtpSink, udpSink, @@ -347,7 +356,8 @@ void OnDemandServerMediaSubsession::closeStreamSource(FramedSource *inputSource) } void OnDemandServerMediaSubsession -::setSDPLinesFromRTPSink(RTPSink* rtpSink, FramedSource* inputSource) { +::setSDPLinesFromRTPSink(RTPSink* rtpSink, FramedSource* inputSource, + unsigned estBitrate) { if (rtpSink == NULL) return; char const* mediaType = rtpSink->sdpMediaType(); @@ -362,6 +372,7 @@ void OnDemandServerMediaSubsession char const* const sdpFmt = "m=%s %u RTP/AVP %d\r\n" "c=IN IP4 %s\r\n" + "b=AS:%u\r\n" "%s" "%s" "%s" @@ -369,6 +380,7 @@ void OnDemandServerMediaSubsession unsigned sdpFmtSize = strlen(sdpFmt) + strlen(mediaType) + 5 /* max short len */ + 3 /* max char len */ + strlen(ipAddressStr) + + 20 + strlen(rtpmapLine) + strlen(rangeLine) + strlen(auxSDPLine) @@ -379,6 +391,7 @@ void OnDemandServerMediaSubsession fPortNumForSDP, // m= rtpPayloadType, // m= ipAddressStr, // c= address + estBitrate, // b=AS: rtpmapLine, // a=rtpmap:... (if present) rangeLine, // a=range:... (if present) auxSDPLine, // optional extra SDP line diff --git a/liveMedia/include/OnDemandServerMediaSubsession.hh b/liveMedia/include/OnDemandServerMediaSubsession.hh index 95d2266..70de0c7 100644 --- a/liveMedia/include/OnDemandServerMediaSubsession.hh +++ b/liveMedia/include/OnDemandServerMediaSubsession.hh @@ -76,7 +76,8 @@ protected: // new virtual functions, defined by all subclasses FramedSource* inputSource) = 0; private: - void setSDPLinesFromRTPSink(RTPSink* rtpSink, FramedSource* inputSource); + void setSDPLinesFromRTPSink(RTPSink* rtpSink, FramedSource* inputSource, + unsigned estBitrate); // used to implement "sdpLines()" private: -- 1.6.4.3 -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 828 bytes Desc: This is a digitally signed message part URL: From finlayson at live555.com Sun Sep 20 18:43:44 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Sep 2009 18:43:44 -0700 Subject: [Live-devel] strstream.h In-Reply-To: <1253064053.4989.2.camel@localhost> References: <1253064053.4989.2.camel@localhost> Message-ID: FYI, I have now installed a new version (2009.09.21) of the code that uses exclusively. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Sep 20 18:46:08 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Sep 2009 18:46:08 -0700 Subject: [Live-devel] [PATCH] Use estimated bitrate to set SDP bandwidth line and socket send buffer size In-Reply-To: <1253463189.3415.87.camel@localhost> References: <1253463189.3415.87.camel@localhost> Message-ID: Ben, Thanks for the contribution. I have included this (and the other three patches you sent) in the latest release (2009.09.21) of the "LIVE555 Streaming Media" code. (I also updated "PassiveServerMediaSubsession" so that the same thing is done for multicast RTSP sessions.) (BTW, I haven't forgotten about including your support for DV video; I hope to complete that soon.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ssviridenko at iptv.by Sun Sep 20 22:56:19 2009 From: ssviridenko at iptv.by (=?UTF-8?B?ItCh0LLQuNGA0LjQtNC10L3QutC+INChLtCQLiI=?=) Date: Mon, 21 Sep 2009 08:56:19 +0300 Subject: [Live-devel] How to make session count restriction? Message-ID: <4AB71583.8000501@iptv.by> What i need to modify in "RTSPServer" to make session count restriction? From finlayson at live555.com Sun Sep 20 23:50:39 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Sep 2009 23:50:39 -0700 Subject: [Live-devel] How to make session count restriction? In-Reply-To: <4AB71583.8000501@iptv.by> References: <4AB71583.8000501@iptv.by> Message-ID: >What i need to modify in "RTSPServer" to make session count restriction? Someone else asked the same question recently. Here was my answer: http://lists.live555.com/pipermail/live-devel/2009-September/011183.html -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Tue Sep 22 01:58:22 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Tue, 22 Sep 2009 11:58:22 +0300 Subject: [Live-devel] Question about streaming a H.264 video file via RTP In-Reply-To: References: Message-ID: <21E398286732DC49AD45BE8C7BE96C079CB3DCD4A2@fs11.mertree.mer.co.il> Hi, I'm having the same problem (trying to stream NAL packets in TS). So, it looks like there is no way to calculate the timestamps for the MPEG TS once the AVC NALs are saved into a file. Its probably possible to read SEI, SPS NALs and get the idea of the approximate bitrate of the AVC stream and thus set the frame time. This might not work with variable bitrate though. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeremy Noring Sent: Thursday, September 17, 2009 7:44 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Question about streaming a H.264 video file via RTP On Thu, Sep 17, 2009 at 12:13 AM, 12091637 <12091637 at qq.com> wrote: > Dear all, > I am working with live555 library for H.264 video file streaming via RTP. > When I creat "ByteStreamFileSource" instance to stream *.h264 video file, > how can I set > "PlayTimePerFrame" and "preferredFrameSize" value?How to get the > "preferredFrameSize" and > "playTimePerFrame" parameters base on existing *.h264 file? How to calculate > them from the information of *.h264 file. > If I set the two?parameters to zero,?my stream will be incompatible with > existing, standards-compliant clients (e.g.,?VLC,MPlayer). > Sometimes VLC will crash when play my stream( > rtsp://192.168.10.62/test.h264) To get the preferredFrameSize value, you'll have to parse the h264 bytestream; read the H.264 standard to figure out how to do this (you'll likely need to read the standard to get your streaming working anyway). You cannot get "playTimePerFrame" from a ".h264" file because raw h264 byte streams contain no timing information (H.264 encoders themselves have no notion of time, only successive frames). _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From mjarvis at draper.com Tue Sep 22 08:53:11 2009 From: mjarvis at draper.com (Jarvis, M. Simon) Date: Tue, 22 Sep 2009 11:53:11 -0400 Subject: [Live-devel] H264FUAFragmenter glibc error Message-ID: I have been using the live555 media library for streaming H.264 encoded data stream, on an ARM based embedded linux platform. It is working for the most part, except for Stopping the stream, at which point I get the following glibc error: ~H264FUAFragmenter() fInputBuffer = 0xfcbb0 *** glibc detected *** ./cap_encode_omap3530.x470MV: double free or corruption ( !prev): 0x000fcbb0 *** I have traced this down to the H264FUAFragmenter destruction. The two line just prior to the glibc error are my debug printf's. If I comment out the delete[] fInputBuffer; It works better the first time, but then eventually crashes subsequently. I'm guessing by removing the delete I end up with a memory leak and then eventually run out of memory. I also put check for NULL on the fInputBuffer pointer, but as you can see from the printf the pointer is not NULL but still throws the memory error. Has anyone seen this kind of error? Does anyone have any suggestions for troubleshooting this further? Thanks for your help. Simon -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Tue Sep 22 14:18:36 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Tue, 22 Sep 2009 14:18:36 -0700 Subject: [Live-devel] Question about a FAQ answer In-Reply-To: References: Message-ID: On Wed, Sep 16, 2009 at 11:48 PM, Ross Finlayson wrote: >> The Live555 FAQ has this question and answer: >> >> Q: For many of the "test*Streamer" test prpgrams, the built-in RTSP >> server is optional (and disabled by default). For >> "testAMRudioStreamer", "testMPEG4VideoStreamer" and >> "testWAVAudioStreamer", however, the built-in RTSP server is >> manditory. Why? >> >> A: For those media types (AMR audio, MPEG-4 video, and PCM audio, >> respectively), the stream includes some codec-specific parameters that >> are communicated to clients out-of-band, in a SDP description. Because >> these parameters - and thus the SDP description - can vary from stream >> to stream, the only effective way to communicate this SDP description >> to clients is using the standard RTSP protocol. Therefore, the RTSP >> server is a manditory part of these test programs. >> >> My question is: does this same principle apply to H.264 as well > > Yes. Thanks. My application consists of an embedded processor running Live555. To allow people to view video remotely, we need to push video to a central server where it is then re-distributed; this is to circumvent NAT/firewalls/etc. For this model, the embedded server has to push video to some central server. Is there any way to do this with Live555 without making some rather radical modifications? And if yes, any suggestions as to a good starting place in the code base to make those modifications? (on a side note, I did tear apart quicktime broadcaster with wireshark, and it seems they're using some weird combination of RTSP ANNOUNCE/RECORD and TCP interleaving to "push" video to a server over a single port. Is this a common capability of RTSP implementations? Seems like it isn't) As always, Ross, your advice is greatly appreciated. From gares at eyeseesecurity.com Tue Sep 22 14:27:55 2009 From: gares at eyeseesecurity.com (Gerardo Ares Meneces) Date: Tue, 22 Sep 2009 18:27:55 -0300 Subject: [Live-devel] MP4 Audio stream problem Message-ID: <38f061a30909221427y5bc697dcnd2e98d362c812f7f@mail.gmail.com> Hi, I'm trying to get video+audio from an Axis m1031 device (RTSP stream) with the openRTSP command. I'm using options -4 (to create a mp4 file) and -y (to get video and audio sync). The video is fine, but the audio is wrong. Looking at the mp4 metadata, it seems that the openRTSP is generating wrong information at the esds atom: $ MP4Box -info test4.mp4 [iso file] Box "stbl" has 1752 extra bytes [iso file] Box "moov" has 4228 extra bytes Error opening file test4.mp4: IsoMedia File is truncated and the mp4dump command shows an error in the esds atom: $ mp4dump test4.mp4 mp4dump version 1.6 ReadAtom: invalid atom size, extends outside parent atom - skipping to end of "mp4a" "^D<80><80><80>" 705752561 vs 1109523 Dumping test4.mp4 meta-information... type mdat ... type mp4a reserved1 = <6 bytes> 00 00 00 00 00 00 dataReferenceIndex = 1 (0x0001) soundVersion = 1 (0x0001) reserved2 = <6 bytes> 00 00 00 00 00 00 channels = 1 (0x0001) sampleSize = 16 (0x0010) packetSize = 65534 (0xfffe) timeScale = 8000 (0x00001f40) reserved3 = <2 bytes> 00 00 samplesPerPacket = 50 (0x00000032) bytesPerPacket = 1702061171 (0x65736473) bytesPerFrame = 0 (0x00000000) bytesPerSample = 58753152 (0x03808080) type ^D<80><80><80> data = <26 bytes> 1c 40 15 00 18 00 00 00 6d 60 00 00 6d 60 05 80 80 80 02 15 88 06 80 80 80 01 type stts .... It should be "type esds" but not "type ^D<80><80><80>". I looked at the function definition of the esds atom in the QuickTimeFileSink.cpp file, but I think i need some help to understand what could be wrong or something that i must check. I configured the axis device with the following options: Encoding: AAC Sample rate: 8 kHz Bit rate: 16 kbits/s Thanks in advance. Regards, Gerardo. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 22 15:35:51 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Sep 2009 15:35:51 -0700 Subject: [Live-devel] H264FUAFragmenter glibc error In-Reply-To: References: Message-ID: >~H264FUAFragmenter() >fInputBuffer = 0xfcbb0 >*** glibc detected *** ./cap_encode_omap3530.x470MV: double free or corruption I suspect that "double free" is really the problem. You should check to make sure that "~H264VideoRTPSink()" isn't being called more than once on the same object. You should also check that "OutPacketBuffer::maxSize" is a reasonable value (in particular, non-zero), because that's what gets used for the "fInputBuffer" size. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Sep 22 15:46:00 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Sep 2009 15:46:00 -0700 Subject: [Live-devel] MP4 Audio stream problem In-Reply-To: <38f061a30909221427y5bc697dcnd2e98d362c812f7f@mail.gmail.com> References: <38f061a30909221427y5bc697dcnd2e98d362c812f7f@mail.gmail.com> Message-ID: >$ MP4Box -info test4.mp4 >[iso file] Box "stbl" has 1752 extra bytes >[iso file] Box "moov" has 4228 extra bytes >Error opening file test4.mp4: IsoMedia File is truncated Perhaps the output file is not being written completely, because you are not terminating "openRTSP" correctly? See the "Important note:" mentioned here: http://www.live555.com/openRTSP/#quicktime For the output file to be written properly, you can't just -c "openRTSP". Instead, give it a duration in advance, using the "-d " option, or signal it with SIGHUP or SIGUSR1. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gares at eyeseesecurity.com Tue Sep 22 16:05:02 2009 From: gares at eyeseesecurity.com (Gerardo Ares Meneces) Date: Tue, 22 Sep 2009 20:05:02 -0300 Subject: [Live-devel] MP4 Audio stream problem In-Reply-To: References: <38f061a30909221427y5bc697dcnd2e98d362c812f7f@mail.gmail.com> Message-ID: <38f061a30909221605p7a33c286r8f11ee6cc874f15f@mail.gmail.com> No, i'm using the -d option. Gerardo. On Tue, Sep 22, 2009 at 7:46 PM, Ross Finlayson wrote: > $ MP4Box -info test4.mp4 >> [iso file] Box "stbl" has 1752 extra bytes >> [iso file] Box "moov" has 4228 extra bytes >> Error opening file test4.mp4: IsoMedia File is truncated >> > > Perhaps the output file is not being written completely, because you are > not terminating "openRTSP" correctly? See the "Important note:" mentioned > here: http://www.live555.com/openRTSP/#quicktime > > For the output file to be written properly, you can't just -c > "openRTSP". Instead, give it a duration in advance, using the "-d > " option, or signal it with SIGHUP or SIGUSR1. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjarvis at draper.com Wed Sep 23 06:18:45 2009 From: mjarvis at draper.com (Jarvis, M. Simon) Date: Wed, 23 Sep 2009 09:18:45 -0400 Subject: [Live-devel] H264FUAFragmenter glibc error In-Reply-To: References: Message-ID: Hi Ross, Thanks for your input. I also thought it was probably the "double free", but it turned out I was writing beyond the allocated input buffer(fInputBuffer) size. I thought I was limiting the memcpy() to the fInputBufferSize, but in fact was not, and occasionally my H.264 encoder is producing a buffer larger than the H264FUAFragmenter allocated size. So now having tracked down the problem, the question is what to do about it? I think I have two choices, and would value your opinion here. 1. Increase the H264FUAFragmenter fInputBufferSize to accommodate my largest Encoder output or 2. put in some encoder output buffer overflow management code. I think the easier of the two is 1, as long as I don't run into memory limitations in my embedded environment. Are there any issues that you can think of with going this way? Was there any reason the fInputBufferSize is set to OutPacketBuffer::maxSize, which is hard coded to 60,000? Implications of increasing this? I could hard code a new fInputBufferSize just for the H264FUAFragmenter object instead of changing/using OutPacketBuffer::maxSize? Thanks again for your input. simon -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, September 22, 2009 6:36 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H264FUAFragmenter glibc error >~H264FUAFragmenter() >fInputBuffer = 0xfcbb0 >*** glibc detected *** ./cap_encode_omap3530.x470MV: double free or corruption I suspect that "double free" is really the problem. You should check to make sure that "~H264VideoRTPSink()" isn't being called more than once on the same object. You should also check that "OutPacketBuffer::maxSize" is a reasonable value (in particular, non-zero), because that's what gets used for the "fInputBuffer" size. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Wed Sep 23 08:07:16 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Sep 2009 08:07:16 -0700 Subject: [Live-devel] H264FUAFragmenter glibc error In-Reply-To: References: Message-ID: >Was there any reason the >fInputBufferSize is set to OutPacketBuffer::maxSize, which is hard coded >to 60,000? It's not 'hard coded'; "OutPacketBuffer::maxSize" is a *variable*. You may set it yourself (e.g., to some larger value), before you create your "H264VideoRTPSink" object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mjarvis at draper.com Wed Sep 23 11:07:19 2009 From: mjarvis at draper.com (Jarvis, M. Simon) Date: Wed, 23 Sep 2009 14:07:19 -0400 Subject: [Live-devel] H264FUAFragmenter glibc error In-Reply-To: References: Message-ID: Hi Ross, Yes, it is not 'Hard Coded', sorry poor choice of words ;) At a quick glace it seems to really only be set to 60000, in the source code. And temporarily set while an OutputPacket is created, but then reset back to the 60000 default value. Thanks again for your input, I will modify the value used to create the H264FUAFragmenter() object. Simon -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, September 23, 2009 11:07 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H264FUAFragmenter glibc error >Was there any reason the >fInputBufferSize is set to OutPacketBuffer::maxSize, which is hard coded >to 60,000? It's not 'hard coded'; "OutPacketBuffer::maxSize" is a *variable*. You may set it yourself (e.g., to some larger value), before you create your "H264VideoRTPSink" object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From lanlantao at gmail.com Wed Sep 23 09:38:28 2009 From: lanlantao at gmail.com (Tao Wu) Date: Wed, 23 Sep 2009 11:38:28 -0500 Subject: [Live-devel] question about live streaming H264 Message-ID: <712d99b30909230938l584d5171n2ed16930fb3e331e@mail.gmail.com> Hi, I play to live stream h264, the source is not a file, but encoded video frames from hardware. Can some one suggest what example code I can look at as reference, like H264VideoStreamFrame, VideoFileServerMediaSubsession (their source is a file , not encoded frames)? Thanks a lot. Best Regards, Tao From finlayson at live555.com Wed Sep 23 14:35:11 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Sep 2009 14:35:11 -0700 Subject: [Live-devel] H264FUAFragmenter glibc error In-Reply-To: References: Message-ID: >At a quick glace it seems to really only be set to 60000, in the source >code. Yes, but note that it's a *variable*. You can change it yourself, in your own code. >Thanks again for your input, I will modify the value used to create the >H264FUAFragmenter() object. No! You don't need to change the released source code *at all*. That's my point. Instead, add the following line to your "main()" program (before you create any "RTPSink" objects): OutPacketBuffer::maxSize = 100000; // or whatever buffer size you want. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From spopo316 at yahoo.com.tw Wed Sep 23 18:32:07 2009 From: spopo316 at yahoo.com.tw (spopo316 at yahoo.com.tw) Date: Thu, 24 Sep 2009 09:32:07 +0800 (CST) Subject: [Live-devel] H264 multicast streaming question In-Reply-To: Message-ID: <761260.88942.qm@web72303.mail.tp2.yahoo.com> Hi Jeremy: ? ? Thanks for your advice. When i set correct SPS/PPS,then?i can streaming H264?file now. ? ? ?Thanks Best Regards --- 09/9/11 (?)?Jeremy Noring ??? ???: Jeremy Noring ??: Re: [Live-devel] H264 multicast streaming question ???: "LIVE555 Streaming Media - development & use" ??: 2009?9?11?,?,??7:14 2009/9/10 But when I streamed the H.264 file by unicsat method successfully , the sprop-parameter-sets has been set ?h264?. Therefore i think the sprop-parameter-sets=h264 does't influence the stream when using multicast method. Is it right? No, that's not right.? For the decoder to understand your H.264 stream, it is crucial that the SPS/PPS info is communicated over a reliable protocol.? If you are not sending in in the sprop-parameter-sets argument, how are you conveying it?? Note that sending it in the actual stream is not recommended (see section 8.4 of RFC3984).? Some people mistakenly think their Live555/H264 implementation "works," but really they're just conveying SPS/PPS info through the lossy RTP channel, which is strongly discouraged by the RFC: The picture and sequence parameter set NALUs SHOULD NOT be transmitted in the RTP payload unless reliable transport is provided for RTP, as a loss of a parameter set of either type will likely prevent decoding of a considerable portion of the corresponding RTP stream. Thus, the transmission of parameter sets using a reliable session control protocol (i.e., usage of principle A or B above) is RECOMMENDED. Botton line: correctly populate sprop-parameter-sets; you are wasting your time by not doing this. -----????????----- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___________________________________________________ ??????? ? ???????????????? http://messenger.yahoo.com.tw/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From silence.vincent at msa.hinet.net Wed Sep 23 19:52:01 2009 From: silence.vincent at msa.hinet.net (Vincent Chen) Date: Thu, 24 Sep 2009 10:52:01 +0800 Subject: [Live-devel] question for RTSP/RTP sequence number Message-ID: My client receives RTP stream from Server and stream sometime get "shatterd". And I found my client have packet loss because I print out the RTP sequence number in MultiFramedRTPSource.cpp and the numbers are not continous so I have following questions for RTP sequence number: 1. does livemedia have mechanism to reorder the RTP packet by sequence number if the client doesn't receive in order?? 2. if 1 is yes, how long will livemedia drop the late RTP packet?? thanks!! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 23 22:32:54 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Sep 2009 22:32:54 -0700 Subject: [Live-devel] question for RTSP/RTP sequence number In-Reply-To: References: Message-ID: >1. does livemedia have mechanism to reorder the RTP packet by >sequence number if the client doesn't receive in order?? Yes, and this is done automatically. (I.e., you don't have to do anything special to get this.) RTP receivers (that use our code) will *never* get out-of-order data. >2. if 1 is yes, how long will livemedia drop the late RTP packet?? By default, if a sequence number packet gap is seen, the code will wait 100000 microseconds (i.e., 100 ms) for the late packet to arrive, before it is dropped. However, you can change this threshold using the function void MultiFramedRTPSource::setPacketReorderingThresholdTime(unsigned uSeconds) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Thu Sep 24 00:46:37 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Thu, 24 Sep 2009 10:46:37 +0300 Subject: [Live-devel] H264 multicast streaming question In-Reply-To: <761260.88942.qm@web72303.mail.tp2.yahoo.com> References: <761260.88942.qm@web72303.mail.tp2.yahoo.com> Message-ID: <21E398286732DC49AD45BE8C7BE96C079CB3DCD62D@fs11.mertree.mer.co.il> Hi, Its obvious that loss of the SPS or PPS results in a lot of grief in the h264 land. The question is what choice do we have in case of no other means of communication besides RTP? What if the h264 stream is packed inside TS and receiver is not aware of anything else? I guess in case of LAN streaming where packet loss is rare sending SPS/PPS inband is not that bad of an option would you agree? From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of spopo316 at yahoo.com.tw Sent: Thursday, September 24, 2009 4:32 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H264 multicast streaming question Hi Jeremy: Thanks for your advice. When i set correct SPS/PPS,then i can streaming H264 file now. Thanks Best Regards --- 09/9/11 (?)?Jeremy Noring ??? ???: Jeremy Noring ??: Re: [Live-devel] H264 multicast streaming question ???: "LIVE555 Streaming Media - development & use" ??: 2009?9?11?,?,??7:14 2009/9/10 > But when I streamed the H.264 file by unicsat method successfully , the sprop-parameter-sets has been set ?h264?. Therefore i think the sprop-parameter-sets=h264 does't influence the stream when using multicast method. Is it right? No, that's not right. For the decoder to understand your H.264 stream, it is crucial that the SPS/PPS info is communicated over a reliable protocol. If you are not sending in in the sprop-parameter-sets argument, how are you conveying it? Note that sending it in the actual stream is not recommended (see section 8.4 of RFC3984). Some people mistakenly think their Live555/H264 implementation "works," but really they're just conveying SPS/PPS info through the lossy RTP channel, which is strongly discouraged by the RFC: The picture and sequence parameter set NALUs SHOULD NOT be transmitted in the RTP payload unless reliable transport is provided for RTP, as a loss of a parameter set of either type will likely prevent decoding of a considerable portion of the corresponding RTP stream. Thus, the transmission of parameter sets using a reliable session control protocol (i.e., usage of principle A or B above) is RECOMMENDED. Botton line: correctly populate sprop-parameter-sets; you are wasting your time by not doing this. -----????????----- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___________________________________________________ ??????? ? ???????????????? http://messenger.yahoo.com.tw/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Thu Sep 24 01:25:01 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Thu, 24 Sep 2009 11:25:01 +0300 Subject: [Live-devel] TS user data Message-ID: <21E398286732DC49AD45BE8C7BE96C079CB3DCD634@fs11.mertree.mer.co.il> Hi, I'd like to add telemetry data along with A/V streams to my TS. I planned to subclass MPEG2TransportStreamFromESSource and add a method like addNewTelemetrySource. The telemetry data arrives via serial port on my device and my source will feed the data to TS as soon as its available. Does this sound ok? -------------- next part -------------- An HTML attachment was scrubbed... URL: From cybercockroach at gmail.com Thu Sep 24 01:49:10 2009 From: cybercockroach at gmail.com (cybercockroach) Date: Thu, 24 Sep 2009 16:49:10 +0800 Subject: [Live-devel] RTSP over Http Message-ID: <4abb32fd.27045a0a.42ef.6f1b@mx.google.com> Dear experts: Do live555 support the RTSP over Http?? If yes, how can I do? Any hints? if no, is there any workaround to accomplishment this?? Thanks for your reply -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 24 02:05:32 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Sep 2009 02:05:32 -0700 Subject: [Live-devel] RTSP over Http In-Reply-To: <4abb32fd.27045a0a.42ef.6f1b@mx.google.com> References: <4abb32fd.27045a0a.42ef.6f1b@mx.google.com> Message-ID: > Do live555 support the RTSP over Http?? Yes. For RTSP clients, it is fully implemented; note the "tunnelOverHTTPPortNum" parameter to "RTSPClient::createNew()". For RTSP servers it is not yet fully implemented, but will be in the future. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjarvis at draper.com Thu Sep 24 05:28:05 2009 From: mjarvis at draper.com (Jarvis, M. Simon) Date: Thu, 24 Sep 2009 08:28:05 -0400 Subject: [Live-devel] H264FUAFragmenter glibc error In-Reply-To: References: Message-ID: Hi Ross, Sorry being a bit thick ;) Too far down in the details. I completely understand what you mean.( now!) Thanks again for your help. Simon -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, September 23, 2009 5:35 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H264FUAFragmenter glibc error >At a quick glace it seems to really only be set to 60000, in the source >code. Yes, but note that it's a *variable*. You can change it yourself, in your own code. >Thanks again for your input, I will modify the value used to create the >H264FUAFragmenter() object. No! You don't need to change the released source code *at all*. That's my point. Instead, add the following line to your "main()" program (before you create any "RTPSink" objects): OutPacketBuffer::maxSize = 100000; // or whatever buffer size you want. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From kidjan at gmail.com Thu Sep 24 10:16:32 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 24 Sep 2009 10:16:32 -0700 Subject: [Live-devel] H264 multicast streaming question In-Reply-To: <21E398286732DC49AD45BE8C7BE96C079CB3DCD62D@fs11.mertree.mer.co.il> References: <761260.88942.qm@web72303.mail.tp2.yahoo.com> <21E398286732DC49AD45BE8C7BE96C079CB3DCD62D@fs11.mertree.mer.co.il> Message-ID: On Thu, Sep 24, 2009 at 12:46 AM, Stas Desyatnlkov wrote: > Hi, > > Its obvious that loss of the SPS or PPS results in a lot of grief in the > h264 land. The question is what choice do we have in case of no other means > of communication besides RTP? > > What if the h264 stream is packed inside TS and receiver is not aware of > anything else? > > I guess in case of LAN streaming where packet loss is rare sending SPS/PPS > inband is not that bad of an option would you agree? > > It depends. If you're doing multicast, then sending it in-stream is a bad idea; clients may miss the first sending, which means you'd have to do something weird like insert it when a client connects, or periodically insert it. If you're doing unicast, it would probably work to send it once in-stream. In my experience, some clients expect correct sprop-parameter-set and profile-level-id fields (e.g. quicktime), so not populating them is possibly a deal-breaker. In any event, if you're on a LAN, why would you _ever_ consider not using the associated RTSP session to communicate this information? And what sort of scenario do you have where there is "no other means of communication besides RTP?" Quite frankly, I can't possibly think of a situation where either of these statements would be true. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jnoring at logitech.com Thu Sep 24 18:32:35 2009 From: jnoring at logitech.com (Jeremy Noring) Date: Thu, 24 Sep 2009 18:32:35 -0700 Subject: [Live-devel] Proposed change to DarwinInjector.cpp/.hh Message-ID: <988ed6930909241832n5f2ee777hf03bd818be206a8a@mail.gmail.com> I've been testing with the DarwinInjector to push to a server online for redistribution, and one thing I noticed is the timeout value for the initial connection to Darwin isn't exposed, so if a push attempt is made to a server that's unavailable for whatever reason, DarwinInjector->setDestination() does not return. So I exposed the timeout value in the DarwinInjector->setDestination() method, with a default value of -1 so it's compatible with existing code. Here's the diff for DarwinInjector.hh: 76c76,77 < char const* sessionCopyright = ""); --- > char const* sessionCopyright = "", > int timeout = -1); Diff for DarwinInjector.cpp: 112c112,113 < char const* sessionCopyright) { --- > char const* sessionCopyright, > int timeout) { 189c190 < = fRTSPClient->announceWithPassword(url, sdp, remoteUserName, remotePassword); --- > = fRTSPClient->announceWithPassword(url, sdp, remoteUserName, remotePassword, timeout); 191c192 < announceSuccess = fRTSPClient->announceSDPDescription(url, sdp); --- > announceSuccess = fRTSPClient->announceSDPDescription(url, sdp, NULL, timeout); I didn't see another way to configure the timeout, but if I missed something, let me know. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 24 23:05:02 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Sep 2009 23:05:02 -0700 Subject: [Live-devel] Proposed change to DarwinInjector.cpp/.hh In-Reply-To: <988ed6930909241832n5f2ee777hf03bd818be206a8a@mail.gmail.com> References: <988ed6930909241832n5f2ee777hf03bd818be206a8a@mail.gmail.com> Message-ID: OK, this will be included in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From linkfanel at yahoo.fr Fri Sep 25 05:17:51 2009 From: linkfanel at yahoo.fr (Pierre Ynard) Date: Fri, 25 Sep 2009 14:17:51 +0200 Subject: [Live-devel] [PATCH] Support T.140 text subsessions Message-ID: <20090925121751.GA9122@via.ecp.fr> This patch adds supports for T.140 text subsessions. This was tested and works nicely with VLC media player. --- live/liveMedia/MediaSession.cpp 2009-09-25 14:05:13.000000000 +0200 +++ live/liveMedia/MediaSession.cpp 2009-09-25 14:06:20.000000000 +0200 @@ -863,6 +863,7 @@ || strcmp(fCodecName, "G726-32") == 0 // G.726, 32 kbps || strcmp(fCodecName, "G726-40") == 0 // G.726, 40 kbps || strcmp(fCodecName, "SPEEX") == 0 // SPEEX audio + || strcmp(fCodecName, "T140") == 0 // T.140 text ) { createSimpleRTPSource = True; useSpecialRTPoffset = 0; Regards, -- Pierre Ynard "Une ?me dans un corps, c'est comme un dessin sur une feuille de papier." From finlayson at live555.com Fri Sep 25 23:43:17 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Sep 2009 23:43:17 -0700 Subject: [Live-devel] [PATCH] Support T.140 text subsessions In-Reply-To: <20090925121751.GA9122@via.ecp.fr> References: <20090925121751.GA9122@via.ecp.fr> Message-ID: >This patch adds supports for [receiving] T.140 text subsessions. OK, thanks - this will be included in the next release of the software. Ross. From Alexandre at Maumene.Org Mon Sep 28 04:32:04 2009 From: Alexandre at Maumene.Org (=?ISO-8859-1?Q?Alexandre_Maumen=E9?=) Date: Mon, 28 Sep 2009 13:32:04 +0200 Subject: [Live-devel] Streaming in H264 from DV cam (live source) Message-ID: <341bb8d10909280432p265fea74o868fb7b61eb14c9@mail.gmail.com> Alexandre Maumen? to live-devel show details 1:29 PM (2 minutes ago) Hi everybody, I'm trying to stream in h264 from a DV camera. I've wrote some modifications for h264 streaming and it work well for a plain file. But now I want to do it in live. I'm running FreeBSD, my DV cam is plugged in firewire and I can get DV stream with fwcontrol. I've modify testOnDemandRTSPServer for using my h264 implementation, and to make it read from stdin So I try something like that : *fwcontrol -M dv -R - | ffmpeg -i - -vcodec libx264 -r 12 -s cif -flags2 aud -an -f h264 - | ./testOnDemandRTSPServer* When VLC start to play, ffmpeg start to encode but even after a while, no video appears. ffmpeg start to encode with a bunch of this error: *AC EOB marker is absent pos=89 AC EOB marker is absent pos=68 AC EOB marker is absent pos=65 AC EOB marker is absent pos=64 AC EOB marker is absent pos=107 AC EOB marker is absent pos=64 AC EOB marker is absent pos=65 AC EOB marker is absent pos=64 Last message repeated 2 times* and I don't know the meaning of this messages. So my question is how I can feed my RTSP server to stream in live from my dv cam? Thanks in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben at decadent.org.uk Mon Sep 28 19:03:08 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Tue, 29 Sep 2009 03:03:08 +0100 Subject: [Live-devel] Streaming in H264 from DV cam (live source) In-Reply-To: <341bb8d10909280432p265fea74o868fb7b61eb14c9@mail.gmail.com> References: <341bb8d10909280432p265fea74o868fb7b61eb14c9@mail.gmail.com> Message-ID: <1254189788.27790.11.camel@localhost> On Mon, 2009-09-28 at 13:32 +0200, Alexandre Maumen? wrote: > > Alexandre Maumen? > to live-devel > show details 1:29 PM (2 > minutes ago) > > > Hi everybody, > > I'm trying to stream in h264 from a DV camera. > I've wrote some modifications for h264 streaming and it work well for > a plain file. > > But now I want to do it in live. I'm running FreeBSD, my DV cam is > plugged in firewire and I can get DV stream with fwcontrol. > I've modify testOnDemandRTSPServer for using my h264 implementation, > and to make it read from stdin > So I try something like that : fwcontrol -M dv -R - | ffmpeg -i - > -vcodec libx264 -r 12 -s cif -flags2 aud -an -f h264 - > | ./testOnDemandRTSPServer > When VLC start to play, ffmpeg start to encode but even after a while, > no video appears. > > ffmpeg start to encode with a bunch of this error: > AC EOB marker is absent pos=89 > AC EOB marker is absent pos=68 > AC EOB marker is absent pos=65 > AC EOB marker is absent pos=64 > AC EOB marker is absent pos=107 > AC EOB marker is absent pos=64 > AC EOB marker is absent pos=65 > AC EOB marker is absent pos=64 > Last message repeated 2 times > and I don't know the meaning of this messages. I believe these indicate errors in the DV stream. The DV format is very robust so these should not prevent decoding, but they can result in visible glitches. You could try changing the cable. If you tested by playing a tape, maybe it's time to clean the tape heads. > So my question is how I can feed my RTSP server to stream in live from > my dv cam? testOnDemandRTSPServer requires static files with specific names. It can't be used to stream the standard input. Ben. -- Ben Hutchings Who are all these weirdos? - David Bowie, about L-Space IRC channel #afp -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 828 bytes Desc: This is a digitally signed message part URL: From finlayson at live555.com Mon Sep 28 22:37:10 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Sep 2009 22:37:10 -0700 Subject: [Live-devel] Streaming in H264 from DV cam (live source) In-Reply-To: <1254189788.27790.11.camel@localhost> References: <341bb8d10909280432p265fea74o868fb7b61eb14c9@mail.gmail.com> <1254189788.27790.11.camel@localhost> Message-ID: >testOnDemandRTSPServer requires static files with specific names. It >can't be used to stream the standard input. Not without changing the code, that's correct. However, you can do this by changing the code to use a special file name of "stdin". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mohitagandotra at gmail.com Tue Sep 29 05:19:44 2009 From: mohitagandotra at gmail.com (mohita gandotra) Date: Tue, 29 Sep 2009 17:49:44 +0530 Subject: [Live-devel] pause function returning wrong value Message-ID: <6d1553a50909290519q6ab46ac3g56186c8eecd957e3@mail.gmail.com> Hi Ross I am having some issues regarding pause command. I am using live555 library for my client code. For server, i am using testondemandrtpserver. when i call pauseMediaSession function of RTSPClient class, the pause function returns false most of the times even if the media is paused. And the getResultMsg function returns error message - "no response code in line" or "cannot handle response". Can you tell me why the function returns wrong value? -- Mohita Gandotra ------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Tue Sep 29 10:43:21 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Tue, 29 Sep 2009 19:43:21 +0200 Subject: [Live-devel] H264 multicast streaming question In-Reply-To: References: <761260.88942.qm@web72303.mail.tp2.yahoo.com> <21E398286732DC49AD45BE8C7BE96C079CB3DCD62D@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C079CB3DCD780@fs11.mertree.mer.co.il> Here is a situation: The streamer is a black box that's connected to audio and video source. The streamer sends TS with AVC and AAC because the client can only accept TS. The client knows the exact format of the video and audio streams inside TS and wants to connect to the live transmission at any time. The client knows nothing about session establishment protocols (e.g. RTSP). From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeremy Noring Sent: Thursday, September 24, 2009 8:17 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H264 multicast streaming question On Thu, Sep 24, 2009 at 12:46 AM, Stas Desyatnlkov > wrote: Hi, Its obvious that loss of the SPS or PPS results in a lot of grief in the h264 land. The question is what choice do we have in case of no other means of communication besides RTP? What if the h264 stream is packed inside TS and receiver is not aware of anything else? I guess in case of LAN streaming where packet loss is rare sending SPS/PPS inband is not that bad of an option would you agree? It depends. If you're doing multicast, then sending it in-stream is a bad idea; clients may miss the first sending, which means you'd have to do something weird like insert it when a client connects, or periodically insert it. If you're doing unicast, it would probably work to send it once in-stream. In my experience, some clients expect correct sprop-parameter-set and profile-level-id fields (e.g. quicktime), so not populating them is possibly a deal-breaker. In any event, if you're on a LAN, why would you _ever_ consider not using the associated RTSP session to communicate this information? And what sort of scenario do you have where there is "no other means of communication besides RTP?" Quite frankly, I can't possibly think of a situation where either of these statements would be true. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jnoring at logitech.com Tue Sep 29 13:23:45 2009 From: jnoring at logitech.com (Jeremy Noring) Date: Tue, 29 Sep 2009 13:23:45 -0700 Subject: [Live-devel] H264 multicast streaming question In-Reply-To: <21E398286732DC49AD45BE8C7BE96C079CB3DCD780@fs11.mertree.mer.co.il> References: <761260.88942.qm@web72303.mail.tp2.yahoo.com> <21E398286732DC49AD45BE8C7BE96C079CB3DCD62D@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C079CB3DCD780@fs11.mertree.mer.co.il> Message-ID: <988ed6930909291323v2be63ae4ya7c4892aee252614@mail.gmail.com> On Tue, Sep 29, 2009 at 10:43 AM, Stas Desyatnlkov wrote: > Here is a situation: > > The streamer is a black box that?s connected to audio and video source. The > streamer sends TS with AVC and AAC because the client can only accept TS. > The client knows the exact format of the video and audio streams inside TS > and wants to connect to the live transmission at any time. > > The client knows nothing about session establishment protocols (e.g. RTSP). > > Where exactly in this situation are you using Live555? -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Wed Sep 30 02:26:37 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Wed, 30 Sep 2009 11:26:37 +0200 Subject: [Live-devel] H264 multicast streaming question In-Reply-To: <988ed6930909291323v2be63ae4ya7c4892aee252614@mail.gmail.com> References: <761260.88942.qm@web72303.mail.tp2.yahoo.com> <21E398286732DC49AD45BE8C7BE96C079CB3DCD62D@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C079CB3DCD780@fs11.mertree.mer.co.il> <988ed6930909291323v2be63ae4ya7c4892aee252614@mail.gmail.com> Message-ID: <21E398286732DC49AD45BE8C7BE96C079CB3DCD7A0@fs11.mertree.mer.co.il> Sender only. I see some video providers that mix AVC into TS. NAL, SPS and PPS are all part of the elementary stream packet. How they ensure that client receives SPS and PPS in time is not clear. Probably the receiving side waits for the next SPS, PPS and start showing video after that. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeremy Noring Sent: Tuesday, September 29, 2009 11:24 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H264 multicast streaming question On Tue, Sep 29, 2009 at 10:43 AM, Stas Desyatnlkov > wrote: Here is a situation: The streamer is a black box that's connected to audio and video source. The streamer sends TS with AVC and AAC because the client can only accept TS. The client knows the exact format of the video and audio streams inside TS and wants to connect to the live transmission at any time. The client knows nothing about session establishment protocols (e.g. RTSP). Where exactly in this situation are you using Live555? -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Wed Sep 30 07:15:36 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Wed, 30 Sep 2009 16:15:36 +0200 Subject: [Live-devel] AAC in TS streaming Message-ID: <21E398286732DC49AD45BE8C7BE96C079CB3DCD7E0@fs11.mertree.mer.co.il> Hi, I'd like to use the sample wis-streamer to encode the wav file into AAC and stream in TS over RTP. Does the following sequence makes sense: WavFileSrc* pcm = new WavFileSrc(*env, filename); //Create AAC encoder filter audioSource = AACAudioEncoder::createNew(*env, pcm, 1, 8000, 0); tsSource = MPEG2TransportStreamFromESSource::createNew(*env); tsSource->addNewAudioSource(audioSource, 2); MPEG2TransportStreamAccumulator* acc = MPEG2TransportStreamAccumulator::createNew(*env, tsSource); //Create sink char const* encoderConfigStr = /*audioNumChannels == 2 ? "1210": */ "1208"; audioSink = MPEG4GenericRTPSink::createNew(*env, rtpGroupsock, 96, 8000, "audio", "AAC-hbr", encoderConfigStr, 1); Boolean b = audioSink->startPlaying(*acc, afterPlaying, audioSink); env->taskScheduler().doEventLoop(&watch); -------------- next part -------------- An HTML attachment was scrubbed... URL: From jnoring at logitech.com Wed Sep 30 08:27:26 2009 From: jnoring at logitech.com (Jeremy Noring) Date: Wed, 30 Sep 2009 08:27:26 -0700 Subject: [Live-devel] H264 multicast streaming question In-Reply-To: <21E398286732DC49AD45BE8C7BE96C079CB3DCD7A0@fs11.mertree.mer.co.il> References: <761260.88942.qm@web72303.mail.tp2.yahoo.com> <21E398286732DC49AD45BE8C7BE96C079CB3DCD62D@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C079CB3DCD780@fs11.mertree.mer.co.il> <988ed6930909291323v2be63ae4ya7c4892aee252614@mail.gmail.com> <21E398286732DC49AD45BE8C7BE96C079CB3DCD7A0@fs11.mertree.mer.co.il> Message-ID: <988ed6930909300827o797764a7h7cf68d218dee93be@mail.gmail.com> On Wed, Sep 30, 2009 at 2:26 AM, Stas Desyatnlkov wrote: > Sender only. > > I see some video providers that mix AVC into TS. > > NAL, SPS and PPS are all part of the elementary stream packet. How they > ensure that client receives SPS and PPS in time is not clear. Probably the > receiving side waits for the next SPS, PPS and start showing video after > that. > OK, so the sender is not a "black box," like you describe, but something to which you could easily add the SPS/PPS info to the SDP exchange? You *can* send it in stream, but the risks you accept are A) the client may not be listening when they're sent, and/or B) those packets are lost/corrupted en-route (assuming you're using UDP as a transport). It's just that simple. -------------- next part -------------- An HTML attachment was scrubbed... URL: From joeflin at 126.com Tue Sep 29 15:30:32 2009 From: joeflin at 126.com (joeflin) Date: Wed, 30 Sep 2009 06:30:32 +0800 (CST) Subject: [Live-devel] questions about quicktime as a rtsp client Message-ID: <29292100.546421254263432646.JavaMail.coremail@bj126app82.126.com> Hi, My streaming box has IP 192.168.0.48, my PC has IP 192.168.0.100, I am using live555 to stream h.264 video to the network. VLC and mplayer work fine. But Quicktime gives troubles. 1. If I use rtsp://192.168.0.48, I got "Connection failed", after negotiating and waiting for 10 sec from QuickTime. On the box, I turned on the DEBUG in RTSPServer.cpp and got "RTSPClientSession[0x14f28]::incomingRequestHandler1() read 0 bytes (of 10000); terminating connection!" 2. If I use rtsp://192.168.0.48:554, I got "Bad request", after negotiatiaing and waiting for 10 sec from QuickTime. On the box, I turned on the DEBUG in RTSServer.cpp and got "RTSPClientSession[0x14f50]::incomingRequestHandler1() read 0 bytes (of 10000); terminating connection! destructing an H264 StreamFramer accept()ed connection from 192.168.0.10010RTSPClientSession[0x14f28]::incomingRequestHandler1() read 211 bytes:G0 User-Agent: QuickTime/7.6 (qtver=7.6;os=Windows NT 5.1Service Pack 2) x-sessioncookie: 6wiJt4ihAAA7/wEAOYAAAA Accept: application/x-rtsp-tunnelled Pragma: no-cache Cache-Control: no-cache parseRTSPRequestString() failed! sending response: RTSP/1.0 400 Bad Request Date: Tue, Sep 29 2009 22:21:33 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER RTSPClientSession[0x14f28]::incomingRequestHandler1() read 0 bytes (of 10000); terminating connection!" Could someone share a light here? Thanks! Joe -------------- next part -------------- An HTML attachment was scrubbed... URL: From sharda.murthi at gmail.com Tue Sep 29 21:13:48 2009 From: sharda.murthi at gmail.com (Sharda Murthi) Date: Wed, 30 Sep 2009 00:13:48 -0400 Subject: [Live-devel] Comparison of streamed data for TCP and UDP Message-ID: Hi, I am new to using live555 media server. I am trying to compare the quality of media streaming over TCP and UDP. I understand that this can be done using the openRTSP client. However, this client generates 2 separate audio and video files. Is there any way I can play them both together while they are being streamed? The -Q option with openRTSP indicates that there is no packet-loss (since I am performing local streaming), however, the files sizes of the audio and video files do not add up to the original file size. Is there something I am missing out on? What is the best way for me to compare streaming quality for TCP and UDP on the live555 media server? -------------- next part -------------- An HTML attachment was scrubbed... URL: From mliggett at btisystems.com Wed Sep 30 09:47:32 2009 From: mliggett at btisystems.com (Mark Liggett) Date: Wed, 30 Sep 2009 12:47:32 -0400 Subject: [Live-devel] [PATCH] Added command line option to live555MediaServer to specify port Message-ID: Hi everyone, I've added some code into live555MediaServer.cpp to allow the user to specify a command line argument stating which port to use for the server. This change allows multiple (more than 2) live555MediaServers to be started on a given host and means they can also be started using a script. If no port is specified I've set the server to start on port 8554. This is perhaps not everybody's default but suits our environment. Kind regards, Mark Liggett BTI Systems 24d23 < #include 40,78d38 < /* < * Added command line option -p to specify a tcp port on < * which to start live555MediaServer otherwise default to port 8554 < */ < unsigned short desiredPortNum = 8554; < int opt; < < if(argc == 1){ < *env << "RTSP server starting on default port: " << desiredPortNum << "\n"; < } < < while ((opt = getopt (argc, argv, "p:")) != -1) { < switch (opt) < { < case 'p': { // specify port number < int portArg; < if(sscanf(optarg, "%d", &portArg) != -1){ < if (portArg <= 0 || portArg >= 65536) { < *env << "RTSP server failed to start: " << "Invalid port number \n"; < exit(1); < } < else{ < desiredPortNum = (unsigned short)portArg; < *env << "RTSP server starting on port: " << desiredPortNum << "\n"; < break; < } < } < else{ < *env << "RTSP server failed to start: " << "Invalid port number \n"; < exit(1); < } < } < default : { < *env << "RTSP server failed to start: " << "Missing port number \n"; < exit(1); < } < } < } < 82,86c42 < < // Changed default behavior to set port to desiredPortNum rather than 554 < portNumBits rtspServerPortNum = desiredPortNum; < < --- > portNumBits rtspServerPortNum = 554; 88,91d43 < < /* < * Removed to stop default port behavior < * 96,97d47 < */ < -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 30 12:10:19 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Sep 2009 12:10:19 -0700 Subject: [Live-devel] AAC in TS streaming In-Reply-To: <21E398286732DC49AD45BE8C7BE96C079CB3DCD7E0@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C079CB3DCD7E0@fs11.mertree.mer.co.il> Message-ID: >Does the following sequence makes sense No, because a "MPEG4GenericRTPSink" is used to stream MPEG-4 Elementary Stream data (e.g., AAC audio), *not* MPEG Transport Stream data. I feeling deja vu here.... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 30 12:17:53 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Sep 2009 12:17:53 -0700 Subject: [Live-devel] [PATCH] Added command line option to live555MediaServer to specify port In-Reply-To: References: Message-ID: Thanks, but I won't be adding this to the released code. The "LIVE555 Media Server" is intended to be a simple product with no 'bells and whistles' - aimed at unsophisticated users (e.g., people who use "@gmail.com" addresses :-). Anyone who wants to run multiple copies of the server on a single host (for whatever reason) should be sophisticated enough to modifiy their own copy of the code, as you have done. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From shahsachin at gmail.com Wed Sep 30 15:11:50 2009 From: shahsachin at gmail.com (Sachin) Date: Wed, 30 Sep 2009 15:11:50 -0700 Subject: [Live-devel] Server to client ANNOUNCE method implementation in live media library Message-ID: <5535bcc50909301511n11c5b4dfr6c49dd5ae1407bce@mail.gmail.com> I have to implement ANNOUNCE method (from server to client) for the RTSP server I am implementing using live media library. I was looking into implementing it without modifying existing RTSPServer code in live media library and doing it completely in my own extension of RTSPServer. But seems like that would not be possible as current RTSPServer doesn't keep the references to the client sessions. Is there any way to query existing client sessions? Or is there any other way to just send ANNOUNCE to the clients connected to server. I would appreciate your response. Thanks, Sachin -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jonathan.Pannaman at espn.com Wed Sep 30 15:30:15 2009 From: Jonathan.Pannaman at espn.com (Pannaman, Jonathan) Date: Wed, 30 Sep 2009 15:30:15 -0700 Subject: [Live-devel] TS over UDP (or RTP) Message-ID: I am interested in using live for an application which receives TS data from an encoder in blocks approximately 16K bytes via a Directshow filter graph. I have successfully built a filter which receives the data and sends it via UDP over the network but very simply breaks the data into blocks of 7 TS packets and sends them in UDP packets over the network. This works fine for a number of devices but others cannot seem to keep the buffer properly primed or synced. If I send them to localhost and use VLC to receive them and then send them (stream) them all devices are content and all works properly. I am using RTP and then UDP over RTP for this. I would like to use the live library to integrate the reliable sending function into my DS filter but can't work out the right way to do this. Can anyone give me an example or direction on the right way to do this. The main thing I struggle with if how to determine the timing of sending the data., Do I need to look at PTS and DTS timestamps? Or can I do something simpler? Thoughts? JonP -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 30 21:47:49 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Sep 2009 21:47:49 -0700 Subject: [Live-devel] TS over UDP (or RTP) In-Reply-To: References: Message-ID: >I would like to use the live library to integrate the reliable >sending function into my DS filter but can't work out the right way >to do this. Can anyone give me an example or direction on the right >way to do this. The main thing I struggle with if how to determine >the timing of sending the data., Do I need to look at PTS and DTS >timestamps? Or can I do something simpler? Yes, you can feed your Transport Stream data (after you've accumulated it into groups of 7 Transport Packets) into a "MPEG2TransportStreamFramer" object. You can then feed this into your "RTPSink" (a "SimpleRTPSink"). The "MPEG2TransportStreamFramer" will figure out a proper duration for each chunk of data, so that it gets streamed at an appropriate rate. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jonathan.Pannaman at espn.com Wed Sep 30 22:29:38 2009 From: Jonathan.Pannaman at espn.com (Pannaman, Jonathan) Date: Wed, 30 Sep 2009 22:29:38 -0700 Subject: [Live-devel] TS over UDP (or RTP) Message-ID: Can I do this as soon as I get a block and as fast as a for loop will pass off the 7xTS packets. In other words will the Framer buffer until it needs to go? Thanks Jon ________________________________ From: live-devel-bounces at ns.live555.com To: LIVE555 Streaming Media - development & use Sent: Thu Oct 01 00:47:49 2009 Subject: Re: [Live-devel] TS over UDP (or RTP) I would like to use the live library to integrate the reliable sending function into my DS filter but can't work out the right way to do this. Can anyone give me an example or direction on the right way to do this. The main thing I struggle with if how to determine the timing of sending the data., Do I need to look at PTS and DTS timestamps? Or can I do something simpler? Yes, you can feed your Transport Stream data (after you've accumulated it into groups of 7 Transport Packets) into a "MPEG2TransportStreamFramer" object. You can then feed this into your "RTPSink" (a "SimpleRTPSink"). The "MPEG2TransportStreamFramer" will figure out a proper duration for each chunk of data, so that it gets streamed at an appropriate rate. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ Please consider the environment before printing this e-mail. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 30 22:35:25 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Sep 2009 22:35:25 -0700 Subject: [Live-devel] TS over UDP (or RTP) In-Reply-To: References: Message-ID: >Can I do this as soon as I get a block and as fast as a for loop >will pass off the 7xTS packets. In other words will the Framer >buffer until it needs to go? I don't really understand your question. Just set up a standard LIVE555 data chain: your-7-transport-packet-source -> MPEG2TransportStreamFramer -> SimpleRTPSink Because of the "MPEG2TransportStreamFramer" object, data will end up getting streamed at an appropriate rate. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: