From sebastien-devel at celeos.eu Wed Apr 1 05:20:03 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Wed, 01 Apr 2009 14:20:03 +0200 Subject: [Live-devel] teardownMediaSession Message-ID: <1238588403.49d35bf3b2621@imp.celeos.eu> Hi, I would like to know why do we need to wait for an answer in teardownMediaSession ? My issue is : When a server stop sending me data, I am closing the client. But before that I send a teardownMediaSession just in case the server or the connection is not really dead. But if the server or the connection IS dead, I have to wait 30sec for the function teardownMediaSession to return. And I don't want to know if the server received the teardown or not. Would you have any suggestion ? Thanks. From bitter at vtilt.com Wed Apr 1 10:04:33 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Wed, 1 Apr 2009 13:04:33 -0400 Subject: [Live-devel] stream m4e In-Reply-To: <1238533927.23779.13.camel@oslab-l1> References: <1238450328.5328.178.camel@oslab-l1> <1238531094.23779.8.camel@oslab-l1> <64DCF11D-D0F4-4078-B4F6-F7876FED062A@vtilt.com> <1238533927.23779.13.camel@oslab-l1> Message-ID: <1FAE409C-0CB9-46A9-984E-6B5E5A5A3B11@vtilt.com> Its hard to tell without looking at the file you captured. It seems that either your i-frames are not being sent correctly or you timing is off for some reason. If the RTP timestamp is not correct then VLC will not be happy and give you warnings about late and dropped frames. - Brad On Mar 31, 2009, at 5:12 PM, York Sun wrote: > Brad, > > My camera is AXIS M1011 Network Camera. I have no idea about the key > frame on camera. > > My recording is 100 seconds lone. However, with testMPEG4VideoStreamer > and vlc as a client, I can only see the green picture for about 5 > seconds, then the picture freezes and vlc message says > > ffmpeg decoder error: more than 5 seconds of late video -> dropping > frame (computer too slow ?) > > On the other hand, vlc can play the recorded file if rename it > to .m4v. > > York > > > On Tue, 2009-03-31 at 16:55 -0400, Brad Bitterman wrote: >> Oh! I think I know exactly what your seeing. I have seen this before >> many times with VLC. What happens for me is that the time between i- >> frames is very long. When I connect with VLC i only get p-frames >> for a >> while. The green is because the initial decoded picture buffer in VLC >> is filled with all zeros which in YUV land is green. The little >> shapes >> you see are the motion vectors that are being decoded against the >> green buffer. >> >> How long is the clip you captured? What is the key frame interval set >> to on the Axis camera? >> >> - Brad >> >> On Mar 31, 2009, at 4:24 PM, York Sun wrote: >> >>> Brad, >>> >>> I should be clear about the color. It is not random color. It is all >>> green, with a little bit shape here and there under vlc. mplayer >>> shows >>> most black. >>> >>> I think it is something related to multicast. Just found an old post >>> in >>> archive >>> http://lists.live555.com/pipermail/live-devel/2008-August/009323.html >>> . >>> Ross seems to believe the missing key frame causes the green. >>> >>> I tried live555MediaServer and it streams OK. >>> >>> Here is what I am trying to do. I want to capture rtsp steams from >>> multiple IP cameras (MPEG4 or H.264), save the stream to the hard >>> drive, >>> and also re-stream the video to unicast and/or multicast addresses. >>> I am >>> hoping I can use the openRTSP and testMPEG4VideoStreamer as template >>> and >>> put them together. Now it seems to be a little bit difficult for >>> multicasting. Please advise if it is even possible, or someone >>> already >>> did it. >>> >>> Thanks, >>> >>> York >>> >>> >>> On Tue, 2009-03-31 at 15:49 -0400, Brad Bitterman wrote: >>>> Not sure what would cause a color issue like what you're >>>> describing.... Live555 doesn't modify the frame data in any way so >>>> chroma info should not be modified. >>>> >>>> >>>> - Brad >>>> >>>> On Mar 30, 2009, at 10:34 PM, sun york-R58495 wrote: >>>> >>>>> Thanks for the hint. vlc can play the file with .m4v extension. >>>>> That >>>>> at least proves the openRTSP works OK. I still don't know why >>>>> testMPEG4VideoStreamer sends the vague picture with wrong color. >>>>> Is >>>>> it resolution issue? The sample para.m4e file streams OK. It has >>>>> low >>>>> resolution, though. >>>>> >>>>> York >>>>> >>>>> >>>>> >>>>> -----Original Message----- >>>>> From: live-devel-bounces at ns.live555.com on behalf of Brad >>>>> Bitterman >>>>> Sent: Mon 3/30/2009 19:51 >>>>> To: LIVE555 Streaming Media - development & use >>>>> Subject: Re: [Live-devel] stream m4e >>>>> >>>>> You can use ffmpeg to wrap the file into a mp4 container. I think >>>>> the >>>>> command is just the following: >>>>> >>>>> ./ffmpeg -i video-MP4V-ES-1 out.mp4 >>>>> >>>>> You should then be able to open the file with VLC. Also, ffmpeg >>>>> will >>>>> usually tell you if there are problems with the bitstream. I have >>>>> also >>>>> just renamed the file with a .m4v extension and VLC can play it >>>>> directly without any container. >>>>> >>>>> I have used openRTSP in the past to capture a stream from an Axis >>>>> camera and it worked fine. You might also want to turn the debug >>>>> log >>>>> level up in vlc and see if it reports any errors. >>>>> >>>>> - Brad >>>>> >>>>> On Mar 30, 2009, at 5:58 PM, York Sun wrote: >>>>> >>>>>> I am new to this list. Tried to search the archive but didn't >>>>>> find >>>>> a >>>>>> good way. >>>>>> >>>>>> In short, I have difficulty to stream m4e by >>>>> testMPEG4VideoStreamer. >>>>>> The >>>>>> m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured >>>>>> from my >>>>>> Axis camera, 640x480. I can see the picture with wrong color. vlc >>>>> and >>>>>> mplayer display different wrong color. I don't know which part is >>>>>> wrong. >>>>>> >>>>>> More info, if adding -i or -4 to openRTSP, the avi or mp4 file >>>>>> can >>>>> be >>>>>> correctly played by vlc and mplayer. I don't know how to verify >>>>> the >>>>>> m4e >>>>>> file. I can stream the sample para.m4e file. Only the very >>>>> beginning >>>>>> shows wrong color. >>>>>> >>>>>> Any help is appreciated. >>>>>> >>>>>> York >>>>>> >>>>>> _______________________________________________ >>>>>> live-devel mailing list >>>>>> live-devel at lists.live555.com >>>>>> http://lists.live555.com/mailman/listinfo/live-devel >>>>> >>>>> _______________________________________________ >>>>> live-devel mailing list >>>>> live-devel at lists.live555.com >>>>> http://lists.live555.com/mailman/listinfo/live-devel >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> _______________________________________________ >>>>> live-devel mailing list >>>>> live-devel at lists.live555.com >>>>> http://lists.live555.com/mailman/listinfo/live-devel >>>> >>>> >>>> _______________________________________________ >>>> live-devel mailing list >>>> live-devel at lists.live555.com >>>> http://lists.live555.com/mailman/listinfo/live-devel >>> _______________________________________________ >>> live-devel mailing list >>> live-devel at lists.live555.com >>> http://lists.live555.com/mailman/listinfo/live-devel >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From bitter at vtilt.com Wed Apr 1 10:06:37 2009 From: bitter at vtilt.com (Brad Bitterman) Date: Wed, 1 Apr 2009 13:06:37 -0400 Subject: [Live-devel] Access to RTP Extension Header In-Reply-To: References: Message-ID: <3509FC8D-B688-4585-AD1F-0983297628AB@vtilt.com> Not sure if this made it through to the list so I'm resending..... - Brad ----------------------------------------------------------------- Ross, I modifies MultiFramedRTPSource to add the ability to set and call a callback function to gain access to the RTP extended header. This works for my test scenario. If there is no RTP extended header (probably 90% of the time this is the case) then the added code has no effect on the the system. I have attached a patch/diff of the changes I made. I hope you'll add them into the official tree.... Thanks for your help, Brad ----------------------------------------------------------------- diff -rupN --exclude=Makefile live/liveMedia/include/RTPSource.hh live. 2009.02.23/liveMedia/include/RTPSource.hh --- live/liveMedia/include/RTPSource.hh 2009-03-25 16:20:34.000000000 -0400 +++ live.2009.02.23/liveMedia/include/RTPSource.hh 2009-02-23 05:03:54.000000000 -0500 @@ -76,10 +76,6 @@ public: u_int16_t curPacketRTPSeqNum() const { return fCurPacketRTPSeqNum; } u_int32_t curPacketRTPTimestamp() const { return fCurPacketRTPTimestamp; } - // This is used to set a callback to retrieve the RTP Header Extension data - typedef void (*RtpExtHdrCallback_t)(uint16_t profile, uint16_t len, uint8_t* pHdrData, void* pPriv); - void setRtpExtHdrCallback( RtpExtHdrCallback_t callback, void* pPriv ) { fRtpExtHdrCallback = callback; fRtpExtHdrCallbackPrivData = pPriv;} - protected: RTPSource(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat, u_int32_t rtpTimestampFrequency); @@ -93,8 +89,6 @@ protected: Boolean fCurPacketMarkerBit; Boolean fCurPacketHasBeenSynchronizedUsingRTCP; u_int32_t fLastReceivedSSRC; - RtpExtHdrCallback_t fRtpExtHdrCallback; - void* fRtpExtHdrCallbackPrivData; private: // redefined virtual functions: diff -rupN --exclude=Makefile live/liveMedia/MultiFramedRTPSource.cpp live.2009.02.23/liveMedia/MultiFramedRTPSource.cpp --- live/liveMedia/MultiFramedRTPSource.cpp 2009-03-30 08:49:05.000000000 -0400 +++ live.2009.02.23/liveMedia/MultiFramedRTPSource.cpp 2009-02-23 05:03:54.000000000 -0500 @@ -71,8 +71,6 @@ MultiFramedRTPSource // Try to use a big receive buffer for RTP: increaseReceiveBufferTo(env, RTPgs->socketNum(), 50*1024); - - fRtpExtHdrCallback = NULL; } void MultiFramedRTPSource::reset() { @@ -248,16 +246,11 @@ void MultiFramedRTPSource::networkReadHa ADVANCE(cc*4); // Check for (& ignore) any RTP header extension - // If a callback is set we pass any header extension info to it if (rtpHdr&0x10000000) { if (bPacket->dataSize() < 4) break; unsigned extHdr = ntohl(*(unsigned*)(bPacket->data())); ADVANCE(4); unsigned remExtSize = 4*(extHdr&0xFFFF); if (bPacket->dataSize() < remExtSize) break; - if( source->fRtpExtHdrCallback ) - { - source->fRtpExtHdrCallback(extHdr>>16, remExtSize, bPacket- >data(), source->fRtpExtHdrCallbackPrivData); - } ADVANCE(remExtSize); } > From yorksun at freescale.com Wed Apr 1 11:10:07 2009 From: yorksun at freescale.com (York Sun) Date: Wed, 01 Apr 2009 13:10:07 -0500 Subject: [Live-devel] stream m4e In-Reply-To: <1FAE409C-0CB9-46A9-984E-6B5E5A5A3B11@vtilt.com> References: <1238450328.5328.178.camel@oslab-l1> <1238531094.23779.8.camel@oslab-l1> <64DCF11D-D0F4-4078-B4F6-F7876FED062A@vtilt.com> <1238533927.23779.13.camel@oslab-l1> <1FAE409C-0CB9-46A9-984E-6B5E5A5A3B11@vtilt.com> Message-ID: <1238609407.8995.4.camel@oslab-l1> Brad, Thanks for the advise. I will drop it for now and move on to unicast and relay. Maybe I will have a better idea when I learn more about it. York On Wed, 2009-04-01 at 13:04 -0400, Brad Bitterman wrote: > Its hard to tell without looking at the file you captured. It seems > that either your i-frames are not being sent correctly or you timing > is off for some reason. If the RTP timestamp is not correct then VLC > will not be happy and give you warnings about late and dropped frames. > > - Brad > > On Mar 31, 2009, at 5:12 PM, York Sun wrote: > > > Brad, > > > > My camera is AXIS M1011 Network Camera. I have no idea about the key > > frame on camera. > > > > My recording is 100 seconds lone. However, with testMPEG4VideoStreamer > > and vlc as a client, I can only see the green picture for about 5 > > seconds, then the picture freezes and vlc message says > > > > ffmpeg decoder error: more than 5 seconds of late video -> dropping > > frame (computer too slow ?) > > > > On the other hand, vlc can play the recorded file if rename it > > to .m4v. > > > > York > > > > > > On Tue, 2009-03-31 at 16:55 -0400, Brad Bitterman wrote: > >> Oh! I think I know exactly what your seeing. I have seen this before > >> many times with VLC. What happens for me is that the time between i- > >> frames is very long. When I connect with VLC i only get p-frames > >> for a > >> while. The green is because the initial decoded picture buffer in VLC > >> is filled with all zeros which in YUV land is green. The little > >> shapes > >> you see are the motion vectors that are being decoded against the > >> green buffer. > >> > >> How long is the clip you captured? What is the key frame interval set > >> to on the Axis camera? > >> > >> - Brad > >> > >> On Mar 31, 2009, at 4:24 PM, York Sun wrote: > >> > >>> Brad, > >>> > >>> I should be clear about the color. It is not random color. It is all > >>> green, with a little bit shape here and there under vlc. mplayer > >>> shows > >>> most black. > >>> > >>> I think it is something related to multicast. Just found an old post > >>> in > >>> archive > >>> http://lists.live555.com/pipermail/live-devel/2008-August/009323.html > >>> . > >>> Ross seems to believe the missing key frame causes the green. > >>> > >>> I tried live555MediaServer and it streams OK. > >>> > >>> Here is what I am trying to do. I want to capture rtsp steams from > >>> multiple IP cameras (MPEG4 or H.264), save the stream to the hard > >>> drive, > >>> and also re-stream the video to unicast and/or multicast addresses. > >>> I am > >>> hoping I can use the openRTSP and testMPEG4VideoStreamer as template > >>> and > >>> put them together. Now it seems to be a little bit difficult for > >>> multicasting. Please advise if it is even possible, or someone > >>> already > >>> did it. > >>> > >>> Thanks, > >>> > >>> York > >>> > >>> > >>> On Tue, 2009-03-31 at 15:49 -0400, Brad Bitterman wrote: > >>>> Not sure what would cause a color issue like what you're > >>>> describing.... Live555 doesn't modify the frame data in any way so > >>>> chroma info should not be modified. > >>>> > >>>> > >>>> - Brad > >>>> > >>>> On Mar 30, 2009, at 10:34 PM, sun york-R58495 wrote: > >>>> > >>>>> Thanks for the hint. vlc can play the file with .m4v extension. > >>>>> That > >>>>> at least proves the openRTSP works OK. I still don't know why > >>>>> testMPEG4VideoStreamer sends the vague picture with wrong color. > >>>>> Is > >>>>> it resolution issue? The sample para.m4e file streams OK. It has > >>>>> low > >>>>> resolution, though. > >>>>> > >>>>> York > >>>>> > >>>>> > >>>>> > >>>>> -----Original Message----- > >>>>> From: live-devel-bounces at ns.live555.com on behalf of Brad > >>>>> Bitterman > >>>>> Sent: Mon 3/30/2009 19:51 > >>>>> To: LIVE555 Streaming Media - development & use > >>>>> Subject: Re: [Live-devel] stream m4e > >>>>> > >>>>> You can use ffmpeg to wrap the file into a mp4 container. I think > >>>>> the > >>>>> command is just the following: > >>>>> > >>>>> ./ffmpeg -i video-MP4V-ES-1 out.mp4 > >>>>> > >>>>> You should then be able to open the file with VLC. Also, ffmpeg > >>>>> will > >>>>> usually tell you if there are problems with the bitstream. I have > >>>>> also > >>>>> just renamed the file with a .m4v extension and VLC can play it > >>>>> directly without any container. > >>>>> > >>>>> I have used openRTSP in the past to capture a stream from an Axis > >>>>> camera and it worked fine. You might also want to turn the debug > >>>>> log > >>>>> level up in vlc and see if it reports any errors. > >>>>> > >>>>> - Brad > >>>>> > >>>>> On Mar 30, 2009, at 5:58 PM, York Sun wrote: > >>>>> > >>>>>> I am new to this list. Tried to search the archive but didn't > >>>>>> find > >>>>> a > >>>>>> good way. > >>>>>> > >>>>>> In short, I have difficulty to stream m4e by > >>>>> testMPEG4VideoStreamer. > >>>>>> The > >>>>>> m4e comes from openRTSP (renamed from video-MP4V-ES-1), captured > >>>>>> from my > >>>>>> Axis camera, 640x480. I can see the picture with wrong color. vlc > >>>>> and > >>>>>> mplayer display different wrong color. I don't know which part is > >>>>>> wrong. > >>>>>> > >>>>>> More info, if adding -i or -4 to openRTSP, the avi or mp4 file > >>>>>> can > >>>>> be > >>>>>> correctly played by vlc and mplayer. I don't know how to verify > >>>>> the > >>>>>> m4e > >>>>>> file. I can stream the sample para.m4e file. Only the very > >>>>> beginning > >>>>>> shows wrong color. > >>>>>> > >>>>>> Any help is appreciated. > >>>>>> > >>>>>> York > >>>>>> > >>>>>> _______________________________________________ > >>>>>> live-devel mailing list > >>>>>> live-devel at lists.live555.com > >>>>>> http://lists.live555.com/mailman/listinfo/live-devel > >>>>> > >>>>> _______________________________________________ > >>>>> live-devel mailing list > >>>>> live-devel at lists.live555.com > >>>>> http://lists.live555.com/mailman/listinfo/live-devel > >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > >>>>> _______________________________________________ > >>>>> live-devel mailing list > >>>>> live-devel at lists.live555.com > >>>>> http://lists.live555.com/mailman/listinfo/live-devel > >>>> > >>>> > >>>> _______________________________________________ > >>>> live-devel mailing list > >>>> live-devel at lists.live555.com > >>>> http://lists.live555.com/mailman/listinfo/live-devel > >>> _______________________________________________ > >>> live-devel mailing list > >>> live-devel at lists.live555.com > >>> http://lists.live555.com/mailman/listinfo/live-devel > >> > >> _______________________________________________ > >> live-devel mailing list > >> live-devel at lists.live555.com > >> http://lists.live555.com/mailman/listinfo/live-devel > >> > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From patbob at imoveinc.com Wed Apr 1 11:16:59 2009 From: patbob at imoveinc.com (Patrick White) Date: Wed, 1 Apr 2009 11:16:59 -0700 Subject: [Live-devel] minor bug on windows implementation of gettimeofday() In-Reply-To: References: <1238450328.5328.178.camel@oslab-l1> Message-ID: <200904011116.59137.patbob@imoveinc.com> Over in GroupSockHelper.cpp is a windows implementation of gettimeofday(). There are three implementations for it -- one for CE (?), one that uses ftime() and another that uses QueryPerformanceCounter(). The version that uses QueryPerformanceCounter() is broken. With that implementation, the epoch of the returned times is variant -- seemns to be last-boot. gettimeofday() is supposed to return wall clock times with an invariant epoch. The ftime() implementation should be used on windows instead, which one gets by compiling the library with the USE_OLD_GETTIMEOFDAY_FOR_WINDOWS_CODE symbol defined. I think I found the only place where it matters -- gettimeofday() is used to record the time of the last received RTCP RR report. In all other places, it seems to be used for delta times against other calls to gettimeofday(), so either implementation is OK for them. You should probably remove or fix that QueryPerformanceCounter() variant so gettimeofday() does what it is supposed to do -- we all have the library source so you never know when one of us is going to try to print out the time of the last RR report or something :) From smalenfant at gmail.com Wed Apr 1 13:10:34 2009 From: smalenfant at gmail.com (Steve Malenfant) Date: Wed, 1 Apr 2009 16:10:34 -0400 Subject: [Live-devel] Implement RTSP skip forward in mediaServer Message-ID: <9a33fb600904011310y12d9eeecgb4025b4522bd44a7@mail.gmail.com> Seems like there is an option in RTSP to specify a range in time that you want to start your stream. I did a web page for a STB that when a certain button is pressed (for example), I take the current Position and add 30 to it and set the position. The new npt is sent to the server but mediaServer doesn't seems to do anything with this param. Without an index, it would also seem feasible to skip, but then I don't know about rewind. http://www.myiptv.org/Articles/RTSP/tabid/72/Default.aspx "The important bits of this command are Range and Scale. See I said you would want to know the range. Range specifies from where and how much of the content to play. 0- tells the server to start at the beginning and play to the end but you could also start anywhere in the file as we?ll see in a minute or only play the first 5 minutes of the content. It?s up to you." Steve M. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 1 13:49:08 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Apr 2009 13:49:08 -0700 Subject: [Live-devel] Access to RTP Extension Header In-Reply-To: <3509FC8D-B688-4585-AD1F-0983297628AB@vtilt.com> References: <3509FC8D-B688-4585-AD1F-0983297628AB@vtilt.com> Message-ID: >Not sure if this made it through to the list You can easily check this by looking at the online archives: http://lists.live555.com/pipermail/live-devel/ > so I'm resending..... *Please* don't send the same message to the mailing list more than once! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Apr 1 14:21:34 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Apr 2009 14:21:34 -0700 Subject: [Live-devel] Implement RTSP skip forward in mediaServer In-Reply-To: <9a33fb600904011310y12d9eeecgb4025b4522bd44a7@mail.gmail.com> References: <9a33fb600904011310y12d9eeecgb4025b4522bd44a7@mail.gmail.com> Message-ID: >Seems like there is an option in RTSP to specify a range in time >that you want to start your stream. Yes, and we support it, for both RTSP clients and RTSP servers. For RTSP clients: Note the "start" and "end" parameters to "RTSPClient:: playMediaSession()" and "RTSPClient:: playMediaSubsession()". These parameters (if set to non-default values) tell the RTSP client to request a specific time range. For RTSP servers: Our RTSP server implementation (including its use in the "LIVE555 Media Server" product) supports these requests, ***provided that*** the underlying file type can handle them. In our current implementation, the following file types support this: - MP3 audio files - MPEG-1or 2 audio/video Program Stream files (but not reliably) - MPEG Transport Stream files (provided that they each have an 'index file'; see the documentation) - WAV audio files For other file types (including MPEG-4 video files), our implementation currently does *not* support seeking. >http://www.myiptv.org/Articles/RTSP/tabid/72/Default.aspx >"The important bits of this command are Range and Scale. See I said >you would want to know the range. Range specifies from where and how >much of the content to play. 0- tells the server to start at the >beginning and play to the end but you could also start anywhere in >the file as we'll see in a minute or only play the first 5 minutes >of the content. It's up to you." You don't have to worry about the details of the RTSP protocol; we implement all of this for you. Just use our "RTSPClient" class, and pass the appropriate parameters (as noted above). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 1 17:08:23 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Apr 2009 17:08:23 -0700 Subject: [Live-devel] minor bug on windows implementation of gettimeofday() In-Reply-To: <200904011116.59137.patbob@imoveinc.com> References: <1238450328.5328.178.camel@oslab-l1> <200904011116.59137.patbob@imoveinc.com> Message-ID: >Over in GroupSockHelper.cpp is a windows implementation of gettimeofday(). >There are three implementations for it -- one for CE (?), one that uses >ftime() and another that uses QueryPerformanceCounter(). > >The version that uses QueryPerformanceCounter() is broken. With that >implementation, the epoch of the returned times is variant -- seemns to be >last-boot. gettimeofday() is supposed to return wall clock times with an >invariant epoch. The ftime() implementation should be used on windows >instead, which one gets by compiling the library with the >USE_OLD_GETTIMEOFDAY_FOR_WINDOWS_CODE symbol defined. The "QueryPerformaceCounter()" code was introduced - in October 2006 - in response to an email by Dave Arnold, complaining that the old, "ftime()"-based implementation was insufficiently precise. See . For this reason, I don't want to go back to making the old implementation the default. Instead, I'd rather fix the current implementation, if we can. Unfortunately I'm not an expert on Windoze-specific API stuff. (Unfortunately Dave Arnold no longer appears to be on this mailing list.) Is there any other Windows function - other than "ftime()" - that returns a correct time, based on a fixed epoch? (If not, then the only possible fix I can think of would be to call "ftime()" just once - for the first call to "gettimeofday()" - to get a fixed 'epoch adjustment' that would be added to the subsequent times returned by "QueryPerformaceCounter()".) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kidjan at gmail.com Wed Apr 1 18:52:58 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 1 Apr 2009 18:52:58 -0700 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: I'm still struggling to get this code to work correctly, and I'm sure I'm just doing something dumb (sorry, new to Live555 and H.264, and the available samples leave much to be desired). I've attached the code I've written/used (some of it is stuff I found previous posters had used). Once again, the video source is just a webcam. The encoder is a directshow implementation of x264, and the framer is something I put together. I think it's mostly likely that I'm doing something wrong in H264RTPSender.cpp, as this is the code I feel like I understand the least. Any advice is hugely appreciated at this point; it's been two weeks and I'm still not able to get any video in VLC, quicktime, etc. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264RTPSender.cpp Type: application/octet-stream Size: 4066 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: x264EncodingThread.cpp Type: application/octet-stream Size: 7727 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: x264VideoStreamFramer.cpp Type: application/octet-stream Size: 3687 bytes Desc: not available URL: From sebastien-devel at celeos.eu Wed Apr 1 23:57:35 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Thu, 02 Apr 2009 08:57:35 +0200 Subject: [Live-devel] minor bug on windows implementation of gettimeofday() In-Reply-To: References: <1238450328.5328.178.camel@oslab-l1> <200904011116.59137.patbob@imoveinc.com> Message-ID: <1238655455.49d461df53066@imp.celeos.eu> Quoting Ross Finlayson : > Unfortunately I'm not an expert on Windoze-specific API stuff. You may have a look at vlc times functions : http://git.videolan.org/gitweb.cgi?p=vlc.git;a=blob;f=src/misc/mtime.c;h=0dbb4df578308b38e6e3ff9487b0e9143f11853b;hb=HEAD or for a direct access to the file : http://git.videolan.org/gitweb.cgi?p=vlc.git;a=blob_plain;f=src/misc/mtime.c;hb=HEAD If you need a high precision clock, with a random epoch, look at mdate. If you need a constant epoch look at gettimeofday From Andrew.P.McMahon at boeing.com Thu Apr 2 01:51:41 2009 From: Andrew.P.McMahon at boeing.com (I-McMahon, Andrew P) Date: Thu, 2 Apr 2009 10:51:41 +0200 Subject: [Live-devel] minor bug on windows implementation ofgettimeofday() In-Reply-To: <1238655455.49d461df53066@imp.celeos.eu> References: <1238450328.5328.178.camel@oslab-l1><200904011116.59137.patbob@imoveinc.com> <1238655455.49d461df53066@imp.celeos.eu> Message-ID: <24F6CF60E435AD4F88E444DBCC8261E818874B@XCH-EU-1V1.eu.nos.boeing.com> A suggestion for this could be to intialise using time() to get UTC time, then use QueryPerformanceFrequency/QueryPerformanceCounter from then on, like this: int gettimeofday(struct timeval* tp, int* /*tz*/) { static bool tickFrequencySet = false; static unsigned __int64 tickFrequency = 0; static unsigned __int64 timeOffset = 0; unsigned __int64 tickNow; if (tickFrequencySet == false) { QueryPerformanceFrequency(reinterpret_cast(&tickFrequency)); time_t t; time(&t); QueryPerformanceCounter(reinterpret_cast(&tickNow)); tickNow /= tickFrequency; timeOffset = t - tickNow; tickFrequencySet = true; } QueryPerformanceCounter(reinterpret_cast(&tickNow)); tp->tv_sec = static_cast((tickNow / tickFrequency) + timeOffset); tp->tv_usec = static_cast(((tickNow % tickFrequency) * 1000000L) / tickFrequency); return 0; } (better implementations of this are probably available ;)) Andy M -----Original Message----- From: S?bastien Escudier [mailto:sebastien-devel at celeos.eu] Sent: 2009-04-02 07:58 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] minor bug on windows implementation ofgettimeofday() Quoting Ross Finlayson : > Unfortunately I'm not an expert on Windoze-specific API stuff. You may have a look at vlc times functions : http://git.videolan.org/gitweb.cgi?p=vlc.git;a=blob;f=src/misc/mtime.c;h=0dbb4df578308b38e6e3ff9487b0e9143f11853b;hb=HEAD or for a direct access to the file : http://git.videolan.org/gitweb.cgi?p=vlc.git;a=blob_plain;f=src/misc/mtime.c;hb=HEAD If you need a high precision clock, with a random epoch, look at mdate. If you need a constant epoch look at gettimeofday _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Thu Apr 2 01:47:57 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Apr 2009 01:47:57 -0700 Subject: [Live-devel] teardownMediaSession In-Reply-To: <1238588403.49d35bf3b2621@imp.celeos.eu> References: <1238588403.49d35bf3b2621@imp.celeos.eu> Message-ID: >I would like to know why do we need to wait for an answer in >teardownMediaSession ? In principle the RTSP "TEARDOWN" command might not succeed - e.g., because of permission problems. In practice, though, this is unlikely, I agree. The next (completely revamped) version of "RTSPClient" will send each command asynchronously (i.e., without blocking), and use the event loop to handle the reply, if desired. For "TEARDOWN", the programmer probably won't bother handing the reply. >Would you have any suggestion ? Wait for my upcoming update to "RTSPClient". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Alan.Roberts at e2v.com Thu Apr 2 01:59:57 2009 From: Alan.Roberts at e2v.com (Roberts, Alan) Date: Thu, 2 Apr 2009 09:59:57 +0100 Subject: [Live-devel] teardownMediaSession Message-ID: <8821BCD7410B064BA4414E4F8080E5EB043A2D51@whl46.e2v.com> I have the same problem Sebastien. During this 30 seconds I can't write to or unmount the SD card that I'm recording video onto. If I power-down during this 30 seconds the current clip that I'm recording becomes un-playable... If, however, I wait for 30 seconds all is fine. The problem is I can't wait for more than a couple of seconds with my application. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of S?bastien Escudier Sent: 01 April 2009 13:20 To: live-devel at ns.live555.com Subject: [Live-devel] teardownMediaSession Hi, I would like to know why do we need to wait for an answer in teardownMediaSession ? My issue is : When a server stop sending me data, I am closing the client. But before that I send a teardownMediaSession just in case the server or the connection is not really dead. But if the server or the connection IS dead, I have to wait 30sec for the function teardownMediaSession to return. And I don't want to know if the server received the teardown or not. Would you have any suggestion ? Thanks. _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ______________________________________________________________________ This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ Sent by a member of the e2v group of companies. The parent company, e2v technologies plc, is registered in England and Wales. Company number; 04439718. Registered address; 106 Waterhouse Lane, Chelmsford, Essex, CM1 2QU, UK. This email and any attachments are confidential and meant solely for the use of the intended recipient. If you are not the intended recipient and have received this email in error, please notify us immediately by replying to the sender and then deleting this copy and the reply from your system without further disclosing, copying, distributing or using the e-mail or any attachment. Thank you for your cooperation. ______________________________________________________ ________________ This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ From stas at tech-mer.com Thu Apr 2 06:41:43 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Thu, 2 Apr 2009 16:41:43 +0300 Subject: [Live-devel] H.264 + AAC + data in TS Message-ID: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> Hi All, I need to create a streamer for my embedded solution running Linux. The streamer receives H.264 video and AAC audio packets from hardware encoders and also a stream of some telemetry data. It should pack all of it into MPEG2 TS and send it over LAN (no QoS needed). Questions: 1) Is there a good example for muxing elementary A/V streams and sending it as a single TS? 2) Does the live library support muxing of custom data streams? Regards, Stas -------------- next part -------------- An HTML attachment was scrubbed... URL: From patbob at imoveinc.com Thu Apr 2 10:32:04 2009 From: patbob at imoveinc.com (Patrick White) Date: Thu, 2 Apr 2009 10:32:04 -0700 Subject: [Live-devel] minor bug on windows implementation ofgettimeofday() In-Reply-To: <24F6CF60E435AD4F88E444DBCC8261E818874B@XCH-EU-1V1.eu.nos.boeing.com> References: <1238450328.5328.178.camel@oslab-l1> <1238655455.49d461df53066@imp.celeos.eu> <24F6CF60E435AD4F88E444DBCC8261E818874B@XCH-EU-1V1.eu.nos.boeing.com> Message-ID: <200904021032.04817.patbob@imoveinc.com> This will work, but time() only has resolution of 1 second, so your times reported by gettimeofday() will be consistently off by up to 1 full second from what they should really be. If you do the same thing with ftime(), you'll be consistenly off by somewhere in the 10s of milliseconds. It doesn't really matter, but if we go with the time() fix, someone down the road will probably notice the RR packet last-received times are off by a second and we'll be fixing it all over again :) I've got other bugs to finish swatting today, but I'll post a proposed fix that uses ftime when I can.. if its still relevant and needed by then. patbob On Thursday 02 April 2009 1:51 am, I-McMahon, Andrew P wrote: > A suggestion for this could be to intialise using time() to get UTC time, > then use QueryPerformanceFrequency/QueryPerformanceCounter from then on, > like this: > > int gettimeofday(struct timeval* tp, int* /*tz*/) { > static bool tickFrequencySet = false; > static unsigned __int64 tickFrequency = 0; > static unsigned __int64 timeOffset = 0; > > unsigned __int64 tickNow; > > if (tickFrequencySet == false) { > > QueryPerformanceFrequency(reinterpret_cast(&tickFrequency)) >; > > time_t t; > time(&t); > > QueryPerformanceCounter(reinterpret_cast(&tickNow)); > tickNow /= tickFrequency; > timeOffset = t - tickNow; > > tickFrequencySet = true; > } > > QueryPerformanceCounter(reinterpret_cast(&tickNow)); > tp->tv_sec = static_cast((tickNow / tickFrequency) + timeOffset); > tp->tv_usec = static_cast(((tickNow % tickFrequency) * 1000000L) / > tickFrequency); > > return 0; > } > > (better implementations of this are probably available ;)) > > Andy M > > -----Original Message----- > From: S?bastien Escudier [mailto:sebastien-devel at celeos.eu] > Sent: 2009-04-02 07:58 > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] minor bug on windows implementation > ofgettimeofday() > > Quoting Ross Finlayson : > > Unfortunately I'm not an expert on Windoze-specific API stuff. > > You may have a look at vlc times functions : > > http://git.videolan.org/gitweb.cgi?p=vlc.git;a=blob;f=src/misc/mtime.c;h=0d >bb4df578308b38e6e3ff9487b0e9143f11853b;hb=HEAD > > or for a direct access to the file : > > http://git.videolan.org/gitweb.cgi?p=vlc.git;a=blob_plain;f=src/misc/mtime. >c;hb=HEAD > > If you need a high precision clock, with a random epoch, look at mdate. > If you need a constant epoch look at gettimeofday > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From alberto.pretto at dei.unipd.it Thu Apr 2 07:23:54 2009 From: alberto.pretto at dei.unipd.it (Alberto Pretto) Date: Thu, 2 Apr 2009 16:23:54 +0200 Subject: [Live-devel] Suggestions for uncompressed Video streming and multiple payload format Message-ID: <200904021623.54139.alberto.pretto@dei.unipd.it> Hi, I'm using liveMedia library for streaming from a Flir thermocamera (A320) that can stream MPEG4 format, uncompressed video and a proprietary format (FCAM). For MPEG4 streaming, I used liveMedia with ffmepg libraries, without any problems. I needed also to stream the uncompressed video (Y160, i.e. 1 channel images with 16 bit of pixel depth). The liveMedia library doesn't support this type of streaming, but initiating the subsession with an offset of 14 ( subsession->initiate(14) ), that is the length of the extended payload RFC 4175 format, it is possible to stream this type of payload by setting the bool useSpecialRTPoffset to True inside MediaSession.cpp source, before that the object SimpleRTPSource is created. I suggest to the liveMedia developers to modify the function Boolean MediaSubsession::initiate(int useSpecialRTPoffset) in order to let the caller to set useSpecialRTPoffset boolean. Another small issue: Flir A320 allow to stream various payload format, here the SDP: v=0 o=- 0 0 IN IP4 192.168.200.199 s=IR stream i=Live infrared t=now- c=IN IP4 192.168.200.199 m=video 10100 RTP/AVP 96 97 98 100 101 103 104 a=control:rtsp://192.168.200.199/sid=96 a=framerate:30 a=rtpmap:96 MP4V-ES/90000 a=framesize:96 640-480 a=fmtp:96 profile-level-id=5;config=000001B005000001B509000001010000012002045D4C28A021E0A4C7 a=rtpmap:97 MP4V-ES/90000 a=framesize:97 320-240 a=fmtp:97 profile-level-id=5;config=000001B005000001B509000001010000012002045D4C285020F0A4C7 a=rtpmap:98 MP4V-ES/90000 a=framesize:98 160-128 a=fmtp:98 profile-level-id=5;config=000001B005000001B509000001010000012002045D4C28282080A4C7 a=rtpmap:100 FCAM/90000 a=framesize:100 320-240 a=fmtp:100 sampling=mono; width=320; height=240; depth=16 a=rtpmap:101 FCAM/90000 a=framesize:101 160-120 a=fmtp:101 sampling=mono; width=160; height=120; depth=16 a=rtpmap:103 raw/90000 a=framesize:103 320-240 a=fmtp:103 sampling=mono; width=320; height=240; depth=16 a=rtpmap:104 raw/90000 a=framesize:104 160-120 a=fmtp:104 sampling=mono; width=160; height=120; depth=16 liveMedia get only the first payload format, but it is possible to work around this limitation by calling the initializeWithSDP function with a "modified" SDP, e.g. to select the raw 320x240 format i call the initializeWithSDP function with this hard-coded SDP: v=0 o=- 0 0 IN IP4 192.168.200.199 s=IR stream i=Live infrared t=now- c=IN IP4 192.168.200.199 m=video 10040 RTP/AVP 103 a=control:rtsp://192.168.200.199/sid=103 a=framerate:30 a=rtpmap:103 raw/90000 a=framesize:103 320-240 a=fmtp:103 sampling=mono; width=320; height=240; depth=16 Thanks a lot! Alberto From smalenfant at gmail.com Thu Apr 2 08:18:43 2009 From: smalenfant at gmail.com (Steve Malenfant) Date: Thu, 2 Apr 2009 11:18:43 -0400 Subject: [Live-devel] Implement RTSP skip forward in mediaServer In-Reply-To: References: <9a33fb600904011310y12d9eeecgb4025b4522bd44a7@mail.gmail.com> Message-ID: <9a33fb600904020818m22f04977j8f2997b89c70a7d9@mail.gmail.com> I can't use the live555 client, I'm using an Amino STB. My first test failed (maybe I used the wrong stream with no trick play), but my second test just worked. Thanks. On Wed, Apr 1, 2009 at 5:21 PM, Ross Finlayson wrote: > Seems like there is an option in RTSP to specify a range in time that you > want to start your stream. > > > Yes, and we support it, for both RTSP clients and RTSP servers. > > For RTSP clients: Note the "start" and "end" parameters to "RTSPClient:: > playMediaSession()" and "RTSPClient:: playMediaSubsession()". These > parameters (if set to non-default values) tell the RTSP client to request a > specific time range. > > For RTSP servers: Our RTSP server implementation (including its use in the > "LIVE555 Media Server" product) supports these requests, ***provided that*** > the underlying file type can handle them. In our current implementation, > the following file types support this: > - MP3 audio files > - MPEG-1or 2 audio/video Program Stream files (but not reliably) > - MPEG Transport Stream files (provided that they each have an 'index > file'; see the documentation) > - WAV audio files > > For other file types (including MPEG-4 video files), our implementation > currently does *not* support seeking. > > http://www.myiptv.org/Articles/RTSP/tabid/72/Default.aspx > "The important bits of this command are Range and Scale. See I said you > would want to know the range. Range specifies from where and how much of the > content to play. 0- tells the server to start at the beginning and play to > the end but you could also start anywhere in the file as we'll see in a > minute or only play the first 5 minutes of the content. It's up to you." > > > You don't have to worry about the details of the RTSP protocol; we > implement all of this for you. Just use our "RTSPClient" class, and pass > the appropriate parameters (as noted above). > > -- > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From debargha.mukherjee at hp.com Thu Apr 2 10:01:50 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Thu, 2 Apr 2009 17:01:50 +0000 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: <73833378E80044458EC175FF8C1E63D57246FD3060@GVW0433EXB.americas.hpqcorp.net> Hi Jeremy, I am currently trying to attack a similar problem with the ffmpeg encoder (which I believe also uses X264). While I am not an expert on live either, from reading the archives this is what I have understood so far, and this is what I would try next. You need to send NAL units to your H264VideoRTPSink. Is the X264 encoder you are using creating a single NAL unit per video frame encoded? If not, you need to parse the bit-stream that you get from your encoder, extract NAL units and send them one by one to your sink. You may check for NAL units by checking for a 3 byte start code 0x000001, and then your NAL unit would begin from the next byte - until the next 0x000001 is encountered. Am not sure if this mechanism would work, but is worth a try. If there are multiple NAL units in a frame, you should probably set an internal parameter say currentNALUnitEndsAccessUnit_var to be TRUE only for the last NAL unit in the frame and FALSE for the others. Your currentNALUnitEndsAccessUnit() function should just return this variable currentNALUnitEndsAccessUnit_var. I would also set the fDurationInMicrosecs parameter to be 0 for all NAL units sent to sink except the last. For the last just use the actual duration of the frame. Also, I would make presentation time the same for all NAL units in the frame. In order to write an x264VideoStreamFramer class, I would also model it based on the MPEG4VideoStreamDiscreteFramer class, and then add an appropriate parsing mechanism to extract NAL units from the encoded frame bit-stream received. Best Regards, Debargha. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeremy Noring Sent: Wednesday, April 01, 2009 6:53 PM To: live-devel at ns.live555.com Subject: Re: [Live-devel] H.264 NAL confusion I'm still struggling to get this code to work correctly, and I'm sure I'm just doing something dumb (sorry, new to Live555 and H.264, and the available samples leave much to be desired). I've attached the code I've written/used (some of it is stuff I found previous posters had used). Once again, the video source is just a webcam. The encoder is a directshow implementation of x264, and the framer is something I put together. I think it's mostly likely that I'm doing something wrong in H264RTPSender.cpp, as this is the code I feel like I understand the least. Any advice is hugely appreciated at this point; it's been two weeks and I'm still not able to get any video in VLC, quicktime, etc. -------------- next part -------------- An HTML attachment was scrubbed... URL: From spider.karma+live555.com at gmail.com Thu Apr 2 10:06:52 2009 From: spider.karma+live555.com at gmail.com (Gordon Smith) Date: Thu, 2 Apr 2009 11:06:52 -0600 Subject: [Live-devel] Read from device file problem In-Reply-To: <2df568dc0903311310h1aa1c3bcnba70f1a03d0417a9@mail.gmail.com> References: <2df568dc0903311310h1aa1c3bcnba70f1a03d0417a9@mail.gmail.com> Message-ID: <2df568dc0904021006k864a52drd1c068a0dca27dc0@mail.gmail.com> On Tue, Mar 31, 2009 at 2:10 PM, Gordon Smith wrote: > > Hello - > > After modification for input of filename, testMPEG2TransportStreamer > can read from pipe, but not read directly from device file. > The fread() in ByteStreamFileSource returns EIO and reads zero bytes > for direct read. > > Device file /dev/video2 is saa7134-empress device on Linux debian > 2.6.26-1-686, live-2009-03-22, latest v4l-dvb modules. > > Reading /dev/video2 directly results in EIO and zero bytes read. > > ? $ sudo ./testMPEG2TransportStreamer /dev/video2 FYI, I modified testMPEG2TransportStreamer, ByteStreamFileSource, FramedFileSource and InputFile and was able to read a v4l2 device file. The modifications did the following: - Use low level I/O: open, read, etc. in place of fread, etc. - Use select (with timeout) before call to read - Ignore EIO (2 or 3 occur at startup but shouldn't) - Remove file seek > FYI, using cat piped to testMPEG2TransportStreamer starts well, but > begins to continuously lose data after about 2 to 4 minutes. > Using setvbuf to increase buffer to 32k or 64k did not help. > > ? $ cat /dev/video2 | sudo ./testMPEG2TransportStreamer stdin > I still get data loss after a few minutes with direct read, but that may be a problem with the device module. - Gordon From smalenfant at gmail.com Thu Apr 2 12:52:10 2009 From: smalenfant at gmail.com (Steve Malenfant) Date: Thu, 2 Apr 2009 15:52:10 -0400 Subject: [Live-devel] Implement RTSP skip forward in mediaServer In-Reply-To: <9a33fb600904020818m22f04977j8f2997b89c70a7d9@mail.gmail.com> References: <9a33fb600904011310y12d9eeecgb4025b4522bd44a7@mail.gmail.com> <9a33fb600904020818m22f04977j8f2997b89c70a7d9@mail.gmail.com> Message-ID: <9a33fb600904021252s2900e832xdf338175b501d270@mail.gmail.com> Is there a possible bug that after you request something higher than 3570 seconds it is not working on transport stream files? It always brings me back to the same scene. Transport file length : 20177187044 (20GB) Index file lenght : 1151033004 (1.1 GB) Thanks. On Thu, Apr 2, 2009 at 11:18 AM, Steve Malenfant wrote: > I can't use the live555 client, I'm using an Amino STB. My first test > failed (maybe I used the wrong stream with no trick play), but my second > test just worked. > > Thanks. > > > On Wed, Apr 1, 2009 at 5:21 PM, Ross Finlayson wrote: > >> Seems like there is an option in RTSP to specify a range in time that >> you want to start your stream. >> >> >> Yes, and we support it, for both RTSP clients and RTSP servers. >> >> For RTSP clients: Note the "start" and "end" parameters to "RTSPClient:: >> playMediaSession()" and "RTSPClient:: playMediaSubsession()". These >> parameters (if set to non-default values) tell the RTSP client to request a >> specific time range. >> >> For RTSP servers: Our RTSP server implementation (including its use in the >> "LIVE555 Media Server" product) supports these requests, ***provided that*** >> the underlying file type can handle them. In our current implementation, >> the following file types support this: >> - MP3 audio files >> - MPEG-1or 2 audio/video Program Stream files (but not reliably) >> - MPEG Transport Stream files (provided that they each have an 'index >> file'; see the documentation) >> - WAV audio files >> >> For other file types (including MPEG-4 video files), our implementation >> currently does *not* support seeking. >> >> http://www.myiptv.org/Articles/RTSP/tabid/72/Default.aspx >> "The important bits of this command are Range and Scale. See I said you >> would want to know the range. Range specifies from where and how much of the >> content to play. 0- tells the server to start at the beginning and play to >> the end but you could also start anywhere in the file as we'll see in a >> minute or only play the first 5 minutes of the content. It's up to you." >> >> >> You don't have to worry about the details of the RTSP protocol; we >> implement all of this for you. Just use our "RTSPClient" class, and pass >> the appropriate parameters (as noted above). >> >> -- >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yorksun at freescale.com Thu Apr 2 14:53:11 2009 From: yorksun at freescale.com (York Sun) Date: Thu, 02 Apr 2009 16:53:11 -0500 Subject: [Live-devel] FramedSource and MediaSink Message-ID: <1238709191.27465.27.camel@oslab-l1> Live experts, I am still learning... I have read http://lists.live555.com/pipermail/live-devel/2008-October/009620.html and http://www.live555.com/liveMedia/faq.html#liveInput-unicast Can I hook up any FramedSource to any MediaSink? Specifically, can I use the first part of openRTSP to get subsession->readSource() and put it to a BasicUDPSink? Can I put it into more than one more sink, such as FileSink? I am trying to get multiple rtsp video streams (from cameras) to record on files, and host an on-demand rtsp server as well as a live rtsp server. Any advice is appreciated. York From neville.bradbury at opensoftaustralia.com.au Thu Apr 2 15:53:27 2009 From: neville.bradbury at opensoftaustralia.com.au (neville bradbury) Date: Fri, 3 Apr 2009 09:53:27 +1100 Subject: [Live-devel] capturing a live555 stream to a usb hard disk for playing later Message-ID: <200904022253.n32MrXg5063314@ns.live555.com> Hi, I was wondering what is the most easier way to capture a stream to an usb hard drive that would simulate a pvr device. I would assume then we could use a vlc on a stb to point to this file and run locally. Regards, Neville Bradbury OpenSoft Australia Office +61 2 99395260 Mob +61 447405948 Address: Suite 403 97 Pacific Hwy North Sydney Web http://www.opensoftaustralia.com.au "Integrating the Future" -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 2 21:39:39 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Apr 2009 21:39:39 -0700 Subject: [Live-devel] FramedSource and MediaSink In-Reply-To: <1238709191.27465.27.camel@oslab-l1> References: <1238709191.27465.27.camel@oslab-l1> Message-ID: >Can I hook up any FramedSource to any MediaSink? In most cases, yes. (However, there are exceptions. For example, only a "H264VideoStreamFramer" (or a subclass) can be fed into a "H264VideoRTPSink".) > Specifically, can I use >the first part of openRTSP to get subsession->readSource() and put it to >a BasicUDPSink? Yes. > Can I put it into more than one more sink, such as >FileSink? No. A "FramedSource" can be fed into only one "MediaSink" at a time. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Apr 2 21:50:50 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Apr 2009 21:50:50 -0700 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> Message-ID: >1. Is there a good example for muxing elementary A/V streams and >sending it as a single TS? Yes, see the "wis-streamer" source code: , and note how we implement the "-mpegtransport" option. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From smalenfant at gmail.com Thu Apr 2 14:51:55 2009 From: smalenfant at gmail.com (Steve Malenfant) Date: Thu, 2 Apr 2009 17:51:55 -0400 Subject: [Live-devel] Implement RTSP skip forward in mediaServer In-Reply-To: <9a33fb600904021442x3a3dc2d6k5773f47f6fa29954@mail.gmail.com> References: <9a33fb600904011310y12d9eeecgb4025b4522bd44a7@mail.gmail.com> <9a33fb600904020818m22f04977j8f2997b89c70a7d9@mail.gmail.com> <9a33fb600904021252s2900e832xdf338175b501d270@mail.gmail.com> <9a33fb600904021442x3a3dc2d6k5773f47f6fa29954@mail.gmail.com> Message-ID: <9a33fb600904021451r62287a11rfe554b8d9fc1655a@mail.gmail.com> I guess it's related to PCR rolling over within that file.... [steve at CentosP4 readable]$ head -10000 1431_20090215193000.ts | dvbsnoop -if - -s ts -tssubdecode | grep clock program_clock_reference: ==> program_clock_reference: 2480569434945 (0x2418d75c341) [= PCR-Timestamp: 25:31:12.942035] program_clock_reference: ==> program_clock_reference: 2480570988683 (0x2418d8d788b) [= PCR-Timestamp: 25:31:12.999580] program_clock_reference: ==> program_clock_reference: 2480571369790 (0x2418d93493e) [= PCR-Timestamp: 25:31:13.013695] [steve at CentosP4 readable]$ tail -10000 1431_20090215193000.ts | dvbsnoop -if - -s ts -tssubdecode | grep clock program_clock_reference: ==> program_clock_reference: 146458110601 (0x2219956a89) [= PCR-Timestamp: 1:30:24.374466] program_clock_reference: ==> program_clock_reference: 146458843495 (0x2219a09967) [= PCR-Timestamp: 1:30:24.401610] program_clock_reference: ==> program_clock_reference: 146459538699 (0x2219ab350b) [= PCR-Timestamp: 1:30:24.427359] On Thu, Apr 2, 2009 at 5:42 PM, Steve Malenfant wrote: > Here some more details to make it obvious: > > [steve at CentosP4 readable]$ > /home/steve/Desktop/testMPEG2TransportStreamTrickPlay 1431_20090215193000.ts > 3600 1 test2.ts > Writing output file "test2.ts" (start time 3570.539062, scale 1)... > [steve at CentosP4 readable]$ > /home/steve/Desktop/testMPEG2TransportStreamTrickPlay 1431_20090215193000.ts > 4600 1 test2.ts > Writing output file "test2.ts" (start time 3570.539062, scale 1)... > > They both start at 3570.539062 start time. > > > On Thu, Apr 2, 2009 at 3:52 PM, Steve Malenfant wrote: > >> Is there a possible bug that after you request something higher than 3570 >> seconds it is not working on transport stream files? It always brings me >> back to the same scene. >> >> Transport file length : 20177187044 (20GB) >> Index file lenght : 1151033004 (1.1 GB) >> >> Thanks. >> >> >> On Thu, Apr 2, 2009 at 11:18 AM, Steve Malenfant wrote: >> >>> I can't use the live555 client, I'm using an Amino STB. My first test >>> failed (maybe I used the wrong stream with no trick play), but my second >>> test just worked. >>> >>> Thanks. >>> >>> >>> On Wed, Apr 1, 2009 at 5:21 PM, Ross Finlayson wrote: >>> >>>> Seems like there is an option in RTSP to specify a range in time that >>>> you want to start your stream. >>>> >>>> >>>> Yes, and we support it, for both RTSP clients and RTSP servers. >>>> >>>> For RTSP clients: Note the "start" and "end" parameters to "RTSPClient:: >>>> playMediaSession()" and "RTSPClient:: playMediaSubsession()". These >>>> parameters (if set to non-default values) tell the RTSP client to request a >>>> specific time range. >>>> >>>> For RTSP servers: Our RTSP server implementation (including its use in >>>> the "LIVE555 Media Server" product) supports these requests, ***provided >>>> that*** the underlying file type can handle them. In our current >>>> implementation, the following file types support this: >>>> - MP3 audio files >>>> - MPEG-1or 2 audio/video Program Stream files (but not reliably) >>>> - MPEG Transport Stream files (provided that they each have an 'index >>>> file'; see the documentation) >>>> - WAV audio files >>>> >>>> For other file types (including MPEG-4 video files), our implementation >>>> currently does *not* support seeking. >>>> >>>> http://www.myiptv.org/Articles/RTSP/tabid/72/Default.aspx >>>> "The important bits of this command are Range and Scale. See I said you >>>> would want to know the range. Range specifies from where and how much of the >>>> content to play. 0- tells the server to start at the beginning and play to >>>> the end but you could also start anywhere in the file as we'll see in a >>>> minute or only play the first 5 minutes of the content. It's up to you." >>>> >>>> >>>> You don't have to worry about the details of the RTSP protocol; we >>>> implement all of this for you. Just use our "RTSPClient" class, and pass >>>> the appropriate parameters (as noted above). >>>> >>>> -- >>>> >>>> >>>> Ross Finlayson >>>> Live Networks, Inc. >>>> http://www.live555.com/ >>>> >>>> _______________________________________________ >>>> live-devel mailing list >>>> live-devel at lists.live555.com >>>> http://lists.live555.com/mailman/listinfo/live-devel >>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From spider.karma+live555.com at gmail.com Thu Apr 2 16:42:01 2009 From: spider.karma+live555.com at gmail.com (Gordon Smith) Date: Thu, 2 Apr 2009 17:42:01 -0600 Subject: [Live-devel] Read from device file problem In-Reply-To: <2df568dc0904021006k864a52drd1c068a0dca27dc0@mail.gmail.com> References: <2df568dc0903311310h1aa1c3bcnba70f1a03d0417a9@mail.gmail.com> <2df568dc0904021006k864a52drd1c068a0dca27dc0@mail.gmail.com> Message-ID: <2df568dc0904021642l2130973bp75cb7e06ef840e5b@mail.gmail.com> > $ sudo ./testMPEG2TransportStreamer /dev/video2 > > I still get data loss after a few minutes with direct read, but that > may be a problem with the device module. I now see that overflow occurs in the encoder device module (zero free buffers). I see in ByteStreamFileSource.cpp that "fPreferredFrameSize" fits an integral number of MPEG-2 TS packets into the MTU. If I remove the the following in ByteStreamFileSource.cpp: // Try to read as many bytes as will fit in the buffer provided // (or "fPreferredFrameSize" if less) if (fPreferredFrameSize > 0 && fPreferredFrameSize < fMaxSize) { fMaxSize = fPreferredFrameSize; } and read as much data as is available, 12032 bytes (64 * 188) are read at a time instead of the usual 1316 (7 * 188). Would a larger MTU likely help? Would that be moving into new territory codewise? Thanks for any pointers, Gordon From workzh at hotmail.com Thu Apr 2 20:23:31 2009 From: workzh at hotmail.com (zhanglu) Date: Fri, 3 Apr 2009 03:23:31 +0000 Subject: [Live-devel] About "Live555/DirectShow Source Filter for PCM audio: source code" Message-ID: Mr. Ralf Globisch : I'm new to live555 ,and now I'm in a project and I encountered some problems rating to the live555 and DShow integration. And fortunately I read your message at http://lists.live555.com/pipermail/live-devel/2008-December/009914.html , But I can not find the source and download it.. So It's kind of you to send me a copy of codes,the "simple directshow rtsp source filter using live555 library". Thanks very much! Bruce 09.04 _________________________________________________________________ ????????????MClub????????? http://club.msn.cn/?from=10 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 2 22:20:04 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Apr 2009 22:20:04 -0700 Subject: [Live-devel] Implement RTSP skip forward in mediaServer In-Reply-To: <9a33fb600904021451r62287a11rfe554b8d9fc1655a@mail.gmail.com> References: <9a33fb600904011310y12d9eeecgb4025b4522bd44a7@mail.gmail.com> <9a33fb600904020818m22f04977j8f2997b89c70a7d9@mail.gmail.com> <9a33fb600904021252s2900e832xdf338175b501d270@mail.gmail.com> <9a33fb600904021442x3a3dc2d6k5773f47f6fa29954@mail.gmail.com> <9a33fb600904021451r62287a11rfe554b8d9fc1655a@mail.gmail.com> Message-ID: >I guess it's related to PCR rolling over within that file.... Yes. The MPEG Transport Stream file indexing mechanism requires that the PCRs be increasing throughout the file. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Apr 2 22:23:53 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Apr 2009 22:23:53 -0700 Subject: [Live-devel] Suggestions for uncompressed Video streming and multiple payload format In-Reply-To: <200904021623.54139.alberto.pretto@dei.unipd.it> References: <200904021623.54139.alberto.pretto@dei.unipd.it> Message-ID: No, the right solution is to properly implement the RTP payload format described in RFC 4175. This will require implementing: 1/ a new subclass of "MultiFramedRTPSink" - e.g., called "UncompressedVideoRTPSink" - for transmitting RTP packets of this payload format, and 2/ a new subclass of "MultiFramedRTPSource" - e.g., called "UncompressedVideoRTPSource" - for receiving RTP packets of this payload format. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Apr 2 22:30:59 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Apr 2009 22:30:59 -0700 Subject: [Live-devel] Read from device file problem In-Reply-To: <2df568dc0904021642l2130973bp75cb7e06ef840e5b@mail.gmail.com> References: <2df568dc0903311310h1aa1c3bcnba70f1a03d0417a9@mail.gmail.com> <2df568dc0904021006k864a52drd1c068a0dca27dc0@mail.gmail.com> <2df568dc0904021642l2130973bp75cb7e06ef840e5b@mail.gmail.com> Message-ID: >Would a larger MTU likely help? No, this is a bad idea (and won't solve your problem anyway, if your input source is really generating data too fast). If the MTU is too large, you'll get IP-level fragmentation, increasing the chances of unrecoverable packet loss. Our "MPEG2TransportStreamFramer" class automatically paces the outgoing Transport Stream data properly, based on its PCR timestamps. If your input source is generating data too fast for this, then perhaps its PCR timestamps are wrong? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From slaine at slaine.org Fri Apr 3 02:13:02 2009 From: slaine at slaine.org (Glen Gray) Date: Fri, 3 Apr 2009 10:13:02 +0100 Subject: [Live-devel] Compiler warnings Message-ID: <1E8CA7AD-3108-4240-AE14-7FC3D6B90C68@slaine.org> Hey Guys, Working on a idea for a project and compiled up live on my Fedora 10 box, with gcc version 4.3.2 Noticed a tone of warnings such as ... "warning: deprecated conversion from string constant to ?char*?" This is from passing a "string" to a function that's expecting a char*. Can be cleaned up in 2 ways 1) "Fix" function prototypes from "char *" to "const char *" as appropriate 2) Add -Wno-write-strings to the CFLAGS in the Makefiles to suppress the warnings. Which would be deemed suitable Ross ? -- Glen Gray slaine at slaine.org From slaine at slaine.org Fri Apr 3 06:46:01 2009 From: slaine at slaine.org (Glen Gray) Date: Fri, 3 Apr 2009 14:46:01 +0100 Subject: [Live-devel] Compiler warnings In-Reply-To: <1E8CA7AD-3108-4240-AE14-7FC3D6B90C68@slaine.org> References: <1E8CA7AD-3108-4240-AE14-7FC3D6B90C68@slaine.org> Message-ID: Looking back at the archives, I see that Erik Hovland submitted a big patch for something similar. On 3 Apr 2009, at 10:13, Glen Gray wrote: > Hey Guys, > > Working on a idea for a project and compiled up live on my Fedora 10 > box, with gcc version 4.3.2 > > Noticed a tone of warnings such as ... > "warning: deprecated conversion from string constant to ?char*?" > > This is from passing a "string" to a function that's expecting a > char*. Can be cleaned up in 2 ways > > 1) "Fix" function prototypes from "char *" to "const char *" as > appropriate > 2) Add -Wno-write-strings to the CFLAGS in the Makefiles to suppress > the warnings. > > Which would be deemed suitable Ross ? > -- > Glen Gray > slaine at slaine.org > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Glen Gray slaine at slaine.org From finlayson at live555.com Fri Apr 3 08:06:42 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Apr 2009 08:06:42 -0700 Subject: [Live-devel] Compiler warnings In-Reply-To: References: <1E8CA7AD-3108-4240-AE14-7FC3D6B90C68@slaine.org> Message-ID: >>Working on a idea for a project and compiled up live on my Fedora >>10 box, with gcc version 4.3.2 >> >>Noticed a tone of warnings such as ... >>"warning: deprecated conversion from string constant to 'char*'" >> >>This is from passing a "string" to a function that's expecting a >>char*. Can be cleaned up in 2 ways >> >>1) "Fix" function prototypes from "char *" to "const char *" as appropriate >>2) Add -Wno-write-strings to the CFLAGS in the Makefiles to >>suppress the warnings. >> >>Which would be deemed suitable Ross ? Definitely 1). If a string constant is being passed to a function, then that function's parameter should be decalred "char const*". >Looking back at the archives, I see that Erik Hovland submitted a >big patch for something similar. I remember *someone* submitting a patch for this, but it was bogus, because it replaced the string constants with heap-allocated copies of the strings (instead of just changing the function parameter declarations). Just let me know the functions (and parameters) for which this is an issue (there probably aren't many of them), and I'll fix them. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kidjan at gmail.com Thu Apr 2 22:06:32 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 2 Apr 2009 22:06:32 -0700 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: Thanks for the response Debargha. << You need to send NAL units to your H264VideoRTPSink. Is the X264 encoder you are using creating a single NAL unit per video frame encoded? >> My encoder is creating a single NAL unit per video frame encoded. I verified that very thing this evening just to double check. I do see that the encoder is capable of generating multiple NAL units per frame, but when I checked the size of the incoming buffers, it was clear that I was getting a single frame per nal unit. Because of this, I have currentNALUnitEndsAccessUnit always return true. << In order to write an x264VideoStreamFramer class, I would also model it based on the MPEG4VideoStreamDiscreteFramer class, and then add an appropriate parsing mechanism to extract NAL units from the encoded frame bit-stream received >> I haven't looked at that framer, but I will. After tonight, I'm convinced something is wrong with my SDP stuff. I fired up Wireshark to see the traffic between my transmitting application and VLC, and I don't see the SDP stuff anywhere in the bitstream. Would it be contained in the RTP packets, or the RTCP packets? I feel like I'm pretty close to having this work, but every time I fire up VLC it basically does absolutely nothing with the stream, so...not sure what I'm missing here. The other thing I don't understand is I see other people using the string "h264" for the sprop string in the framer--why is it that I need to include my SPS/PPS parameters, but other people have reported implementations working just find with "h264" inputted for that entire string? I'm definitely confused about how RTSP, RTP and SDP all operate together, and what Live555 provides w/r/t all of these technologies. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Apr 3 08:38:40 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Apr 2009 08:38:40 -0700 Subject: [Live-devel] H.264 NAL confusion In-Reply-To: References: Message-ID: >After tonight, I'm convinced something is wrong with my SDP stuff. >I fired up Wireshark to see the traffic between my transmitting >application and VLC, and I don't see the SDP stuff anywhere in the >bitstream. Would it be contained in the RTP packets, or the RTCP >packets? No, the SDP description is returned in the response to RTSP's "DESCRIBE" command. (You can see this by running "openRTSP" on your server; you don't need to run 'wireshark' to see this.) >The other thing I don't understand is I see other people using the >string "h264" for the sprop string in the framer Why are people doing this? This is completely bogus. The "sprop_parameter_sets_str" parameter to "H264VideoRTPSink::createNew()" *must* be a string formed by Base-64-encoding the SPS and PPS NAL units, and concatenating them with ",". > I'm definitely confused about how RTSP, RTP and SDP all operate together To use this software, you need to be familiar with these protocols. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Sun Apr 5 07:01:50 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Sun, 5 Apr 2009 17:01:50 +0300 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> Hi Ross, Thanx for the quick response. Before I start reading code can someone point me to an RFC or other documentation describing how the NAL units should be packed into the TS? I understand that the better way is to just send the NAL units without TS but my client has an application that expects reception of TS, so there is no way to implement it differently. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, April 03, 2009 7:51 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 + AAC + data in TS 1. Is there a good example for muxing elementary A/V streams and sending it as a single TS? Yes, see the "wis-streamer" source code: , and note how we implement the "-mpegtransport" option. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Apr 5 15:31:34 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 5 Apr 2009 15:31:34 -0700 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> Message-ID: >Thanx for the quick response. Before I start reading code can >someone point me to an RFC or other documentation describing how the >NAL units should be packed into the TS? FYI, this will be a ISO document, not an IETF RFC (because it's the ISO, not the IETF, that standardizes MPEG). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kwizart at gmail.com Sun Apr 5 14:43:14 2009 From: kwizart at gmail.com (Nicolas Chauvet (kwizart)) Date: Sun, 05 Apr 2009 23:43:14 +0200 Subject: [Live-devel] [Patches] Fedora package maintainer Message-ID: <49D925F2.6060607@gmail.com> Hi ! As Fedora packager at rpmfusion.org, we collected various patches for live555. Here is an unified version of them with some explanations. - build live555 shared instead of statically (might be better to switch the name to linux.shared so static build stay the default). - pick env CFLAG instead of hardcoded -O2 (and propagate to subdir) - conditionalize code with __FreeBSD_kernel__ && __GLIBC__ (originaly from debian) - choose $(CC) -o instead of ld -o - avoid undeeded cast. - re-order directory (avoid symbol not present while linking). Thx Nicolas (kwizart) -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: live.2009.03.22-unified.patch URL: From finlayson at live555.com Sun Apr 5 18:47:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 5 Apr 2009 18:47:26 -0700 Subject: [Live-devel] [Patches] Fedora package maintainer In-Reply-To: <49D925F2.6060607@gmail.com> References: <49D925F2.6060607@gmail.com> Message-ID: Thanks, but I don't think I will be applying these patches, at least not right now. First, I don't plan on releasing any 'config.' files that do dynamic linking, because these days - for most environments, with RAM being so plentiful - dynamic linking often causes more trouble than it's worth. (If people want to write their own 'config.' files that do dynamic linking, then that's fine; it's just that I don't plan on distributing any such 'config.' files.) Also: >diff -uNr live/groupsock/GroupsockHelper.cpp >live-patches/groupsock/GroupsockHelper.cpp >--- live/groupsock/GroupsockHelper.cpp 2009-03-22 23:26:16.000000000 +0100 >+++ live-patches/groupsock/GroupsockHelper.cpp 2009-03-24 >14:10:56.000000000 +0100 >@@ -474,12 +474,14 @@ > // let us know, by sending email to the "live-devel" mailing list. > // (See >to subscribe to that mailing list.) > // END NOTE TO CYGWIN DEVELOPERS >+#if !(defined(__FreeBSD_kernel__) && defined(__GLIBC__)) > struct ip_mreq_source { > struct in_addr imr_multiaddr; /* IP multicast address of group */ > struct in_addr imr_sourceaddr; /* IP address of source */ > struct in_addr imr_interface; /* local IP address of interface */ > }; > #endif >+#endif I don't understand this. Does Fedora have the same bug as Cygwin - i.e., does it define IP_ADD_SOURCE_MEMBERSHIP, but not define ip_mreq_source?? If that's really the case, then you should fix Fedora. >diff -uNr live/liveMedia/H263plusVideoRTPSink.cpp >live-patches/liveMedia/H263plusVideoRTPSink.cpp >--- live/liveMedia/H263plusVideoRTPSink.cpp 2009-03-22 >23:26:16.000000000 +0100 >+++ live-patches/liveMedia/H263plusVideoRTPSink.cpp 2009-03-24 >14:10:56.000000000 +0100 >@@ -64,7 +64,7 @@ > } > if (frameStart[0] != 0 || frameStart[1] != 0) { > envir() << "H263plusVideoRTPSink::doSpecialFrameHandling(): >unexpected non-zero first two bytes: " >- << (void*)(frameStart[0]) << "," << >(void*)(frameStart[1]) << "\n"; >+ << (frameStart[0]) << "," << (frameStart[1]) << "\n"; I needed these (void*) casts to ensure that 1-byte values get printed properly, at least on my FreeBSD system. If they don't cause any problems in Fedora, then I'll just leave them as is. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Sun Apr 5 22:39:11 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Mon, 6 Apr 2009 08:39:11 +0300 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C07357A72D18A@fs11.mertree.mer.co.il> Whatever, as long as its standard. So, any links? -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, April 06, 2009 1:32 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 + AAC + data in TS >Thanx for the quick response. Before I start reading code can >someone point me to an RFC or other documentation describing how the >NAL units should be packed into the TS? FYI, this will be a ISO document, not an IETF RFC (because it's the ISO, not the IETF, that standardizes MPEG). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Mon Apr 6 01:25:49 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Apr 2009 01:25:49 -0700 Subject: [Live-devel] minor bug on windows implementation of gettimeofday() In-Reply-To: <200904011116.59137.patbob@imoveinc.com> References: <1238450328.5328.178.camel@oslab-l1> <200904011116.59137.patbob@imoveinc.com> Message-ID: OK, I have now installed a new version (2009.04.06) of the "LIVE555 Streaming Media" code that - I think - now fixes this problem. Please download and check the new code, to make sure it's OK. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sebastien-devel at celeos.eu Mon Apr 6 01:49:21 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Mon, 06 Apr 2009 10:49:21 +0200 Subject: [Live-devel] Compiler warnings In-Reply-To: References: <1E8CA7AD-3108-4240-AE14-7FC3D6B90C68@slaine.org> Message-ID: <1239007761.49d9c21191efa@imp.celeos.eu> Quoting Ross Finlayson : > Just let me know the functions (and parameters) for which this is an > issue (there probably aren't many of them), and I'll fix them. Hi, There are functions, but also "char *" variables like that : char * example; example = "const string"; please find a list of the warnings attached (using today release). -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: live555_warnings.txt URL: From sebastien-devel at celeos.eu Mon Apr 6 02:12:53 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Mon, 06 Apr 2009 11:12:53 +0200 Subject: [Live-devel] 2009.04.06 version Message-ID: <1239009173.49d9c7958817e@imp.celeos.eu> Hi Ross, looking at 2009.04.06 version, I see you applied this patch : http://mailman.videolan.org/pipermail/vlc-devel/2009-April/058263.html As you can see here : http://mailman.videolan.org/pipermail/vlc-devel/2009-April/058284.html it does not really solve the issue. It seems to be because of my patch you applied in 2008.11.13 version, for timeout values. I am not using windows at all, so I can't test this, and I am not able to fix this. If you don't know what's wrong, maybe you can revert this patch ? Then we would wait for you RTSPClient rework. Regards. From sebastien-devel at celeos.eu Mon Apr 6 02:16:59 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Mon, 06 Apr 2009 11:16:59 +0200 Subject: [Live-devel] 2009.04.06 version In-Reply-To: <1239009173.49d9c7958817e@imp.celeos.eu> References: <1239009173.49d9c7958817e@imp.celeos.eu> Message-ID: <1239009419.49d9c88b9d223@imp.celeos.eu> Quoting S?bastien Escudier : > As you can see here : > http://mailman.videolan.org/pipermail/vlc-devel/2009-April/058284.html it > does > not really solve the issue. I meant http://mailman.videolan.org/pipermail/vlc-devel/2009-April/058279.html sorry From finlayson at live555.com Mon Apr 6 02:26:29 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Apr 2009 02:26:29 -0700 Subject: [Live-devel] 2009.04.06 version In-Reply-To: <1239009173.49d9c7958817e@imp.celeos.eu> References: <1239009173.49d9c7958817e@imp.celeos.eu> Message-ID: >looking at 2009.04.06 version, I see you applied this patch : >http://mailman.videolan.org/pipermail/vlc-devel/2009-April/058263.html > >As you can see here : >http://mailman.videolan.org/pipermail/vlc-devel/2009-April/058284.html it does >not really solve the issue. > >It seems to be because of my patch you applied in 2008.11.13 version, for >timeout values. > >I am not using windows at all, so I can't test this, and I am not able to fix >this. >If you don't know what's wrong, maybe you can revert this patch ? I don't want to make more major changes to "RTSPClient.cpp" at this time (until the completely reworked version is done). If - for whatever reason - the timeout stuff doesn't work, then for now I suggest not using timeout parameters at all. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Mon Apr 6 04:02:49 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Mon, 6 Apr 2009 14:02:49 +0300 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: <21E398286732DC49AD45BE8C7BE96C07357A72D18A@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D18A@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C07357A72D1FD@fs11.mertree.mer.co.il> I found a document describing how the NAL packets should be packaged in the transport stream. The document name is JVT-C087.doc and can be downloaded from the net. The question is - does live555 implements the streaming and packetization of the AVC NAL packets according to this document? Also I don't see how all the program information table (PMT, PAT, NIT, CAT...) fit into this scheme. After this is solved I'd need to figure how to add voice and data steams to the TS and how to tell the demuxer on the other side what is what. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Stas Desyatnlkov Sent: Monday, April 06, 2009 8:39 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 + AAC + data in TS Whatever, as long as its standard. So, any links? -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, April 06, 2009 1:32 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 + AAC + data in TS >Thanx for the quick response. Before I start reading code can >someone point me to an RFC or other documentation describing how the >NAL units should be packed into the TS? FYI, this will be a ISO document, not an IETF RFC (because it's the ISO, not the IETF, that standardizes MPEG). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From slaine at slaine.org Mon Apr 6 08:15:58 2009 From: slaine at slaine.org (Glen Gray) Date: Mon, 6 Apr 2009 16:15:58 +0100 Subject: [Live-devel] testProgs tested ? Message-ID: Hey guys, I'm looking around at the samples in testProgs and interested in using some techniques on a small project. I've had some problems in getting the testProgs/testRelay app working sufficently. Basically, I've got an off the shelf TV encoder that broadcasts MPEG2- TS onto the network. I used testRelay to read the source address and relay it to another address. I've got VLC running on another box on the same LAN with two play list items, one pointing to the original broadcast address and the other to the relayed address. Original plays fine, relayed one is corrupted badly with lots of "libdvbpsi error (PSI decoder): TS discontinuity (received 14, expected 12) for PID 17" type messages appearing in the logs. I'm looking for some tips on how to test this testProgs/testRelay so that I can get a working stream relayed to another address. Thanks in advance, -- Glen Gray slaine at slaine.org From slaine at slaine.org Mon Apr 6 11:20:41 2009 From: slaine at slaine.org (Glen Gray) Date: Mon, 6 Apr 2009 19:20:41 +0100 Subject: [Live-devel] testProgs tested ? In-Reply-To: References: Message-ID: Looking back through the archives I see this was an issue for someone back in 2005. I've added the suggested multicastSendOnly() call to the outbound groupsock, but I can't test it 'til tomorrow. Interestingly, I tested on Mac OSX and Linux and both had the same issues as regards playback on the new multicast address. On 6 Apr 2009, at 16:15, Glen Gray wrote: > Hey guys, > > I'm looking around at the samples in testProgs and interested in > using some techniques on a small project. I've had some problems in > getting the testProgs/testRelay app working sufficently. > > Basically, I've got an off the shelf TV encoder that broadcasts > MPEG2-TS onto the network. I used testRelay to read the source > address and relay it to another address. I've got VLC running on > another box on the same LAN with two play list items, one pointing > to the original broadcast address and the other to the relayed > address. Original plays fine, relayed one is corrupted badly with > lots of "libdvbpsi error (PSI decoder): TS discontinuity (received > 14, expected 12) for PID 17" type messages appearing in the logs. > > I'm looking for some tips on how to test this testProgs/testRelay so > that I can get a working stream relayed to another address. > > Thanks in advance, > -- > Glen Gray > slaine at slaine.org > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Glen Gray slaine at slaine.org From slaine at slaine.org Mon Apr 6 11:51:54 2009 From: slaine at slaine.org (Glen Gray) Date: Mon, 6 Apr 2009 19:51:54 +0100 Subject: [Live-devel] testProgs tested ? In-Reply-To: References: Message-ID: <5334AE92-938D-4507-8C5D-E6C81753CEF7@slaine.org> Further digging into the archives reveals that using the same port number on different multicast addresses probably isn't the best thing to do on linux. I've tried bumping the port number up by one and the vlc logs show no TS discontinuity errors now. I'll know for sure if it's working when I get back to my desk tomorrow On 6 Apr 2009, at 16:15, Glen Gray wrote: > Hey guys, > > I'm looking around at the samples in testProgs and interested in > using some techniques on a small project. I've had some problems in > getting the testProgs/testRelay app working sufficently. > > Basically, I've got an off the shelf TV encoder that broadcasts > MPEG2-TS onto the network. I used testRelay to read the source > address and relay it to another address. I've got VLC running on > another box on the same LAN with two play list items, one pointing > to the original broadcast address and the other to the relayed > address. Original plays fine, relayed one is corrupted badly with > lots of "libdvbpsi error (PSI decoder): TS discontinuity (received > 14, expected 12) for PID 17" type messages appearing in the logs. > > I'm looking for some tips on how to test this testProgs/testRelay so > that I can get a working stream relayed to another address. > > Thanks in advance, > -- > Glen Gray > slaine at slaine.org > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Glen Gray slaine at slaine.org From finlayson at live555.com Mon Apr 6 19:21:53 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Apr 2009 19:21:53 -0700 Subject: [Live-devel] Compiler warnings In-Reply-To: <1239007761.49d9c21191efa@imp.celeos.eu> References: <1E8CA7AD-3108-4240-AE14-7FC3D6B90C68@slaine.org> <1239007761.49d9c21191efa@imp.celeos.eu> Message-ID: >There are functions, but also "char *" variables like that : >char * example; >example = "const string"; > >please find a list of the warnings attached (using today release). OK, I've now installed a new version (2009.04.07) of the code that fixes most of these. Thanks for the help. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Apr 6 20:55:24 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Apr 2009 20:55:24 -0700 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: <21E398286732DC49AD45BE8C7BE96C07357A72D1FD@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D18A@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D1FD@fs11.mertree.mer.co.il> Message-ID: >I found a document describing how the NAL packets should be packaged >in the transport stream. >The document name is JVT-C087.doc and can be downloaded from the net. >The question is - does live555 implements the streaming and >packetization of the AVC NAL packets according to this document? Our software can stream, or receive, *any* MPEG Transport Stream data, regardless of its contents, as long as it has proper PCR timestamps. The contents of Transport Stream data become relevant only when we are trying to create a new Transport Stream from some other type of data - using our "MPEG2TransportStreamMultiplexor" class (or its subclasses: "MPEG2TransportStreamFromESSource" or "MPEGTransportStreamFromPESSource"). Note, in particular, the code starting at line 100 of "MPEG2TransportStreamMultiplexor.cpp". To support adding a new kind of data to a Transport Stream, you may need to modify this code in some way. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Apr 6 22:34:56 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Apr 2009 22:34:56 -0700 Subject: [Live-devel] testProgs tested ? In-Reply-To: References: Message-ID: >Basically, I've got an off the shelf TV encoder that broadcasts >MPEG2-TS onto the network. I used testRelay to read the source >address and relay it to another address. I've got VLC running on >another box on the same LAN with two play list items, one pointing >to the original broadcast address and the other to the relayed >address. Original plays fine, relayed one is corrupted badly with >lots of "libdvbpsi error (PSI decoder): TS discontinuity (received >14, expected 12) for PID 17" type messages appearing in the logs. I suspect you're seeing network packet loss. See http://www.live555.com/liveMedia/faq.html#packet-loss -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Tue Apr 7 00:20:43 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Tue, 7 Apr 2009 10:20:43 +0300 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D18A@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D1FD@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C07357A72D27A@fs11.mertree.mer.co.il> That is exactly the question - how should I modify the MPEG2TransportStreamMultiplexor code to put the NAL packets correctly into the resulting TS? -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, April 07, 2009 6:55 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 + AAC + data in TS >I found a document describing how the NAL packets should be packaged >in the transport stream. >The document name is JVT-C087.doc and can be downloaded from the net. >The question is - does live555 implements the streaming and >packetization of the AVC NAL packets according to this document? Our software can stream, or receive, *any* MPEG Transport Stream data, regardless of its contents, as long as it has proper PCR timestamps. The contents of Transport Stream data become relevant only when we are trying to create a new Transport Stream from some other type of data - using our "MPEG2TransportStreamMultiplexor" class (or its subclasses: "MPEG2TransportStreamFromESSource" or "MPEGTransportStreamFromPESSource"). Note, in particular, the code starting at line 100 of "MPEG2TransportStreamMultiplexor.cpp". To support adding a new kind of data to a Transport Stream, you may need to modify this code in some way. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Apr 7 00:41:19 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 7 Apr 2009 00:41:19 -0700 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: <21E398286732DC49AD45BE8C7BE96C07357A72D27A@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D18A@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D1FD@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D27A@fs11.mertree.mer.co.il> Message-ID: >That is exactly the question - how should I modify the >MPEG2TransportStreamMultiplexor code to put the NAL packets >correctly into the resulting TS? I don't know, but please let us know when you've figured it out :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Tue Apr 7 04:43:22 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Tue, 7 Apr 2009 14:43:22 +0300 Subject: [Live-devel] H.264 + AAC + data in TS In-Reply-To: References: <21E398286732DC49AD45BE8C7BE96C07357A72D09C@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D171@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D18A@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D1FD@fs11.mertree.mer.co.il> <21E398286732DC49AD45BE8C7BE96C07357A72D27A@fs11.mertree.mer.co.il> Message-ID: <21E398286732DC49AD45BE8C7BE96C07357A72D2DC@fs11.mertree.mer.co.il> I will. Its just there is no clear indication as to how this should be done. On the other hand there are commercially available solutions (e.g. Elecard demuxer) that receives and handles the AVC + AAC TS. These solutions are available on Windows and there is no way of telling it that my implementation is right, I have to provide the TS these components expect to receive, otherwise my A/V won't play right in the media player. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, April 07, 2009 10:41 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H.264 + AAC + data in TS >That is exactly the question - how should I modify the >MPEG2TransportStreamMultiplexor code to put the NAL packets >correctly into the resulting TS? I don't know, but please let us know when you've figured it out :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From slaine at slaine.org Tue Apr 7 06:49:34 2009 From: slaine at slaine.org (Glen Gray) Date: Tue, 7 Apr 2009 14:49:34 +0100 Subject: [Live-devel] testProgs tested ? In-Reply-To: References: Message-ID: <67DB2CF0-E6B8-4A2A-9F4E-7EC1D7693072@slaine.org> On 7 Apr 2009, at 06:34, Ross Finlayson wrote: >> Basically, I've got an off the shelf TV encoder that broadcasts >> MPEG2-TS onto the network. I used testRelay to read the source >> address and relay it to another address. I've got VLC running on >> another box on the same LAN with two play list items, one pointing >> to the original broadcast address and the other to the relayed >> address. Original plays fine, relayed one is corrupted badly with >> lots of "libdvbpsi error (PSI decoder): TS discontinuity (received >> 14, expected 12) for PID 17" type messages appearing in the logs. > > I suspect you're seeing network packet loss. See http://www.live555.com/liveMedia/faq.html#packet-loss This was the port problem Ross. I changed the code to receive on port 5000 and rebroadcast on 5001 and it's working perfectly. Going forward with this little test app of mine, I need to be able to receive different UDP streams on the same machine. Unfortunately, all these streams will be on the same port. The encoder boxes in question will let you choose different multicast base addresses and ports, but the generated multicast addresses for each stream, while different, will always be on the same port. As has been mentioned a few times in the archives, by default the GroupsockHelper code binds to the INADDR_ANY setting (ReceivingInterfaceAddr is defaulting to this). What I need to happen is that it should bind to the multicast address but join the group using the wildcard interface, INADDR_ANY. This will allow the UDP layer to filter correctly to the socket the datagrams for the addr/ port combination as opposed to app getting all datagrams for the port. I've changed the code in GroupsockHelper::socketJoinGroup so that when setting up the socket to join with IP_ADD_MEMBERSHIP we use INADDR_ANY for the imr.imr_interface.s_addr value as opposed to using the value in the ReceivingInterfaceAddr variable. This allows me to set ReceivingInterfaceAddr in the testRelay.cpp code to the multicast addr for correct binding while allowing the system to join the multicast group on any interface. What's your take on how this should be achieved ? -- Glen Gray slaine at slaine.org From slaine at slaine.org Tue Apr 7 07:23:44 2009 From: slaine at slaine.org (Glen Gray) Date: Tue, 7 Apr 2009 15:23:44 +0100 Subject: [Live-devel] Compiler warnings In-Reply-To: References: <1E8CA7AD-3108-4240-AE14-7FC3D6B90C68@slaine.org> <1239007761.49d9c21191efa@imp.celeos.eu> Message-ID: <1FAB6F94-99C1-47F7-967C-CBCF226F8037@slaine.org> Excellent, thanks guys, On 7 Apr 2009, at 03:21, Ross Finlayson wrote: >> There are functions, but also "char *" variables like that : >> char * example; >> example = "const string"; >> >> please find a list of the warnings attached (using today release). > > OK, I've now installed a new version (2009.04.07) of the code that > fixes most of these. > > Thanks for the help. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Glen Gray slaine at slaine.org From bmoore at airgain.com Wed Apr 8 18:09:53 2009 From: bmoore at airgain.com (Bryan Moore) Date: Wed, 8 Apr 2009 18:09:53 -0700 Subject: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes Message-ID: These patches are for: 1. Linux+BSD: Corrected usage of 'select ' to avoid packet loss 2. Windows: Better use of WSAGetLastError (instead of errno) Note: review to add #ifndef _WIN32_WCE if needed Explanation: 'select ': On Linux and FreeBSD and similar operating systems, the buffer would fill up and lose packets: The 'select' function resets an internal "I/O available" flag after it returns for each of the sockets that are indicated as having available I/O. These internal "I/O available" flags are not set again until _NEW_ I/O is available (specifically, for incoming I/O, another packet arrives). In other words, if you call 'select' and 2 new packets are available for the socket you are testing, select will return immediately. But if 'recvfrom' only reads a single packet, the next call to 'select' will _BLOCK_ until another (3rd) new packet arrives for this socket. What is _supposed_ to happen is repeated calls to 'recvfrom' until there is no more data available. The best way to test for available data is via the 'FIONREAD' ioctl, which is what the patch is all about. This is a relatively fast call, and any performance hits caused by the ioctl are easily made up for by the greater efficiency of the I/O overall. 'WSAGetLastError ': We observe live555 'getErrno ' in BasicUsageEnvironment attempts to report the correct Windows socket error code. However if errno is already 0, the live555 might fail to use the correct error code. Windows always reports in WSAGetLastError, but will not set errno unless the compatibility library is also engaged. Therefore we patched this in several places. Regards, Bryan and Bob http://www.airgain.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: patch-BasicUsageEnvironment_BasicTaskScheduler_cpp Type: application/octet-stream Size: 2998 bytes Desc: patch-BasicUsageEnvironment_BasicTaskScheduler_cpp URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: patch-groupsock_GroupsockHelper_cpp Type: application/octet-stream Size: 5647 bytes Desc: patch-groupsock_GroupsockHelper_cpp URL: From chengjianweiok at gmail.com Wed Apr 8 19:51:40 2009 From: chengjianweiok at gmail.com (Danny Cheng) Date: Thu, 9 Apr 2009 10:51:40 +0800 Subject: [Live-devel] questiones about the MultiFramedRTPSource::doGetNextFrame1() Message-ID: <8cd741830904081951l720a7eb5m2cb2b36f499dfcb5@mail.gmail.com> Hi, I have questiones about the MultiFramedRTPSource::doGetNextFrame1() in MultiFramedRTPSource.cpp. In the codes bellow, what is the role of fNeedDelivery? Does it means that whether should the RTP Packets be delivered to fReorderingBuffer? What does " nextPacket->useCount() == 0 " means? Or what is the role of fuseCount? In addition, if the sentence bettween "unsigned specialHeaderSize;" and "nextPacket->skip(specialHeaderSize);" doesn't work, the "specialHeaderSize " is not initialized, then how does "nextPacket->skip(specialHeaderSize);" work? Thank you! -- After all, tomorrow is another day? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 9 00:52:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Apr 2009 00:52:28 -0700 Subject: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes In-Reply-To: References: Message-ID: Thanks for the suggestion; I like this. Unless anyone sees a problem with this, I will include it in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ivo.vachkov at gmail.com Thu Apr 9 08:04:54 2009 From: ivo.vachkov at gmail.com (Ivo Vachkov) Date: Thu, 9 Apr 2009 18:04:54 +0300 Subject: [Live-devel] Multicast to RTSP Message-ID: Hello all, I've been trying to setup multicast to rtsp server for some time now. However, I find difficulties. My idea is to modify testOnDemandRTSPServer to read from stdin (which i did successfully) and use something like this mcas2stdout $mcast_group $port | testOnDemandRTSPServer. I use VLC to stream to the multicast group. When I try "cat file.ts | testOnDemandRTSPServer" all works like a charm. When I start streaming file.ts with VLC (prefer UDP over RTP) and "mcast2stdout | testOnDemandRTSPServer" i see nothing. My mutlicast reading program works. My concern is that i probably need to "unpacket" whatever is comming from the multicast group and then send it to stdout/testOnDemandRTSPServer. Am I correct and/or how to proceed ? Thank you very much in advance. /ipv From patrice.peyrano at live.fr Thu Apr 9 14:22:24 2009 From: patrice.peyrano at live.fr (patrice peyrano) Date: Thu, 9 Apr 2009 23:22:24 +0200 Subject: [Live-devel] Link error with LiveMedia in a VS2005 project. Message-ID: Hello to all, I try to incorporate LiveMedia in a project in VS2005, which only display a dialog box empty I compiled all the libraries successfully. But with only the following code: #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" #pragma comment(lib, "libliveMedia.lib") #pragma comment(lib, "libgroupsock.lib") #pragma comment(lib, "libBasicUsageEnvironment.lib") #pragma comment(lib, "libUsageEnvironment.lib") TaskScheduler *scheduler = BasicTaskScheduler::createNew(); UsageEnvironment *env = BasicUsageEnvironment::createNew(*scheduler); I have the following link error: 1>libBasicUsageEnvironment.lib(BasicTaskScheduler.obj) : warning LNK4217: locally defined symbol _exit imported in function "protected: virtual void __thiscall BasicTaskScheduler::SingleStep(unsigned int)" (?SingleStep at BasicTaskScheduler@@MAEXI at Z) 1>libBasicUsageEnvironment.lib(BasicUsageEnvironment.obj) : warning LNK4049: locally defined symbol _exit imported 1>libBasicUsageEnvironment.lib(BasicUsageEnvironment.obj) : warning LNK4217: locally defined symbol __errno imported in function "public: virtual int __thiscall BasicUsageEnvironment::getErrno(void)const " (?getErrno at BasicUsageEnvironment@@UBEHXZ) 1>libBasicUsageEnvironment.lib(BasicUsageEnvironment.obj) : warning LNK4217: locally defined symbol ___iob_func imported in function "public: virtual class UsageEnvironment & __thiscall BasicUsageEnvironment::operatorlibBasicUsageEnvironment.lib(BasicUsageEnvironment0.obj) : warning LNK4217: locally defined symbol _memmove imported in function "public: virtual void __thiscall BasicUsageEnvironment0::appendToResultMsg(char const *)" (?appendToResultMsg at BasicUsageEnvironment0@@UAEXPBD at Z) 1>libgroupsock.lib(GroupsockHelper.obj) : warning LNK4217: locally defined symbol __ctime64 imported in function _ctime 1>libgroupsock.lib(GroupsockHelper.obj) : warning LNK4217: locally defined symbol _sprintf imported in function "int __cdecl setupDatagramSocket(class UsageEnvironment &,class Port,unsigned int)" (?setupDatagramSocket@@YAHAAVUsageEnvironment@@VPort@@I at Z) 1>libgroupsock.lib(GroupsockHelper.obj) : warning LNK4217: locally defined symbol _strncmp imported in function "unsigned int __cdecl ourIPAddress(class UsageEnvironment &)" (?ourIPAddress@@YAIAAVUsageEnvironment@@@Z) 1>libBasicUsageEnvironment.lib(BasicTaskScheduler.obj) : error LNK2019: unresolved external symbol __imp__perror referenced in function "protected: virtual void __thiscall BasicTaskScheduler::SingleStep(unsigned int)" (?SingleStep at BasicTaskScheduler@@MAEXI at Z) 1>libBasicUsageEnvironment.lib(BasicUsageEnvironment.obj) : error LNK2019: unresolved external symbol __imp__fprintf referenced in function "public: virtual class UsageEnvironment & __thiscall BasicUsageEnvironment::operatorlibBasicUsageEnvironment.lib(BasicUsageEnvironment0.obj) : error LNK2019: unresolved external symbol __imp__fputs referenced in function "public: virtual void __thiscall BasicUsageEnvironment0::reportBackgroundError(void)" (?reportBackgroundError at BasicUsageEnvironment0@@UAEXXZ) 1>libgroupsock.lib(GroupsockHelper.obj) : error LNK2019: unresolved external symbol __imp___ftime64 referenced in function _ftime 1>libgroupsock.lib(inet.obj) : error LNK2019: unresolved external symbol __imp__rand referenced in function _our_random 1>libgroupsock.lib(inet.obj) : error LNK2019: unresolved external symbol __imp__srand referenced in function _our_srandom For a long time I try the solution, but I have not found. Does someone could tell me how to fix this problem? Thank you in advance. Patrice _________________________________________________________________ T?l?phonez gratuitement ? tous vos proches avec Windows Live Messenger? !? T?l?chargez-le maintenant !? http://www.windowslive.fr/messenger/1.asp From kidjan at gmail.com Thu Apr 9 15:25:31 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 9 Apr 2009 15:25:31 -0700 Subject: [Live-devel] Best sample to start with for H.264 receiver? Message-ID: First, thanks to everyone who helped me with H.264 streaming; I have my framer and media subsession classes implemented, and an RTSP on-demand server that seems to work pretty well with VLC. I still don't work too well with Quicktime, but I'll iron that out later. Anyway, my question is: what is the best sample application to start with if I want to write a receiver application? I want to receive this H.264 video stream, and I'm not sure which receiver application is most well suited towards working with an on-demand RTPS server, or if any of the test*Receiver apps (or openRTSP?) would be suitable starting points. Thanks! Jeremy -- Where are we going? And why am I in this hand-basket? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 9 18:06:12 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Apr 2009 18:06:12 -0700 Subject: [Live-devel] Multicast to RTSP In-Reply-To: References: Message-ID: >I've been trying to setup multicast to rtsp server for some time now. >However, I find difficulties. My idea is to modify >testOnDemandRTSPServer to read from stdin (which i did successfully) Don't forget to also set the "reuseFirstSource" variable to "True". >and use something like this mcas2stdout $mcast_group $port | >testOnDemandRTSPServer. I use VLC to stream to the multicast group. >When I try "cat file.ts | testOnDemandRTSPServer" Note, you don't need to use "cat" here. Instead, use testOnDemandRTSPServer < file.ts >all works like a >charm. When I start streaming file.ts with VLC (prefer UDP over RTP) >and "mcast2stdout | testOnDemandRTSPServer" i see nothing. My >mutlicast reading program works. Does the following work: mcast2stdout > test.ts testOnDemandRTSPServer (using the original, unmodified code) If not, then there's a problem with your "mcast2stdout" application. If this does work, then try the following: mcast2stdout > file.ts testOnDemandRTSPServer < file.ts (using your new, modified code) >My concern is that i probably need to "unpacket" whatever is comming >from the multicast group and then send it to >stdout/testOnDemandRTSPServer. No. If the incoming data really is MPEG Transport Stream data, then you should be able to pipe it directly into your modifiled "testOnDemandRTSPServer". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Apr 9 18:26:44 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Apr 2009 18:26:44 -0700 Subject: [Live-devel] Best sample to start with for H.264 receiver? In-Reply-To: References: Message-ID: >Anyway, my question is: what is the best sample application to start >with if I want to write a receiver application? "openRTSP" -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From debargha.mukherjee at hp.com Fri Apr 10 03:09:44 2009 From: debargha.mukherjee at hp.com (Debargha Mukherjee) Date: Fri, 10 Apr 2009 03:09:44 -0700 Subject: [Live-devel] Best sample to start with for H.264 receiver? In-Reply-To: References: Message-ID: <5F18AC5B-9C92-4CB1-8491-73733D8A4F38@hp.com> Would you mind sharing how you fixed your H.264 streaming problems? __________________________ Debargha Mukherjee On Apr 9, 2009, at 3:25 PM, Jeremy Noring wrote: > First, thanks to everyone who helped me with H.264 streaming; I have > my framer and media subsession classes implemented, and an RTSP on- > demand server that seems to work pretty well with VLC. I still > don't work too well with Quicktime, but I'll iron that out later. > > Anyway, my question is: what is the best sample application to start > with if I want to write a receiver application? I want to receive > this H.264 video stream, and I'm not sure which receiver application > is most well suited towards working with an on-demand RTPS server, > or if any of the test*Receiver apps (or openRTSP?) would be suitable > starting points. > > Thanks! > > Jeremy > > -- > Where are we going? > And why am I in this hand-basket? > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From kidjan at gmail.com Fri Apr 10 12:25:37 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Fri, 10 Apr 2009 12:25:37 -0700 Subject: [Live-devel] Best sample to start with for H.264 receiver? In-Reply-To: <5F18AC5B-9C92-4CB1-8491-73733D8A4F38@hp.com> References: <5F18AC5B-9C92-4CB1-8491-73733D8A4F38@hp.com> Message-ID: On Fri, Apr 10, 2009 at 3:09 AM, Debargha Mukherjee < debargha.mukherjee at hp.com> wrote: > Would you mind sharing how you fixed your H.264 streaming problems? Yeah, it was pretty basic: I was using the Live555 RTSP classes incorrectly. I needed to start with one of the sample applications, instead of the incorrect H.264 tutorial code I started with (it's floating about somewhere on the list; I don't recommend it as a starting point as it has some rather blaring inaccuracies). My H.264 code was fine the way it was. I used x264 (a DShow filter implementation, that is) to perform the encode, DirectShow to process media samples, and Some Qt components to handle threading. So my application is unfortunately windows-only, but someone could probably write a gstreamer variant if they were familiar with that API (unfortunately, I'm not). -------------- next part -------------- An HTML attachment was scrubbed... URL: From RGlobisch at csir.co.za Sat Apr 11 06:37:18 2009 From: RGlobisch at csir.co.za (Ralf Globisch) Date: Sat, 11 Apr 2009 15:37:18 +0200 Subject: [Live-devel] Link error with LiveMedia in a VS2005 project. Message-ID: <49E0B92F0200004D0002F4D4@pta-emo.csir.co.za> Looks like you're missing the ms runtimes: msvcrt.lib or libcmt.lib, etc. depending on your project configuration: single-threaded/multi-threaded, statically/dynamically, debug/release build, etc. Read up on them at http://support.microsoft.com/kb/154753 Generally these get added to the linker inputs of a project automatically so I'm not sure what's wrong on your side. Google "unresolved external symbol __imp__fprintf" and you should find out pretty quickly which libs are missing. -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. MailScanner thanks Transtec Computers for their support. From hennelly.mark at gmail.com Sun Apr 12 05:16:14 2009 From: hennelly.mark at gmail.com (Mark Hennelly) Date: Sun, 12 Apr 2009 13:16:14 +0100 Subject: [Live-devel] Switching between multiple streams of same file encoded at different bitrates Message-ID: Hi, I am trying to build an RTSP client based on the live555.com libraries that has the ability to switch between multiple streams of the same file encoded at different bitrates. For example, using RTSP STOP and PLAY requests to stop one stream and play the other lower bitrate version at the same point using random access. I was wondering if anyone could tell me if this is possible and if so, what files in the live555.com libraries I would need to change to make it happen. Thanks in advance. Best regards, Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Apr 12 22:17:23 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 12 Apr 2009 22:17:23 -0700 Subject: [Live-devel] Switching between multiple streams of same file encoded at different bitrates In-Reply-To: References: Message-ID: >I am trying to build an RTSP client based on the >live555.com libraries that has the ability to >switch between multiple streams of the same file encoded at >different bitrates. For example, using RTSP STOP and PLAY requests >to stop one stream and play the other lower bitrate version at the >same point using random access. I was wondering if anyone could tell >me if this is possible Certainly. Have several files; one for each bitrate that you want (with the bitrate included in the file's name somehow). Then the client can switch to a different bitrate by (i) doing a "TEARDOWN" on the original stream, then (ii) doing a "DESCRIBE", "SETUP", and "PLAY" on the new stream (with the new file name). The "PLAY" request can include (as the start time) the play time that you had reached with the previous stream. > and if so, what files in the live555.com >libraries I would need to change to make it happen. You don't need to change anything (provided, of course, that the file type already supports seeking). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From frank.su.cn at gmail.com Wed Apr 15 02:58:12 2009 From: frank.su.cn at gmail.com (=?GB2312?B?y9W3sg==?=) Date: Wed, 15 Apr 2009 17:58:12 +0800 Subject: [Live-devel] Can live555 client reconnect the server automatically after several minutes? Message-ID: Hi all: i am writing a media player client with live555. i want to implement it just like a long term monitor system somwhow. that is, the client should connect the server , receive the stream, play the program, when the connection loss happen, the client can reconnect the server until the connection has been re-established. in reality, i have made some experiments: when the player is playing the program, i pulled the RJ45 modular plug out , and plug it in several seconds or minutes later. as a result, i found that the client can reconnect the server if just several seconds delay, but failed to reconnect the server if i plug the RJ45 in several minutes later. my question is: can i achieve that the client reconnect the server automatically after several minutes ,or even longer using the live555. best regards!! frank -------------- next part -------------- An HTML attachment was scrubbed... URL: From aurelien at sitadelle.com Wed Apr 15 03:03:41 2009 From: aurelien at sitadelle.com (=?ISO-8859-1?Q?Aur=E9lien_Nephtali?=) Date: Wed, 15 Apr 2009 12:03:41 +0200 Subject: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes In-Reply-To: References: Message-ID: <5334c8b0904150303i1f4b9709o83dfe2fe3575d8dd@mail.gmail.com> On Thu, Apr 9, 2009 at 9:52 AM, Ross Finlayson wrote: > Thanks for the suggestion; I like this. > > Unless anyone sees a problem with this, I will include it in the next > release of the software. Hello, On Linux, select() is implemented above poll(). In net/core/datagram.c (linux 2.6.26 sources), in datagram_poll() you can see the lines that check for readable event on the socket : /* readable? */ if (!skb_queue_empty(&sk->sk_receive_queue) || (sk->sk_shutdown & RCV_SHUTDOWN)) mask |= POLLIN | POLLRDNORM; The event is set if the socket input buffer is not empty. On FreeBSD it is the same thing : sys/kern/uipc_socket.c in sopoll_generic() : if (events & (POLLIN | POLLRDNORM)) if (soreadable(so)) revents |= events & (POLLIN | POLLRDNORM); sys/sys/socketvar.h : #define soreadable(so) \ ((so)->so_rcv.sb_cc >= (so)->so_rcv.sb_lowat || \ ((so)->so_rcv.sb_state & SBS_CANTRCVMORE) || \ !TAILQ_EMPTY(&(so)->so_comp) || (so)->so_error) -- Aur?lien Nephtali From v_carvalho_7 at hotmail.com Wed Apr 15 03:03:40 2009 From: v_carvalho_7 at hotmail.com (Vitor Carvalho) Date: Wed, 15 Apr 2009 10:03:40 +0000 Subject: [Live-devel] live streaming,DM355 Message-ID: Hi, I?m using a DM355 Davinci with Live555MediaServer to do live streaming. I would like to capture video from a camera connected to Davinci and stream it to a Pc with VLC. I need to do this in real time, without the video file creation. Can anyone help me here? Maybe some code will help.Thanks _________________________________________________________________ Emoticons e Winks super diferentes para o Messenger. Baixe agora, ? gr?tis! http://specials.br.msn.com/ilovemessenger/pacotes.aspx -------------- next part -------------- An HTML attachment was scrubbed... URL: From Alan.Roberts at e2v.com Wed Apr 15 03:07:59 2009 From: Alan.Roberts at e2v.com (Roberts, Alan) Date: Wed, 15 Apr 2009 11:07:59 +0100 Subject: [Live-devel] Can live555 client reconnect the server automaticallyafter several minutes? Message-ID: <8821BCD7410B064BA4414E4F8080E5EB043A2D63@whl46.e2v.com> I bet 30 seconds is the threshold at the moment. I'm interested in finding out more about this bit of code too - if anyone can point us in the right direction it would be much appreciated. best regards Alan -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of frank.su.cn at gmail.com Sent: 15 April 2009 10:58 To: live-devel at ns.live555.com Subject: [Live-devel] Can live555 client reconnect the server automaticallyafter several minutes? Hi all: i am writing a media player client with live555. i ?want to implement it just like a long term monitor system somwhow. that is, the client should connect the server , receive the stream, play the program, when the connection loss happen, the client can reconnect the server until the connection has been re-established. in reality, i have made some experiments: when the player is playing the program, i pulled the RJ45 modular plug out , ?and plug it in several seconds or minutes later. ?as a result, i found that the client can reconnect the server if just several seconds delay, ?but failed to reconnect the server if i plug the RJ45 in several minutes later. my question is: can i achieve that the client reconnect the server automatically after several minutes ,or even longer using the live555.? best regards!! frank ______________________________________________________________________ This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ Sent by a member of the e2v group of companies. The parent company, e2v technologies plc, is registered in England and Wales. Company number; 04439718. Registered address; 106 Waterhouse Lane, Chelmsford, Essex, CM1 2QU, UK. This email and any attachments are confidential and meant solely for the use of the intended recipient. If you are not the intended recipient and have received this email in error, please notify us immediately by replying to the sender and then deleting this copy and the reply from your system without further disclosing, copying, distributing or using the e-mail or any attachment. Thank you for your cooperation. ______________________________________________________ ________________ This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 15 07:34:40 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 15 Apr 2009 07:34:40 -0700 Subject: [Live-devel] Can live555 client reconnect the server automatically after several minutes? In-Reply-To: References: Message-ID: >i am writing a media player client with live555. i want to >implement it just like a long term monitor system somwhow. that is, >the client should connect the server , receive the stream, play the >program, when the connection loss happen, the client can reconnect >the server until the connection has been re-established. > >in reality, i have made some experiments: when the player is playing >the program, i pulled the RJ45 modular plug out , and plug it in >several seconds or minutes later. as a result, i found that the >client can reconnect the server if just several seconds delay, but >failed to reconnect the server if i plug the RJ45 in several minutes >later. Each "RTSPClient" object encapsulates a *single* TCP connection only. Therefore, to ensure that you can reconnect to the server (even if the previous TCP connection has closed), you should use a *different* "RTSPClient" object each time you connect to the server. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Apr 15 07:45:35 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 15 Apr 2009 07:45:35 -0700 Subject: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes In-Reply-To: <5334c8b0904150303i1f4b9709o83dfe2fe3575d8dd@mail.gmail.com> References: <5334c8b0904150303i1f4b9709o83dfe2fe3575d8dd@mail.gmail.com> Message-ID: >On Thu, Apr 9, 2009 at 9:52 AM, Ross Finlayson wrote: >> Thanks for the suggestion; I like this. >> >> Unless anyone sees a problem with this, I will include it in the next >> release of the software. > >Hello, > >On Linux, select() is implemented above poll(). What do you mean by this? > In net/core/datagram.c >(linux 2.6.26 sources), in datagram_poll() you can see the lines that >check for readable event on the socket : > > /* readable? */ > if (!skb_queue_empty(&sk->sk_receive_queue) || > (sk->sk_shutdown & RCV_SHUTDOWN)) > mask |= POLLIN | POLLRDNORM; > >The event is set if the socket input buffer is not empty. So what is your conclusion? Are you implying that we should (on Unix systems) be using "poll()" instead of "select()", and that if we use "poll()", we won't need the optimization that Bryan Moore proposed? Or are you implying that we don't need this optimization even if we use "select()"?? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From aurelien at sitadelle.com Wed Apr 15 09:12:13 2009 From: aurelien at sitadelle.com (=?ISO-8859-1?Q?Aur=E9lien_Nephtali?=) Date: Wed, 15 Apr 2009 18:12:13 +0200 Subject: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes In-Reply-To: References: <5334c8b0904150303i1f4b9709o83dfe2fe3575d8dd@mail.gmail.com> Message-ID: <5334c8b0904150912l6a2e0c0aj61fe45c20f636a52@mail.gmail.com> On Wed, Apr 15, 2009 at 4:45 PM, Ross Finlayson wrote: >> On Thu, Apr 9, 2009 at 9:52 AM, Ross Finlayson >> wrote: >>> >>> ?Thanks for the suggestion; I like this. >>> >>> ?Unless anyone sees a problem with this, I will include it in the next >>> ?release of the software. >> >> Hello, >> >> On Linux, select() is implemented above poll(). > > What do you mean by this? > I mean that on Linux (and at least FreeBSD), select() is just a wrapper around poll(). > >> ?In net/core/datagram.c >> (linux 2.6.26 sources), in datagram_poll() you can see the lines that >> check for readable event on the socket : >> >> ? ? ? /* readable? */ >> ? ? ? ?if (!skb_queue_empty(&sk->sk_receive_queue) || >> ? ? ? ? ? ?(sk->sk_shutdown & RCV_SHUTDOWN)) >> ? ? ? ? ? ? ? ?mask |= POLLIN | POLLRDNORM; >> >> The event is set if the socket input buffer is not empty. > > So what is your conclusion? ?Are you implying that we should (on Unix > systems) be using "poll()" instead of "select()", and that if we use > "poll()", we won't need the optimization that Bryan Moore proposed? Or are > you implying that we don't need this optimization even if we use > "select()"?? I always use poll() on Linux because it is just a wrapper of select() and because it does not have a limit on the number of sockets to watch. The only problem is that it is not available on Windows. In conclusion you should use poll() for Unix and select() for Windows. The FIONREAD method is useless with poll()/select() under Unix because there is no internal flag which tells the socket has events ; the socket status is re-checked at each call. I don't known how select() is implemented on Windows. -- Aur?lien Nephtali From Alan.Roberts at e2v.com Wed Apr 15 07:43:27 2009 From: Alan.Roberts at e2v.com (Roberts, Alan) Date: Wed, 15 Apr 2009 15:43:27 +0100 Subject: [Live-devel] openRTSP, 30 seconds, Kill, teardown, network, interPacketGapMaxTime Message-ID: <8821BCD7410B064BA4414E4F8080E5EB043A2D69@whl46.e2v.com> Hi Ross I appreciate that you'll probably just want to copy this onto the forum but... I'm up against it. I'm struggling and just need to know if there's any light at the end of the tunnel in trying to get my system to work! I'm so close but there's a black hole between where I'm "at" and "finishing" my project! I'm trying to connect my new Linux "streaming video recorder" embeded processor card's ethernet port to our existing camera (not allowed to change software on the camera) which has streaming video over ethernet coming out of it. My problem is that the ethernet port gets "strangled" inside the camera at the same time as the one-and-only control signal (record start/stop) gets de-asserted (record stop). The Linux card detects when this signal gets de-asserted and I issue a Kill -HUP to kill the current openRTSP session. This works brilliantly... but only provided I leave the system for more than 30 seconds before removing the power! There is obviously some sort of "network outage handler thing" going on... I need to reduce the 30 seconds down to say 1 second... this will be fine as it's literally an ad-hoc point-to-point link that's about 12cm long. It takes 3 seconds of holding down the power button before power is actually removed so I've got a bit of slack time. If I remove the power before 30 seconds (after having tried to kill the openRTSP session) then the resulting file has duration 0 when I come to play it back with VLC. There's probably something that's not getting written to the mp4 file - it has roughly the correct physical size (Mbytes vs record time) but it just won't play. I suspect that the openRTSP hasn't actually been killed successfully. I'm going to write some test code to look into this but my gut feeling is that "something" is keeping openRTSP "on ice" (can't kill it) until either the network has "come back" or a 30 second timeout has "timed out". I've seen a few postings recently that "sort of" relate to this... pulling out RJ45s, timeouts, re-connecting, blocking/non-blocking etc etc. The one that interests me is when you replied saying that you plan to revamp rtspClient to make it "non-blocking" some time in the future. I've no idea how to do it myself so am really interested in finding out when this may become available? It's my only hope at the moment unless you can point me in another direction? I'm just using openRTSP and haven't tweaked any of your code... I effectively just call it from the command line. "-D 2" doesn't seem to answer my problem even though, in principle, it sounded like "the" answer. Is it implemented? I hope you can help me. Apart from this last hitch, I've really enjoyed this project. It would be great to finish it though! Kind regards Alan ____________________________________________________ Alan Roberts Senior Engineer ISIS Business Unit Tel: 01245 493493 x3876 Email: alan.roberts at e2v.com e2v, 106 Waterhouse Lane, Chelmsford, Essex. CM1 2QU ____________________________________________________ Sent by a member of the e2v group of companies. The parent company, e2v technologies plc, is registered in England and Wales. Company number; 04439718. Registered address; 106 Waterhouse Lane, Chelmsford, Essex, CM1 2QU, UK. This email and any attachments are confidential and meant solely for the use of the intended recipient. If you are not the intended recipient and have received this email in error, please notify us immediately by replying to the sender and then deleting this copy and the reply from your system without further disclosing, copying, distributing or using the e-mail or any attachment. Thank you for your cooperation. ______________________________________________________ ________________ This email has been scanned by the MessageLabs Email Security System. For more information please visit http://www.messagelabs.com/email ______________________________________________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Wed Apr 15 09:51:50 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 15 Apr 2009 09:51:50 -0700 Subject: [Live-devel] live streaming,DM355 In-Reply-To: References: Message-ID: On Wed, Apr 15, 2009 at 3:03 AM, Vitor Carvalho wrote: > I?m using a DM355 Davinci with > Live555MediaServer to do live streaming. I would like to capture video from > a > camera connected to Davinci and stream it to a Pc with VLC. I need to do > this > in real time, without the video file creation. Can anyone help me here? > Maybe > some code will help.Thanks > What is your video format? (MPEG4, H.264, MJPEG, etc) And I'd suggest reading the FAQ carefully, and in particular, http://www.live555.com/liveMedia/faq.html#liveInput -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Thu Apr 16 00:41:53 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Thu, 16 Apr 2009 10:41:53 +0300 Subject: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes In-Reply-To: <5334c8b0904150912l6a2e0c0aj61fe45c20f636a52@mail.gmail.com> References: <5334c8b0904150303i1f4b9709o83dfe2fe3575d8dd@mail.gmail.com> <5334c8b0904150912l6a2e0c0aj61fe45c20f636a52@mail.gmail.com> Message-ID: <21E398286732DC49AD45BE8C7BE96C07357C055AE4@fs11.mertree.mer.co.il> Select is implemented on Windows differently. But on both systems it consumes too much of the CPU. I tried using live555 for the reception of 100 audio streams (g.711). It consumed about 10% CPU on my 3Ghz machine. I end up rewriting it with a simple set of asynchronous sockets, the CPU consumption fell to 1% at peak levels on both Linux and Windows. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Aur?lien Nephtali Sent: Wednesday, April 15, 2009 7:12 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes On Wed, Apr 15, 2009 at 4:45 PM, Ross Finlayson wrote: >> On Thu, Apr 9, 2009 at 9:52 AM, Ross Finlayson >> wrote: >>> >>> ?Thanks for the suggestion; I like this. >>> >>> ?Unless anyone sees a problem with this, I will include it in the next >>> ?release of the software. >> >> Hello, >> >> On Linux, select() is implemented above poll(). > > What do you mean by this? > I mean that on Linux (and at least FreeBSD), select() is just a wrapper around poll(). > >> ?In net/core/datagram.c >> (linux 2.6.26 sources), in datagram_poll() you can see the lines that >> check for readable event on the socket : >> >> ? ? ? /* readable? */ >> ? ? ? ?if (!skb_queue_empty(&sk->sk_receive_queue) || >> ? ? ? ? ? ?(sk->sk_shutdown & RCV_SHUTDOWN)) >> ? ? ? ? ? ? ? ?mask |= POLLIN | POLLRDNORM; >> >> The event is set if the socket input buffer is not empty. > > So what is your conclusion? ?Are you implying that we should (on Unix > systems) be using "poll()" instead of "select()", and that if we use > "poll()", we won't need the optimization that Bryan Moore proposed? Or are > you implying that we don't need this optimization even if we use > "select()"?? I always use poll() on Linux because it is just a wrapper of select() and because it does not have a limit on the number of sockets to watch. The only problem is that it is not available on Windows. In conclusion you should use poll() for Unix and select() for Windows. The FIONREAD method is useless with poll()/select() under Unix because there is no internal flag which tells the socket has events ; the socket status is re-checked at each call. I don't known how select() is implemented on Windows. -- Aur?lien Nephtali _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From v_carvalho_7 at hotmail.com Thu Apr 16 09:21:43 2009 From: v_carvalho_7 at hotmail.com (Vitor Carvalho) Date: Thu, 16 Apr 2009 16:21:43 +0000 Subject: [Live-devel] live streaming,DM355 In-Reply-To: References: Message-ID: The video format is MPEG4. And live555 seems not support this format,only m4e. If a try to change the file extension from mpeg4 to m4e,it doesn't play.Any ideia to fix this? Thanks Date: Wed, 15 Apr 2009 09:51:50 -0700 From: kidjan at gmail.com To: live-devel at ns.live555.com Subject: Re: [Live-devel] live streaming,DM355 On Wed, Apr 15, 2009 at 3:03 AM, Vitor Carvalho wrote: I?m using a DM355 Davinci with Live555MediaServer to do live streaming. I would like to capture video from a camera connected to Davinci and stream it to a Pc with VLC. I need to do this in real time, without the video file creation. Can anyone help me here? Maybe some code will help.Thanks What is your video format? (MPEG4, H.264, MJPEG, etc) And I'd suggest reading the FAQ carefully, and in particular, http://www.live555.com/liveMedia/faq.html#liveInput _________________________________________________________________ Messenger 2009: Instale j?! http://download.live.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Apr 16 13:02:23 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 16 Apr 2009 13:02:23 -0700 Subject: [Live-devel] live streaming,DM355 In-Reply-To: References: Message-ID: On Thu, Apr 16, 2009 at 9:21 AM, Vitor Carvalho wrote: > The video format is MPEG4. And live555 seems not support this format,only > m4e. If a try to change the file extension from mpeg4 to m4e,it doesn't > play.Any ideia to fix this? Thanks > > I'd look at the wis-streamer application. And you're going to need to implement some of the Live555 classes to wrap your MPEG4 video source, however that's exposed. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 16 14:55:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Apr 2009 14:55:28 -0700 Subject: [Live-devel] live streaming,DM355 In-Reply-To: References: Message-ID: >On Thu, Apr 16, 2009 at 9:21 AM, Vitor Carvalho ><v_carvalho_7 at hotmail.com> wrote: > >The video format is MPEG4. And live555 seems not support this >format,only m4e. If a try to change the file extension from mpeg4 to >m4e,it doesn't play.Any ideia to fix this? Thanks > > > >I'd look at the wis-streamer application. No, that's not going to help. If his file format really is "MPEG-4" - i.e., ".mp4", then sorry, we don't support that. If, however, his file format is MPEG-4 Video *Elementary Stream* data, then we do support it (if the file is given the filename extension ".m4e"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Apr 16 15:48:06 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 16 Apr 2009 15:48:06 -0700 Subject: [Live-devel] live streaming,DM355 In-Reply-To: References: Message-ID: On Thu, Apr 16, 2009 at 2:55 PM, Ross Finlayson wrote: > On Thu, Apr 16, 2009 at 9:21 AM, Vitor Carvalho > wrote: > > The video format is MPEG4. And live555 seems not support this format,only > m4e. If a try to change the file extension from mpeg4 to m4e,it doesn't > play.Any ideia to fix this? Thanks > > > > I'd look at the wis-streamer application. > > > No, that's not going to help. If his file format really is "MPEG-4" - > i.e., ".mp4", then sorry, we don't support that. If, however, his file > format is MPEG-4 Video *Elementary Stream* data, then we do support it (if > the file is given the filename extension ".m4e") > Yes, I think it will help, because I've done this exact thing before. If you look at his first post, he's interested in streaming live video (not video from a file, although admittedly it isn't totally clear from the OP's description, and even less clear why he's now attempting to stream from a file) from a TI DM355. Having worked with the DM355 before, wis-streamer is what we used, and it worked great. -------------- next part -------------- An HTML attachment was scrubbed... URL: From amadorim at vdavda.com Fri Apr 17 00:04:13 2009 From: amadorim at vdavda.com (Marco Amadori) Date: Fri, 17 Apr 2009 09:04:13 +0200 Subject: [Live-devel] =?iso-8859-1?q?Correct_use_of_=27select=27_to_avoid_?= =?iso-8859-1?q?packet_loss_in=09Linux+BSD=3B_correct_use_of_WSAGetLastErr?= =?iso-8859-1?q?or_and_codes?= In-Reply-To: <21E398286732DC49AD45BE8C7BE96C07357C055AE4@fs11.mertree.mer.co.il> References: <5334c8b0904150912l6a2e0c0aj61fe45c20f636a52@mail.gmail.com> <21E398286732DC49AD45BE8C7BE96C07357C055AE4@fs11.mertree.mer.co.il> Message-ID: <200904170904.13908.amadorim@vdavda.com> On Thursday 16 April 2009, 09:41:53, Stas Desyatnlkov wrote: > [...] But on both systems it > consumes too much of the CPU. I tried using live555 for the reception of > 100 audio streams (g.711). It consumed about 10% CPU on my 3Ghz machine. I > end up rewriting it with a simple set of asynchronous sockets, the CPU > consumption fell to 1% at peak levels on both Linux and Windows. Would you please share also some of your code? If the performance improvements are so radical, why do not try to have those improvements merged back in live555 official releases? -- ESC:wq -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From peter.gejgus at gmail.com Thu Apr 16 23:12:08 2009 From: peter.gejgus at gmail.com (Peter Gejgus) Date: Fri, 17 Apr 2009 08:12:08 +0200 Subject: [Live-devel] problem with megapixel images in RTSPClient Message-ID: Hello, I'm new to live555. We made a MJPEG streaming server. I tried to receive megapixel (1280x1024) images from server using RTSPClient from live555 library. Seems that RTSPClient has a problem to process such a big images, because media sink was not called. With VGA format (e.g. 640x480) there was no problem. Server was able to stream megapixel images, so the problem is on the client's side. I tried to increase input buffer socket size, but problem still remains. I registered in mailing-list that RTSPClient is under a reconstruction now. Is this problem related with this reconstruction? Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Apr 17 00:52:37 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Apr 2009 00:52:37 -0700 Subject: [Live-devel] problem with megapixel images in RTSPClient In-Reply-To: References: Message-ID: >I'm new to live555. We made a MJPEG streaming server. I tried to >receive megapixel (1280x1024) images from server using RTSPClient >from live555 library. Seems that RTSPClient has a problem to process >such a big images, because media sink was not called. With VGA >format (e.g. 640x480) there was no problem. Your problem might be packet loss. JPEG is a very poor codec for video streaming, because each frame is so large (and so takes up many network packets). If *any* of these network packets gets lost, then the whole frame must be discarded. >I registered in mailing-list that RTSPClient is under a >reconstruction now. Is this problem related with this reconstruction? No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From v_carvalho_7 at hotmail.com Fri Apr 17 02:38:23 2009 From: v_carvalho_7 at hotmail.com (Vitor Carvalho) Date: Fri, 17 Apr 2009 09:38:23 +0000 Subject: [Live-devel] live streaming,DM355 In-Reply-To: References: Message-ID: I have already tryed wis-streamer but when i try to install wis-go7007 driver i get an error and then the wis-streamer doesn?t work. Did you use DM355 with montavista pro4.0.1? Date: Thu, 16 Apr 2009 15:48:06 -0700 From: kidjan at gmail.com To: live-devel at ns.live555.com Subject: Re: [Live-devel] live streaming,DM355 On Thu, Apr 16, 2009 at 2:55 PM, Ross Finlayson wrote: On Thu, Apr 16, 2009 at 9:21 AM, Vitor Carvalho wrote: The video format is MPEG4. And live555 seems not support this format,only m4e. If a try to change the file extension from mpeg4 to m4e,it doesn't play.Any ideia to fix this? Thanks I'd look at the wis-streamer application. No, that's not going to help. If his file format really is "MPEG-4" - i.e., ".mp4", then sorry, we don't support that. If, however, his file format is MPEG-4 Video *Elementary Stream* data, then we do support it (if the file is given the filename extension ".m4e") Yes, I think it will help, because I've done this exact thing before. If you look at his first post, he's interested in streaming live video (not video from a file, although admittedly it isn't totally clear from the OP's description, and even less clear why he's now attempting to stream from a file) from a TI DM355. Having worked with the DM355 before, wis-streamer is what we used, and it worked great. _________________________________________________________________ Descubra seu lado desconhecido com o novo Windows Live! http://www.windowslive.com.br -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Fri Apr 17 11:52:23 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Fri, 17 Apr 2009 11:52:23 -0700 Subject: [Live-devel] live streaming,DM355 In-Reply-To: References: Message-ID: On Fri, Apr 17, 2009 at 2:38 AM, Vitor Carvalho wrote: > I have already tryed wis-streamer but when i try to install wis-go7007 > driver i get an error and then the wis-streamer doesn?t work. Did you use > DM355 with montavista pro4.0.1? > We didn't use the driver. We wrote our own MPEG4 framer and subsession, and modified wis-streamer to get video using that. -------------- next part -------------- An HTML attachment was scrubbed... URL: From phmichels at gmail.com Sun Apr 19 10:32:35 2009 From: phmichels at gmail.com (Paulo Michels) Date: Sun, 19 Apr 2009 14:32:35 -0300 Subject: [Live-devel] openRTSP and H.264 Message-ID: Hi guys, I've been trying to record a H.264 stream using openRTSP. To be more specific, I'm trying to record the AmericaFree.TV stream (http://www.americafree.tv/VLC/ ). I'm just running: openRTSP rtsp://video3.americafree.tv/AFTVAdventureH264250.sdp . Looking at the log everything seems to be, the DESCRIBE, SETUP and PLAY commands are sent and replied by the server. The dump audio and video files are created, but both are empty. I can successfully see the stream using VLC and Quicktime though. Looking at the mailing list history I can see people having problems to stream (server side) H.264 video, but it's not clear to me whether openRTSP supports receiving this type of stream. Thanks in advance! Paulo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Apr 19 10:45:14 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 19 Apr 2009 10:45:14 -0700 Subject: [Live-devel] openRTSP and H.264 In-Reply-To: References: Message-ID: >I'm just running: >openRTSP rtsp://video3.americafree.tv/AFTVAdventureH264250.sdp. >Looking at the log everything seems to be, the DESCRIBE, SETUP and >PLAY commands are sent and replied by the server. The dump audio and >video files are created, but both are empty. Read the FAQ! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From phmichels at gmail.com Sun Apr 19 13:43:02 2009 From: phmichels at gmail.com (Paulo Michels) Date: Sun, 19 Apr 2009 17:43:02 -0300 Subject: [Live-devel] openRTSP and H.264 In-Reply-To: References: Message-ID: <7444A334-60BA-40EC-8503-AEDE605C9AE4@gmail.com> Sorry, I didn't saw the "-t" option before. Now I was able to get stream data packets, but I still have problems to play the video file. I'm running this: openRTSP -t -4 -d 20 -w 320 -h 240 rtsp://video3.americafree.tv/AFTVAdventureH264250.sdp > video.mp4 As you can see, I'm defining a duration limit (20 seconds) and explicitly defining the image size. The output video has two problems: 1. The first video frame is a green frame. I would like to get the actual video in the very first frame, is it possible? 2. The video seems to be in slow motion and the total video duration is not 20 seconds as defined, by more than 40 seconds. The audio is perfect and 20 sec length though. Then I tried to use the -y option, to synchronize video and audio, but the video gets really bad and the audio is just noise. I also tried to capture stream from QT Broadcaster, but still get the same problems. Any idea about how to get better video? Thanks! Paulo On Apr 19, 2009, at 2:45 PM, Ross Finlayson wrote: >> I'm just running: openRTSP rtsp://video3.americafree.tv/AFTVAdventureH264250.sdp >> . Looking at the log everything seems to be, the DESCRIBE, SETUP >> and PLAY commands are sent and replied by the server. The dump >> audio and video files are created, but both are empty. > > Read the FAQ! > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Apr 19 22:58:39 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 19 Apr 2009 22:58:39 -0700 Subject: [Live-devel] openRTSP and H.264 In-Reply-To: <7444A334-60BA-40EC-8503-AEDE605C9AE4@gmail.com> References: <7444A334-60BA-40EC-8503-AEDE605C9AE4@gmail.com> Message-ID: >Sorry, I didn't saw the "-t" option before. Now I was able to get >stream data packets, but I still have problems to play the video >file. I'm running this: > >openRTSP -t -4 -d 20 -w 320 -h 240 >rtsp://video3.americafree.tv/AFTVAdventureH264250.sdp > >video.mp4 > >As you can see, I'm defining a duration limit (20 seconds) >and explicitly defining the image size. The output video has two >problems: > >1. The first video frame is a green frame. I would like to get the >actual video in the very first frame, is it possible? Sorry, but I don't know what's causing this. >2. The video seems to be in slow motion and the total video duration >is not 20 seconds as defined, by more than 40 seconds. The audio is >perfect and 20 sec length though The problem is that you're not using the "-f " option. You *must* specify the correct video frame rate if you want to have any chance of your recorded video playing properly. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stas at tech-mer.com Mon Apr 20 00:21:13 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Mon, 20 Apr 2009 10:21:13 +0300 Subject: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes In-Reply-To: <200904170904.13908.amadorim@vdavda.com> References: <5334c8b0904150912l6a2e0c0aj61fe45c20f636a52@mail.gmail.com> <21E398286732DC49AD45BE8C7BE96C07357C055AE4@fs11.mertree.mer.co.il> <200904170904.13908.amadorim@vdavda.com> Message-ID: <21E398286732DC49AD45BE8C7BE96C07357D8F7EEE@fs11.mertree.mer.co.il> I can't share the code as it is part of the classified project. But its all really simple. 1) Create N async sockets 2) Run a thread that will poll on the sockets once in say 50ms 3) Read all data available from all the sockets The live555 lib is designed to be OS independent so running thread is not something it should do. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Marco Amadori Sent: Friday, April 17, 2009 10:04 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Correct use of 'select' to avoid packet loss in Linux+BSD; correct use of WSAGetLastError and codes On Thursday 16 April 2009, 09:41:53, Stas Desyatnlkov wrote: > [...] But on both systems it > consumes too much of the CPU. I tried using live555 for the reception of > 100 audio streams (g.711). It consumed about 10% CPU on my 3Ghz machine. I > end up rewriting it with a simple set of asynchronous sockets, the CPU > consumption fell to 1% at peak levels on both Linux and Windows. Would you please share also some of your code? If the performance improvements are so radical, why do not try to have those improvements merged back in live555 official releases? -- ESC:wq -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From tadmorm1 at post.tau.ac.il Mon Apr 20 02:13:37 2009 From: tadmorm1 at post.tau.ac.il (tadmorm1 at post.tau.ac.il) Date: Mon, 20 Apr 2009 12:13:37 +0300 Subject: [Live-devel] Receiving an H.264 Stream Message-ID: <20090420121337.140118zixu1kgm9t@webmail.tau.ac.il> Hi, I'm new to the live555 media server and from what I understand it supportsReceiving an H.264 file using RTP/UDP.I saw that the media server has several receivers testProgs,like "testMPEG1or2VideoReceiver".my question is, what do I need to do, in order to write an H.264 receiver?do I need to implement several classes? or only make a few changes to an existing test-file like the MPEGVideo receiver? Thanks,Michael. From finlayson at live555.com Mon Apr 20 02:28:57 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Apr 2009 02:28:57 -0700 Subject: [Live-devel] Receiving an H.264 Stream In-Reply-To: <20090420121337.140118zixu1kgm9t@webmail.tau.ac.il> References: <20090420121337.140118zixu1kgm9t@webmail.tau.ac.il> Message-ID: >I'm new to the live555 media server and from what I understand it >supportsReceiving an H.264 file using RTP/UDP.I saw that the media >server has several receivers testProgs,like >"testMPEG1or2VideoReceiver".my question is, what do I need to do, in >order to write an H.264 receiver?do I need to implement several >classes? or only make a few changes to an existing test-file like >the MPEGVideo receiver? You don't need to write a new application to receive a H.264 stream. Just use "openRTSP": http://www.live555.com/openRTSP -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Apr 20 15:54:59 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Apr 2009 15:54:59 -0700 Subject: [Live-devel] New version (2009.04.20) released - fixes Windows "errno" problem Message-ID: Recently some people have been having problems with the use of "errno" in the code, when building for/running on Windows. The real problem here was that 1/ The code - in a few places - was incorrectly using "errno" instead of calling "UsageEnvironment::getErrno()", and 2/ The Windows implementation of "BasicUsageEnvironment::getErrno()" was testing the "errno" variable, instead of always just calling "WSAGetLastError()". I have installed a new version (2009.04.20) of the "LIVE555 Streaming Media" code that fixes this problem. (If anyone finds any problems with this release, please let us know (on this mailing list) ASAP.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From vpirson at tcomlife.be Mon Apr 20 03:57:27 2009 From: vpirson at tcomlife.be (Vincent Pirson) Date: Mon, 20 Apr 2009 12:57:27 +0200 Subject: [Live-devel] Trick-play on MPEG4 streams Message-ID: Dear Sir, 1/can you please let me know when you plan to support trick play on MPEG4 and possibly WM9 streams ? 2/Do you plan some pre-encryptor in future to deliver VOD assets ? http://www.live555.com/mediaServer/#trick-play 'Trick play' functionality The server supports RTSP 'trick play' operations for some, but not all, media types: * Pausing: All media types * Seeking: MPEG Transport Streams; WAV (PCM) audio; MPEG-1 or 2 audio; MPEG-1 or 2 Program Streams (partially working) * Fast forward: MPEG Transport Streams; WAV (PCM) audio; MPEG-1 or 2 audio * Reverse play: MPEG Transport Streams; WAV (PCM) audio Trick play support for additional media types will be added in the future. ________________________________________ Vincent Pirson Cell: +32.478.24.03.69 T'com-Life SPRL 38A rue de win?e Skype Pseudo: vpirson B-5310 Leuze/Eghez?e Belgium P avant d'imprimer, pensez ? l'environnement -------------- next part -------------- An HTML attachment was scrubbed... URL: From phmichels at gmail.com Mon Apr 20 07:54:46 2009 From: phmichels at gmail.com (Paulo Michels) Date: Mon, 20 Apr 2009 11:54:46 -0300 Subject: [Live-devel] openRTSP and H.264 In-Reply-To: References: <7444A334-60BA-40EC-8503-AEDE605C9AE4@gmail.com> Message-ID: <0DB0BC42-5DAB-44C4-A2CE-5900186E69DB@gmail.com> Ross, How can I use the "-f" option and specify the frame rate if I don't know the server settings? Is there any way to get this info from the server? Besides that, when streaming camera feeds, I believe the frame rate is not constant. Paulo On Apr 20, 2009, at 2:58 AM, Ross Finlayson wrote: >> Sorry, I didn't saw the "-t" option before. Now I was able to get >> stream data packets, but I still have problems to play the video >> file. I'm running this: >> >> openRTSP -t -4 -d 20 -w 320 -h 240 rtsp://video3.americafree.tv/AFTVAdventureH264250.sdp >> > video.mp4 >> >> As you can see, I'm defining a duration limit (20 seconds) and >> explicitly defining the image size. The output video has two >> problems: >> >> 1. The first video frame is a green frame. I would like to get the >> actual video in the very first frame, is it possible? > > Sorry, but I don't know what's causing this. >> 2. The video seems to be in slow motion and the total video >> duration is not 20 seconds as defined, by more than 40 seconds. The >> audio is perfect and 20 sec length though > > The problem is that you're not using the "-f " option. > You *must* specify the correct video frame rate if you want to have > any chance of your recorded video playing properly. > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From gbonneau at miranda.com Mon Apr 20 10:02:36 2009 From: gbonneau at miranda.com (BONNEAU Guy) Date: Mon, 20 Apr 2009 13:02:36 -0400 Subject: [Live-devel] computing uSecondsToGo Message-ID: <6353CA579307224BAFDE9495906E69160251A733@ca-ops-mail> I have had to add some fprintf messages in MultiFramesRTPSink::sendPacketIfNecessary() to follow-up with the packet scheduling of the live555 library. I have found that the computing of the uSecondsToGo can become negative when a frame to be send is segmented through many RTP packets. This happens in a call to send a follow-up packet of a frame and fNextSendTime.tv_sec == timeNow.tv_sec but fNextSendTime.tv_usec < timeNow.tv_usec. Fortunately the envir().taskScheduler().scheduleDelayedTask function tests the condition of negative delay and set it to 0. However it is somewhat confusing when you try to debug and monitor the delay. I suggest a cosmetic change to the function void MultiFramedRTPSink::sendPacketIfNecessary() from: if (fNextSendTime.tv_sec < timeNow.tv_sec) { uSecondsToGo = 0; // prevents integer underflow if too far behind } else { uSecondsToGo = (fNextSendTime.tv_sec - timeNow.tv_sec)*1000000 + (fNextSendTime.tv_usec - timeNow.tv_usec); } To if ((fNextSendTime.tv_sec < timeNow.tv_sec) || ((fNextSendTime.tv_sec == timeNow.tv_sec) && (fNextSendTime.tv_usec < timeNow.tv_usec))) { uSecondsToGo = 0; // prevents integer underflow if too far behind } else { uSecondsToGo = (fNextSendTime.tv_sec - timeNow.tv_sec)*1000000 + (fNextSendTime.tv_usec - timeNow.tv_usec); } Regards Guy Bonneau -------------- next part -------------- An HTML attachment was scrubbed... URL: From jordillamaspons at gmail.com Mon Apr 20 10:37:07 2009 From: jordillamaspons at gmail.com (Jordi Llamas Pons) Date: Mon, 20 Apr 2009 19:37:07 +0200 Subject: [Live-devel] ServerMediaSession with multiple files Message-ID: Hi, I've been trying for days to make RTSPServer to stream a video that is divided in 5 parts, in a way that the client acts as if it was receiving the stream from a single video file. I've looked at ServerMediaSession.cpp/.h but I am pretty much lost. Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrteddy at citromail.hu Mon Apr 20 19:00:41 2009 From: mrteddy at citromail.hu (Mr. Teddy) Date: Tue, 21 Apr 2009 04:00:41 +0200 Subject: [Live-devel] openRTSP TS receive as an another program input Message-ID: <20090421020042.19462.qmail@server16.citromail.hu> Hello everybody, I have some questions. First of all I failed to use openRTSP. I used the "testMPEG2TransportStreamer" to stream "test.ts" file and I can receive it with VLC, but a I can't with openRTSP. I started the openRTSP from the command line with this: openRTSP "rtp://239.255.42.42/test.ts" or openRTSP "rtp://239.255.42.42:1234" but I get this error: "Failed to get a SDP description from URL..." Secondly, I have a project. In this project I have to write an application, that receive an mpeg2 ts over rtp and use it az input. The second application (what's want the ts input) default open ts file from a file with FOPEN and store in "FILE* Fp", then regulary fill a buffer with "fread" command (NumBytes = fread(Buf, 1, BUFSIZE, fp)) from the "fp". So basically I need to make a FIFO where the openRTSP put the data and the fread read from. Thanks a lot, Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From mowtschan at gmail.com Mon Apr 20 23:51:02 2009 From: mowtschan at gmail.com (Mowtschan) Date: Tue, 21 Apr 2009 08:51:02 +0200 Subject: [Live-devel] sdp support for live555 media server Message-ID: can i use the live555 media server as a reflector (with using sdp files) ? for example: UDP Stream (h264) ---> Live555 Media Server ---> RTSP Stream From finlayson at live555.com Tue Apr 21 00:55:58 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 00:55:58 -0700 Subject: [Live-devel] computing uSecondsToGo In-Reply-To: <6353CA579307224BAFDE9495906E69160251A733@ca-ops-mail> References: <6353CA579307224BAFDE9495906E69160251A733@ca-ops-mail> Message-ID: >I suggest a cosmetic change to the function void >MultiFramedRTPSink::sendPacketIfNecessary() OK, This will be included in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Apr 21 01:47:19 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 01:47:19 -0700 Subject: [Live-devel] openRTSP and H.264 In-Reply-To: <0DB0BC42-5DAB-44C4-A2CE-5900186E69DB@gmail.com> References: <7444A334-60BA-40EC-8503-AEDE605C9AE4@gmail.com> <0DB0BC42-5DAB-44C4-A2CE-5900186E69DB@gmail.com> Message-ID: >How can I use the "-f" option and specify the frame rate if I don't >know the server settings? Is there any way to get this info from the >server? The only way I know to find the stream's frame rate is to play it using a media player application, and try to figure out the frame rate from the media player's 'get stream information' feature, or something... Sorry, but the ".mov" and ".mp4" file formats were poorly designed for this purpose (recording incoming streams), so this is about the best we can do if we want to record those file formats. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Apr 21 01:49:08 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 01:49:08 -0700 Subject: [Live-devel] openRTSP TS receive as an another program input In-Reply-To: <20090421020042.19462.qmail@server16.citromail.hu> References: <20090421020042.19462.qmail@server16.citromail.hu> Message-ID: >First of all I failed to use openRTSP. I used the >"testMPEG2TransportStreamer" to stream "test.ts" file and I can >receive it with VLC, but a I can't with openRTSP. Did you modifiy the "testMPEG2TranspoerStreamer'" code to enable it's built-in RTSP server? You need to do this if you want to receive the stream using "openRTSP". A better solution, though, is to use "testOnDemandRTSPServer" as your server instead. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Apr 21 01:54:39 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 01:54:39 -0700 Subject: [Live-devel] sdp support for live555 media server In-Reply-To: References: Message-ID: >can i use the live555 media server as a reflector (with using sdp files) ? >for example: UDP Stream (h264) ---> Live555 Media Server ---> RTSP Stream No, not at present. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Apr 21 01:56:46 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 01:56:46 -0700 Subject: [Live-devel] ServerMediaSession with multiple files In-Reply-To: References: Message-ID: >I've been trying for days to make RTSPServer to stream a video that >is divided in 5 parts, >in a way that the client acts as if it was receiving the stream from >a single video file. >I've looked at ServerMediaSession.cpp/.h but I am pretty much lost. You can do this by writing a new "ServerMediaSubsession" (subclass) - similar to the original one - that uses the "ByteStreamMultiFileSource" class instead of "ByteStreamFileSource". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Apr 21 02:41:51 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 02:41:51 -0700 Subject: [Live-devel] Trick-play on MPEG4 streams In-Reply-To: References: Message-ID: >1/can you please let me know when you plan to support trick play on >MPEG4 and possibly WM9 streams ? No. > >2/Do you plan some pre-encryptor in future to deliver VOD assets ? No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrteddy at citromail.hu Tue Apr 21 03:21:32 2009 From: mrteddy at citromail.hu (Mr. Teddy) Date: Tue, 21 Apr 2009 12:21:32 +0200 Subject: [Live-devel] openRTSP TS receive as an another program input In-Reply-To: Message-ID: <20090421102132.19473.qmail@server16.citromail.hu> Right, with "testOnDemandRTSPServer" works fine. So my project, openRTSP is a complex application, and I need only a code that receive .TS file over rtp. Can I modify example "testMPEG1or2VideoReceiver" to receive TS file, or I should Integrate the openRTSP application to my code?And Can I link openRTSP received data buffer to my application input? -- Eredeti ?zenet -- Felad?: Ross Finlayson <finlayson at live555.com> C?mzett: LIVE555 Streaming Media - development & use <live-devel at ns.live555.com> M?solat: Elk?ldve: 11:10 T?ma: Re: [Live-devel] openRTSP TS receive as an another program input>First of all I failed to use openRTSP. I used the >"testMPEG2TransportStreamer" to stream "test.ts" file and I can >receive it with VLC, but a I can't with openRTSP.Did you modifiy the "testMPEG2TranspoerStreamer'" code to enable it's built-in RTSP server? You need to do this if you want to receive the stream using "openRTSP".A better solution, though, is to use "testOnDemandRTSPServer" as your server instead.-- Ross FinlaysonLive Networks, Inc.http://www.live555.com/_______________________________________________live-devel mailing listlive-devel at lists.live555.comhttp://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From jordillamaspons at gmail.com Tue Apr 21 05:29:38 2009 From: jordillamaspons at gmail.com (Jordi Llamas Pons) Date: Tue, 21 Apr 2009 14:29:38 +0200 Subject: [Live-devel] ServerMediaSession with multiple files In-Reply-To: References: Message-ID: And how can I make the client to divide the file in multiple parts? I modified FileSink to make the parts based on a constant instead of the video size but I suppose it can be done much better. Thanks! 2009/4/21 Ross Finlayson > I've been trying for days to make RTSPServer to stream a video that is >> divided in 5 parts, >> in a way that the client acts as if it was receiving the stream from a >> single video file. >> I've looked at ServerMediaSession.cpp/.h but I am pretty much lost. >> > > You can do this by writing a new "ServerMediaSubsession" (subclass) - > similar to the original one - that uses the "ByteStreamMultiFileSource" > class instead of "ByteStreamFileSource". > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 21 07:11:15 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 07:11:15 -0700 Subject: [Live-devel] openRTSP TS receive as an another program input In-Reply-To: <20090421102132.19473.qmail@server16.citromail.hu> References: <20090421102132.19473.qmail@server16.citromail.hu> Message-ID: >And Can I link openRTSP received data buffer to my application input? Yes. Use the "-v" option to cause "openRTSP" to write its output to 'stdout', and then pipe this to your application. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Apr 21 07:17:03 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 07:17:03 -0700 Subject: [Live-devel] ServerMediaSession with multiple files In-Reply-To: References: Message-ID: >And how can I make the client to divide the file in multiple parts? Now this is just being silly. If you want to record the incoming stream back into separate files, then why combine the files (on the server) in the first place? Instead, leave the server the way it was, and just use "openRTSP" to request each stream separately, recording each one into a separate file. I.e., you could call "openRTSP" multiple times, from a script. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas at tech-mer.com Tue Apr 21 08:15:24 2009 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Tue, 21 Apr 2009 18:15:24 +0300 Subject: [Live-devel] h.264 NAL frame emulation Message-ID: <21E398286732DC49AD45BE8C7BE96C07357D8F8016@fs11.mertree.mer.co.il> Hi All, I need to find a way to emulate NAL packets in order to send it in TS. My hardware encoder will produce NAL but as it won't be ready for a while, I'd like to write my streamer with some file as a source. I have a TS file with h.264 video and AAC audio. How do I get the NAL packets from it? Does anyone know of such a sample? I understand that there is always a case of reverse engineering the VLC but this would be my last resort. Regards, Stas -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Tue Apr 21 09:12:17 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Tue, 21 Apr 2009 09:12:17 -0700 Subject: [Live-devel] Sink for live H264/openRTSP questions Message-ID: I see for H.264 streams, openRTSP defaults to the H264VideoFileSink, which is based on FileSink, which is based on MediaSink. I don't want to write the video out to a file; I want the video exposed as a live stream to the rest of my application. To me, it seems like I need to write my own "sink," but I'm not sure what class would be best to inherit from (MediaSink? Or all the way down to Medium?). My other question is more general; I see that a single RTSP server can have multiple sessions, and each session can be composed of multiple subsessions. So I'm wondering what the best (easiest?) way to structure my media streams would be. I'm going to have several H.264 streams, audio, MJPEG, and possibly MPEG4, and I'm wondering if each should get its own session, or if I should combine audio and video into the same session. Will I have AV sync issues if each stream is in its own session? Thanks in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jordillamaspons at gmail.com Tue Apr 21 10:26:13 2009 From: jordillamaspons at gmail.com (Jordi Llamas Pons) Date: Tue, 21 Apr 2009 19:26:13 +0200 Subject: [Live-devel] ServerMediaSession with multiple files In-Reply-To: References: Message-ID: Well, I did not explain myself before. I'm doing a client/server application. The client part will connect to a normal RTSP server and divide the video in parts. The server part will stream those parts with different priorities to various clients, but those clients don't need to know that the video is divided. Thanks again, and sorry for my english 2009/4/21 Ross Finlayson > And how can I make the client to divide the file in multiple parts? >> > > Now this is just being silly. > > If you want to record the incoming stream back into separate files, then > why combine the files (on the server) in the first place? > > Instead, leave the server the way it was, and just use "openRTSP" to > request each stream separately, recording each one into a separate file. > I.e., you could call "openRTSP" multiple times, from a script. > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 21 14:01:08 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Apr 2009 14:01:08 -0700 Subject: [Live-devel] Sink for live H264/openRTSP questions In-Reply-To: References: Message-ID: >I see for H.264 streams, openRTSP defaults to the H264VideoFileSink, >which is based on FileSink, which is based on MediaSink. > >I don't want to write the video out to a file; I want the video >exposed as a live stream to the rest of my application. To me, it >seems like I need to write my own "sink," but I'm not sure what >class would be best to inherit from (MediaSink? Yes. However, a simpler solution is to not modify "openRTSP" at all. Instead, use the "-v" option to cause "openRTSP" to write its output to 'stdout', and then pipe this to your application. >My other question is more general; I see that a single RTSP server >can have multiple sessions, and each session can be composed of >multiple subsessions. So I'm wondering what the best (easiest?) way >to structure my media streams would be. I'm going to have several >H.264 streams, audio, MJPEG, and possibly MPEG4, and I'm wondering >if each should get its own session, or if I should combine audio and >video into the same session. Will I have AV sync issues if each >stream is in its own session? If you want to stream audio and video together, then both "ServerMediaSubsession"s should be in a single "ServerMediaSession". A RTSP client then requests a single stream, which will contain both audio and video. However, if you have several video streams, then they should usually be in separate "ServerMediaSession"s (because a RTSP client will rarely want to receive more than one video stream at the same time). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rpeck at rpeck.com Wed Apr 22 14:25:51 2009 From: rpeck at rpeck.com (Raymond Peck) Date: Wed, 22 Apr 2009 14:25:51 -0700 Subject: [Live-devel] h.264 NAL frame emulation In-Reply-To: <21E398286732DC49AD45BE8C7BE96C07357D8F8016@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C07357D8F8016@fs11.mertree.mer.co.il> Message-ID: <163a3a3b0904221425q58b0e689l664b577629cd2fc@mail.gmail.com> FLV files containing h.264/aac might be a very simple way for you to grab raw NALUs. The format is very simple, well documented, and has very little wrapper around the raw NALUs. Another alternative is to lightly hack ffmpeg, to write out the raw NAL data it gets from x264 as it is encoding. On Tue, Apr 21, 2009 at 8:15 AM, Stas Desyatnlkov wrote: > Hi All, > > I need to find a way to emulate NAL packets in order to send it in TS.? My > hardware encoder will produce NAL but as it won?t be ready for a while, I?d > like to write my streamer with some file as a source. > > I have a TS file with h.264 video and AAC audio. > How do I get the NAL packets from it? > Does anyone know of such a sample? I understand that there is always a case > of reverse engineering the VLC but this would be my last resort. > > Regards, > Stas > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From finlayson at live555.com Wed Apr 22 16:15:16 2009 From: finlayson at live555.com (Corina Dubois) Date: Wed, 22 Apr 2009 18:15:16 -0500 Subject: [Live-devel] Classy, new and inexpensive watches Message-ID: <1941iib671617VKIRfinlayson@live555.com> Loving yourself is the first step in loving life. And what better way to do it, than by getting yourself a fine designer watch? http://www.vexuqoneq.cn Take advantage of Diam0nd Reps tremendous specials, and get yourself a superb designer watch imitation for just a couple of hundred bucks. Plus an extra 15 percent discount when you get two time pieces in the same purchase! http://www.vexuqoneq.cn So, what are you waiting for? Get that unique timepiece today at Diam0nd Reps! From v_carvalho_7 at hotmail.com Wed Apr 22 06:15:09 2009 From: v_carvalho_7 at hotmail.com (Vitor Carvalho) Date: Wed, 22 Apr 2009 14:15:09 +0100 Subject: [Live-devel] MPEG4 Encoder Headers Message-ID: Hello...i need some help.. That?s what i found: MPEG4 Encoder Headers Some streaming applications, such as live555 ( http://www.live555.com) require the following additional headers in the MPEG4 stream: Visual Object Sequence Start (VOSS) and the Visual Object Start (VOS) headers. VOSS = 000001B0 00000000; VOS = 000001B5 00000005 These headers are not present in the stream generated by the DM355 encoder because they are optional for elementary streams. (Note: these headers are present in a stream generated by the DM6446 or DM6437 MPEG4 Encoder) If an application requires these headers, the application must add them to the encoded stream in the following order: VOSS+VOS+EncodedStream. I?m using DM355 and trying to do video streaming to a Pc client with VLC. As you know, live555 doesn?t support MPEG4 streaming, just m4e. So, i would like to know how to implement this solution (add the VOSS and VOS headers).More specific, what i have to add and where?! Hope that somebody can help me..Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 22 15:41:42 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 22 Apr 2009 15:41:42 -0700 Subject: [Live-devel] Apologies for the spam that just got posted to the list In-Reply-To: <1941iib671617VKIRfinlayson@live555.com> References: <1941iib671617VKIRfinlayson@live555.com> Message-ID: Unfortunately a spammer posted a message to the list by impersonating my "From:" address. I've now turned on moderation for my own posts, to prevent this from happening again. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From belloni at imavis.com Thu Apr 23 01:36:28 2009 From: belloni at imavis.com (Cristiano Belloni) Date: Thu, 23 Apr 2009 10:36:28 +0200 Subject: [Live-devel] RR packets not sent by openRTSP in TCP mode Message-ID: <49F0288C.7080503@imavis.com> Hi All, Seems that openRTSP doesn't send Receiver Report RTCP packets when in TCP mode. I just successfully compiled the last liveMedia tarball (currently live.2009.04.20.tar.gz) and tested openRTSP with a bunch of RTSP servers (notably testOnDemandRTSPServer, just to be on the safe side), and started openRTSP with this command line: ./openRTSP -b 64000 -t -v -u [username] [password] rtsp://[address of rtsp resource] > mpeg4.dump Then, I monitored the connection with wireshark, and noticed that, while the server *does* send correctly its Sender Report rtcp packets, openRTSP *never* sends a single Receiver Report packet, causing the server to send a rtcp "bye" message after a while. Conversely, if I switch to udp mode (same command line, without the -t option), receiver reports are sent correctly throught udp. Could this be a bug? I used the last liveMedia package (though I can observe this behaviour in older builds, too) without modifications. Thanks and best regards, Cristiano Belloni. -- Belloni Cristiano Imavis Srl. www.imavis.com belloni at imavis.com From finlayson at live555.com Thu Apr 23 01:58:07 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Apr 2009 01:58:07 -0700 Subject: [Live-devel] RR packets not sent by openRTSP in TCP mode In-Reply-To: <49F0288C.7080503@imavis.com> References: <49F0288C.7080503@imavis.com> Message-ID: This is very strange - I can't reproduce this at all. When I run a server ("live555MediaServer") on a FreeBSD computer, and "openRTSP -t" on a Mac OS X computer, I see that "openRTSP" is sending - and the the server is receiving - the RTCP "SR" reports just fine. What OS are you using to run "openRTSP -t" (not that it should matter...)? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From belloni at imavis.com Thu Apr 23 03:24:13 2009 From: belloni at imavis.com (Cristiano Belloni) Date: Thu, 23 Apr 2009 12:24:13 +0200 Subject: [Live-devel] RR packets not sent by openRTSP in TCP mode In-Reply-To: References: <49F0288C.7080503@imavis.com> Message-ID: <49F041CD.5060102@imavis.com> Ross Finlayson wrote: > This is very strange - I can't reproduce this at all. When I run a > server ("live555MediaServer") on a FreeBSD computer, and "openRTSP -t" > on a Mac OS X computer, I see that "openRTSP" is sending - and the the > server is receiving - the RTCP "SR" reports just fine. > Why SR? openRTSP should send RRs (which it doesnt't, at least not in TCP), and the server should send SRs (which it does). > What OS are you using to run "openRTSP -t" (not that it should > matter...)? Ubuntu Linux Intrepid, with 2.6.27-14 kernel. But I noticed this same behaviour also on a mingw-compiled Windows version of openRTSP. Here's a wireshark screenshot of a session. Packets are grouped by protocol, so all the rtcp packet that are sent are in the screenshot. You can notice that the server (located at .58) sends correct sender reports, while the client (located at .84) never gives back any RR. http://img216.imageshack.us/img216/9380/screenshotws.jpg [screenshot] -- Belloni Cristiano Imavis Srl. www.imavis.com belloni at imavis.com From rippeltippel at gmail.com Thu Apr 23 05:58:30 2009 From: rippeltippel at gmail.com (rippel tippel) Date: Thu, 23 Apr 2009 13:58:30 +0100 Subject: [Live-devel] RTCP timeout Message-ID: Hi All, ?? I'm streaming a video using RTP over a wireless channel. If the server disappears, the client keeps sending RTCP report packets instead of checking if the server is still alive. Such behaviour is required by RFC 3550 (6.3.5 - Timing Out an SSRC) which states: "At occasional intervals, the participant MUST check to see if any of the other participants time out [...] A similar check is performed on the sender list". Is there any way to detect a RTCP timeout, in both the sender and the receiver? Thanks, R. From tadmorm1 at post.tau.ac.il Thu Apr 23 06:31:07 2009 From: tadmorm1 at post.tau.ac.il (tadmorm1 at post.tau.ac.il) Date: Thu, 23 Apr 2009 16:31:07 +0300 Subject: [Live-devel] retreiving an MPEG2-TS without rtsp Message-ID: <20090423163107.79364vsmjjccmqvv@webmail.tau.ac.il> Hi, I need to retrieve an MPEG2-TS using RTP/UDP. The testProgs doesn't seem to have this kind of receiver, i tried the openRTSP but I understand that it requieres the .sdp file in order to start streaming, while my transmitter, do not support RTSP. how can I use the live555 for this purpose? Thanks Michael From finlayson at live555.com Thu Apr 23 07:16:15 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Apr 2009 07:16:15 -0700 Subject: [Live-devel] RR packets not sent by openRTSP in TCP mode In-Reply-To: <49F041CD.5060102@imavis.com> References: <49F0288C.7080503@imavis.com> <49F041CD.5060102@imavis.com> Message-ID: >Ross Finlayson wrote: >>This is very strange - I can't reproduce this at all. When I run a >>server ("live555MediaServer") on a FreeBSD computer, and "openRTSP >>-t" on a Mac OS X computer, I see that "openRTSP" is sending - and >>the the server is receiving - the RTCP "SR" reports just fine. >> >Why SR? Sorry, I mispoke. I meant "RR". > openRTSP should send RRs (which it doesnt't, at least not in TCP) I've tested this again - this time with Fedora as a client - and I still see the client sending RTCP "RR" packets correctly. I haven't been able to reproduce your problem. Are you *sure* you haven't modified the supplied code at all?? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From tadmorm1 at post.tau.ac.il Thu Apr 23 08:38:35 2009 From: tadmorm1 at post.tau.ac.il (tadmorm1 at post.tau.ac.il) Date: Thu, 23 Apr 2009 18:38:35 +0300 Subject: [Live-devel] retreiving an MPEG2-TS without rtsp Message-ID: <20090423183835.86923fdcudir7bzv@webmail.tau.ac.il> Sorry for bothering, I just ran into one the archives of this mailing list and From what i understand I can use the "testMPEG1or2VideoReceiver" and just replace the session source with the next line : sessionState.source = SimpleRTPSource::createNew(*env, &rtpGroupsock,33 /*indicates mpeg2ts*/,90000,"video" /*hack*/); where the 90000 value of the "rtpTimestampFrequency" is taken from the MPEG2transportStreamer program I streamed from vlc, but got nothing (the target file is empty). What am I doing wrong ? How can I know the rtpTimestampFrequency of the transmitter? Thanks Michael From belloni at imavis.com Thu Apr 23 09:56:40 2009 From: belloni at imavis.com (Cristiano Belloni) Date: Thu, 23 Apr 2009 18:56:40 +0200 Subject: [Live-devel] RR packets not sent by openRTSP in TCP mode In-Reply-To: References: <49F0288C.7080503@imavis.com> <49F041CD.5060102@imavis.com> Message-ID: <49F09DC8.8040803@imavis.com> Ross Finlayson wrote: >> Ross Finlayson wrote: >>> This is very strange - I can't reproduce this at all. When I run a >>> server ("live555MediaServer") on a FreeBSD computer, and "openRTSP >>> -t" on a Mac OS X computer, I see that "openRTSP" is sending - and >>> the the server is receiving - the RTCP "SR" reports just fine. >>> >> Why SR? > > Sorry, I mispoke. I meant "RR". >> openRTSP should send RRs (which it doesnt't, at least not in TCP) > > I've tested this again - this time with Fedora as a client - and I > still see the client sending RTCP "RR" packets correctly. I haven't > been able to reproduce your problem. > > Are you *sure* you haven't modified the supplied code at all?? Absolutely sure, it's testOnDemandRTSPServer (streaming mpeg4 from a raw m4e file) on the server and openRTSP, both compiled from the last package just this morning. I see only SRs from the server, every 4 sec, and no RR from the client. -- Belloni Cristiano Imavis Srl. www.imavis.com belloni at imavis.com From patbob at imoveinc.com Thu Apr 23 13:59:10 2009 From: patbob at imoveinc.com (Patrick White) Date: Thu, 23 Apr 2009 13:59:10 -0700 Subject: [Live-devel] RTCP timeout In-Reply-To: References: Message-ID: <200904231359.10745.patbob@imoveinc.com> RTSPClient doesn't support the client timeout directly, so you have to wrap it (eg. subclass it) and schedule/reschedule your own task to do that. I can't remember what we use as the stimulus to reschedule, and don't have time to look right now. later, patbob On Thursday 23 April 2009 5:58 am, rippel tippel wrote: > Hi All, > > ?? I'm streaming a video using RTP over a wireless channel. > If the server disappears, the client keeps sending RTCP report packets > instead of checking if the server is still alive. Such behaviour is > required by RFC 3550 (6.3.5 - Timing Out an SSRC) which states: "At > occasional intervals, the participant MUST check to see if any of the > other participants time out [...] A similar check is performed on the > sender list". > > Is there any way to detect a RTCP timeout, in both the sender and the > receiver? > > Thanks, > R. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From mrteddy at citromail.hu Thu Apr 23 07:24:47 2009 From: mrteddy at citromail.hu (Mr. Teddy) Date: Thu, 23 Apr 2009 16:24:47 +0200 Subject: [Live-devel] openRTSP TS receive as an another program input In-Reply-To: Message-ID: <20090423142447.3577.qmail@server16.citromail.hu> But in the website of openRTSP, there are the following: -v play only the video stream. Not the "-r" option cause to write to stdout? -- Eredeti ?zenet -- Felad?: Ross Finlayson <finlayson at live555.com> C?mzett: LIVE555 Streaming Media - development & use <live-devel at ns.live555.com> M?solat: Elk?ldve: 2009.04.21 16:20 T?ma: Re: [Live-devel] openRTSP TS receive as an another program input>And Can I link openRTSP received data buffer to my application input?Yes. Use the "-v" option to cause "openRTSP" to write its output to 'stdout', and then pipe this to your application.-- Ross FinlaysonLive Networks, Inc.http://www.live555.com/_______________________________________________live-devel mailing listlive-devel at lists.live555.comhttp://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From tamira at optibase.com Thu Apr 23 07:42:13 2009 From: tamira at optibase.com (Tamir Adari) Date: Thu, 23 Apr 2009 17:42:13 +0300 Subject: [Live-devel] H.264 support trick-mode Message-ID: Hi all, It seems that the live555 media server supports H.264 multiplexed as transport (.ts). Please update it in: > Is there a plan to support it in the near future for trick-mode as well? Any scheduled date for this? Thanks, Tamir. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 23 18:50:08 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Apr 2009 18:50:08 -0700 Subject: [Live-devel] H.264 support trick-mode In-Reply-To: References: Message-ID: >It seems that the live555 media server supports H.264 multiplexed as >transport (.ts). >Please update it in: ><http://www.live555.com/mediaServer/#about> There's nothing to update on that page. We support the streaming of MPEG Transport Stream files, regardless of what they happen to contain. However, I have now updated the '"Trick Play' support" web page - http://www.live555.com/liveMedia/transport-stream-trick-play.html - to make clear that we currently support these 'trick play' operations only on Transport Stream files that contain MPEG-1 or MPEG-2 video - not MPEG-4 (including H.264) video. >Is there a plan to support it in the near future for trick-mode as well? Yes. > Any scheduled date for this? No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 23 19:05:55 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Apr 2009 19:05:55 -0700 Subject: [Live-devel] openRTSP TS receive as an another program input In-Reply-To: <20090423142447.3577.qmail@server16.citromail.hu> References: <20090423142447.3577.qmail@server16.citromail.hu> Message-ID: >But in the website of openRTSP, there are the following: > >-v play only the video stream. > >Not the "-r" option cause to write to stdout? "-v" means: Play (and receive) only the video stream, outputting the received data to 'stdout'. "-r" means: Play the entire stream (i.e., video and audio, if present), but don't actually receive the data ourself. (The data will still arrive, but we don't actually receive it ourself.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Apr 23 19:16:08 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Apr 2009 19:16:08 -0700 Subject: [Live-devel] retreiving an MPEG2-TS without rtsp In-Reply-To: <20090423183835.86923fdcudir7bzv@webmail.tau.ac.il> References: <20090423183835.86923fdcudir7bzv@webmail.tau.ac.il> Message-ID: >Sorry for bothering, I just ran into one the archives of this mailing list and >From what i understand I can use the "testMPEG1or2VideoReceiver" and just >replace the session source with the next line : > >sessionState.source = SimpleRTPSource::createNew(*env, >&rtpGroupsock,33 /*indicates mpeg2ts*/,90000,"video" /*hack*/); > >where the 90000 value of the "rtpTimestampFrequency" is taken from >the MPEG2transportStreamer program > >I streamed from vlc, but got nothing (the target file is empty). >What am I doing wrong ? I don't know. If your stream really is a MPEG Transport Stream sent over RTP *multicast*, using the IETF standard RTP payload format, and you have specified the port number correctly, then this should work. However, if your stream is unicast rather than multicast, then see . >How can I know the rtpTimestampFrequency of the transmitter? 90000 Hz is standard for most MPEG media, including MPEG Transport Streams. If however, your stream is MPEG Transport Stream data over *raw UDP*, rather than over RTP, then you should use a "BasicUDPSource" rather than a "SimpleRTPSource". Once again, though, the preferred way to receive streams (especially unicast streams) is to use RTSP. Our RTSP client implementation figures out everything for you. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Apr 23 19:24:39 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Apr 2009 19:24:39 -0700 Subject: [Live-devel] RR packets not sent by openRTSP in TCP mode In-Reply-To: <49F09DC8.8040803@imavis.com> References: <49F0288C.7080503@imavis.com> <49F041CD.5060102@imavis.com> <49F09DC8.8040803@imavis.com> Message-ID: >>I've tested this again - this time with Fedora as a client - and I >>still see the client sending RTCP "RR" packets correctly. I >>haven't been able to reproduce your problem. >> >>Are you *sure* you haven't modified the supplied code at all?? >Absolutely sure, it's testOnDemandRTSPServer (streaming mpeg4 from a >raw m4e file) on the server and openRTSP, both compiled from the >last package just this morning. I see only SRs from the server, >every 4 sec, and no RR from the client. Sorry, but I can't reproduce this at all. *Something* must be unusual about your system. You're going to have to track this down yourself. Sorry. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Apr 23 19:28:47 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Apr 2009 19:28:47 -0700 Subject: [Live-devel] RTCP timeout In-Reply-To: <200904231359.10745.patbob@imoveinc.com> References: <200904231359.10745.patbob@imoveinc.com> Message-ID: >RTSPClient doesn't support the client timeout directly, so you have to wrap it >(eg. subclass it) and schedule/reschedule your own task to do that. No, you don't need to do that. Instead, you can arrange for a 'liveness indicator' function (that you would write) to be called each time your client receives a RTCP "SR" packet from the server. Call: RTCPInstance:: setSRHandler() to schedule this. If your function stops getting called after 30 seconds or so, then you know that the stream has died. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mrteddy at citromail.hu Fri Apr 24 03:40:25 2009 From: mrteddy at citromail.hu (Mr. Teddy) Date: Fri, 24 Apr 2009 12:40:25 +0200 Subject: [Live-devel] openRTSP TS receive as an another program input In-Reply-To: Message-ID: <20090424104025.9625.qmail@server16.citromail.hu> And when I receiving a TS with -v option, this will put to stdout only the video? Because I need the video and audio with TS container to feed my applicaton, so I need the untouchet TS data. How can I use the -v option, because I've tried to start openRTSP with this command: "openRTSP "rtsp://<ip>:<port>/testtransportstream" -v" But I get error.When I run like this: "openRTSP "rtsp://<ip>:<port>/testtransportstream"" It receive the ts fine, so I don't know where must be put the addictional command line options. -- Eredeti ?zenet -- Felad?: Ross Finlayson <finlayson at live555.com> C?mzett: LIVE555 Streaming Media - development & use <live-devel at ns.live555.com> M?solat: Elk?ldve: 04:53 T?ma: Re: [Live-devel] openRTSP TS receive as an another program input>But in the website of openRTSP, there are the following:>>-v play only the video stream.>>Not the "-r" option cause to write to stdout?"-v" means: Play (and receive) only the video stream, outputting the received data to 'stdout'."-r" means: Play the entire stream (i.e., video and audio, if present), but don't actually receive the data ourself. (The data will still arrive, but we don't actually receive it ourself.)-- Ross FinlaysonLive Networks, Inc.http://www.live555.com/_______________________________________________live-devel mailing listlive-devel at lists.live555.comhttp://lists .live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Apr 24 05:05:04 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Apr 2009 05:05:04 -0700 Subject: [Live-devel] openRTSP TS receive as an another program input In-Reply-To: <20090424104025.9625.qmail@server16.citromail.hu> References: <20090424104025.9625.qmail@server16.citromail.hu> Message-ID: >And when I receiving a TS with -v option, this will put to stdout >only the video? No, not in this case - because Transport Stream data is streamed 'as is'. Unlike MPEG Prpgram Stream data, it is *not* split into separate audio and video streams. > Because I need the video and audio with TS container to feed my >applicaton, so I need the untouchet TS data. You'll get that if you use the "-v" option. > >How can I use the -v option, because I've tried to start openRTSP >with this command: >"openRTSP "rtsp://:/testtransportstream" -v" >But I get error. The "-v" option causes the data to be written to 'stdout', so you must pipe it to your application, which should read from 'stdin'. (If you don't understand what "stdout", "stdin" and and "pipe" means, then this software is not for you - sorry.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ben at decadent.org.uk Sat Apr 25 20:58:28 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Sun, 26 Apr 2009 04:58:28 +0100 Subject: [Live-devel] Adding support for DV video Message-ID: <1240718308.4886.46.camel@deadeye.i.decadent.org.uk> I started attempting to add support to liveMedia for DV over RTSP, based on RFC 3189, some time last year. I recently picked up that work and finished it off (more or less). There are probably bugs in this code as it's my first attempt at extending liveMedia, but it seems to work. I rebuilt VLC using the new library and was able to play a DV stream received over RTSP, though part of the video is missing which suggests that it is not allocating a large enough frame buffer. If I save the stream using openRTSP with an appropriately large buffer size, I get a valid DV file. The changes are divided into: - Add BufferedPacket::rtpTimestamp() method required for DV support - Add DV file source, RTP sink and source - Add test program for DV video According to the RFC, each packet must contain a whole number of DV blocks (i.e. the RTP payload size must be a multiple of 80) but it doesn't appear to be possible to control fragment sizes in this way in liveMedia. I am currently using the kluge of setting the maximum packet size to 1372 = 12 + 17 * 80, but it would be preferable to have a virtual function in MultiFramedRTPSink that could be used to override the default fragmentation behaviour. Patches are attached. Ben. -------------- next part -------------- A non-text attachment was scrubbed... Name: 0001-Add-BufferedPacket-rtpTimestamp-method-required-f.patch Type: application/mbox Size: 872 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 0002-Add-support-for-DV-IEC-61384.patch Type: application/mbox Size: 40308 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 0003-Add-test-program-for-DV-video.patch Type: application/mbox Size: 6946 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From finlayson at live555.com Sat Apr 25 22:21:40 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 25 Apr 2009 22:21:40 -0700 Subject: [Live-devel] Adding support for DV video In-Reply-To: <1240718308.4886.46.camel@deadeye.i.decadent.org.uk> References: <1240718308.4886.46.camel@deadeye.i.decadent.org.uk> Message-ID: >I started attempting to add support to liveMedia for DV over RTSP, based >on RFC 3189, some time last year. I recently picked up that work and >finished it off (more or less). Ben, Many thanks for contributing this. I'll review it, and will likely add it to a future release of the code. >According to the RFC, each packet must contain a whole number of DV >blocks (i.e. the RTP payload size must be a multiple of 80) but it >doesn't appear to be possible to control fragment sizes in this way in >liveMedia. In fact, this is determined (in "MultiFramedRTPSink.cpp") by the result of the "allowFragmentationAfterStart()" virtual function. By default, this function returns False, meaning that no 'frames' can be fragmented across more than one RTP packet, unless the RTP packet contains only a single 'frame' that is too large for a single RTP packet. A 'frame' in this context is a single discrete chunk of data that is input (one at a time) to the "MultiFramedRTPSink" (subclass). If you are assuming that juse one "DV block" is input at time (I haven't yet looked at your code closely enough to see if that is, in fact, the case), then you should get the desired behavior automatically. > I am currently using the kluge of setting the maximum packet >size to 1372 = 12 + 17 * 80, but it would be preferable to have a >virtual function in MultiFramedRTPSink that could be used to override >the default fragmentation behaviour. As I noted above, the default fragmentation behavior *is* to do what you want, provided that you are feeding discrete 80-byte blocks to the "MultiFramedRTPSink" (subclass). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From tadmorm1 at post.tau.ac.il Sun Apr 26 00:36:16 2009 From: tadmorm1 at post.tau.ac.il (tadmorm1 at post.tau.ac.il) Date: Sun, 26 Apr 2009 10:36:16 +0300 Subject: [Live-devel] retreiving an MPEG2-TS without rtsp Message-ID: <20090426103616.111153qs5tty63u8@webmail.tau.ac.il> Hi, although I am using unicast, I made the proper changes to the receiver file as instructed in the comments. I made a "cout << ..." statement in the "gropsoclHelper" class and it is clear that the data is received in the buffer, every time I stream from VLC the port reads 1316 bytes each time. however, the target file is empty. I assume VLC is using the IETF standard RTP payload format, does it not? Thanks, Michael From ben at decadent.org.uk Sun Apr 26 07:24:39 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Sun, 26 Apr 2009 15:24:39 +0100 Subject: [Live-devel] Adding support for DV video In-Reply-To: References: <1240718308.4886.46.camel@deadeye.i.decadent.org.uk> Message-ID: <1240755879.4886.70.camel@deadeye.i.decadent.org.uk> On Sat, 2009-04-25 at 22:21 -0700, Ross Finlayson wrote: [...] > >According to the RFC, each packet must contain a whole number of DV > >blocks (i.e. the RTP payload size must be a multiple of 80) but it > >doesn't appear to be possible to control fragment sizes in this way in > >liveMedia. > > In fact, this is determined (in "MultiFramedRTPSink.cpp") by the > result of the "allowFragmentationAfterStart()" virtual function. By > default, this function returns False, meaning that no 'frames' can be > fragmented across more than one RTP packet, unless the RTP packet > contains only a single 'frame' that is too large for a single RTP > packet. > > A 'frame' in this context is a single discrete chunk of data that is > input (one at a time) to the "MultiFramedRTPSink" (subclass). If you > are assuming that juse one "DV block" is input at time (I haven't yet > looked at your code closely enough to see if that is, in fact, the > case), then you should get the desired behavior automatically. [...] In fact I use a whole DV frame (120000-576000 bytes), which will then be split across a large number of packets. I take it this is not the correct way to use frames in liveMedia? Given that a DV stream has at least 45000 blocks per second, I suspect that it would be sensible to pack blocks together. We need to see at least 6 blocks to identify the DV profile in use, and 480 bytes plus headers should still be less than IPv4's minimum MTU of 576 bytes, so I would go with that. That would also match Firewire. How should the receiving application cope with packet loss? My application, DVswitch, currently receives DV over TCP and does not have any logic to deal with incomplete DV frames. If I convert it to use liveMedia, as I intend to, then it will have to do so in some way. It seems to me that it would need to have its own reassembly logic. Ben. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From finlayson at live555.com Sun Apr 26 15:50:04 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 26 Apr 2009 15:50:04 -0700 Subject: [Live-devel] Adding support for DV video In-Reply-To: <1240755879.4886.70.camel@deadeye.i.decadent.org.uk> References: <1240718308.4886.46.camel@deadeye.i.decadent.org.uk> <1240755879.4886.70.camel@deadeye.i.decadent.org.uk> Message-ID: >In fact I use a whole DV frame (120000-576000 bytes), which will then be >split across a large number of packets. I take it this is not the >correct way to use frames in liveMedia? No, actually that is correct. Those are very large frame sizes, though - yow! >I am currently using the kluge of setting the maximum packet >size to 1372 = 12 + 17 * 80, but it would be preferable to have a >virtual function in MultiFramedRTPSink that could be used to override >the default fragmentation behaviour. Yes. Your '1372 hack' will work for now, but a better solution (which I'll likely add myself) will be to have a 'fragmentation granularity' virtual function (default value: 1), which will give us more control over fragmentation. Then the "MultiFramedRTPSink" code will work even if the programmer decides to call "setPacketSizes()" himself to specify very large MTUs (e.g., for a LAN) after he's created the "MultiFramedRTPSink" (subclass), but before he starts using it. >How should the receiving application cope with packet loss? Our RTP receiving code ("MultiFramedRTPSource") automatically detects, and discards, frames that contain lost packets, so the receiving application will get either complete frames, or no frame data at all. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ben at decadent.org.uk Sun Apr 26 16:39:32 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Mon, 27 Apr 2009 00:39:32 +0100 Subject: [Live-devel] Adding support for DV video In-Reply-To: References: <1240718308.4886.46.camel@deadeye.i.decadent.org.uk> <1240755879.4886.70.camel@deadeye.i.decadent.org.uk> Message-ID: <1240789172.4886.82.camel@deadeye.i.decadent.org.uk> On Sun, 2009-04-26 at 15:50 -0700, Ross Finlayson wrote: [...] > >In fact I use a whole DV frame (120000-576000 bytes), which will then be > >split across a large number of packets. I take it this is not the > >correct way to use frames in liveMedia? > > No, actually that is correct. Those are very large frame sizes, though - yow! Just to be clear, that is the encoded size in DV interchange format (DIF) of a complete video frame. This size is constant for any given profile so it doesn't change in a valid DIF file or stream. The 80-byte blocks each have a header identifying their type and position in the frame. I have compiled a more detailed description at . [...] > >How should the receiving application cope with packet loss? > > Our RTP receiving code ("MultiFramedRTPSource") automatically > detects, and discards, frames that contain lost packets, so the > receiving application will get either complete frames, or no frame > data at all. Nice and simple. The regular structure of DIF does mean that it should be possible for a more sophisticated application to make use of an incomplete frame hthough. Ben. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: From bartha_adam at yahoo.com Mon Apr 27 00:14:55 2009 From: bartha_adam at yahoo.com (Bartha Adam) Date: Mon, 27 Apr 2009 00:14:55 -0700 (PDT) Subject: [Live-devel] DeviceSource question Message-ID: <948858.66005.qm@web110311.mail.gq1.yahoo.com> Hello All, I am trying to implement a DeviceSource, based on the: http://www.live555.com/liveMedia/doxygen/html/DeviceSource_8cpp-source.html My doGetNextFrame() methode is called once, at the beginning, when no data is available. In this case, based on the comments in the sample file, the doGetNextFrame has to return imediately, right (without waiting for a frame)? // This must be done in a non-blocking fashion - i.e., so that we // return immediately from this function even if no data is // currently available. This is my code: img = imgBuffer->getNextImage(); if(img.size==0) { fFrameSize = 0; fNumTruncatedBytes = 0; return; } After this first call, the doGetNextFrame will be never called. I am missing something? Regards, Gr3go -------------- next part -------------- An HTML attachment was scrubbed... URL: From rippeltippel at gmail.com Mon Apr 27 06:18:23 2009 From: rippeltippel at gmail.com (rippel tippel) Date: Mon, 27 Apr 2009 14:18:23 +0100 Subject: [Live-devel] RTSP "Require" header Message-ID: Hi All, does live555 support the "Require" RTSP header, as specified in RFC2326 (12.32) - http://tools.ietf.org/html/rfc2326#page-54 ? I had a look in RTSPServer but couldn't find anything about that, and I guess I have to add it by myself... right? Thanks, R. From finlayson at live555.com Mon Apr 27 07:12:35 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 27 Apr 2009 07:12:35 -0700 Subject: [Live-devel] RTSP "Require" header In-Reply-To: References: Message-ID: > does live555 support the "Require" RTSP header, as specified in >RFC2326 (12.32) - http://tools.ietf.org/html/rfc2326#page-54 ? No, sorry - we don't do anything with that at all. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From amadorim at vdavda.com Mon Apr 27 07:48:08 2009 From: amadorim at vdavda.com (Marco Amadori) Date: Mon, 27 Apr 2009 16:48:08 +0200 Subject: [Live-devel] RTSP "Require" header In-Reply-To: References: Message-ID: <200904271648.09165.amadorim@vdavda.com> On Monday 27 April 2009, 16:12:35, Ross Finlayson wrote: > > does live555 support the "Require" RTSP header, as specified in > >RFC2326 (12.32) - http://tools.ietf.org/html/rfc2326#page-54 ? > No, sorry - we don't do anything with that at all. Seems a nice feature to have. Related to that, is there a TODO list anywhere related to the project to look at for missing pieces which are also wanted by some of the developers? -- ESC:wq -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From jason88wang at gmail.com Mon Apr 27 08:23:35 2009 From: jason88wang at gmail.com (wang wuying) Date: Mon, 27 Apr 2009 23:23:35 +0800 Subject: [Live-devel] Why H264 annexB bytestream transported incompletely Message-ID: <90a5da2f0904270823s434cebefx7d47a2491b9ea110@mail.gmail.com> Hi,all,I have been trying to stream H264 annexB bytestream with livemedia.My H264 file is composed of many NAL Units, each NAL Unit starts with a four-byte startcode(0x00 0x00 0x00 0x01),and only one frame a NAL Unit and the first frame is I frame ,the others are P frame. I have implemented my H264VideoStreamer class,and my currentNALUnitEndsAccessUnit() function always return true, the fDurationInMicroseconds value is a 33ms , and the fPresentationTime value adds the fDurationInMicroseconds value when gets a NAL Unit each time except the first time it uses gettimeofday() funciton return value. I used the openRTSP program as the client to receive the H264 stream and write it to file.When the openRTSP program and the server are in the same PC,it can receive the H264 data completely and write them to file correctly.And i compare the file with the old file ,they are consistent completely ,and the VLC can play it . But when the openRTSP and the server are in different PC,which there is a 1000M switch between,the openRTSP writes the H264 data to file incompletely,the file lost the I frame.But i can find the I frame received by the PC by using the network protocol analyzer wireshark .And i tested serveral H.264 files , and it always lost the I frame. My openRTSP command is as follow: openRTSP -B 1024000 -b 1024000 -Q rtsp://192.168.1.8:8554/h264Test and the QOS statistics result is as follow: Started playing session Receiving streamed data... Received RTCP "BYE" on "video/H264" subsession (after 6 seconds) begin_QOS_statistics server_availability 100 stream_availability 100 subsession video/H264 num_packets_received 3949 num_packets_lost 15 elapsed_measurement_time 6.001293 kBytes_received_total 5468.173000 measurement_sampling_interval_ms 1000 kbits_per_second_min 7016.428477 kbits_per_second_ave 7289.326483 kbits_per_second_max 7547.909051 packet_loss_percentage_min 0.000000 packet_loss_percentage_ave 0.378406 packet_loss_percentage_max 2.293578 inter_packet_gap_ms_min 0.011000 inter_packet_gap_ms_ave 1.516025 inter_packet_gap_ms_max 24.364000 end_QOS_statistics any one who can help me ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivo.vachkov at gmail.com Mon Apr 27 08:48:40 2009 From: ivo.vachkov at gmail.com (Ivo Vachkov) Date: Mon, 27 Apr 2009 18:48:40 +0300 Subject: [Live-devel] Multiple sources question Message-ID: Hello all, By reading testOnDemandRTSPServer.cpp i figured out that I can have different ServerMediaSessions and add subsession to each one. However, it proves to be a bit more complex than that when i comes to network sources. I have the following code: const int channels = 128; struct Stream { string protocol; string address; string port; string stream; RTPSink *videoSink; struct in_addr inputAddress; Port *inputPort; Groupsock *inputGroupsock; ServerMediaSession *sms; }; int num_streams = 0; Stream streams[channels]; ... to represent a number of multicast streams i would like to restream over rtsp. And later on in my code I initialise those like this: for (int i = 0; i < num_streams; i++) { cout << streams[i].stream << streams[i].address << streams[i].port << endl; streams[i].inputAddress.s_addr = our_inet_addr(streams[i].address.c_str()); streams[i].inputPort = new Port(atoi(streams[i].port.c_str())); unsigned char const inputTTL = 0; streams[i].inputGroupsock = new Groupsock(*env, streams[i].inputAddress, *streams[i].inputPort, inputTTL); streams[i].videoSink = SimpleRTPSink::createNew(*env, streams[i].inputGroupsock, 33, 90000, "video", "mp2t", 1, True, True /*no 'M' bit*/); streams[i].sms = ServerMediaSession::createNew(*env, streams[i].stream.c_str(), streams[i].stream.c_str(), descriptionString); streams[i].sms->addSubsession(PassiveServerMediaSubsession::createNew(*streams[i].videoSink, NULL)); rtspServer->addServerMediaSession(streams[i].sms); announceStream(rtspServer, streams[i].sms, streams[i].stream.c_str(), streams[i].stream.c_str()); } With one source everything is working fine. However when I have several multicast sources I get them overlapping on every rtsp url. For some reason (I guess in my code :) ) different multicast streams don't go to different sessions. All packets go to all sessions and it becomes a complete mess. Any help is greatly appreciated. Thanks in advance. P.S. I know the source is messy. I'm trying to prove a concept. From finlayson at live555.com Tue Apr 28 01:01:04 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Apr 2009 01:01:04 -0700 Subject: [Live-devel] Multiple sources question In-Reply-To: References: Message-ID: >With one source everything is working fine. However when I have >several multicast sources I get them overlapping on every rtsp url. >For some reason (I guess in my code :) ) different multicast streams >don't go to different sessions. All packets go to all sessions and it >becomes a complete mess. Perhaps you're using the same port number for different multicast streams? You shouldn't do this. Some OSs (e.g., some versions of Linux) have a bug that causes them to deliver packets intended for different multicast addresses to the same socket if they use the same port number. To be on the safe side, when you have multiple multicast streams, you should use different multicast addresses, *and* different port numbers. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ivo.vachkov at gmail.com Tue Apr 28 06:25:18 2009 From: ivo.vachkov at gmail.com (Ivo Vachkov) Date: Tue, 28 Apr 2009 16:25:18 +0300 Subject: [Live-devel] Multiple sources question In-Reply-To: References: Message-ID: I'm using Ubuntu Linux with 2.6.27-11-generic (Ubuntu Desktop 8.10) for my tests and the multicast addresses of the streams are different (239.255.255.10 and 239.255.255.20). However the port is the same. Looks like a reasonable explanation for the problem. I will confirm it later today. Thanks. I also found another problem with that source code i posted in the original message. The RTSP Server just provides information for the multicast streams. Which is not my intention. Can you provide me with pointers or additional information how to construct Multicast-to-RTSP gateway, i.e. I have a multicast streams that I want to restream using unicast RTSP towards another network. Thanks in advance. On Tue, Apr 28, 2009 at 11:01 AM, Ross Finlayson wrote: >> With one source everything is working fine. However when I have >> several multicast sources I get them overlapping on every rtsp url. >> For some reason (I guess in my code :) ) different multicast streams >> don't go to different sessions. All packets go to all sessions and it >> becomes a complete mess. > > Perhaps you're using the same port number for different multicast streams? > ?You shouldn't do this. ?Some OSs (e.g., some versions of Linux) have a bug > that causes them to deliver packets intended for different multicast > addresses to the same socket if they use the same port number. > > To be on the safe side, when you have multiple multicast streams, you should > use different multicast addresses, *and* different port numbers. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- "UNIX is basically a simple operating system, but you have to be a genius to understand the simplicity." Dennis Ritchie From santosh.k at knoahsoft.com Wed Apr 29 09:59:27 2009 From: santosh.k at knoahsoft.com (Santosh Karankoti) Date: Wed, 29 Apr 2009 22:29:27 +0530 Subject: [Live-devel] Kindly provide info on any changes to be made to some make files or the folder structure Message-ID: <001f01c9c8eb$dce1b8e0$6356020a@knoahsoft.com> Hi all, This is Santosh, I have downloaded the live555 latest tar.gz and using ZipGenious have extracted them. I am facing issues with compiling or generating the make files or creating the workspace in VS 6 on win 2000 machine. Firstly, it says to make changes to the TOOLS32 path, have modified to VC98 folder as C:\program files\microsoft visual studio\VC98. In this path there is no folder call "tools". Found it in C:\program files\microsoft visual studio\common folder. Tried with both the paths, its giving me error as in the attached .jpg file. Moreover, in some site found that the LINK_OPTS_0 value the name of the lib file is wrongly exising. Corrected it still the same. But not getting much help from any other sites. Finally posting it here. As per the live555 media site says, I have done those things but I am unable to generate the makefiles and proceed further. Kindly let me know whether I have to do some modifications to the \live folder by creating multiple folders and copy the files and make files there before running genWindowsMakefiles? Or Do I need to make modifications in any of the makefiles like makefile.head / makefile.tail / w32config / genWindowsMakefiles or any other? Quick response is appreciable. Regards, - Santosh KnoahSoft -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Errorneous_genWindowsMakefiles_result.jpg Type: image/jpeg Size: 102520 bytes Desc: not available URL: From anujs at wientech.com Thu Apr 30 00:05:42 2009 From: anujs at wientech.com (anuj) Date: Thu, 30 Apr 2009 12:35:42 +0530 Subject: [Live-devel] live555 RTSP structure Message-ID: <1241075142.5282.5.camel@anuj-desktop> hi all, i am new to live555. i have to know the live555 library structure and with the help of existing library, i have to create my own RTSP server(live555media) and Client(openRTSP) program. please help to understand the library, if u have any documentation or reference, please send me. Thanks & Regards Anuj Shukla anujs at wientech.com