From vim.garg at gmail.com Tue Dec 1 02:23:21 2009 From: vim.garg at gmail.com (Vimal Garg) Date: Tue, 1 Dec 2009 15:53:21 +0530 Subject: [Live-devel] setAuxilliaryReadHandler: Core dumps In-Reply-To: References: <2faaac140911300151x2da5f9beg622c25cb17764c8c@mail.gmail.com> <2faaac140911302056p28ede19w7df81edeaefd1b0d@mail.gmail.com> Message-ID: <2faaac140912010223o463bec8ej95da53f14372305@mail.gmail.com> I have tried second approach. I ran openRTSP with -b 40000 -r -v rtsp:/media but it doesnt seems to be working. Anyway I will keep this as a fall back mechanishm. I would like to know more about creating a subclass of MediaSink, could you please elaborate more on this. On Tue, Dec 1, 2009 at 12:16 PM, Ross Finlayson wrote: > I believe it is possible to have the streams in buffer but not clear how? I >> went through RTPSource class which has a function pointer >> setAuxilliaryReadHandler(). As I understand this is for registering a >> handler for processing RTP read. >> > > "setAuxilliaryReadHandler()" is a hack that should not be used. I don't > think anyone uses it anymore, and it will likely be removed from a future > release of the code. In any case, it provides access to raw RTP packets > (i.e., including RTP headers) - and that's *not* what you want. > > > > After long discussion and research we had decided to make use of live555 >> stack as it is widely used (and tested). I have gone through the >> openRTSP.cpp which actually saves all the streams to a file. I do not want >> to record streams in a file and read from the file as this will have >> latency. >> > > OK, then your solution is straightforward. You need to write your own > subclass of "MediaSink" that does whatever you want to do with the incoming > data, and use that *instead of* "FileSink". > > (Or, if your incoming stream is video-only (or audio-only), then you could > use the "-v" (or "-a") option to "openRTSP", and pipe it into a separate > application (that reads from stdin) that processes it.) > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From guiluge at gmail.com Tue Dec 1 02:26:02 2009 From: guiluge at gmail.com (Guillaume Ferry) Date: Tue, 1 Dec 2009 11:26:02 +0100 Subject: [Live-devel] [MediaServer] Streaming mp3 file issues Message-ID: <425f13530912010226u13d605b4yea53022855d388f8@mail.gmail.com> Hi there, I'm currently using mediaServer to stream various media on a local network. Next, I implemented a small RTSP client to receive & play that stream. It works really great with WAV/PCM files, but I experience some issues with MP3 files. After some investigation, I found out what is going on : I use systematically provided samplerate (rtpTimeStampFrequency), which is set to 90khz for mP3 files (RTP payload :14)... Incorrect value of course, and it messes up all my playback routines afterwards... Is it possible to obtain an accurate sample rate value through liveMedia classes, or do I need to do some computing ? Thanks in advance, Kind regards, Guillaume. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaguilera at a5security.com Tue Dec 1 05:43:48 2009 From: jaguilera at a5security.com (Josep Aguilera) Date: Tue, 1 Dec 2009 14:43:48 +0100 Subject: [Live-devel] Retriving raw data from IP cameras via RTSP Message-ID: <005801ca728c$4fca3400$ef5e9c00$@com> Dear all, We are very interested to retrieve raw data from IP cameras (MJPEG, MPEG4, H.264) via RTSP and using your Live 555. In order to do it we implemented the Demux class which inherit from the class FramedSource. Inside de Demux class we have the AfterReadingFrame which is call each time a frame is available. Definition of AfterReadingFrame. static void AfterReadingFrame(void* clientData, unsigned frameSize,unsigned /*numTruncatedBytes*/,struct timeval presentationTime,unsigned /*durationInMicroseconds*/); The function AfterReadingFrame is passed to the getNextFrame as seen below. subsession->readSource()->getNextFrame(m_FrameBuffer,MAX_RTP_FRAME_SIZE,RTSP .AfterReadingFrame,subsession,onSourceClosure,subsession); Inside of the AfterReadingFrame function we are accessing to the frame buffer, variable fTo and looking for the frame size from the variable fFrameSize MediaSubsession* bufferQueue = (MediaSubsession*)clientData; Demux *sFrame = (Demux*)bufferQueue->readSource(); unsigned char* pBuffer = sFrame->fTo; unsigned int FrameSize = sFrame-> fFrameSize; We think that the buffer obtained and size is correct. However, when trying to write directly the frame to a file (MJPEG video stream) or create a header jpg and to add the pBuffer we cannot see the saved frame. What contains the pBuffer? Raw data or the frame with its header. Is that the best way to retrieve raw data from a IP camera? Thank you in advance! Best Regards, Josep Aguilera. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 1 05:54:50 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Dec 2009 05:54:50 -0800 Subject: [Live-devel] setAuxilliaryReadHandler: Core dumps In-Reply-To: <2faaac140912010223o463bec8ej95da53f14372305@mail.gmail.com> References: <2faaac140911300151x2da5f9beg622c25cb17764c8c@mail.gmail.com> <2faaac140911302056p28ede19w7df81edeaefd1b0d@mail.gmail.com> <2faaac140912010223o463bec8ej95da53f14372305@mail.gmail.com> Message-ID: >I have tried second approach. I ran openRTSP with -b 40000 -r -v >rtsp:/media but it doesnt seems to be working. No, you should use the "-v" option, but *not* the "-r" option - because you *want* "openRTSP" to receive the RTP stream, but then write the resulting video data to stdout. You should then pipe this command ("openRTSP -b 40000 -r rtsp://url") to a second application that reads from 'stdin'. (If you don't understand what I mean by "pipe" and "stdin", then this software is not for you. Sorry.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Dec 1 06:02:55 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Dec 2009 06:02:55 -0800 Subject: [Live-devel] [MediaServer] Streaming mp3 file issues In-Reply-To: <425f13530912010226u13d605b4yea53022855d388f8@mail.gmail.com> References: <425f13530912010226u13d605b4yea53022855d388f8@mail.gmail.com> Message-ID: >I'm currently using mediaServer to stream various media on a local network. >Next, I implemented a small RTSP client to receive & play that stream. > >It works really great with WAV/PCM files, but I experience some >issues with MP3 files. >After some investigation, I found out what is going on : I use >systematically provided samplerate (rtpTimeStampFrequency), which is >set to 90khz for mP3 files (RTP payload :14)... >Incorrect value of course, and it messes up all my playback routines >afterwards... > >Is it possible to obtain an accurate sample rate value through >liveMedia classes, or do I need to do some computing ? The latter. The sampling frequency for MPEG-1 or 2 audio (which includes 'MP3') is encoded in the data's MPEG audio header. It has nothing to do with the RTP timestamp frequency. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Dec 1 06:27:18 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Dec 2009 06:27:18 -0800 Subject: [Live-devel] Retriving raw data from IP cameras via RTSP In-Reply-To: <005801ca728c$4fca3400$ef5e9c00$@com> References: <005801ca728c$4fca3400$ef5e9c00$@com> Message-ID: >We are very interested to retrieve raw data from IP cameras (MJPEG, >MPEG4, H.264) via RTSP and using your Live 555. >In order to do it we implemented the Demux class which inherit from >the class FramedSource. >Inside de Demux class we have the AfterReadingFrame which is call >each time a frame is available. > >Definition of AfterReadingFrame. >static void AfterReadingFrame(void* clientData, unsigned >frameSize,unsigned /*numTruncatedBytes*/,struct timeval >presentationTime,unsigned /*durationInMicroseconds*/); > >The function AfterReadingFrame is passed to the getNextFrame as seen below. >subsession->readSource()->getNextFrame(m_FrameBuffer,MAX_RTP_FRAME_SIZE,RTSP.AfterReadingFrame,subsession,onSourceClosure,subsession); > >Inside of the AfterReadingFrame function we are accessing to the >frame buffer, variable fTo and looking for the frame size from the >variable fFrameSize No, this is wrong. The class variables "fTo" and "fFrameSize" are used only if you are implementing the "doGetNextFrame()" virtual function - i.e., only if you are implementing a media source (that delivers data to some downstream object). (If you don't plan to implement a media source, then you should be inherting from "MediaSink", not "FramedSource".) Instead, things are quite simple - the actual parameters to your "AfterReadingFrame()" function contain all of the information that you need: - "clientData" will be "subsession" (i.e., the "afterGettingClientData" parameter that you passed to the call to "getNextFrame()" - "frameSize" will be the size of the delivered frame, which will have *already* been delivered into "m_FrameBuffer" (the "to" parameter that you passed to the call to "getNextFrame()" - "presentationTime" will be the presentation time of the delivered frame of data. In other words, in your "AfterReadingFrame()" function, you already have a delivered frame of data. It's in "m_FrameBuffer", and is of length "frameSize", and has presentation time "presentationTime". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From guiluge at gmail.com Tue Dec 1 06:46:52 2009 From: guiluge at gmail.com (Guillaume Ferry) Date: Tue, 1 Dec 2009 15:46:52 +0100 Subject: [Live-devel] [MediaServer] Streaming mp3 file issues In-Reply-To: References: <425f13530912010226u13d605b4yea53022855d388f8@mail.gmail.com> Message-ID: <425f13530912010646n29e743eeg80ef634f8cde97e7@mail.gmail.com> Thanks Ross. By the way, I assume the 'numChannels' parameter is unrelated too ? I mean, a stereo mp3 file wouldn't have 'numChannels=2' set. I'll dive further into libavformat to probe those values (if you have an idea...you're welcome ;-) ) Thanks again, Best regards, Guillaume. 2009/12/1 Ross Finlayson > I'm currently using mediaServer to stream various media on a local network. >> Next, I implemented a small RTSP client to receive & play that stream. >> >> It works really great with WAV/PCM files, but I experience some issues >> with MP3 files. >> After some investigation, I found out what is going on : I use >> systematically provided samplerate (rtpTimeStampFrequency), which is set to >> 90khz for mP3 files (RTP payload :14)... >> Incorrect value of course, and it messes up all my playback routines >> afterwards... >> >> Is it possible to obtain an accurate sample rate value through liveMedia >> classes, or do I need to do some computing ? >> > > The latter. The sampling frequency for MPEG-1 or 2 audio (which includes > 'MP3') is encoded in the data's MPEG audio header. It has nothing to do > with the RTP timestamp frequency. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vim.garg at gmail.com Wed Dec 2 05:29:16 2009 From: vim.garg at gmail.com (Vimal Garg) Date: Wed, 2 Dec 2009 18:59:16 +0530 Subject: [Live-devel] setAuxilliaryReadHandler: Core dumps In-Reply-To: References: <2faaac140911300151x2da5f9beg622c25cb17764c8c@mail.gmail.com> <2faaac140911302056p28ede19w7df81edeaefd1b0d@mail.gmail.com> Message-ID: <2faaac140912020529n3c033487w1ae13c583cf388f8@mail.gmail.com> Hi, I have created a subclass of MediaSink which process the received data. I am using this instead of FileSink. I am able to see MultiFramedRTPSource::doGetNextFrame1() getting called before I start playing streams. It also calls my afterGettingFrame() and continuePlaying() function. All this works fine. Problem starts when it get into doEventLoop(). I see MultiFramedRTPSource::doGetNextFrame1() getting called but it does not call the afterGetting(this); (at line number 195). I investigated and found that packet->rtpMarkerBit(); in MPEG4ESVideoRTPSource::processSpecialHeader() always returns 0 because of which it does not get into the loop. Could you please help me understand what should be done to make this work. On Tue, Dec 1, 2009 at 12:16 PM, Ross Finlayson wrote: > I believe it is possible to have the streams in buffer but not clear how? I >> went through RTPSource class which has a function pointer >> setAuxilliaryReadHandler(). As I understand this is for registering a >> handler for processing RTP read. >> > > "setAuxilliaryReadHandler()" is a hack that should not be used. I don't > think anyone uses it anymore, and it will likely be removed from a future > release of the code. In any case, it provides access to raw RTP packets > (i.e., including RTP headers) - and that's *not* what you want. > > > > After long discussion and research we had decided to make use of live555 >> stack as it is widely used (and tested). I have gone through the >> openRTSP.cpp which actually saves all the streams to a file. I do not want >> to record streams in a file and read from the file as this will have >> latency. >> > > OK, then your solution is straightforward. You need to write your own > subclass of "MediaSink" that does whatever you want to do with the incoming > data, and use that *instead of* "FileSink". > > (Or, if your incoming stream is video-only (or audio-only), then you could > use the "-v" (or "-a") option to "openRTSP", and pipe it into a separate > application (that reads from stdin) that processes it.) > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yooval.from.elta at gmail.com Wed Dec 2 06:01:13 2009 From: yooval.from.elta at gmail.com (Yooval Shabi) Date: Wed, 2 Dec 2009 16:01:13 +0200 Subject: [Live-devel] Problem connecting openRTSP to VLC Message-ID: <91a92a830912020601q59370f6fj4546617ad30c27ea@mail.gmail.com> Thanks for the help. I'm using Windows OS and VLC 1.0.0 I was able to establish communication between openRTSP and testOnDemandRTSPServer application. openRTSP saved the incoming data to a file and after renaming it to *.mp3 I was able to play it. What I'm aware now is that I need to provide openRTSP a port number and a stream identifier (e.g. 8554 and 'mp3AudioTest' in testOnDemandRTSPServer), but I do not know what the correct values for connecting to VLC are. >My RTP server is VLC player which streams an mp3 file. >When I try to connect with the openRTSP I get an error as follows: >Failed to get a SDP description from URL >"rtsp://128.129.83.38:6666": connect() >failed: Unknown error >Do you get the same error if you use "testOnDemandRTSPServer" (with a >file named "test.mp3") or "live555MediaServer" as your server? >Also, what OS are you running? >-- >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Sathish.Manohar at aricent.com Thu Dec 3 01:48:11 2009 From: Sathish.Manohar at aricent.com (Sathish Kumar Manohar) Date: Thu, 3 Dec 2009 15:18:11 +0530 Subject: [Live-devel] A General Doubt from 3GPP Spec Message-ID: Hi Ross, I have a general doubt from the 3GPP Spec V9.0.0. I am currently implementing the "alt","alt-default-id" and "alt-group" and That falls under the section 5.3.3.3 and 5.3.3.4, In that Server sends some alternate attributes (say bandwidth, maxprate etc.) and allows the client to choose one among them during the initial Negotiation. My Doubt: >From the Server's perspective, is there any possibility that server can send rtpmap information as alternatives with different Codec for the same file. Example, Regular: a=rtpmap:97 MP4A-LATM/22050/2 Alternate: a=rtpmap:97 AMR/8000 Is the above alternate possible for any clip?? Regards, Sathish Kumar M ________________________________ "DISCLAIMER: This message is proprietary to Aricent and is intended solely for the use of the individual to whom it is addressed. It may contain privileged or confidential information and should not be circulated or used for any purpose other than for what it is intended. If you have received this message in error, please notify the originator immediately. If you are not the intended recipient, you are notified that you are strictly prohibited from using, copying, altering, or disclosing the contents of this message. Aricent accepts no responsibility for loss or damage arising from the use of the information transmitted by this email including damage from virus." -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 3 05:36:34 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Dec 2009 05:36:34 -0800 Subject: [Live-devel] A General Doubt from 3GPP Spec Message-ID: This has nothing to do with '3GPP'. Your question is about SDP - an IETF standard (which 3GPP - and many other organizations - have adopted). >Example, > >Regular: >a=rtpmap:97 MP4A-LATM/22050/2 > >Alternate: >a=rtpmap:97 AMR/8000 > >Is the above alternate possible for any clip?? No, this is not legal SDP. The dynamic RTP payload format code (97 in this case) can be defined by only one "a=rtpmap:" line. What is legal, however, is to use two separate dynamic RTP payload format codes within the same stream, and use one for MPEG-4 LATM audio, and the other for AMR audio - e.g., m=audio 0 RTP/AVP 96 97 ... a=rtpmap:96 MP4A-LATM/22050/2 a=rtpmap:97 AMR/8000 However, our software currently does not support this (at either the server or the client end). (Note that this mailing list is for discussion of the "LIVE555 Streaming Media" software, not for general discussion of SDP etc.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dinho.victor at gmail.com Mon Dec 7 00:58:02 2009 From: dinho.victor at gmail.com (Lee Victor) Date: Mon, 7 Dec 2009 16:58:02 +0800 Subject: [Live-devel] I want to add pause and replay function in openRTSP, and I have some questions. Message-ID: <22f61b560912070058x76a8aeadre3336b73639c6f8b@mail.gmail.com> I want to add pause and replay function in openRTSP. I think I can user RTSPClient::pauseMediaSession(MediaSession& session) & Boolean RTSPClient::playMediaSession(MediaSession& session, double start, double end, float scale) to realize this function. But I have to get the time when it pause ,so that I can use it as the parameter "double start" in playMediaSession. Now my question is: how can I get the time when it pause. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 7 01:17:29 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Dec 2009 01:17:29 -0800 Subject: [Live-devel] I want to add pause and replay function in openRTSP, and I have some questions. In-Reply-To: <22f61b560912070058x76a8aeadre3336b73639c6f8b@mail.gmail.com> References: <22f61b560912070058x76a8aeadre3336b73639c6f8b@mail.gmail.com> Message-ID: >I want to add pause and replay function in openRTSP. I think I can user > >RTSPClient::pauseMediaSession(MediaSession& session) >& >Boolean RTSPClient::playMediaSession(MediaSession& session, double >start, double end, float scale) > >to realize this function. But I have to get the time when it pause >,so that I can use it as the parameter "double start" in >playMediaSession. Just use "-1.0" as the "start" parameter. This will tell the code not to include a "Range:" header in the "PLAY" request, so the server will resume the stream from the pause point. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From dinho.victor at gmail.com Mon Dec 7 01:33:10 2009 From: dinho.victor at gmail.com (Lee Victor) Date: Mon, 7 Dec 2009 17:33:10 +0800 Subject: [Live-devel] I want to add pause and replay function in openRTSP, and I have some questions. In-Reply-To: References: <22f61b560912070058x76a8aeadre3336b73639c6f8b@mail.gmail.com> Message-ID: <22f61b560912070133l19e9d1b7g9b81e1261ffe9ae8@mail.gmail.com> If I do need to get the time, how can I? 2009/12/7 Ross Finlayson > I want to add pause and replay function in openRTSP. I think I can user >> >> RTSPClient::pauseMediaSession(MediaSession& session) >> & >> Boolean RTSPClient::playMediaSession(MediaSession& session, double start, >> double end, float scale) >> >> to realize this function. But I have to get the time when it pause ,so >> that I can use it as the parameter "double start" in playMediaSession. >> > > Just use "-1.0" as the "start" parameter. This will tell the code not to > include a "Range:" header in the "PLAY" request, so the server will resume > the stream from the pause point. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From anders_chen at huntelec.com.tw Mon Dec 7 02:30:13 2009 From: anders_chen at huntelec.com.tw (HUNT_RD3_Anders Chen) Date: Mon, 7 Dec 2009 18:30:13 +0800 Subject: [Live-devel] How to build a RTSP server supports TCP. UDP & Multicast in the same time? Message-ID: <723E16C8E3C043C8831485A8588DA815@huntelec.com.tw> Dear all, I built a RTSP server. It works fine in TCP & UDP mode. But, it doesn't support multicast ... When TCP/UDP mode, the subsession needs to inherit from class OnDemandServerMediaSubsession. When multicast mode, it needs to inherit from PassiveServerMediaSubsession. How can I make a path support both mode in the same time?? Anders Chen -------------- next part -------------- An HTML attachment was scrubbed... URL: From lvshengye at gmail.com Mon Dec 7 11:33:55 2009 From: lvshengye at gmail.com (Shengye Lu) Date: Mon, 7 Dec 2009 21:33:55 +0200 Subject: [Live-devel] Does Live555 RTSP server fully support RTSP/RTP TCP-interleave? Message-ID: Hi, I am using latest version of Live555 RTSP server, using Realplayer as client. If Realplayer is within a NAT, RTSP/RTP streaming should go via TCP interleave. When I test with this setting, some problem happens: After RTSP server starts sending interleaved RTP packet to Realplayer, it does not send RTSP message any more. The result is: Realplayer sends RTSP requests during some point while playing interleaved RTP packet (e.g. OPTION, SET_PARAMETER, PAUSE), but never gets RTSP responses from server. So after a while (in my test, after 1m:27s), Realplayer complains errors and stops streaming. RTP/UDP does not have this problem. Thanks in advance, -- Shengye From finlayson at live555.com Mon Dec 7 16:36:14 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Dec 2009 16:36:14 -0800 Subject: [Live-devel] Does Live555 RTSP server fully support RTSP/RTP TCP-interleave? In-Reply-To: References: Message-ID: This is a known problem (bug) in the way that we currently handle RTP/RTCP-over-TCP. (Once RTP/RTCP-over-TCP starts, RTSP requests or responses no longer get sent or handled.) Because this is tripping up so many people, it has become a high-priority bug, and a fix will be coming soon. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Dec 7 17:11:12 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Dec 2009 17:11:12 -0800 Subject: [Live-devel] How to build a RTSP server supports TCP. UDP & Multicast in the same time? In-Reply-To: <723E16C8E3C043C8831485A8588DA815@huntelec.com.tw> References: <723E16C8E3C043C8831485A8588DA815@huntelec.com.tw> Message-ID: > When TCP/UDP mode, the subsession needs to inherit from >class OnDemandServerMediaSubsession. > When multicast mode, it needs to inherit from >PassiveServerMediaSubsession. > How can I make a path support both mode in the same time?? It is the server - not the client - that decides whether a stream is sent via unicast or multicast. Therefore it makes no sense to use the same stream (path) name for both kinds of streaming. Instead, the only way that a client can request unicast or multicast streaming is to use a different stream name for each, and have the server use the stream name to determine which type of streaming is desired. This is what we support. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 7 17:14:37 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Dec 2009 17:14:37 -0800 Subject: [Live-devel] I want to add pause and replay function in openRTSP, and I have some questions. In-Reply-To: <22f61b560912070133l19e9d1b7g9b81e1261ffe9ae8@mail.gmail.com> References: <22f61b560912070058x76a8aeadre3336b73639c6f8b@mail.gmail.com> <22f61b560912070133l19e9d1b7g9b81e1261ffe9ae8@mail.gmail.com> Message-ID: >If I do need to get the time, how can I? Either keep track of the time yourself, using "gettimeofday()", or else - for a more accurate time - call "MediaSubsession:: getNormalPlayTime()" (using a stream's most recently received presentation time as parameter). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From dinho.victor at gmail.com Mon Dec 7 23:03:15 2009 From: dinho.victor at gmail.com (Lee Victor) Date: Tue, 8 Dec 2009 15:03:15 +0800 Subject: [Live-devel] What is the timestamp unit in openRTSP? Message-ID: <22f61b560912072303t3918475er66589d92d7bbd029@mail.gmail.com> What is the timestamp unit in openRTSP? second or microsecond? Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Mon Dec 7 23:38:17 2009 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Tue, 8 Dec 2009 09:38:17 +0200 Subject: [Live-devel] What is the timestamp unit in openRTSP? In-Reply-To: <22f61b560912072303t3918475er66589d92d7bbd029@mail.gmail.com> Message-ID: <49B2D30CB72A5A44BCCAFB026702837805C532@mailesl5.esl.corp.elbit.co.il> timestamp in video in most cases (depends on the sdp) is 90000 ticks in second for audio is according to the sample rate 8000/16000/44000 etc.. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Lee Victor Sent: Tuesday, December 08, 2009 9:03 AM To: live-devel at ns.live555.com Subject: [Live-devel] What is the timestamp unit in openRTSP? What is the timestamp unit in openRTSP? second or microsecond? Thank you. The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 7 23:28:41 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Dec 2009 23:28:41 -0800 Subject: [Live-devel] What is the timestamp unit in openRTSP? In-Reply-To: <22f61b560912072303t3918475er66589d92d7bbd029@mail.gmail.com> References: <22f61b560912072303t3918475er66589d92d7bbd029@mail.gmail.com> Message-ID: >What is the timestamp unit in openRTSP? second or microsecond? I'm not sure what you mean by "timestamp unit", but I'm assuming that you mean "data type of a presentation time". (The RTP streaming protocol does use timestamps internally, but those usually do not get exposed outside our libraries, and you don't need to concern yourself with them.) The data type for a "presentation time" is "struct timeval" (i.e., seconds and microseconds). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From vijay.rathi at gmail.com Tue Dec 8 07:41:52 2009 From: vijay.rathi at gmail.com (Vijay Rathi) Date: Tue, 8 Dec 2009 21:11:52 +0530 Subject: [Live-devel] Issue with streaming with Live rtsp stack Message-ID: Hi, I am facing issue with streaming mpeg4 only stream using live stack. On server side : MPEG4 Video only session is being streamed from hardware encoder using live stack On client side : Quicktime Player is able to receive and play it properly without any problem. But OpenRTSP (test app. from live repo.) and VLC is not able to receive any data from server. Strange thing is when client and server both are using live stack then I am facing problem. But quicktime is working without any problem with live stack. Thanks in advance, Best Regards, Vijay The log from openRTSP is as below: (command line : ./openRTSP -b 81920 rtsp://192.168.0.29/LiveStreaming) ------------------------------------------------------------------------------------------- Sending request: OPTIONS rtsp://192.168.0.29/LiveStreaming RTSP/1.0 CSeq: 1 User-Agent: ./openRTSP (LIVE555 Streaming Media v2009.06.02) Received OPTIONS response: RTSP/1.0 200 OK CSeq: 1 Date: Thu, Jan 01 1970 00:59:17 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER Sending request: DESCRIBE rtsp://192.168.0.29/LiveStreaming RTSP/1.0 CSeq: 2 Accept: application/sdp User-Agent: ./openRTSP (LIVE555 Streaming Media v2009.06.02) Received DESCRIBE response: RTSP/1.0 200 OK CSeq: 2 Date: Thu, Jan 01 1970 00:59:18 GMT Content-Base: rtsp://192.168.0.29/LiveStreaming/ Content-Type: application/sdp Content-Length: 496 Need to read 496 extra bytes Read 496 extra bytes: v=0 o=- 12557483 1 IN IP4 192.168.0.29 s=MyCam i=LiveStreaming t=0 0 a=tool:LIVE555 Streaming Media v2009.06.02 a=type:broadcast a=control:* a=source-filter: incl IN IP4 * 192.168.0.29 a=rtcp-unicast: reflection a=range:npt=0- a=x-qt-text-nam:MyCam a=x-qt-text-inf:LiveStreaming m=video 554 RTP/AVP 96 c=IN IP4 232.228.16.207/255 a=rtpmap:96 MP4V-ES/90000 a=fmtp:96 profile-level-id=5;config=000001B005000001B509000001000000012000847A9828B421E0A31F a=control:track1 Opened URL "rtsp://192.168.0.29/LiveStreaming", returning a SDP description: v=0 o=- 12557483 1 IN IP4 192.168.0.29 s=MyCam i=LiveStreaming t=0 0 a=tool:LIVE555 Streaming Media v2009.06.02 a=type:broadcast a=control:* a=source-filter: incl IN IP4 * 192.168.0.29 a=rtcp-unicast: reflection a=range:npt=0- a=x-qt-text-nam:MyCam a=x-qt-text-inf:LiveStreaming m=video 554 RTP/AVP 96 c=IN IP4 232.228.16.207/255 a=rtpmap:96 MP4V-ES/90000 a=fmtp:96 profile-level-id=5;config=000001B005000001B509000001000000012000847A9828B421E0A31F a=control:track1 Created receiver for "video/MP4V-ES" subsession (client ports 554-555) Sending request: SETUP rtsp://192.168.0.29/LiveStreaming/track1 RTSP/1.0 CSeq: 3 Transport: RTP/AVP;multicast;client_port=554-555 User-Agent: ./openRTSP (LIVE555 Streaming Media v2009.06.02) Received SETUP response: RTSP/1.0 200 OK CSeq: 3 Date: Thu, Jan 01 1970 00:59:18 GMT Transport: RTP/AVP;multicast;destination=232.228.16.207;source=192.168.0.29;port=554-555;ttl=255 Session: 55 Setup "video/MP4V-ES" subsession (client ports 554-555) Created output file: "video-MP4V-ES-1" Sending request: PLAY rtsp://192.168.0.29/LiveStreaming/ RTSP/1.0 CSeq: 4 Session: 55 Range: npt=0.000- User-Agent: ./openRTSP (LIVE555 Streaming Media v2009.06.02) Received PLAY response: RTSP/1.0 200 OK CSeq: 4 Date: Thu, Jan 01 1970 00:59:18 GMT Range: npt=0.000- Session: 55 RTP-Info: url=rtsp:// 192.168.0.29/LiveStreaming/track1;seq=63376;rtptime=1354683609 Started playing session Receiving streamed data (signal with "kill -HUP 3644" or "kill -USR1 3644" to terminate)... -------------- next part -------------- An HTML attachment was scrubbed... URL: From guiluge at gmail.com Tue Dec 8 08:35:52 2009 From: guiluge at gmail.com (Guillaume Ferry) Date: Tue, 8 Dec 2009 17:35:52 +0100 Subject: [Live-devel] [MediaServer] Streaming mp3 file issues In-Reply-To: <425f13530912010646n29e743eeg80ef634f8cde97e7@mail.gmail.com> References: <425f13530912010226u13d605b4yea53022855d388f8@mail.gmail.com> <425f13530912010646n29e743eeg80ef634f8cde97e7@mail.gmail.com> Message-ID: <425f13530912080835t42410244p329bca9eca97a5d0@mail.gmail.com> Hi Ross, I'm still stuck into this mp3 streaming case : I must be missing something, but here are some details. 1) A standard mp3 is being streamed by liveMedia server, works just fine (checked with VLC) 2) I have a sink client (a really simple MediaSink subclass), in which I grab & store frames, through callback 'afterGettingFrame', with a specific framesize. 3) I'm using FFmpeg API (libavformat / libavcodec) to decode this frame, using provided framesize, and I inject it into a sample player, for instance PortAudio. FFmpeg decoding seems ok (no errors), but when I feed PortAudio with those samples, I can hear nothing but noise, and after a few seconds, an app crash. This client / audio player works really fine with remote PCM files, but MP3 files won't play. Could you give me some insights about received RTP frames ? Do I need some specific processing before decoding ? Thanks in advance, Guillaume. 2009/12/1 Guillaume Ferry > Thanks Ross. > By the way, I assume the 'numChannels' parameter is unrelated too ? > I mean, a stereo mp3 file wouldn't have 'numChannels=2' set. > > I'll dive further into libavformat to probe those values (if you have an > idea...you're welcome ;-) ) > > Thanks again, > > Best regards, > Guillaume. > > 2009/12/1 Ross Finlayson > > I'm currently using mediaServer to stream various media on a local >>> network. >>> Next, I implemented a small RTSP client to receive & play that stream. >>> >>> It works really great with WAV/PCM files, but I experience some issues >>> with MP3 files. >>> After some investigation, I found out what is going on : I use >>> systematically provided samplerate (rtpTimeStampFrequency), which is set to >>> 90khz for mP3 files (RTP payload :14)... >>> Incorrect value of course, and it messes up all my playback routines >>> afterwards... >>> >>> Is it possible to obtain an accurate sample rate value through liveMedia >>> classes, or do I need to do some computing ? >>> >> >> The latter. The sampling frequency for MPEG-1 or 2 audio (which includes >> 'MP3') is encoded in the data's MPEG audio header. It has nothing to do >> with the RTP timestamp frequency. >> -- >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 8 16:52:15 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Dec 2009 16:52:15 -0800 Subject: [Live-devel] [MediaServer] Streaming mp3 file issues In-Reply-To: <425f13530912080835t42410244p329bca9eca97a5d0@mail.gmail.com> References: <425f13530912010226u13d605b4yea53022855d388f8@mail.gmail.com> <425f13530912010646n29e743eeg80ef634f8cde97e7@mail.gmail.com> <425f13530912080835t42410244p329bca9eca97a5d0@mail.gmail.com> Message-ID: >Could you give me some insights about received RTP frames ? Do I >need some specific processing before decoding ? No. The data that you receive after calling "getNextFrame()" on a RTP source should be in a form that you can feed directly to a MP3 (in this case) decoder. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From grzegorz.gwardys at gmail.com Tue Dec 8 17:44:01 2009 From: grzegorz.gwardys at gmail.com (grzegorz gwardys) Date: Wed, 9 Dec 2009 02:44:01 +0100 Subject: [Live-devel] RTSP Server Division Message-ID: <6603af060912081744lec25p232deb6670680bb1@mail.gmail.com> Dear Mr Finlayson, Sorry, for posting in a bad place. Rest of my email: My student task is to implement a RTSP server, which wouldn't stream a media, but parse RTSP commends (play, stop and so on) and communicate with a Streaming Server, which would be responsible for streaming. Also I have to write a communication protocole betwen them (RTSP serv. and streaming serv.). And to make it I plan to open 3 projects - RTSP Client (open RTSP), RTSP Server (test on demand), Streaming Server (also test on demand). The last 2 I would modify to achive this task. Do you think, that's a good idea? Or you would recomend me some other solution? I look forward to hearing from you. Yours sincerely, -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 8 17:53:43 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Dec 2009 17:53:43 -0800 Subject: [Live-devel] RTSP Server Division In-Reply-To: <6603af060912081744lec25p232deb6670680bb1@mail.gmail.com> References: <6603af060912081744lec25p232deb6670680bb1@mail.gmail.com> Message-ID: At 2:44 AM +0100 12/9/09, grzegorz.gwardys at gmail.com wrote: >My student task Does your school not have its own domain name? :-) > is to implement a RTSP server, which wouldn't stream a media, but >parse RTSP commends (play, stop and so on) and communicate with a >Streaming Server, which would be responsible for streaming. Also I >have to write a communication protocole betwen them (RTSP serv. and >streaming serv.). And to make it I plan to open 3 projects - RTSP >Client (open RTSP), RTSP Server (test on demand), Streaming Server >(also test on demand). The last 2 I would modify to achive this >task. Do you think, that's a good idea? You could do this, I think, although you would need to implement your own "ServerMediaSession" subclasses to get it to do what you want (i.e., communicate with your second streaming server, rather than start streaming itself). From kevin.liu at innosofts.com Tue Dec 8 18:53:01 2009 From: kevin.liu at innosofts.com (Kevin.Liu) Date: Wed, 9 Dec 2009 10:53:01 +0800 Subject: [Live-devel] live555 for wm6.1 loss udp packet Message-ID: <001201ca787a$be988410$3bc98c30$@liu@innosofts.com> Hi to all , I have migrated the live555 library to window mobile 6.1 ,but I found a question that live555 client app can not receive all the media data sent by the live555 server (udp). I thought the live555 client may be not quick enough to handle the udp packet which means the server send too fast ( wlan ) so some udp packets were dropped by the client. Is there any solution for this question or should I modify some part of live555 for wm ? Thank in advance. Best regards, Kevin Liu -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 9 00:28:55 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Dec 2009 00:28:55 -0800 Subject: [Live-devel] live555 for wm6.1 loss udp packet In-Reply-To: <001201ca787a$be988410$3bc98c30$@liu@innosofts.com> References: <001201ca787a$be988410$3bc98c30$@liu@innosofts.com> Message-ID: > I have migrated the live555 library to window mobile 6.1 ,but I >found a question that live555 client app can not receive all the >media data sent by the live555 server (udp). > I thought the live555 client may be not quick enough to handle the >udp packet which means the server send too fast ( wlan ) so some udp >packets were dropped by the client. > Is there any solution for this question Either: 1/ Your server is sending data faster than it should (e.g., because you're not setting "fDurationInMicroseconds" properly in the data that's fed into your "RTPSink" subclass), or 2/ Your client has insuffienct socket buffering. See -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From B23625 at freescale.com Wed Dec 9 01:09:05 2009 From: B23625 at freescale.com (Fan Rong-B23625) Date: Wed, 9 Dec 2009 17:09:05 +0800 Subject: [Live-devel] The implementation of H264VideoStreamFramer stream from .h264 file Message-ID: Hi guys, I am studying H264VideoStreamFramer now. I want to stream from .h264 file, then VLC play this stream. I kown somebody has implemented this feature, But I cann't get the source codes by google. Who can share these source codes with me? Thanks a lot. Fan Rong ________________________________ ???: Fan Rong-B23625 ????: 2009?12?9? 15:22 ???: 'live-devel at lists.live555.com' ??: [Live-devel] ByteStreamFileSource afterGetting function problem Hi Engin, Could you please send me the source code h264videostreamframer you implemented ? Because I can not google your implementation of h264videostreamframer.cpp, I want to study it. Thanks a lot. Fan Rong -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.nemec at atlas.cz Wed Dec 9 05:42:12 2009 From: a.nemec at atlas.cz (=?utf-8?B?TsSbbWVjIEFsZXhhbmRy?=) Date: Wed, 09 Dec 2009 13:42:12 GMT Subject: [Live-devel] 64 bit portability Message-ID: <9c2491c0bea64a59be151aec36bc5db5@c6b2383d419d46e38e9f46a0c06fc835> An HTML attachment was scrubbed... URL: -------------- next part -------------- Hi all, I compiled the live555 media streaming library using the Visual Studio 2005 compiler and the compiler reports 176 warnings when compiled with the 64 bit portability issues detection feature. The warnings are not reported when this compiler feature is turned off. Yes, most of the warnings can be safely ignored (according to our opinion). But there are also some line with type casts of ints or long to pointers or vice versa which might be very dangerous on a 64 bit platform. I attached a text file to this e-mail with all the warnings including the file and exact line number and warning description. - I think that some of the lines can potentially cause problems on 64 bit platforms, but without better code knowledge we are not sure. - Are there any plans to get rid of there warnings to make Visual Studio compilers happy? Best regards Alex -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: live555.txt URL: From finlayson at live555.com Wed Dec 9 14:36:01 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Dec 2009 14:36:01 -0800 Subject: [Live-devel] 64 bit portability In-Reply-To: <9c2491c0bea64a59be151aec36bc5db5@c6b2383d419d46e38e9f46a0c06fc835> References: <9c2491c0bea64a59be151aec36bc5db5@c6b2383d419d46e38e9f46a0c06fc835> Message-ID: >- Are there any plans to get rid of there warnings to make Visual >Studio compilers happy? No, because most of those warnings are silly and/or bogus, and the few warnings that *are* legitimate turn out not to be a problem in our code. Specifically: - The "conversion from 'SOCKET' to int" warnings are bogus, because the functions "socket()", "fcntl()", and "accept()" have always returned "int" - so if Windoze now thinks that they should return some other data type, then it's just plain broken, IMHO. - The "conversion from 'size_t'" warnings are a bit silly, because functions like "strnlen()" etc. can never reasonably be expected to return sizes >= 2^32 (our buffers will never be that large). - The "conversion to pointer types of 'greater size'" warnings are silly, because these can never cause errors. In our code, these warnings happen only because we use "char*" (i.e., a pointer type) as the type of keys in our hash tables, but the actual keys that we use in some hash table instances will often be other data types (e.g., unsigned) that might (on a 64-bit machine) be smaller (but never larger). - The "pointer truncation" warnings are legitimate, but in our code they're OK, because (in all but one[*] case) they occur only when we retrieve a key from a hash table, and we know for sure that the actual key type is some type other than a pointer. [*] The one other place where we do a 'pointer truncation' is when we use a pointer value to generate a random hash table index. In this case it's OK to use the lower 32 bits only. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kevin.liu at innosofts.com Wed Dec 9 20:04:04 2009 From: kevin.liu at innosofts.com (Kevin.Liu) Date: Thu, 10 Dec 2009 12:04:04 +0800 Subject: [Live-devel] live555 for wm6.1 loss udp packet In-Reply-To: References: <001201ca787a$be988410$3bc98c30$@liu@innosofts.com> Message-ID: <000c01ca794d$d1b54140$751fc3c0$@liu@innosofts.com> Hi Ross: Thank you very much for your reply. I have changed the buffer of the socket to 4,000,000 and the audio stream has much better. But the video stream also not good. I wrote my own "BufferSink" inherit form "MediaSink" directly ,so I did not use the "RTPSink" ,must I use the "RTPSink" in my rtp packets transport ? Best regards, Kevin Liu Senior Software Engineer Innosoft Solutions & Services -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 2009?12?9? 16:29 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] live555 for wm6.1 loss udp packet > I have migrated the live555 library to window mobile 6.1 ,but I >found a question that live555 client app can not receive all the >media data sent by the live555 server (udp). > I thought the live555 client may be not quick enough to handle the >udp packet which means the server send too fast ( wlan ) so some udp >packets were dropped by the client. > Is there any solution for this question Either: 1/ Your server is sending data faster than it should (e.g., because you're not setting "fDurationInMicroseconds" properly in the data that's fed into your "RTPSink" subclass), or 2/ Your client has insuffienct socket buffering. See -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Thu Dec 10 03:22:13 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Dec 2009 03:22:13 -0800 Subject: [Live-devel] live555 for wm6.1 loss udp packet In-Reply-To: <000c01ca794d$d1b54140$751fc3c0$@liu@innosofts.com> References: <001201ca787a$be988410$3bc98c30$@liu@innosofts.com> <000c01ca794d$d1b54140$751fc3c0$@liu@innosofts.com> Message-ID: > I have changed the buffer of the socket to 4,000,000 and the audio stream >has much better. But the video stream also not good. I wrote my own >"BufferSink" inherit form "MediaSink" directly ,so I did not use the >"RTPSink" ,must I use the "RTPSink" in my rtp packets transport ? No. "RTPSink"s are used only for *sending* RTP packets - i.e., they're used by servers, not clients. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From galaxy at novosun.com Thu Dec 10 03:57:37 2009 From: galaxy at novosun.com (Kam Wai-ip) Date: Thu, 10 Dec 2009 19:57:37 +0800 Subject: [Live-devel] Is this a bug?? Message-ID: <3e8c80350912100357r66017cccx48c746c1b9cae5d1@mail.gmail.com> I just found that, when I stream h264 video, every data that contain 0x0d 0x0a will be replaced by 0x0d 0x0d 0x0a, is this a bug, or just my imagination?? However this can be fixed easily, but it is let you guys know. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaguilera at a5security.com Thu Dec 10 04:04:34 2009 From: jaguilera at a5security.com (Josep Aguilera) Date: Thu, 10 Dec 2009 13:04:34 +0100 Subject: [Live-devel] Crash on TaskScheduler DoEventLoop Message-ID: <008b01ca7990$f20b5180$d621f480$@com> Hi Ross, We implemented a code to obtain raw data from cameras IP via RTSP using Live 555. In order to obtain it, we create a class which each time a new camera it is connected a new thread of the class is started. You can see the code below which is working properly where each time the function AfterReadingFrame is called a frame in the variable m_FrameBuffer is available. The problem comes when we want to stop and restart the class. Then we assign to the variable blockingValue a value different to 0 and it crash the function env->taskScheduler().doEventLoop(&blockingValue); with the following debug info as output memmove(unsigned char * 0x00000000, unsigned char * 0x07e6db04, unsigned long 1460) line 171 GRABADORA5! BufferedPacket::use(unsigned char *,unsigned int,unsigned int &,unsigned int &,unsigned short &,unsigned int &,struct timeval &,unsigned int &,unsigned int &) + 110 bytes GRABADORA5! MultiFramedRTPSource::doGetNextFrame1(void) + 232 bytes GRABADORA5! MultiFramedRTPSource::networkReadHandler(class MultiFramedRTPSource *,int) + 577 bytes GRABADORA5! BasicTaskScheduler::SingleStep(unsigned int) + 441 bytes GRABADORA5! BasicTaskScheduler0::doEventLoop(char *) + 26 bytes CRTSPCamera::ThreadFunc(void * 0x07e53840) line 376 + 39 bytes GRABADORA5! const DelayQueue::`vftable' address 0x0069ed28 GRABADORA5! BasicTaskScheduler0::~BasicTaskScheduler0(void) + 47 bytes I checked in the mailing lists and it seems that it is working well to everybody! I also called a dummy function before the doEventLoop obtaining the same crash. Thank you in advance! ---- Source Code -------------------------------------------- void CRTSP_PlayerDlg::dummyTask(void* pData) { // Call this again, after a brief delay: CRTSP_PlayerDlg *pDlg = (CRTSP_PlayerDlg*) pData; if(pDlg) { int uSecsToDelay = 100000; // 100 ms pDlg->env->taskScheduler().scheduleDelayedTask(uSecsToDelay,(TaskFunc*)dummy Task, NULL); } } char blockingValue = 0; UINT CRTSP_PlayerDlg::ThreadFunc(LPVOID pParam) { CRTSP_PlayerDlg *pCaptureFrame = (CRTSP_PlayerDlg *)pParam; MediaSubsessionIterator iter(*(pCaptureFrame->mediaSession)); iter.reset(); while ((subsession = iter.next()) != NULL) { if( subsession->readSource() == NULL ) continue; else { unsigned flags = 0; if (strcmp(subsession->mediumName(), "audio") == 0) { } else if (strcmp(subsession->mediumName(), "video") == 0) { subsession->readSource()->getNextFrame(m_FrameBuffer,MAX_RTP_FRAME_SIZE,Afte rReadingFrame,subsession,onSourceClosure,subsession); } } } //dummyTask(this); env->taskScheduler().doEventLoop(&blockingValue); return 1; } void CRTSP_PlayerDlg::AfterReadingFrame(void* clientData, unsigned frameSize, unsigned /*numTruncatedBytes*/, struct timeval presentationTime, unsigned /*durationInMicroseconds*/) { MediaSubsession* bufferQueue = (MediaSubsession*)clientData; ZeroMemory(m_FrameBuffer,frameSize); bufferQueue->readSource()->getNextFrame(m_FrameBuffer,MAX_RTP_FRAME_SIZE,Aft erReadingFrame,bufferQueue,onSourceClosure,bufferQueue); } Best Regards, Josep Aguilera -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 10 08:59:29 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Dec 2009 08:59:29 -0800 Subject: [Live-devel] Is this a bug?? In-Reply-To: <3e8c80350912100357r66017cccx48c746c1b9cae5d1@mail.gmail.com> References: <3e8c80350912100357r66017cccx48c746c1b9cae5d1@mail.gmail.com> Message-ID: >I just found that, when I stream h264 video, every data that contain >0x0d 0x0a will be replaced by 0x0d 0x0d 0x0a, is this a bug, or just >my imagination?? Remember, You Have Complete Source Code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Dec 10 09:13:49 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Dec 2009 09:13:49 -0800 Subject: [Live-devel] Crash on TaskScheduler DoEventLoop In-Reply-To: <008b01ca7990$f20b5180$d621f480$@com> References: <008b01ca7990$f20b5180$d621f480$@com> Message-ID: >We implemented a code to obtain raw data from cameras IP via RTSP >using Live 555. >In order to obtain it, we create a class which each time a new >camera it is connected >a new thread of the class is started. Have you read the FAQ entry about threads? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From paulo at voicetechnology.com.br Thu Dec 10 11:56:19 2009 From: paulo at voicetechnology.com.br (=?ISO-8859-1?Q?Paulo_Rog=E9rio_Panhoto?=) Date: Thu, 10 Dec 2009 17:56:19 -0200 Subject: [Live-devel] Integration with Third party RTP library Message-ID: <4B215263.5070609@voicetechnology.com.br> Hello everyone, This is my first post to this list. I was checking out the openRTSP source and one question popped to my mind. How can I use only the RTSP client (to do the session stablishment and control, like PLAY, RECORD) without using the RTP library? Regards, Paulo From galaxy at novosun.com Thu Dec 10 18:56:18 2009 From: galaxy at novosun.com (Kam Wai-ip) Date: Fri, 11 Dec 2009 10:56:18 +0800 Subject: [Live-devel] Is this a bug?? In-Reply-To: References: <3e8c80350912100357r66017cccx48c746c1b9cae5d1@mail.gmail.com> Message-ID: <3e8c80350912101856i6bbb1943r50e5ca2a2c98791c@mail.gmail.com> ok, can you tell me where is the code cause this problem, then i will try to fix it, thanks On Fri, Dec 11, 2009 at 12:59 AM, Ross Finlayson wrote: > I just found that, when I stream h264 video, every data that contain 0x0d >> 0x0a will be replaced by 0x0d 0x0d 0x0a, is this a bug, or just my >> imagination?? >> > > Remember, You Have Complete Source Code. > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 11 01:16:20 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Dec 2009 01:16:20 -0800 Subject: [Live-devel] Integration with Third party RTP library In-Reply-To: <4B215263.5070609@voicetechnology.com.br> References: <4B215263.5070609@voicetechnology.com.br> Message-ID: > This is my first post to this list. I was checking out the >openRTSP source and one question popped to my mind. How can I use >only the RTSP client (to do the session stablishment and control, >like PLAY, RECORD) without using the RTP library? Yes, it's easy to do this - just don't create the "groupsock" and "RTPSink" objects, nor any data sinks. For example, see how we implement the "-r" option (which does exactly this) in "openRTSP": in the file "testProgs/playCommon.cpp". (Note what the code does (or, more precisely, doesn't do) if "createReceivers" is False.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From orbit.huang at icatchinc.com Fri Dec 11 03:59:37 2009 From: orbit.huang at icatchinc.com (orbit) Date: Fri, 11 Dec 2009 19:59:37 +0800 Subject: [Live-devel] streaming H.264 with changing resolution Message-ID: Hi Ross, I use Live555 to stream H.264 from the embedded borad and find a bug: for my borad, user could change video's resolution, at beginning user recive H.264 from the borad , it plays well, but if user disconnect then reconnect with different resolution, client player does not decode frames correctly because SPS and PPS are not the same as previous caused by fSPLines is not NULL (see below code) and media field of SDP doesn't be set again. It's not exactly a bug, but if you would remove the if condition, it will be more flexible. orbit ---- OnDemandServerMediaSubsession::sdpLines () { if (fSDPLines == NULL ) { ....... } return fSDPLines ; } -------------- next part -------------- An HTML attachment was scrubbed... URL: From nyckopro at gmail.com Fri Dec 11 04:13:01 2009 From: nyckopro at gmail.com (Nycko) Date: Fri, 11 Dec 2009 09:13:01 -0300 Subject: [Live-devel] Question Message-ID: <5f1066820912110413x2381b3aer36cb800f08058b95@mail.gmail.com> Hi, I openRTSP the 24h running against a couple of IP cameras of different brands, The problem I have is that recordings are more accelerated in the night and almost normally on the day (I suppose it is light). I wonder if it is a matter of parameters or a problem with the cameras. Nose if this is the list referred to, if not apologize. Thanks -- Escobar, Nicolas Dto. Tecnologia Informatica y Telecomunicaciones S.A. From jaguilera at a5security.com Fri Dec 11 04:49:08 2009 From: jaguilera at a5security.com (Josep Aguilera) Date: Fri, 11 Dec 2009 13:49:08 +0100 Subject: [Live-devel] Multi-threading Message-ID: <00dc01ca7a60$54f1c3e0$fed54ba0$@com> Hi Ross, Following the FAQ list where is explained that the library assumes only a thread of execution we created a class where the UsageEnvironment variable as well as MediaSession, MediaSubsession are defined. Each camera creates an instance of the class and launch a thread passing the parameters of the class instance. Therefore as you say in the FAQ, if each thread has its own UsageEnvironment and event loop, it is possible to access simultaneously different threads to the library. However still when we are trying to close the event loop (doEventLoop) and we have more than one camera (with one works fine) the library crash. See the crash code below. memmove(unsigned char * 0x00000000, unsigned char * 0x07e6db04, unsigned long 1460) line 171 GRABADORA5! BufferedPacket::use(unsigned char *,unsigned int,unsigned int &,unsigned int &,unsigned short &,unsigned int &,struct timeval &,unsigned int &,unsigned int &) + 110 bytes GRABADORA5! MultiFramedRTPSource::doGetNextFrame1(void) + 232 bytes GRABADORA5! MultiFramedRTPSource::networkReadHandler(class MultiFramedRTPSource *,int) + 577 bytes GRABADORA5! BasicTaskScheduler::SingleStep(unsigned int) + 441 bytes GRABADORA5! BasicTaskScheduler0::doEventLoop(char *) + 26 bytes CRTSPCamera::ThreadFunc(void * 0x07e53840) line 376 + 39 bytes GRABADORA5! const DelayQueue::`vftable' address 0x0069ed28 GRABADORA5! BasicTaskScheduler0::~BasicTaskScheduler0(void) + 47 bytes Thank you in advance. Josep Aguilera. -----Mensaje original----- De: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] En nombre de Ross Finlayson Enviado el: jueves, 10 de diciembre de 2009 18:14 Para: LIVE555 Streaming Media - development & use Asunto: Re: [Live-devel] Crash on TaskScheduler DoEventLoop >We implemented a code to obtain raw data from cameras IP via RTSP >using Live 555. >In order to obtain it, we create a class which each time a new >camera it is connected >a new thread of the class is started. Have you read the FAQ entry about threads? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Fri Dec 11 05:31:25 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Dec 2009 05:31:25 -0800 Subject: [Live-devel] streaming H.264 with changing resolution In-Reply-To: References: Message-ID: >I use Live555 to stream H.264 from the embedded borad and find a bug: > >for my borad, user could change video's resolution, at beginning >user recive H.264 from the borad , it plays well, but if user >disconnect then reconnect with different resolution, client player >does not decode frames correctly because SPS and PPS are not the >same as previous caused by fSPLines is not NULL (see below code) and >media field of SDP doesn't be set again. Unfortunately, for most servers, the current behavior - setting up "fSDPLines" only once, when the first client connects - will be the desired behavior, because (for most servers) the stream's SDP parameters will not change from client to client. Also, in some cases, it might be costly to figure out the SDP parameters (e.g., it might involve doing some reading/parsing of the input source), so we would not want to generate these more than once. Therefore the current default behavior should remain. What you can do, however, is reimplement the "sdpLines()" virtual function in your "OnDemandServerMediaSubsession" subclass, as follows: char const* yourSubclass::sdpLines() { delete[] fSDPLines; fSDPLines = NULL; return OnDemandServerMediaSubsession::sdpLines(); } Alternatively, you could just add delete[] fSDPLines; fSDPLines = NULL; to your implementation of the "createNewRTPSink()" virtual function. For either of these solutions to work, the definition of "fSDPLines" in "liveMedia/include/OnDemandServerMediaSubsession.hh" will need to be changed from "private" to protected". (I'll make this change in the next release of the software.) I'm curious, though. You talk about the user "reconnect(ing) with a different resolution". How are you doing this in the RTSP protocol? Because you must be creating your "H264VideoRTPSink" with a different "sprop_parameter_sets_str" parameter each time (in order for its SDP description to change), you must be making the "sprop_parameter_sets_str" parameter be somehow dependent upon the client-desired resolution. How are you doing this? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 11 05:49:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Dec 2009 05:49:28 -0800 Subject: [Live-devel] Multi-threading In-Reply-To: <00dc01ca7a60$54f1c3e0$fed54ba0$@com> References: <00dc01ca7a60$54f1c3e0$fed54ba0$@com> Message-ID: >Following the FAQ list where is explained that the library assumes only a >thread of execution we created a >class where the UsageEnvironment variable as well as MediaSession, >MediaSubsession are defined. You will also need a separate "TaskScheduler" object for each. And *no* "liveMedia" objects at all can be shared between different threads; each thread must create and use completely separate objects. >Each camera creates an instance of the class and launch a thread passing the >parameters of the class instance. You probably don't need multiple threads for this. Instead, your application should still be able to use a single event loop, even with multiple data sources. >Therefore as you say in the FAQ, if each thread has its own UsageEnvironment >and event loop, it is possible to access simultaneously >different threads to the library. Yes, although this has not been well tested, and is still not recommended. >However still when we are trying to close the event loop (doEventLoop) and >we have more than one camera (with one works fine) the >library crash. Unfortunately you're going to have to figure this out yourself. It's possible that there is still some shared memory somewhere that is getting messed up when accessed by more than one thread. If you find this, please let us know. Alternatively, as I noted above, you could just have your multiple cameras use a single event loop. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From a.nemec at atlas.cz Fri Dec 11 05:53:33 2009 From: a.nemec at atlas.cz (=?utf-8?B?TsSbbWVjIEFsZXhhbmRy?=) Date: Fri, 11 Dec 2009 13:53:33 GMT Subject: [Live-devel] Multi-threading Message-ID: <088218cb30684dc7a3cdac3189369e77@9244525931044f47ad715156ce3ca771> Hi, I would suggest to review your code, we use it the same way in our testing environment for many IP cameras using one thread for each camera and we can safely exit each doEventLoop and the corresponding thread. You must ensure that you start to clean up all the objects _after_ the event loop has finished which is NOT directly after setting the watch variable, you must do a correct synchronization in your code to use live555 safely in a multithreaded environment. Hope this helps. Regards Alex >--------------------------------------------------------- >Od: Josep Aguilera >P?ijato: 11.12.2009 14:41:31 >P?edm?t: [Live-devel] Multi-threading > >Hi Ross, > > > >Following the FAQ list where is explained that the library assumes only a > >thread of execution we created a > >class where the UsageEnvironment variable as well as MediaSession, > >MediaSubsession are defined. > >Each camera creates an instance of the class and launch a thread passing the > >parameters of the class instance. > > > >Therefore as you say in the FAQ, if each thread has its own UsageEnvironment > >and event loop, it is possible to access simultaneously > >different threads to the library. > > > >However still when we are trying to close the event loop (doEventLoop) and > >we have more than one camera (with one works fine) the > >library crash. See the crash code below. > > > > > >memmove(unsigned char * 0x00000000, unsigned char * 0x07e6db04, unsigned > >long 1460) line 171 > >GRABADORA5! BufferedPacket::use(unsigned char *,unsigned int,unsigned int > >&,unsigned int &,unsigned short &,unsigned int &,struct timeval &,unsigned > >int &,unsigned int &) + 110 bytes > >GRABADORA5! MultiFramedRTPSource::doGetNextFrame1(void) + 232 bytes > >GRABADORA5! MultiFramedRTPSource::networkReadHandler(class > >MultiFramedRTPSource *,int) + 577 bytes > >GRABADORA5! BasicTaskScheduler::SingleStep(unsigned int) + 441 bytes > >GRABADORA5! BasicTaskScheduler0::doEventLoop(char *) + 26 bytes > >CRTSPCamera::ThreadFunc(void * 0x07e53840) line 376 + 39 bytes > >GRABADORA5! const DelayQueue::`vftable' address 0x0069ed28 > >GRABADORA5! BasicTaskScheduler0::~BasicTaskScheduler0(void) + 47 bytes > > > > > >Thank you in advance. > > > >Josep Aguilera. > > > >-----Mensaje original----- > >De: live-devel-bounces at ns.live555.com > >[mailto:live-devel-bounces at ns.live555.com] En nombre de Ross Finlayson > >Enviado el: jueves, 10 de diciembre de 2009 18:14 > >Para: LIVE555 Streaming Media - development & use > >Asunto: Re: [Live-devel] Crash on TaskScheduler DoEventLoop > > > >>We implemented a code to obtain raw data from cameras IP via RTSP > >>using Live 555. > >>In order to obtain it, we create a class which each time a new > >>camera it is connected > >>a new thread of the class is started. > > > >Have you read the FAQ entry about threads? > >-- > > > >Ross Finlayson > >Live Networks, Inc. > >http://www.live555.com/ > >_______________________________________________ > >live-devel mailing list > >live-devel at lists.live555.com > >http://lists.live555.com/mailman/listinfo/live-devel > > > >_______________________________________________ > >live-devel mailing list > >live-devel at lists.live555.com > >http://lists.live555.com/mailman/listinfo/live-devel > > > > From rmcouat at smartt.com Fri Dec 11 09:24:40 2009 From: rmcouat at smartt.com (Ron McOuat) Date: Fri, 11 Dec 2009 09:24:40 -0800 Subject: [Live-devel] Question In-Reply-To: <5f1066820912110413x2381b3aer36cb800f08058b95@mail.gmail.com> References: <5f1066820912110413x2381b3aer36cb800f08058b95@mail.gmail.com> Message-ID: <4B228058.4050805@smartt.com> IP cameras don't always deliver the frame rate requested because under low light conditions the "shutter speed" is dropped to a slow enough value the frames can't be generated fast enough. Nycko wrote: > Hi, I openRTSP the 24h running against a couple of IP cameras of > different brands, The problem I have is that recordings are more > accelerated in the night and almost normally on the day (I suppose it > is light). I wonder if it is a matter of parameters or a problem with > the cameras. > Nose if this is the list referred to, if not apologize. > > Thanks > From pfilists at gmail.com Sun Dec 13 12:58:34 2009 From: pfilists at gmail.com (Peter Lists) Date: Sun, 13 Dec 2009 21:58:34 +0100 Subject: [Live-devel] VoD rtp multicast transcoding Message-ID: <722bbdb30912131258v360ad0c8udb5fd4de0caf9760@mail.gmail.com> Hello, I'd like to ask, if is it possible with live media server to setup this type of configuration. I have clients which are behind "stupid" NAT machines which aren't under my administration. Problem is, that I need to serve for this clients unicast VoD transcoded mutlicast streams from our network. With VLC it can be set without problems, but VLC server side doesn't support RTSP over TCP or rtsp-http-tunneled, only UDP transport protocol is supported - so NATed clients can't receive streams (rtsp over udp is problem). So question is, if is it possible with live media server to setup VoD receiving of rtp://@stream and on-the-fly transcoding to h264 and serving this VoD transcoded stream to client via RTCP (RTCP over TCP) or RTSP-HTTP-tunneled. So when client connect to live media server via RTCP, VoD function starts to receive rtp://@ multicast stream and on-the-fly transcodes it to H.264 and serves it to client. Can be this configuration realized ? Thank you for ideas&help. Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From orbit.huang at icatchinc.com Mon Dec 14 02:09:41 2009 From: orbit.huang at icatchinc.com (orbit) Date: Mon, 14 Dec 2009 18:09:41 +0800 Subject: [Live-devel] streaming H.264 with changing resolution Message-ID: > >I use Live555 to stream H.264 from the embedded borad and find a bug: > > > >for my borad, user could change video's resolution, at beginning > >user recive H.264 from the borad , it plays well, but if user > >disconnect then reconnect with different resolution, client player > >does not decode frames correctly because SPS and PPS are not the > >same as previous caused by fSPLines is not NULL (see below code) and > >media field of SDP doesn't be set again. > > Unfortunately, for most servers, the current behavior - setting up > "fSDPLines" only once, when the first client connects - will be the > desired behavior, because (for most servers) the stream's SDP > parameters will not change from client to client. Also, in some > cases, it might be costly to figure out the SDP parameters (e.g., it > might involve doing some reading/parsing of the input source), so we > would not want to generate these more than once. Therefore the > current default behavior should remain. > > What you can do, however, is reimplement the "sdpLines()" virtual > function in your "OnDemandServerMediaSubsession" subclass, as follows: > > char const* yourSubclass::sdpLines() { > delete[] fSDPLines; fSDPLines = NULL; > > return OnDemandServerMediaSubsession::sdpLines(); > } > Alternatively, you could just add > delete[] fSDPLines; fSDPLines = NULL; > to your implementation of the "createNewRTPSink()" virtual function. > > For my need, I add a virtual const char* sdpLines(FramedSource* inputSource) = 0 to SeverMediaSession.cpp and reimplament myServerMediaSubsession const char* myServerMediaSubsession::sdpLines(FramedSource* inputSource) { if (fSDPLines != NULL) { delete[] fSDPLines; fSDPLines = NULL; } if (fSDPLines == NULL) { ...... ...... RTPSink* dummyRTPSink = createNewRTPSink(&dummyGroupsock, rtpPayloadType, inputSource); setSDPLinesFromRTPSink(dummyRTPSink, inputSource); Medium::close(dummyRTPSink); } return fSDPLines; } > For either of these solutions to work, the definition of "fSDPLines" > in "liveMedia/include/OnDemandServerMediaSubsession.hh" will need to > be changed from "private" to protected". (I'll make this change in > the next release of the software.) > I'm curious, though. You talk about the user "reconnect(ing) with a > different resolution". How are you doing this in the RTSP protocol? > Because you must be creating your "H264VideoRTPSink" with a different > "sprop_parameter_sets_str" parameter each time (in order for its SDP > description to change), you must be making the > "sprop_parameter_sets_str" parameter be somehow dependent upon the > client-desired resolution. How are you doing this? > I add a H264VideoStreamFramer member (fStreamSource) into RTSPCilentSession, my H264VideoStreamFramer creates a thread to read stream from the board and write to FIFO. when server accept clinet's first request(DESCRIBE or OPTIONS), RTSPCilentSession checks if FIFO has data and make the "sprop_parameter_sets_str" envir().taskScheduler().turnOnBackgroundReadHandling(fStreamSource->getClientReadPipe(), (TaskScheduler::BackgroundHandlerProc*)&incomingSPSAndPPSHandler, this); If it gets "sprop_parameter_sets_str", it will process commands continuously otherwise RTSPCilentSession will terminate because of time out. orbit. -------------- next part -------------- An HTML attachment was scrubbed... URL: From kevin.liu at innosofts.com Mon Dec 14 19:30:33 2009 From: kevin.liu at innosofts.com (Kevin.Liu) Date: Tue, 15 Dec 2009 11:30:33 +0800 Subject: [Live-devel] increase socket receive buffer to handle udp packet loss ? Message-ID: <000001ca7d36$fd1331c0$f7399540$@liu@innosofts.com> Hi , My live555 for wm6.1 lose udp packet about 20%-40%. And I found if I receive the data with other procedure written by myself which is only receive udp packet on certain port, the packet lost was only 10% . I have enlarged the socket recv buffer to 4000000 . I found when the live555 receives the data from internet it will deposit the data to the ReorderingPacketBuffer and then use the *source " getNextCompletedPakcet " to get the packet from ReorderingPacketBuffer, finally move the data to the buffer of the *sink. The process will consume some time. And if the udp packet come but the process has not finished ,the packet will be deposited to the socket recv buffer until the socket recv buffer full. If the socket recv buffer full ,the udp packet will be abandoned. This cause the packet loss. Is that right ? The size of socket receive buffer should has a max limit , I set it to 4000000 but it did not seems works good . is there any suggestions for the live555 for window mobile which will handle the packet loss ? Thanks advance . Best regards, Kevin Liu -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 14 23:23:24 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Dec 2009 23:23:24 -0800 Subject: [Live-devel] increase socket receive buffer to handle udp packet loss ? In-Reply-To: <000001ca7d36$fd1331c0$f7399540$@liu@innosofts.com> References: <000001ca7d36$fd1331c0$f7399540$@liu@innosofts.com> Message-ID: I've just added the following paragraph to the FAQ entry on packet loss: "It's important to understand that because a LIVE555 Streaming Media application runs as a single thread - never writing to, or reading from, sockets concurrently - then if packet loss occurs, then it *must* be happening either (i) on the network, or (ii) in the operating system of the sender or receiver. There's nothing in our code that can be 'losing' packets." -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jaguilera at a5security.com Tue Dec 15 01:29:57 2009 From: jaguilera at a5security.com (Josep Aguilera) Date: Tue, 15 Dec 2009 10:29:57 +0100 Subject: [Live-devel] Restart Capture Frame Message-ID: <003901ca7d69$2bb60e30$83222a90$@com> Dear Ross, we are trying now change the treatment of the EventLoop because all our problems is the multithreading using the schedule. We changed our code using now only one thread to control that Eventloop. We do this steps: 1.- Creation of the BasicUsageEnvironment with one schedule. 2.- Initialization of all threads using that Environment. 3.- Start the EventLoop. 4.- Our application processes. 5.- To destroy, we have the problems in that point, we start doing the next code to destroy each thread (no EventLoop thread): ---------------------------------- blockingValue = ~0; // Value to Stop the EventLoop while() //Wait until the EventLoop is stopped ---------------------------------- Medium::close(ourClient); // RTSPClient* ourClient g_pCaptureCtrl->m_bEnv->reclaim(); // UsageEnvironment* m_bEnv; StopCaptureFrame(); //Stop and destroy one thread ---------------------------------- 6.- After that, we try to restart again all the threads (EventLoop thread too) repeating point 1. Then the program crashes and we can look in the call stack that information: ---------------------------------- memmove(unsigned char * 0x00000000, unsigned char * 0x02a74f74, unsigned long 1460) line 171 GRABADORA5! BufferedPacket::use(unsigned char *,unsigned int,unsigned int &,unsigned int &,unsigned short &,unsigned int &,struct timeval &,unsigned int &,unsigned int &) + 110 bytes GRABADORA5! MultiFramedRTPSource::doGetNextFrame1(void) + 232 bytes GRABADORA5! MultiFramedRTPSource::networkReadHandler(class MultiFramedRTPSource *,int) + 577 bytes GRABADORA5! BasicTaskScheduler::SingleStep(unsigned int) + 441 bytes GRABADORA5! BasicTaskScheduler0::doEventLoop(char *) + 26 bytes CCaptureCtrl::ThreadFunc(void * 0x0283de80) line 1922 + 34 bytes GRABADORA5! const DelayQueue::`vftable' address 0x006a4fd8 GRABADORA5! BasicTaskScheduler0::~BasicTaskScheduler0(void) + 47 bytes fffffcd8() ---------------------------------- The error is produced when we try to call the first time again the EventLoop: --------------------------- pCapCtrl->m_bEnv->taskScheduler().doEventLoop(&pCapCtrl->m_cBlockingValue/*0 */); --------------------------- Can do you help us to solve that? Best Regards and Thank you in advance, Josep Aguilera -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 15 02:09:24 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Dec 2009 02:09:24 -0800 Subject: [Live-devel] Restart Capture Frame In-Reply-To: <003901ca7d69$2bb60e30$83222a90$@com> References: <003901ca7d69$2bb60e30$83222a90$@com> Message-ID: >we are trying now change the treatment of the EventLoop because all >our problems is the multithreading using the schedule. We changed >our code using now only one thread to control that Eventloop. >We do this steps: >1.- Creation of the BasicUsageEnvironment with one schedule. >2.- Initialization of all threads using that Environment. WTF do you mean by this?? If you have a "UsageEnvironment" and "TaskScheduler" object, then ***exactly one thread of control, and no more*** can access them. If you have any other threads, then they can't access these objects at all. The only thing the other threads can do is set a "watchVariable" that's used as a parameter to "TaskScheduler::doEventLoop()". Reread the FAQ! >Can do you help us to solve that? If you don't understand (or try to ignore) how this code works - i.e, by using a single thread of execution using an event loop - rather than multiple threads - for concurrency, then you will get no support from me. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kevin.liu at innosofts.com Thu Dec 17 02:39:23 2009 From: kevin.liu at innosofts.com (Kevin.Liu) Date: Thu, 17 Dec 2009 18:39:23 +0800 Subject: [Live-devel] openRTSP test on windows mobile problem ? Message-ID: <000001ca7f05$39b13800$ad13a800$@liu@innosofts.com> Hi Ross : I used the openRTSP to test the live555 libs for windows mobile and I found if I open the mp3 or wav audio file on server ,the openRTSP worked good .But if I open mp4 or mpg which contains both audio and video streams ,the openRTSP would not work good. I know that the live555 is single thread structure ,the video stream should not affect the audio stream . so I was really confused by the problem ,can you give me some advice ,thank you in advance . Best regards, Kevin Liu -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 17 04:55:21 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Dec 2009 04:55:21 -0800 Subject: [Live-devel] openRTSP test on windows mobile problem ? In-Reply-To: <000001ca7f05$39b13800$ad13a800$@liu@innosofts.com> References: <000001ca7f05$39b13800$ad13a800$@liu@innosofts.com> Message-ID: > I used the openRTSP to test the live555 libs for windows mobile >and I found if I open the mp3 or wav audio file on server ,the >openRTSP worked good .But if I open mp4 or mpg which contains both >audio and video streams ,the openRTSP would not work good. http://www.live555.com/liveMedia/faq.html#my-file-doesnt-work -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From debargha.mukherjee at hp.com Thu Dec 17 11:37:08 2009 From: debargha.mukherjee at hp.com (Mukherjee, Debargha) Date: Thu, 17 Dec 2009 19:37:08 +0000 Subject: [Live-devel] Audio drift with live source Message-ID: <73833378E80044458EC175FF8C1E63D5824575750A@GVW0433EXB.americas.hpqcorp.net> HI, I am receiving uncompressed audio and video from a live Directshow source, encoding them with ffmpeg and streaming them out using a RTSP server based on classes derived from Live OnDemandServerMediaSubsession. Then I play the feed on a remote machine using VLC player. The problem is that the audio and video start playing fine and well synchronized, but then the audio starts drifting slowly. Specifically, it gets faster. Within 5-10 minutes it is noticeably faster than the video and within 15 minutes, the audio stops playing altogether. I get error messages on VLC player saying that PTS is out of range. I am not sure if this is a VLC issue, but I suspect there is something about timestamps I may be doing wrong. In my implementation of deliverFrame() function in the classes for the sources, I read uncompressed audio and video from ring buffers (which are filled by another thread), compress them, and then fill the buffers accordingly before calling FramedSource::afterGetting(this). I also set fPresentationTime using gettimeofday(&fPresentationTime, NULL); and set fDurationInMicroseconds to 1000000/30 for video and the audio frame duration for audio. Occasionally, when the deliverFrame() function tries to read from the ring buffers, it does not find data available. Then I call envir().taskScheduler().scheduleDelayedTask(...) with a small delay interval and return. Any help or clues would be appreciated. Debargha. From finlayson at live555.com Thu Dec 17 17:18:33 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Dec 2009 17:18:33 -0800 Subject: [Live-devel] Audio drift with live source In-Reply-To: <73833378E80044458EC175FF8C1E63D5824575750A@GVW0433EXB.americas.hpqcorp.ne t> References: <73833378E80044458EC175FF8C1E63D5824575750A@GVW0433EXB.americas.hpqcorp.ne t> Message-ID: >In my implementation of deliverFrame() function in the classes for >the sources, I read uncompressed audio and video from ring buffers >(which are filled by another thread), compress them, and then fill >the buffers accordingly before calling >FramedSource::afterGetting(this). I also set fPresentationTime using >gettimeofday(&fPresentationTime, NULL); and set >fDurationInMicroseconds to 1000000/30 for video and the audio frame >duration for audio. If "fPresentationTime" is set properly (to accurate wall-clock-synchronized presentation times) for both the audio and video frames, then VLC should be synchronizing them properly at the receiving end. (This is assuming, of course, that RTCP "SR" packets from the server are also reaching the client - which they should.) So, I suggest taking a closer look at the setting of "fPresentationTime" - for both audio and video frames - and making sure that they are accurate. Also, in principle, because you are reading from a live source (rather than from a prerecorded file), you need not set "fDurationInMicroseconds" (and so it will get set to its default value of 0). However, this would mean that the situation that you describe below will become the norm: >Occasionally, when the deliverFrame() function tries to read from >the ring buffers, it does not find data available. Then I call >envir().taskScheduler().scheduleDelayedTask(...) with a small delay >interval and return. This is OK, provided that (once again) you are setting the presentation time properly. Ideally, you should be recording the presentation time (obtained by calling "gettimeofday()") at the time that the data is encoded, not at the time that it gets read from your ring buffer by your "FramedSource" subclass. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gabrieledeluca at libero.it Fri Dec 18 03:36:59 2009 From: gabrieledeluca at libero.it (gabrieledeluca at libero.it) Date: Fri, 18 Dec 2009 12:36:59 +0100 (CET) Subject: [Live-devel] Shared memory Message-ID: <30917646.34231261136219808.JavaMail.defaultUser@defaultHost> Hi, I have implemented a client RTSP that record the stream in 2 buffer files. So I have implemented a server RTSP that read from the buffer files. I want know if it's possible create a wrapper that writes and reads on a shared memory (and not on buffer files). Thanks in advance From jnoring at logitech.com Fri Dec 18 07:57:11 2009 From: jnoring at logitech.com (Jeremy Noring) Date: Fri, 18 Dec 2009 08:57:11 -0700 Subject: [Live-devel] Shared memory In-Reply-To: <30917646.34231261136219808.JavaMail.defaultUser@defaultHost> References: <30917646.34231261136219808.JavaMail.defaultUser@defaultHost> Message-ID: <988ed6930912180757m14155b52te4e4141faf02d186@mail.gmail.com> On Fri, Dec 18, 2009 at 4:36 AM, gabrieledeluca at libero.it < gabrieledeluca at libero.it> wrote: > Hi, > I have implemented a client RTSP that record the stream in 2 buffer files. > > So I have implemented a server RTSP that read from the buffer files. > > I want > know if it's possible create a wrapper that writes and reads on a shared > memory > (and not on buffer files). > Of course it's possible, although I don't think that has anything to do with this mailing list. Depending on what OS you're using, there are several possibilities: http://en.wikipedia.org/wiki/Shared_memory So long as you follow the threading guidelines for Live555 (see the FAQ), you can also put client/server in the same process. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 18 14:15:59 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Dec 2009 14:15:59 -0800 Subject: [Live-devel] Shared memory In-Reply-To: <30917646.34231261136219808.JavaMail.defaultUser@defaultHost> References: <30917646.34231261136219808.JavaMail.defaultUser@defaultHost> Message-ID: >I have implemented a client RTSP that record the stream in 2 buffer files. > >So I have implemented a server RTSP that read from the buffer files. > >I want >know if it's possible create a wrapper that writes and reads on a >shared memory >(and not on buffer files). You can just use pipes, if your operating system supports them. They provide the abstraction (at each end) of an open file, but implemented by shared memory. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From smurthi3 at gatech.edu Sat Dec 19 10:57:34 2009 From: smurthi3 at gatech.edu (smurthi3 at gatech.edu) Date: Sat, 19 Dec 2009 13:57:34 -0500 (EST) Subject: [Live-devel] Packet Loss simulation In-Reply-To: <1280303284.256981261248932003.JavaMail.root@mail4.gatech.edu> Message-ID: <1186217986.257111261249054792.JavaMail.root@mail4.gatech.edu> Hi All, I am trying to simulate various levels of packet loss in the live555 streaming server ranging from 5-50% in steps of 5. The change has to be done in MultiFramedRTPSource.cpp/MultiFramedRTPSink.cpp but it uses (our_random() % 10) to generate 10% packet loss. This method does not work for values other than 10%. The other possibility is to use fSeqNo instead of our_random. But this does not create a truly random scenario. Any ideas on this? From aniworld1248 at gmail.com Sat Dec 19 11:34:11 2009 From: aniworld1248 at gmail.com (ani ani) Date: Sun, 20 Dec 2009 01:04:11 +0530 Subject: [Live-devel] Fwd: Problem in recieving AMR, AAC and WAV In-Reply-To: <147cd0cc0912191126y26be69b5nc7338d2ee1810137@mail.gmail.com> References: <147cd0cc0912191126y26be69b5nc7338d2ee1810137@mail.gmail.com> Message-ID: <147cd0cc0912191134g65ef7d88kc86615239edf702f@mail.gmail.com> ---------- Forwarded message ---------- From: ani ani Date: Sun, Dec 20, 2009 at 12:56 AM Subject: Problem in recieving AMR, AAC and WAV To: support at live555.com, live-devel at lists.live555.com Hello, I am using testOnDemandRTSPServer as a Server Application and openRTSP as the client application. When I am trying to recieve AMR, AAC and WAV streams using the URL as specified by the Server, then I am not able to play them back using players like VLC, QT, etc. However, when I am recieving MP3 stream using the URL as specified by the Server, I am able to play it back using VLC, Winamp, etc. I used WAV (PCM-L16) and AMR (Wide Band) streams for the streaming server. All the files I used are playing before streamed. But when I am recieving the same through the Server, then I am not able to play them back. Please suggest what shall I do so that I can playback the recieved streams. Regards Ani -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Dec 19 18:44:49 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Dec 2009 19:44:49 -0700 Subject: [Live-devel] Packet Loss simulation In-Reply-To: <1186217986.257111261249054792.JavaMail.root@mail4.gatech.edu> References: <1186217986.257111261249054792.JavaMail.root@mail4.gatech.edu> Message-ID: >I am trying to simulate various levels of packet loss in the live555 >streaming server ranging from 5-50% in steps of 5. The change has to >be done in MultiFramedRTPSource.cpp/MultiFramedRTPSink.cpp but it >uses (our_random() % 10) to generate 10% packet loss. > >This method does not work for values other than 10%. The other >possibility is to use fSeqNo instead of our_random. Oh please. To simulate N% packet loss (where N is an integer), just test for (our_random()%100) < N If N = 10, this is (our_random()%100) < 10 which is equivalent to (our_random()%10) == 0 -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kevin.liu at innosofts.com Sun Dec 20 02:01:49 2009 From: kevin.liu at innosofts.com (Kevin.Liu) Date: Sun, 20 Dec 2009 18:01:49 +0800 Subject: [Live-devel] QOS message of live555 ? Message-ID: <000601ca815b$7475ba80$5d612f80$@liu@innosofts.com> Hi all : I have made a test on window mobile for openRTSP and I also add the "-q" option which will print the qos message . And i got the print message showed that I have lost about 30% packets . I have extended the socket receive buffer to 2,000,000 and I also used the live555 media server included in the source code. The strange thing is that if I transport only audio stream like "mp3 file" the packet lost was very small ,but if I transport both audio and video stream like "mp4,mpg files" the packet loss was really big. And I found that a 6M mpg file only transport 37 seconds (sdp message "a=range:npt=0-37.160") but a 5M mp3 file transport 301 seconds in the same network environment. My live555 version is 2009,9,28 and I have test the lib on pc it worked really good . Any advice for the question ? please help me out of the strange problem ! Thank in advance. Following are the sdp message I got : 1.Mpg file(size 6M) v=0 o=- 924739507 1 IN IP4 192.168.1.100 s=MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server i=pict.mpg t=0 0 a=tool:LIVE555 Streaming Media v2009.09.28 a=type:broadcast a=control:* a=range:npt=0-37.160 a=x-qt-text-nam:MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server a=x-qt-text-inf:pict.mpg m=video 0 RTP/AVP 32 c=IN IP4 0.0.0.0 a=control:track1 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 a=control:track2 2.Mp3 file(size 5M) v=0 o=- 5850143118 1 IN IP4 192.168.1.100 s=MPEG-1 or 2 Audio, streamed by the LIVE555 Media Server i=test3.mp3 t=0 0 a=tool:LIVE555 Streaming Media v2009.09.28 a=type:broadcast a=control:* a=range:npt=0-301.840 a=x-qt-text-nam:MPEG-1 or 2 Audio, streamed by the LIVE555 Media Server a=x-qt-text-inf:test3.mp3 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 a=control:track1 Best regards, Kevin Liu -------------- next part -------------- An HTML attachment was scrubbed... URL: From kevin.liu at innosofts.com Sun Dec 20 05:40:42 2009 From: kevin.liu at innosofts.com (Kevin.Liu) Date: Sun, 20 Dec 2009 21:40:42 +0800 Subject: [Live-devel] QOS message of live555 ? In-Reply-To: <000601ca815b$7475ba80$5d612f80$@liu@innosofts.com> References: <000601ca815b$7475ba80$5d612f80$@liu@innosofts.com> Message-ID: <001101ca817a$0870e4d0$1952ae70$@liu@innosofts.com> Hi all , And also I found if I change the transport protocol to ??tcp?? with ?Ct option of openRTSP , its performance was worse than ??udp??. It is really strange. TCP should perform much better than udp ,but the result was on the other hand . Best regards, Kevin Liu From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Kevin.Liu Sent: 2009??12??20?? 18:02 To: live-devel at ns.live555.com Subject: [Live-devel] QOS message of live555 ? Hi all : I have made a test on window mobile for openRTSP and I also add the ??-q?? option which will print the qos message . And i got the print message showed that I have lost about 30% packets . I have extended the socket receive buffer to 2,000,000 and I also used the live555 media server included in the source code. The strange thing is that if I transport only audio stream like ??mp3 file?? the packet lost was very small ,but if I transport both audio and video stream like ??mp4,mpg files?? the packet loss was really big. And I found that a 6M mpg file only transport 37 seconds (sdp message ??a=range:npt=0-37.160??) but a 5M mp3 file transport 301 seconds in the same network environment. My live555 version is 2009,9,28 and I have test the lib on pc it worked really good . Any advice for the question ? please help me out of the strange problem ! Thank in advance. Following are the sdp message I got : 1.Mpg file(size 6M) v=0 o=- 924739507 1 IN IP4 192.168.1.100 s=MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server i=pict.mpg t=0 0 a=tool:LIVE555 Streaming Media v2009.09.28 a=type:broadcast a=control:* a=range:npt=0-37.160 a=x-qt-text-nam:MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server a=x-qt-text-inf:pict.mpg m=video 0 RTP/AVP 32 c=IN IP4 0.0.0.0 a=control:track1 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 a=control:track2 2.Mp3 file(size 5M) v=0 o=- 5850143118 1 IN IP4 192.168.1.100 s=MPEG-1 or 2 Audio, streamed by the LIVE555 Media Server i=test3.mp3 t=0 0 a=tool:LIVE555 Streaming Media v2009.09.28 a=type:broadcast a=control:* a=range:npt=0-301.840 a=x-qt-text-nam:MPEG-1 or 2 Audio, streamed by the LIVE555 Media Server a=x-qt-text-inf:test3.mp3 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 a=control:track1 Best regards, Kevin Liu -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Dec 20 05:49:10 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 20 Dec 2009 07:49:10 -0600 Subject: [Live-devel] QOS message of live555 ? In-Reply-To: <001101ca817a$0870e4d0$1952ae70$@liu@innosofts.com> References: <000601ca815b$7475ba80$5d612f80$@liu@innosofts.com> <001101ca817a$0870e4d0$1952ae70$@liu@innosofts.com> Message-ID: > And also I found if I change the transport protocol to "tcp" >with -t option of openRTSP , its performance was worse than "udp". >It is really strange. TCP should perform much better than udp ,but >the result was on the other hand . OK, this tells me that you're seeing severe network congestion and packet loss, which tells me that your network simply does not have the capacity for the stream(s) that you are trying to send. There is nothing that can be done to overcome this (other than getting a higher-capacity network). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jnoring at logitech.com Sun Dec 20 14:41:25 2009 From: jnoring at logitech.com (Jeremy Noring) Date: Sun, 20 Dec 2009 15:41:25 -0700 Subject: [Live-devel] QOS message of live555 ? In-Reply-To: <661482154562359693@unknownmsgid> References: <661482154562359693@unknownmsgid> Message-ID: <988ed6930912201441w72691756j50a3b7afc70d65cc@mail.gmail.com> On Sun, Dec 20, 2009 at 3:01 AM, Kevin.Liu wrote: > The strange thing is that if I transport only audio stream like ?mp3 > file? the packet lost was very small ,but if I transport both audio and > video stream like ?mp4,mpg files? the packet loss was really big. > > And I found that a 6M mpg file only transport 37 seconds (sdp message ? > a=range:npt=0-37.160?) but a 5M mp3 file transport 301 seconds in the > same network environment. > This is not strange, it's perfectly normal. Doing the math, your video file is 1.3 mbit/sec (48 mbit / 37 seconds), while your mp3 file is 132 kbit/sec (40 mbit / 301 seconds), or roughly one-tenth the bandwidth of your video file. Your problem is obvious: your network does not have enough bandwidth to transmit the video in real-time, and you're experiencing packet loss (from the sound of it, you have about 1 mbit/sec of bandwidth). You're experiencing packet loss because your network is insufficient to transmit the video stream, which is _much_ larger than the audio stream. Like Ross said, get more bandwidth or smaller media. -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.nemec at atlas.cz Sun Dec 20 23:27:58 2009 From: a.nemec at atlas.cz (=?utf-8?B?TsSbbWVjIEFsZXhhbmRy?=) Date: Mon, 21 Dec 2009 07:27:58 GMT Subject: [Live-devel] openRTSP 2x Message-ID: An HTML attachment was scrubbed... URL: -------------- next part -------------- Dear all, I have two questions regarding openRTSP. 1. I have a H264 camera with built-in RTSP server (an IP camera) and I am using openRTSP to receive the video stream. It works nice, I can receive frames of type IDR slices and other coded slices. However, a standard H264 also contains a SPS (sequence parameter set) NAL unit followed by a PPS (picture parameter set) NAL unit followed by the first IDR frame. The IP camera manufacturer confirmed that they include the SPS+PPS "header bytes" before the very first IDR frame and also before some later IDR frames at regular intervals. As I receive clean frames from live555 library without SPS+PPS I ust want to ask, if there is the possibility to somehow get these bytes, because I want to save them before the start of the stream. I tested the incoming frames and the first byte tells me, that the frames are clean IDR slices and other coded slices (NAL unit types 5 and 1). The SPS and PPS "header bytes" are missing. 2. Is there a way to set a custom timeout value for the RTSP client that the client would use for each command like PLAY, TEARDOWN or OPTIONS? I would like to be able to change the timeout and to force the RTSP client to complete a method master when the RTSP server is unavailable. I searched the code and found some timeout parameters, but I am not sure that I am doing the right thing. Thanks very much in advance Alex From guillaume.grimaldi at c-s.fr Mon Dec 21 02:33:39 2009 From: guillaume.grimaldi at c-s.fr (Guillaume Grimaldi) Date: Mon, 21 Dec 2009 11:33:39 +0100 Subject: [Live-devel] playing stream h264 with vlc Message-ID: <4B2F4F03.7050700@c-s.fr> Hi, I have implemented a server RTP that stream a H264 video. I can read this stream with VLC on Linux (version 0.9.9), but I can not read it with VLC on Windows (version 1.0.3). Nervertheless, if I read it with an implementation of liveMedia, it's work on linux. Do you know why does it not work with VLC on Windows ? Thanks. From jnoring at logitech.com Mon Dec 21 08:28:43 2009 From: jnoring at logitech.com (Jeremy Noring) Date: Mon, 21 Dec 2009 08:28:43 -0800 Subject: [Live-devel] openRTSP 2x In-Reply-To: References: Message-ID: <988ed6930912210828of5bde94saa403ff46e51ddaf@mail.gmail.com> 2009/12/20 N?mec Alexandr > Dear all, > I have two questions regarding openRTSP. > 1. I have a H264 camera with built-in RTSP server (an IP camera) and I am > using openRTSP to receive the video stream. It works nice, I can receive > frames of type IDR slices and other coded slices. However, a standard H264 > also contains a SPS (sequence parameter set) NAL unit followed by a PPS > (picture parameter set) NAL unit followed by the first IDR frame. The IP > camera manufacturer confirmed that they include the SPS+PPS "header bytes" > before the very first IDR frame and also before some later IDR frames at > regular intervals. As I receive clean frames from live555 library without > SPS+PPS I ust want to ask, if there is the possibility to somehow get these > bytes, because I want to save them before the start of the stream. I tested > the incoming frames and the first byte tells me, that the frames are clean > IDR slices and other coded slices (NAL unit types 5 and 1). The SPS and PPS > "header bytes" are missing. They should be populating the sprop-parameter-sets parameter in the SDP exchange with the SPS/PPS info (see RFC3984). Sending the info in-stream is not recommended and makes writing a decent client more difficult. Check the SDP exchange to see if they are correctly setting this; you should see someting like: sprop-parameter-sets=Z0IACpZTBYmI,aMljiA== ...the SPS and PPS info are base-64 encoded, and delimited with a comma. > 2. Is there a way to set a custom timeout value for the RTSP client that > the client would use for each command like PLAY, TEARDOWN or OPTIONS? I > would like to be able to change the timeout and to force the RTSP client to > complete a method master when the RTSP server is unavailable. I searched the > code and found some timeout parameters, but I am not sure that I am doing > the right thing. > Thanks very much in advance > Alex See the source code to openRTSP (and specifically, the method calls on RTSPc lient); there are several timeouts you can specify. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jens.binder at hhi.fraunhofer.de Mon Dec 21 01:15:27 2009 From: jens.binder at hhi.fraunhofer.de (Jens Binder) Date: Mon, 21 Dec 2009 10:15:27 +0100 Subject: [Live-devel] openRTSP 2x In-Reply-To: References: Message-ID: Hello, Most probably your SPS and PPS-Headers are in the SDP-Information of your RTSP-DESCRIBE-Response. Look RFC 3984 in section 8.2. for more information. Normally both Header are Base64-encryptet in the "sprop-parameter-sets". jens. N?mec Alexandr schrieb: > Dear all, > I have two questions regarding openRTSP. > 1. I have a H264 camera with built-in RTSP server (an IP camera) and I am using openRTSP to receive the video stream. It works nice, I can receive frames of type IDR slices and other coded slices. However, a standard H264 also contains a SPS (sequence parameter set) NAL unit followed by a PPS (picture parameter set) NAL unit followed by the first IDR frame. The IP camera manufacturer confirmed that they include the SPS+PPS "header bytes" before the very first IDR frame and also before some later IDR frames at regular intervals. As I receive clean frames from live555 library without SPS+PPS I ust want to ask, if there is the possibility to somehow get these bytes, because I want to save them before the start of the stream. I tested the incoming frames and the first byte tells me, that the frames are clean IDR slices and other coded slices (NAL unit types 5 and 1). The SPS and PPS "header bytes" are missing. > 2. Is there a way to set a custom timeout value for the RTSP client that the client would use for each command like PLAY, TEARDOWN or OPTIONS? I would like to be able to change the timeout and to force the RTSP client to complete a method master when the RTSP server is unavailable. I searched the code and found some timeout parameters, but I am not sure that I am doing the right thing. > Thanks very much in advance > Alex > > > > ------------------------------------------------------------------------ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From ansary858 at hotmail.com Mon Dec 21 19:25:20 2009 From: ansary858 at hotmail.com (ansary mohamed) Date: Tue, 22 Dec 2009 11:25:20 +0800 Subject: [Live-devel] Live streaming of h.264 using encoder Message-ID: hi there, I am currently working on live streaming of h.264 from my TI dm 6446. My camera captures the frames and my board encodes it into h.264 video format. Since my board is running a program which is coded in C, I was wondering whether is it possible to implement deviceSource.cpp to read from socket and do the streaming? If not, what ways can achieve this. Your input is much appreciated. Thanks. Regards Ansary _________________________________________________________________ Windows 7: Find the right PC for you. Learn more. http://windows.microsoft.com/shop -------------- next part -------------- An HTML attachment was scrubbed... URL: From marcos at a5security.com Tue Dec 22 03:58:31 2009 From: marcos at a5security.com (Miguel Angel Arcos) Date: Tue, 22 Dec 2009 12:58:31 +0100 Subject: [Live-devel] Intra Frame Identification (I-Frame) Message-ID: <39b9a2b30912220358h6f9eb5afg64c3eb7c9a3a0a1e@mail.gmail.com> Dear Ross, we have been reading all the documentation (the FAQ and the source code examples) trying to find some information about the I-Frame identification in MPEG4 video streaming with no results. In our application (single thread) we can't find on the headers of each packet that information. Exists any way to know if the packet is I-Frame or B-Frame? Thank you in advance. -- Miguel Angel Arcos -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 22 04:14:11 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Dec 2009 06:14:11 -0600 Subject: [Live-devel] Intra Frame Identification (I-Frame) In-Reply-To: <39b9a2b30912220358h6f9eb5afg64c3eb7c9a3a0a1e@mail.gmail.com> References: <39b9a2b30912220358h6f9eb5afg64c3eb7c9a3a0a1e@mail.gmail.com> Message-ID: >we have been reading all the documentation (the FAQ and the source >code examples) trying to find some information about the I-Frame >identification in MPEG4 video streaming with no results. > >In our application (single thread) we can't find on the headers of >each packet that information. Exists any way to know if the packet >is I-Frame or B-Frame? Look at the code for "MPEG4VideoStreamDiscreteFramer" for an example of how to do this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rose.roo at sidc.com.tw Fri Dec 25 00:10:28 2009 From: rose.roo at sidc.com.tw (=?UTF-8?B?Iue+hee/jOS+gShSUiki?=) Date: Fri, 25 Dec 2009 16:10:28 +0800 Subject: [Live-devel] RTSP trick play, extra PES packet issue... Message-ID: <4B347374.4050906@sidc.com.tw> An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 25 02:58:48 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Dec 2009 03:58:48 -0700 Subject: [Live-devel] RTSP trick play, extra PES packet issue... In-Reply-To: <4B347374.4050906@sidc.com.tw> References: <4B347374.4050906@sidc.com.tw> Message-ID: >For now, our RTSP client is working well >on LIVE555 Streaming Server, with normal play & pause function. > >But we're facing an issue with trick play, the story is: > >Start RTSP client to play a content (MPEG2 TS, indexed by your >MPEG2TransportStreamIndexer) >Start to trick play (FF 2x). >For example, start trick play from 00:01:20 of movie. >The output picture was frozen, but player still working >Back to normal play >The picture will going on playback, but from 00:01:42 of movie. Try using "openRTSP" as your client, requesting trick play with the -s 80 -z 2 -d 11 options (see ) I.e., request a 2x play stream, starting at time 1:20, lasting for 11 seconds. Do you get an output Transport Stream file that plays the way you expect? You can also test 'trick play' operation on your original Transport Stream file (i.e., the one that you indexed with "MPEG2TransportStreamIndexer") by running the "testMPEG2TransportStreamTrickPlay" application - see http://www.live555.com/liveMedia/transport-stream-trick-play.html If you don't get the results you expect, then please post a link to your original Transport Stream file (i.e., the one that you indexed with "MPEG2TransportStreamIndexer"), so we can download it and check it for ourself. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From cjc3 at verizon.net Sun Dec 27 23:00:59 2009 From: cjc3 at verizon.net (Chris) Date: Mon, 28 Dec 2009 02:00:59 -0500 Subject: [Live-devel] Need Help with test*Streamer From a Live Source Message-ID: <4B3857AB.6040206@verizon.net> Greetings, I am following the FAQ example and have modified 'testMPEG4VideoStreamer' and 'DeviceSource.cpp' to take framed input from my MPEG4 video encoder, but I am having trouble getting it working properly. I have first verified my video data by writing it out to a file then streaming it from that file to VLC via the unmodified testMPEG4VideoStreamer. This works just fine. When I modify the program to take my live framed source, at issue is doGetNextFrame() does not get called fast enough to keep up with my video data. Some possible issues: - I run liveMedia in one thread, and the video source FIFO in another. When I call deliverFrame() it is from the FIFO thread, not the liveMedia thread. Is this a no-no? - My card doesn't support more than 22 frame buffers. Once these fill up dropped frames are to be expected. Will I have to add intermediate buffering? Does liveMedia take some time to digest the initial start of the stream, or can I expect deliverFrame() to be called with low latency at most times? The cadence of video frame enqueues (denoted by +) to liveMedia doGetNextFrames (denoted by -) looks something like this: -+-+++++++++++++++++++++++-+-+-+-+-+-+-+-+-----------------------+-+++++++++++++++++++++++-----------------------+++++++++++++++++++++++ liveMedia accepts the first two frames right away, then my FIFO fills up, then in about 1 second intervals doGetNextFrame() is called 8 times, then a burst of doGetNextFrame() as the FIFO is emptied at once, followed by the FIFO filling up again. Some info: - In my deliverFrame() I don't set fDurationInMicroseconds, as this is a live feed. - I set the presentation time like so: gettimeofday(&fPresentationTime, NULL); - I connect my videoSource via the MPEG4VideoStreamDiscreteFramer to the videoSink Any ideas? Thanks, Chris From finlayson at live555.com Mon Dec 28 04:15:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Dec 2009 04:15:26 -0800 Subject: [Live-devel] Need Help with test*Streamer From a Live Source In-Reply-To: <4B3857AB.6040206@verizon.net> References: <4B3857AB.6040206@verizon.net> Message-ID: >- I run liveMedia in one thread, and the video source FIFO in >another. When I call deliverFrame() it is from the FIFO thread, not >the liveMedia thread. Is this a no-no? Yes. 'liveMedia' code *must* be called from within one thread only. The only liveMedia-related thing that you can do with the other thread is set a global variable that can (for example) be seen by the other (i.e., liveMedia event loop) thread to indicate an event. For example, if you are using the "doEventLoop()" 'watchVariable' feature, you can set the 'watchVariable' from within the second thread. >- My card doesn't support more than 22 frame buffers. Once these >fill up dropped frames are to be expected. Will I have to add >intermediate buffering? Does liveMedia take some time to digest the >initial start of the stream, or can I expect deliverFrame() to be >called with low latency at most times? If you are using "MPEG4VideoStreamDiscreteFramer" (instead of "MPEG4VideoStreamFramer") between your input source object and your "MPEG4ESVideoRTPSink" object, then the right thing should happen. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From amit.yedidia at elbitsystems.com Wed Dec 30 00:57:17 2009 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Wed, 30 Dec 2009 10:57:17 +0200 Subject: [Live-devel] Sending lots of NAL's problem Message-ID: <49B2D30CB72A5A44BCCAFB02670283787EBBE6@mailesl5.esl.corp.elbit.co.il> Hi All, I am using the LIVE555 to stream h.264 bitstream My source is making NALS with limited size so each frame may contain more than 1 NAL.( I recived all of them together from the HW). After I received them I am creating small descriptors describing each NAL (pointer, size) and send ALL of them to a socket that was assigned to a turnOnBackgroundReadHandling This should wake up the the scheduler to call the deliverFrame. In the mysource::deliverFrame I am reading only one descriptor each time from the socket and passing it forward up to the discreteFramer. In the discreteFramer I check if this is the last NAL in the frame, and if NOT, I set the fDurationInMicroseconds to 0 so the doGetNextFrame will be called immediately again (there are more descriptors in the socket) so the fIsCurrentlyAwaitingData will be set to TRUE. I see that the schedualer is launching my deliverFrame about every 1-2msec, when I got multiple descriptors in the socket. The problem is that when I got lost of descriptors (~80) this can take up to 160 msec which is way beyond my frame size which is 40 msec (PAL) 1. Am I doing this wrong? 2. Is there are a way to make the schedualer launch the the deliverFrame faster? 3. Is the schedualer launching the task again when data is left at the socket? 4. what is the tick resolution of the schedualer? Thanks. Amit Yedidia The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.nemec at atlas.cz Wed Dec 30 00:28:35 2009 From: a.nemec at atlas.cz (=?utf-8?B?TsSbbWVjIEFsZXhhbmRy?=) Date: Wed, 30 Dec 2009 08:28:35 GMT Subject: [Live-devel] Retrieving media with RTSP/RTP Message-ID: <242ebc8c35f44a8ba17529385d0008cb@0703d9fa4ccc45c7925454a6588976e4> An HTML attachment was scrubbed... URL: -------------- next part -------------- Dear all, we have compiled the live media source code on Windows and are using it to retrieve media with RTSP/RTP. It runs very well and stable, however we have some questions as follows. 1. The first one is easy. We have compiled the openRTSP functionality to a library to be more flexible. Since a WSAStartup call is necessary on Windows, this is correctly done in the initializeWinsockIfNecessary call - the "f...ing lame comment" in the source code is so true :). Well, the obvious questions is where is the WSACleanup call? Well, there is a WSACleanup in initializeWinsockIfNecessary but only called when the winsock version is not found. So using live media on Windows should also require WSACleanup, which must be used in pair with WSAStartup. We solved this by removing the entire initializeWinsockIfNecessary and we are using a WSAStartup and WSACleanup calls outside the live media source code in our library. 2. The second one is more difficult. To be able to retrieve RTSP controlled streams from more than one RTSP server in one process we are using worker threads with separate usage of live media code. This works fine as we respect the threading quidelines described in FAQ. Now - the live media source code contains some exit commands that we are afraid of. What I mean - if there is a general problem in one of the threads and live media source code calls exit() as a reaction to this occurrence, the entire process exits including all other threads that did not cause the error. Since the process using the live media code can be responsible for more than just retrieving media, if live media calls exit, this can be a problem for the entire process. We would like to be able to continue the process and recover from such errors in another way than exiting the process to be more stable. Is it possible? Without good code knowledge we are not able to decide how dangerous the exit calls are for our process and what are the circumstances leading to exiting the process. Especially (there is only one exit command in each of the files below): - the exit commands in AMRAudioRTPSource.cpp and QCELPAudioRTPSource.cpp - under which circumstances can this happen? There are exit calls after the sanity check on the parameters fails. - the exit command in BasicTaskScheduler.cpp - if select fails, is it possible to "return" from SingleStep instead of "exit" the process? - the exit command in FramedSource.cpp and MPEG1or2Demux.cpp - exit is called when the stream is read twice. Can this happen when using the live media code as is? - the exit command in StreamParser.cpp - again, can this happen when using the source code as is? Is it possible to recover from this error in another way? There are some other exit commands in MP3* files, but these are not as important for us to understand as the exit commands mentioned above. Thanks for any comments and sorry for a long post. Alex From pdherbemont at free.fr Wed Dec 30 04:57:38 2009 From: pdherbemont at free.fr (Pierre d'Herbemont) Date: Wed, 30 Dec 2009 13:57:38 +0100 Subject: [Live-devel] Git/SVN repo? Message-ID: Hi there, I failed to find a git/svn repo for live555. I think it would ease development if there was one. Any reason why there is none? Pierre. From pdherbemont at free.fr Wed Dec 30 04:59:59 2009 From: pdherbemont at free.fr (Pierre d'Herbemont) Date: Wed, 30 Dec 2009 13:59:59 +0100 Subject: [Live-devel] [Patch] Don't exit(0) but abort() in case of unhanded error. Message-ID: Hi, Without this patch my VLC exists, pretending everything went fine. Pierre. -------------- next part -------------- A non-text attachment was scrubbed... Name: exit_to_err.diff Type: application/octet-stream Size: 326 bytes Desc: not available URL: From finlayson at live555.com Wed Dec 30 05:07:36 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Dec 2009 05:07:36 -0800 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: <49B2D30CB72A5A44BCCAFB02670283787EBBE6@mailesl5.esl.corp.elbit.co.il> References: <49B2D30CB72A5A44BCCAFB02670283787EBBE6@mailesl5.esl.corp.elbit.co.il> Message-ID: >I see that the schedualer is launching my deliverFrame about every >1-2msec, when I got multiple descriptors in the socket. > >The problem is that when I got lost of descriptors (~80) this can >take up to 160 msec which is way beyond my frame size which is 40 >msec (PAL) > >1. Am I doing this wrong? Because you are reading your input data from a socket, you should not assume that the input socket data consists of discrete NAL units. Instead, each time you should read as much data as you can from the socket (remember to make it non-blocking), and treat the data as a set of (>=1) NAL units. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 30 05:18:33 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Dec 2009 05:18:33 -0800 Subject: [Live-devel] Retrieving media with RTSP/RTP In-Reply-To: <242ebc8c35f44a8ba17529385d0008cb@0703d9fa4ccc45c7925454a6588976e4> References: <242ebc8c35f44a8ba17529385d0008cb@0703d9fa4ccc45c7925454a6588976e4> Message-ID: >1. The first one is easy. We have compiled the openRTSP >functionality to a library to be more flexible. Since a WSAStartup >call is necessary on Windows, this is correctly done in the >initializeWinsockIfNecessary call - the "f...ing lame comment" in >the source code is so true :). Well, the obvious questions is where >is the WSACleanup call? Well, there is a WSACleanup >in initializeWinsockIfNecessary but only called when the winsock >version is not found. So using live media on Windows should also >require WSACleanup, which must be used in pair with WSAStartup. We >solved this by removing the entire initializeWinsockIfNecessary and >we are using a WSAStartup and WSACleanup calls outside the live >media source code in our library. I really don't care about "WSACleanup()"; that's just Windows-specific crap, as far as I'm concerned. Presumably the right thing will happen when process eventually exits, and that's all I care about. >Now - the live media source code contains some exit commands that we >are afraid of. In each case, the calls to "exit()" should not be happening unless there is a serious error in your programming - i.e., they should never end up being called under normal circumstances. If you want, you can change them to something else, but (once again) they should not ever get called in properly-written code. You can think of them as a debugging aid. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 30 05:19:51 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Dec 2009 05:19:51 -0800 Subject: [Live-devel] Git/SVN repo? In-Reply-To: References: Message-ID: >I failed to find a git/svn repo for live555. I think it would ease >development if there was one. Any reason why there is none? Because only one person (I) ever adds code directly to the code base. (Also, we provide support for the latest version of the code only.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Dec 30 05:40:09 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Dec 2009 05:40:09 -0800 Subject: [Live-devel] [Patch] Don't exit(0) but abort() in case of unhanded error. In-Reply-To: References: Message-ID: >Without this patch my VLC exists, pretending everything went fine. As I noted in my earlier response (to someone else's question), this call (and other calls) to "exit()" should be getting called only if there's a serious problem in your code. Are you actually seeing this? If so, you should figure out why. And why do you feel that "abort()" is better than "exit()"? (Because you can catch it in a signal handler?) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ben at decadent.org.uk Wed Dec 30 08:47:39 2009 From: ben at decadent.org.uk (Ben Hutchings) Date: Wed, 30 Dec 2009 16:47:39 +0000 Subject: [Live-devel] [Patch] Don't exit(0) but abort() in case of unhanded error. In-Reply-To: References: Message-ID: <20091230164739.GO20379@decadent.org.uk> On Wed, 2009-12-30 at 05:40 -0800, Ross Finlayson wrote: > >Without this patch my VLC exists, pretending everything went fine. > > As I noted in my earlier response (to someone else's question), this > call (and other calls) to "exit()" should be getting called only if > there's a serious problem in your code. Are you actually seeing > this? If so, you should figure out why. And why do you feel that > "abort()" is better than "exit()"? (Because you can catch it in a > signal handler?) exit() will run functions installed with atexit(), and destructors for C++ static objects, and so on. abort() will kill the program quickly rather than calling these functions while the program is already in a bad state (just like a failing assert()). Also, gdb (and presumably other debuggers) will break on unhandled signals by default, but not on exit(). Ben. -- Ben Hutchings The obvious mathematical breakthrough [to break modern encryption] would be development of an easy way to factor large prime numbers. - Bill Gates From loemaster at freenet.de Wed Dec 30 05:59:27 2009 From: loemaster at freenet.de (Jan Szczepanski) Date: Wed, 30 Dec 2009 14:59:27 +0100 Subject: [Live-devel] FW: H264VideoStreamFramer and currentNALUnitEndsAccessUnit() Message-ID: Hi everyone, some month ago I started with h264 video enconding and decoding (Intel Perfomance Primitives) and now I'm really interessted to transport the Frames via the liveMedia modul. But I have some problems with the classes (H264VideoStre... and currentNALUnitE...), is there someone who can give me good advices or documentations or, if you are willing to do this, a source code? With best regards J. Szczepanski ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From pdherbemont at free.fr Wed Dec 30 07:01:51 2009 From: pdherbemont at free.fr (Pierre d'Herbemont) Date: Wed, 30 Dec 2009 16:01:51 +0100 Subject: [Live-devel] Git/SVN repo? In-Reply-To: References: Message-ID: On Wed, Dec 30, 2009 at 2:19 PM, Ross Finlayson wrote: >> I failed to find a git/svn repo for live555. I think it would ease >> development if there was one. Any reason why there is none? > > Because only one person (I) ever adds code directly to the code base. That's not a valid point. Wine does only have one committer but an extensive contributor base. And I do extensively use revision system for my personal project as well... Is the real reason? If so, you should really try... I think it would also simplify your life a maintainer as well :-) They are plenty of *simple* hosting services for that! > (Also, we provide support for the latest version of the code only.) To me that's unrelated... Pierre. From pdherbemont at free.fr Wed Dec 30 07:08:51 2009 From: pdherbemont at free.fr (Pierre d'Herbemont) Date: Wed, 30 Dec 2009 16:08:51 +0100 Subject: [Live-devel] [Patch] Don't exit(0) but abort() in case of unhanded error. In-Reply-To: References: Message-ID: On Wed, Dec 30, 2009 at 2:40 PM, Ross Finlayson wrote: >> Without this patch my VLC exists, pretending everything went fine. > > ?And why do you feel that "abort()" is better than > "exit()"? ?(Because you can catch it in a signal handler?) Because exit(0) means program exited without issue. You are pretending to do something you don't do :-) Also exit() may trigger atexit() and such, which can result in a crash even less explicit. Try abort() on your system, you'll see that it logs that it didn't cleanly aborted, and/or even automatically create a crash report, and backtrace on some system. (See 'man abort') On my build system, abort() launches the debugger as well. I think, it's pretty obvious now :-) Feel free to change this to an assert() which would makes sense as well. > As I noted in my earlier response (to someone else's question), this call > (and other calls) to "exit()" should be getting called only if there's a > serious problem in your code. Are you actually seeing this? If so, you > should figure out why Sure, I still don't know the real reason why I do get this, I have added more debug code, and hopefully the backtrace will help me. But for now I wasn't able to reproduce. Pierre. From finlayson at live555.com Wed Dec 30 10:28:19 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Dec 2009 10:28:19 -0800 Subject: [Live-devel] [Patch] Don't exit(0) but abort() in case of unhanded error. In-Reply-To: <20091230164739.GO20379@decadent.org.uk> References: <20091230164739.GO20379@decadent.org.uk> Message-ID: >exit() will run functions installed with atexit(), and destructors for >C++ static objects, and so on. abort() will kill the program quickly >rather than calling these functions while the program is already in a >bad state (just like a failing assert()). OK, FWIW (not much :-), I'll change the "exit()" calls to "abort()" in the next release of the code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From amit.yedidia at elbitsystems.com Wed Dec 30 21:11:40 2009 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Thu, 31 Dec 2009 07:11:40 +0200 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: Message-ID: <49B2D30CB72A5A44BCCAFB02670283787EBBE8@mailesl5.esl.corp.elbit.co.il> Thanks Ross, what do you mean by "treat the data as a set of (>=1) NAL units."? do you mean that I should read it and use the fNumOfTruncatedBytes if needed? My hardware is alreay generating NAL's with the correct size. what is the correct way if I want to work with discrete NAL? how can I force the live555 fragmenter to use my NAL sizes? Amit. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, December 30, 2009 3:08 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Sending lots of NAL's problem I see that the schedualer is launching my deliverFrame about every 1-2msec, when I got multiple descriptors in the socket. The problem is that when I got lost of descriptors (~80) this can take up to 160 msec which is way beyond my frame size which is 40 msec (PAL) 1. Am I doing this wrong? Because you are reading your input data from a socket, you should not assume that the input socket data consists of discrete NAL units. Instead, each time you should read as much data as you can from the socket (remember to make it non-blocking), and treat the data as a set of (>=1) NAL units. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 30 21:48:55 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Dec 2009 21:48:55 -0800 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: <49B2D30CB72A5A44BCCAFB02670283787EBBE8@mailesl5.esl.corp.elbit.co.il> References: <49B2D30CB72A5A44BCCAFB02670283787EBBE8@mailesl5.esl.corp.elbit.co.il> Message-ID: >what do you mean by "treat the data as a set of (>=1) NAL units."? >do you mean that I should read it and use the fNumOfTruncatedBytes >if needed? No, I just meant that you should - in your socket read handler - read *all* of the available data from the socket, rather than just one NAL unit's worth. That way, you'll reduce the number of times that you handle socket read events, reducing your overhead. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Wed Dec 30 22:14:51 2009 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Thu, 31 Dec 2009 08:14:51 +0200 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: Message-ID: <49B2D30CB72A5A44BCCAFB02670283787EBBEB@mailesl5.esl.corp.elbit.co.il> lets say that I do read all available data from the socket on the first call to doDeliverFrame which was triggered by the data arriving to the socket. but if I feed the framer only in one NAL at the first call to the doDeliverFrame, what will trigger the call to the next doDeliverFrame (since the socket is now empty)? ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, December 31, 2009 7:49 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Sending lots of NAL's problem what do you mean by "treat the data as a set of (>=1) NAL units."? do you mean that I should read it and use the fNumOfTruncatedBytes if needed? No, I just meant that you should - in your socket read handler - read *all* of the available data from the socket, rather than just one NAL unit's worth. That way, you'll reduce the number of times that you handle socket read events, reducing your overhead. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 30 22:58:18 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Dec 2009 22:58:18 -0800 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: <49B2D30CB72A5A44BCCAFB02670283787EBBEB@mailesl5.esl.corp.elbit.co.il> References: <49B2D30CB72A5A44BCCAFB02670283787EBBEB@mailesl5.esl.corp.elbit.co.il> Message-ID: >lets say that I do read all available data from the socket on the >first call to doDeliverFrame which was triggered by the data >arriving to the socket. >but if I feed the framer only in one NAL at the first call to the >doDeliverFrame No, you wouldn't do that. You would need to feed *all* of the data that you received into the downstream 'framer' object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Wed Dec 30 23:30:17 2009 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Thu, 31 Dec 2009 09:30:17 +0200 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: Message-ID: <49B2D30CB72A5A44BCCAFB02670283787EBBEC@mailesl5.esl.corp.elbit.co.il> but then, in case of multiple NAL's, the frame or the fragmenter will try to fragment the data buy themselves, which I donw want to happen. I want to use the NAL size as received from my hardwre encoder. Is there a way from within a single call to doDeliverFrame to send all the NAL's (send more than one packet)? ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, December 31, 2009 8:58 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Sending lots of NAL's problem lets say that I do read all available data from the socket on the first call to doDeliverFrame which was triggered by the data arriving to the socket. but if I feed the framer only in one NAL at the first call to the doDeliverFrame No, you wouldn't do that. You would need to feed *all* of the data that you received into the downstream 'framer' object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 31 03:06:31 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 Dec 2009 03:06:31 -0800 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: <49B2D30CB72A5A44BCCAFB02670283787EBBEC@mailesl5.esl.corp.elbit.co.il> References: <49B2D30CB72A5A44BCCAFB02670283787EBBEC@mailesl5.esl.corp.elbit.co.il> Message-ID: >but then, in case of multiple NAL's, the frame or the fragmenter >will try to fragment the data buy themselves, No, I'm suggesting that your 'framer' object read *and buffer* multiple NAL units from your input socket, but still deliver NAL units to the downstream "H264VideoRTPSink" object one at a time. (This will be my last posting on this topic.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Thu Dec 31 04:39:48 2009 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Thu, 31 Dec 2009 14:39:48 +0200 Subject: [Live-devel] Sending lots of NAL's problem In-Reply-To: Message-ID: <49B2D30CB72A5A44BCCAFB02670283787EBBEE@mailesl5.esl.corp.elbit.co.il> Thanks. Happy New Year! ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, December 31, 2009 1:07 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Sending lots of NAL's problem but then, in case of multiple NAL's, the frame or the fragmenter will try to fragment the data buy themselves, No, I'm suggesting that your 'framer' object read *and buffer* multiple NAL units from your input socket, but still deliver NAL units to the downstream "H264VideoRTPSink" object one at a time. (This will be my last posting on this topic.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: