From finlayson at live555.com Wed Dec 1 00:17:06 2010 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Dec 2010 00:17:06 -0800 Subject: [Live-devel] SDP problem in openRTSP In-Reply-To: References: Message-ID: >Though now I have another problem. The client tells me that it is >'Unable to create receiver for "video/VP8" subsession: RTP payload >format unknown or not supported'. So I am guessing that the live555 >libraries don't support VP8. Is this correct? Yes. Read the FAQ. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From davidcailliere at voila.fr Wed Dec 1 04:51:48 2010 From: davidcailliere at voila.fr (david cailliere) Date: Wed, 1 Dec 2010 13:51:48 +0100 (CET) Subject: [Live-devel] Handling RTCP Goodbye packet with OpenRTSP Message-ID: <28710931.4668981291207908467.JavaMail.www@wwinf4613> Dear Ross > Are you sure that the crash can still occur, even if you omit the "-Q" option? Unfortunately, yes it does. Please find below the log end generated from the following command "OpenRTSP.exe -d 10 when the crash occurs. The call stack is the same as the one I provide you before: Received RTCP "BYE" on "audio/AMR" subsession (after 22 seconds) validated RTCP subpacket (type 3): 1, 203, 0, 0x0146b2f8 validated entire RTCP packet schedule(0.267904->1291201051.167388) AMRDeinterleavingBuffer::deliverIncomingFrame(): new interleave group AMRDeinterleavingBuffer::deliverIncomingFrame(): frameIndex 2 (2,0) put in bank 0, bin 0 (1): size 17, header 0x1c, presentationTime 1291201048.132577 AMRDeinterleavingBuffer::retrieveFrame(): from bank 1, bin 0: size 17, header 0x 1c, presentationTime 1291201048.112577 Hopping it can be helpful Regards, David ____________________________________________________ Prenez de l'avance pour vos cadeaux de No?l, d?couvrez notre s?lection sur Voila : http://actu.voila.fr/evenementiel/noel-2010/boutique/ From edi87 at fibertel.com.ar Wed Dec 1 05:47:34 2010 From: edi87 at fibertel.com.ar (edi87 at fibertel.com.ar) Date: Wed, 01 Dec 2010 10:47:34 -0300 Subject: [Live-devel] Question about trick play, server side Message-ID: <2c2fad7a7e80.4cf627c6@fibertel.com.ar> Ross, Thanks for your reply. I checked the code mentioned, and it works as expected. But now I have a question... is possible to do the same (stream only N mins of file, or make an infinite loop) from the server side? Suppose I have no control over clients and I want them to see only first 10 minutes of 30 mins from a video, without need to cut the file? About the loop, again a way to make the server to stream a video file in a loop or not, without need to change anything on the client side? Thanks! ----- Mensaje original ----- De: Ross Finlayson Fecha: Mi?rcoles, Diciembre 1, 2010 1:32 am Asunto: Re: [Live-devel] Question about trick play, server side > >I'm new at live555, i read doxygen references and make some tests > to > >start understanding how it works. > >Now i plan something to do, i think simple, I want to stream a > video > >file (.mpg) on demand, but with one feature... I want to specify > the > >duration to stream in seconds, or make it an infinite loop. > > This is something that the *client* requests (using the RTSP > protocol). It does not require any special support from the > server. > I.e., our existing, unmodified server code can do this. > > > >So, suppose I have a video of 10 mins, I want to stream on demand > >only 1 min... > > This is easy to do from your RTSP client (and, as I noted above, > without requiring any modification to server code). Note, for > example, how the "openRTSP" demo application (a RTSP client) > implements the "-s " and "-d " options. > > > > or make it an infinite loop, so when 10 mins are streamed to a > >client, it must restart from 0 again. > > Again, this is easy to do - from the client. Note, for example, > how > "openRTSP" implements the "-c" option. > > See , and the "openRTSP" source > code (in "testProgs/playCommon.cpp"). > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Wed Dec 1 06:23:59 2010 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Dec 2010 06:23:59 -0800 Subject: [Live-devel] Question about trick play, server side In-Reply-To: <2c2fad7a7e80.4cf627c6@fibertel.com.ar> References: <2c2fad7a7e80.4cf627c6@fibertel.com.ar> Message-ID: >I checked the code mentioned, and it works as expected. >But now I have a question... is possible to do the same (stream only >N mins of file, or make an infinite loop) from the server side? Well, if you want to only stream N minutes of a file - without the client asking you to do this - then the best/right way to do this would simply be to create a new file in advance, by editing the original file, and then asking your clients to stream from the new file instead. As for streaming a single file over and over again, in an infinite loop (again, without the client asking you to do this): You could do this, but you'd need to write a new "FramedFilter" subclass that sits in front of your "ByteStreamFileSource" class (and presents the illusion of delivering a single, unbroken stream to the downstream object (a "MPEG2TransportStreamFramer")). You'd also need to write a new "OnDemandServerMediaSubsession" class (replacing the existing "MPEG2TransportFileServerMediaSubsession" class) that uses your new "FramedFilter" subclass. Also, because you're streaming a Transport Stream file, you'd need to make sure that the 'discontinuity flag' is set at the start of the file, so that PCR values (and thus presentation times) don't get messed up when the server loops back to the start of the file. None of this really has anything to do with 'trick play' (in particular, you won't be creating index files at all), because - from the client's point of view - you're not doing anything other than playing a simple stream. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From demask at mail.ru Wed Dec 1 11:18:10 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Thu, 2 Dec 2010 01:18:10 +0600 Subject: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() Message-ID: <482E72172B95496783B63DACD866E9E0@castle> Hello Ross, I'm trying to write RTSP client which will be able to read from multiple URLs simultaneously using the single event loop. Is there any way to identify session URL in RTSPClient response handlers? These handlers receive pointer to RTSPClient object. Unfortunatelly, the related class member variable RTSPClient::fBaseURL and member function RTSPClient::sessionURL() are defined as private, meaning there is no way to access them in the response handlers at the moment. The same question regarding retrieval of RTSPClient::fUserAgentHeaderStr. Why do I need this. I need a possibility to retrieve from RTSPClient object some key (a string or a number) that would be unique. Then I'm going to use this key in response handlers to access all other session-specific variables (like mediasession objects, file sinks, duration, etc) via the hash map. At the moment I have no other idea how to distinquish RTSP sessions in response handlers. I would highly appreciate any advice on this matter. Kind regards, Dmitriy -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 1 11:38:15 2010 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Dec 2010 11:38:15 -0800 Subject: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() In-Reply-To: <482E72172B95496783B63DACD866E9E0@castle> References: <482E72172B95496783B63DACD866E9E0@castle> Message-ID: >I'm trying to write RTSP client which will be able to read from >multiple URLs simultaneously using the single event loop. Note that a single "RTSPClient" object is used for controlling *one* stream (i.e., one "rtsp://" URL) only. Your RTSP client *application* can, of course, open and play multiple RTSP streams concurrently (using a single event loop), but to do so, you will need to create a separate "RTSPClient" object for each. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 1 18:20:04 2010 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Dec 2010 18:20:04 -0800 Subject: [Live-devel] Handling RTCP Goodbye packet with OpenRTSP In-Reply-To: <28710931.4668981291207908467.JavaMail.www@wwinf4613> References: <28710931.4668981291207908467.JavaMail.www@wwinf4613> Message-ID: >Hoping it can be helpful I see what is happening to cause the crash, but unfortunately I don't understand how it can be happening. The call to Medium::close(subsession->sink); in "subsessionAfterPlaying()" should be causing AMRDeinterleaver::doStopGettingFrames() to get called, and that should in turn be stopping the reception (and thus handling) of any more AMR/RTP packets. Once again, it would be nice to be able to experiment with a stream that actually illustrates the problem... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yuri.timenkov at itv.ru Wed Dec 1 22:03:01 2010 From: yuri.timenkov at itv.ru (Yuri Timenkov) Date: Thu, 02 Dec 2010 09:03:01 +0300 Subject: [Live-devel] LiveMedia in a Directshow Filter In-Reply-To: References: <10C6D4821E0BAA44A73B987733947B16C91431@OREV.ads.local> Message-ID: <4CF73695.10006@itv.ru> BTW, On 01.12.2010 7:18, Ross Finlayson wrote: >> >while(true) >> >{ >> > fWatchVariable = 0; >> > dummyTask(NULL); // 100ms >> > env->taskScheduler().doEventLoop(&fWatchVariable); >> > if (fWatchVariable = 1) This statement looks very strange: is it still assignment instead of comparison in your code? What is it supposed to be? >> > { >> > printf("\nIn While fWatchVariable=%d,",fWatchVariable); >> > if (!mP4LiveSms->mpeg4LiveSource != NULL) >> > >> mP4LiveSms->mpeg4LiveSource->deliverFrame(); >> > } >> >} > > [...] > >> 1) How should I call deliverFrame() from the Main thread(DS Graph >> in a Filter) in order to signal my FramedSource that a new frame is >> arrived. > > You don't! "deliverFrame()" - being a LIVE555 library operation - > should be called only from the LIVE555 event loop thread - not from > some other thread. > > However, what you *can* do in the non-LIVE555 thread (i.e., your > 'DirectShow' thread) is set "fWatchVariable" to 1. If you do this, > then the LIVE555 event loop (your code above) will notice this, and do > the right thing, calling "deliverFrame()". > > >> 3) Question about the parameters : fPresentationTime, >> fDurationInMicroseconds what exactly they mean. >> fPresentationTime Is the different between sample(frame) start and >> Stop - in my received filter -> pSample->GetTime(&tStart, &tStop);? > > In most situations, you can just set "fPresentationTime" by calling > "gettimeofday()". You should do this inside your "deliverFrame()" > function (which is called from the LIVE555 event loop thread, of course). > > >> fDurationInMicroseconds - How do I set this parameter? > > In principle, it should be set to the duration between successive > frames (that you deliver using "deliverFrame()"). > > However, because you are streaming from a live source, rather than > from a file, it's probably OK to leave this variable unset (in which > case, it'll get a default value of 0). From finlayson at live555.com Wed Dec 1 22:15:36 2010 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Dec 2010 22:15:36 -0800 Subject: [Live-devel] LiveMedia in a Directshow Filter In-Reply-To: <4CF73695.10006@itv.ru> References: <10C6D4821E0BAA44A73B987733947B16C91431@OREV.ads.local> <4CF73695.10006@itv.ru> Message-ID: >>> >while(true) >>>>{ >>>> fWatchVariable = 0; >>>> dummyTask(NULL); // 100ms >>>> env->taskScheduler().doEventLoop(&fWatchVariable); >>>> if (fWatchVariable = 1) >This statement looks very strange: is it still assignment instead of >comparison in your code? What is it supposed to be? Oh wow - I completely missed that. You're right. Also, later... >> > if (!mP4LiveSms->mpeg4LiveSource != NULL) The initial "!" should not be there. Your code should be: while (1) { fWatchVariable = 0; dummyTask(NULL); // 100ms env->taskScheduler().doEventLoop(&fWatchVariable); if (mP4LiveSms->mpeg4LiveSource != NULL) { mP4LiveSms->mpeg4LiveSource->deliverFrame(); } } -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From eyals at aeronautics-sys.com Thu Dec 2 18:40:08 2010 From: eyals at aeronautics-sys.com (Eyal Shveky) Date: Fri, 3 Dec 2010 04:40:08 +0200 Subject: [Live-devel] LiveMedia in a Directshow Filter - Encapsulating References: Message-ID: <10C6D4821E0BAA44A73B987733947B164A172F@OREV.ads.local> Hi, I almost started to give up with live555, then I decided to go over to F&Q, and start from streaming Mpeg1-2 instead of MPeg4 and ondemand, Following testMPEG1or2VideoStreamer and the advice of encapsulating my DS source with FramedSource then I see some gray static picture on my VLCplayer from. Didn't gave up continue trying to make it works. I think my main problem is synchronizing both threads. Actually with DirectShow filter is more then one, Anyway I took the tread that receives the MediaSample (i.e frame) type of CRTSPInputPin. Subclass it from FramedSource, then I realize the when I'm creating it I need to pass the BasicUsageEnvironment as a parameter so the usage environment have to be global for use of both threads. then only after I'm creating my CRTSPInputPin type of FramedSource in my filter creation I'm assigning the pointer to a global CRTSPInputPin type of FramedSource. That used in the Live555 thread at the point of creating the framer MPEG1or2VideoStreamFramer later used by videoSink->startPlaying as the video source. The flow: On the CRTSPInputPin when a new MediaSample(Frame) is received I'm running the live555 thread only for the first time . . That create the server,socket,port etc -> startPlaying at the end doEventLoopwith the fWatchVariable - Live555 is running. Continue in DS thread getting the buffer and length of the frame from the MediaSample call to doGetNextFrame() to process that frame. >From that point both threads asynchronously running, And I need to sync them somehow. Sometimes I'm receiving frames from my source before Live555 get to the point of startplaying, Sometimes Live555 process the same frame on a loop, (The gray VLC effect). Most of the time I'm getting "ignoring non video sequence header" between one receive to another. Questions: 1) How do I sync the LiveMedia Sequence, It's running on 1 thread moving the nextFrame from source to sink on endless loop. It seams that while breaking that loop I have to dill with the consciences , My source is FramedSource and I'm calling doGetNextFrame in order to set fTo, fFrameSize from a global buffer, do I need the fWatchVariable? Do I need to used the DeliverFrame method and do the frame set over there? 2) Should I add another task to the scheduler or subclass it and used it from the DS thread in order to sync to Live555 flow. Source Code attached, I'm more then accepting to see it works. Thanks a lot Eyal > >while(true) > >{ > > fWatchVariable = 0; > > dummyTask(NULL); // 100ms > > env->taskScheduler().doEventLoop(&fWatchVariable); > > if (fWatchVariable = 1) > > { > > printf("\nIn While fWatchVariable=%d,",fWatchVariable); > > if (!mP4LiveSms->mpeg4LiveSource != NULL) > > mP4LiveSms->mpeg4LiveSource->deliverFrame(); > > } > >} [...] > 1) How should I call deliverFrame() from the Main thread(DS Graph >in a Filter) in order to signal my FramedSource that a new frame is >arrived. You don't! "deliverFrame()" - being a LIVE555 library operation - should be called only from the LIVE555 event loop thread - not from some other thread. However, what you *can* do in the non-LIVE555 thread (i.e., your 'DirectShow' thread) is set "fWatchVariable" to 1. If you do this, then the LIVE555 event loop (your code above) will notice this, and do the right thing, calling "deliverFrame()". >3) Question about the parameters : fPresentationTime, >fDurationInMicroseconds what exactly they mean. > fPresentationTime Is the different between sample(frame) start and >Stop - in my received filter -> pSample->GetTime(&tStart, &tStop);? In most situations, you can just set "fPresentationTime" by calling "gettimeofday()". You should do this inside your "deliverFrame()" function (which is called from the LIVE555 event loop thread, of course). > fDurationInMicroseconds - How do I set this parameter? In principle, it should be set to the duration between successive frames (that you deliver using "deliverFrame()"). However, because you are streaming from a live source, rather than from a file, it's probably OK to leave this variable unset (in which case, it'll get a default value of 0). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ********************************************************************************************** LEGAL NOTICE - Unless expressly stated otherwise, this message, its annexes, attachments, appendixes, any subsequent correspondence, and any document, data, sketches, plans and/or other material that is hereby attached, are proprietary. confidential and may be legally privileged. Nothing in this e-mail is intended to conclude a contract on behalf of Aeronautics or make it subject to any other legally binding commitments, unless the e-mail contains an express statement to the contrary or incorporates a formal Purchase Order. This transmission is intended for the named addressee only. Unless you are the named addressee (or authorised to receive it for the addressee) you may not copy or use it, or disclose it to anyone else, any disclosure or copying of the contents of this e-mail or any action taken (or not taken) in reliance on it is unauthorised and may be unlawful. If you are not an addressee, please inform the sender immediately. IMPORTANT: The contents of this email and any attachments are confidential. They are intended for the named recipient(s) only. If you have received this email in error, please notify the system manager or the sender immediately and do not disclose the contents to anyone or make copies thereof. *** eSafe scanned this email for viruses, vandals, and malicious content. *** ********************************************************************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MyRTSPFilter_send.zip Type: application/zip Size: 11934 bytes Desc: MyRTSPFilter_send.zip URL: From eyals at aeronautics-sys.com Thu Dec 2 18:40:08 2010 From: eyals at aeronautics-sys.com (Eyal Shveky) Date: Fri, 3 Dec 2010 04:40:08 +0200 Subject: [Live-devel] LiveMedia in a Directshow Filter - Encapsulating References: Message-ID: <10C6D4821E0BAA44A73B987733947B164A172F@OREV.ads.local> Hi, I almost started to give up with live555, then I decided to go over to F&Q, and start from streaming Mpeg1-2 instead of MPeg4 and ondemand, Following testMPEG1or2VideoStreamer and the advice of encapsulating my DS source with FramedSource then I see some gray static picture on my VLCplayer from. Didn't gave up continue trying to make it works. I think my main problem is synchronizing both threads. Actually with DirectShow filter is more then one, Anyway I took the tread that receives the MediaSample (i.e frame) type of CRTSPInputPin. Subclass it from FramedSource, then I realize the when I'm creating it I need to pass the BasicUsageEnvironment as a parameter so the usage environment have to be global for use of both threads. then only after I'm creating my CRTSPInputPin type of FramedSource in my filter creation I'm assigning the pointer to a global CRTSPInputPin type of FramedSource. That used in the Live555 thread at the point of creating the framer MPEG1or2VideoStreamFramer later used by videoSink->startPlaying as the video source. The flow: On the CRTSPInputPin when a new MediaSample(Frame) is received I'm running the live555 thread only for the first time . . That create the server,socket,port etc -> startPlaying at the end doEventLoopwith the fWatchVariable - Live555 is running. Continue in DS thread getting the buffer and length of the frame from the MediaSample call to doGetNextFrame() to process that frame. >From that point both threads asynchronously running, And I need to sync them somehow. Sometimes I'm receiving frames from my source before Live555 get to the point of startplaying, Sometimes Live555 process the same frame on a loop, (The gray VLC effect). Most of the time I'm getting "ignoring non video sequence header" between one receive to another. Questions: 1) How do I sync the LiveMedia Sequence, It's running on 1 thread moving the nextFrame from source to sink on endless loop. It seams that while breaking that loop I have to dill with the consciences , My source is FramedSource and I'm calling doGetNextFrame in order to set fTo, fFrameSize from a global buffer, do I need the fWatchVariable? Do I need to used the DeliverFrame method and do the frame set over there? 2) Should I add another task to the scheduler or subclass it and used it from the DS thread in order to sync to Live555 flow. Source Code attached, I'm more then accepting to see it works. Thanks a lot Eyal > >while(true) > >{ > > fWatchVariable = 0; > > dummyTask(NULL); // 100ms > > env->taskScheduler().doEventLoop(&fWatchVariable); > > if (fWatchVariable = 1) > > { > > printf("\nIn While fWatchVariable=%d,",fWatchVariable); > > if (!mP4LiveSms->mpeg4LiveSource != NULL) > > mP4LiveSms->mpeg4LiveSource->deliverFrame(); > > } > >} [...] > 1) How should I call deliverFrame() from the Main thread(DS Graph >in a Filter) in order to signal my FramedSource that a new frame is >arrived. You don't! "deliverFrame()" - being a LIVE555 library operation - should be called only from the LIVE555 event loop thread - not from some other thread. However, what you *can* do in the non-LIVE555 thread (i.e., your 'DirectShow' thread) is set "fWatchVariable" to 1. If you do this, then the LIVE555 event loop (your code above) will notice this, and do the right thing, calling "deliverFrame()". >3) Question about the parameters : fPresentationTime, >fDurationInMicroseconds what exactly they mean. > fPresentationTime Is the different between sample(frame) start and >Stop - in my received filter -> pSample->GetTime(&tStart, &tStop);? In most situations, you can just set "fPresentationTime" by calling "gettimeofday()". You should do this inside your "deliverFrame()" function (which is called from the LIVE555 event loop thread, of course). > fDurationInMicroseconds - How do I set this parameter? In principle, it should be set to the duration between successive frames (that you deliver using "deliverFrame()"). However, because you are streaming from a live source, rather than from a file, it's probably OK to leave this variable unset (in which case, it'll get a default value of 0). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ********************************************************************************************** LEGAL NOTICE - Unless expressly stated otherwise, this message, its annexes, attachments, appendixes, any subsequent correspondence, and any document, data, sketches, plans and/or other material that is hereby attached, are proprietary. confidential and may be legally privileged. Nothing in this e-mail is intended to conclude a contract on behalf of Aeronautics or make it subject to any other legally binding commitments, unless the e-mail contains an express statement to the contrary or incorporates a formal Purchase Order. This transmission is intended for the named addressee only. Unless you are the named addressee (or authorised to receive it for the addressee) you may not copy or use it, or disclose it to anyone else, any disclosure or copying of the contents of this e-mail or any action taken (or not taken) in reliance on it is unauthorised and may be unlawful. If you are not an addressee, please inform the sender immediately. IMPORTANT: The contents of this email and any attachments are confidential. They are intended for the named recipient(s) only. If you have received this email in error, please notify the system manager or the sender immediately and do not disclose the contents to anyone or make copies thereof. *** eSafe scanned this email for viruses, vandals, and malicious content. *** ********************************************************************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MyRTSPFilter_send.zip Type: application/zip Size: 11934 bytes Desc: MyRTSPFilter_send.zip URL: From demask at mail.ru Thu Dec 2 11:58:17 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Fri, 3 Dec 2010 01:58:17 +0600 Subject: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() References: <482E72172B95496783B63DACD866E9E0@castle> Message-ID: <74A82D0C72AE420E8204D9B2D972E519@castle> Re: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessHi Ross, Using separate RTSPClient objects - that's exacty I'm going to do. Ok, seems I have found a solution: RTSPClient class is derived from Medium class and there is Medium::name() available which returns unique value. I have another question. Do I need to delete RTSPClient objects myself (as well as UsageEnvironment and TaskScheduler)? The openRTSP application does not show how to do this properly. The best way would be deleting them in after-"TEARDOWN" handler. But the problem is that this handler might be called from RTSPClient object, thus this is not a good option. Kind regard, Dmitriy ----- Original Message ----- From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Thursday, December 02, 2010 1:38 AM Subject: Re: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() I'm trying to write RTSP client which will be able to read from multiple URLs simultaneously using the single event loop. Note that a single "RTSPClient" object is used for controlling *one* stream (i.e., one "rtsp://" URL) only. Your RTSP client *application* can, of course, open and play multiple RTSP streams concurrently (using a single event loop), but to do so, you will need to create a separate "RTSPClient" object for each. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 3 02:24:12 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Dec 2010 02:24:12 -0800 Subject: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() In-Reply-To: <74A82D0C72AE420E8204D9B2D972E519@castle> References: <482E72172B95496783B63DACD866E9E0@castle> <74A82D0C72AE420E8204D9B2D972E519@castle> Message-ID: >Using separate RTSPClient objects - that's exacty I'm going to do. >Ok, seems I have found a solution: RTSPClient class is derived from >Medium class and there is Medium::name() available which returns >unique value. Alternatively, you could subclass "RTSPClient", and store your information in fields in the subclass. > I have another question. Do I need to delete RTSPClient objects >myself (as well as UsageEnvironment and TaskScheduler)? Yes, unless, of course, you will be exiting the application anyway (as "openRTSP" does when it finishes). > The openRTSP application does not show how to do this properly. In general, you should reclaim objects in the reverse order that they were created. So, to reclaim your "RTSPClient", "UsageEnvironment" and "TashScheduler" objects, do the following (after reclaiming other objects): Medium::close(rtspClient); env->reclaim(); delete scheduler; (Yes, this is all really ugly and inconsistent. Someday it might get improved...) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From demask at mail.ru Fri Dec 3 03:53:29 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Fri, 3 Dec 2010 17:53:29 +0600 Subject: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() References: <482E72172B95496783B63DACD866E9E0@castle><74A82D0C72AE420E8204D9B2D972E519@castle> Message-ID: Re: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessHi Ross, Many thanks for getting back. A short word about application I'm writing. It is intended to run for indefinitely long time, until the end user shuts it down manually. Most of time it is going to be awaiting for some specific events from the real world (such as motion detectors and radionuclide detectors). The setting up and recording new RTSP session should be triggered only for tho short time and right after receiving such events. The rest of time application should be idle. Thus, it will not exit immediatelly after ending a single RTSP session, of course. With this in mind I was wondering if Medium::close(rtspClient) is enough to reclaim RTSPClient objects? It seems this function doesn't release object's memory. It only removes object from MediaLookupTable. At least from the first glance. If so, doing this indefinitely many times during application run-time might lead to significant memory leaks... Is there any workaround? Kind regards, Dmitriy ----- Original Message ----- From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Friday, December 03, 2010 4:24 PM Subject: Re: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() Using separate RTSPClient objects - that's exacty I'm going to do. Ok, seems I have found a solution: RTSPClient class is derived from Medium class and there is Medium::name() available which returns unique value. Alternatively, you could subclass "RTSPClient", and store your information in fields in the subclass. I have another question. Do I need to delete RTSPClient objects myself (as well as UsageEnvironment and TaskScheduler)? Yes, unless, of course, you will be exiting the application anyway (as "openRTSP" does when it finishes). The openRTSP application does not show how to do this properly. In general, you should reclaim objects in the reverse order that they were created. So, to reclaim your "RTSPClient", "UsageEnvironment" and "TashScheduler" objects, do the following (after reclaiming other objects): Medium::close(rtspClient); env->reclaim(); delete scheduler; (Yes, this is all really ugly and inconsistent. Someday it might get improved...) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From joao_dealmeida at hotmail.com Fri Dec 3 04:41:54 2010 From: joao_dealmeida at hotmail.com (Joao Almeida) Date: Fri, 3 Dec 2010 12:41:54 +0000 (UTC) Subject: [Live-devel] H264 SVC References: Message-ID: Hi, Im very interested :) i need for my phd thesis to do an streaming of SVC RTP multisession, for that i need the LIVE555 library to do SVC RTP (MST). best regards Joao From davidcailliere at voila.fr Fri Dec 3 10:27:38 2010 From: davidcailliere at voila.fr (david cailliere) Date: Fri, 3 Dec 2010 19:27:38 +0100 (CET) Subject: [Live-devel] Handling RTCP Goodbye packet with OpenRTSP Message-ID: <3381786.1962311291400858880.JavaMail.www@wwinf4617> Dear Ross, > The call to Medium::close(subsession->sink); > in "subsessionAfterPlaying()" should be causing > AMRDeinterleaver::doStopGettingFrames() > to get called, and that should in turn be stopping the reception (and > thus handling) of any more AMR/RTP packets. Actually, the call to AMRDeinterleaver::doStopGettingFrames() does not ensure that the reception is stopped in every case. Sometimes the fNeedAFrame attribute can be set to true when the input source is still waiting for some RTP packet. The rewritting of the function AMRDeinterleaver::doStopGettingFrames as below should fix the issue. void AMRDeinterleaver::doStopGettingFrames() { fNeedAFrame = False; fInputSource->stopGettingFrames(); } Since that modification, I did not manage to rash OpenRTSP anymore :-) I don't know whether the issue only concerns AMRDeinterleaver class. Regards, David ____________________________________________________ Prenez de l'avance pour vos cadeaux de No?l, d?couvrez notre s?lection sur Voila : http://actu.voila.fr/evenementiel/noel-2010/boutique/ From finlayson at live555.com Sat Dec 4 01:37:56 2010 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 4 Dec 2010 01:37:56 -0800 Subject: [Live-devel] RTSPClient::fBaseURL and RTSPClient::sessionURL() In-Reply-To: References: <482E72172B95496783B63DACD866E9E0@castle><74A82D0C72AE420E8204D9B2D972E519@castle> Message-ID: >With this in mind I was wondering if Medium::close(rtspClient) is >enough to reclaim RTSPClient objects? It seems this function doesn't >release object's memory. Yes it does (see "Media.cpp", line 175). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Dec 4 23:34:29 2010 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 4 Dec 2010 23:34:29 -0800 Subject: [Live-devel] H264 SVC In-Reply-To: References: Message-ID: >Im very interested :) i need for my phd thesis to do an streaming of SVC RTP >multisession, for that i need the LIVE555 library to do SVC RTP (MST). Although we don't yet support the RTP payload format for SVC (as currently described in ), it is basically an extension of the RTP payload format for H.264 - which we *do* support. Therefore, support for SVC could probably be added quite easily. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sat Dec 4 23:36:35 2010 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 4 Dec 2010 23:36:35 -0800 Subject: [Live-devel] Handling RTCP Goodbye packet with OpenRTSP In-Reply-To: <3381786.1962311291400858880.JavaMail.www@wwinf4617> References: <3381786.1962311291400858880.JavaMail.www@wwinf4617> Message-ID: >Actually, the call to AMRDeinterleaver::doStopGettingFrames() does >not ensure that the reception is stopped in every case. Sometimes >the fNeedAFrame attribute can be set to true when the input source >is still waiting for some RTP packet. > >The rewritting of the function AMRDeinterleaver::doStopGettingFrames >as below should fix the issue. > > >void AMRDeinterleaver::doStopGettingFrames() { > fNeedAFrame = False; > fInputSource->stopGettingFrames(); >} Yes, that will likely overcome the problem. Thanks for tracking this down. >I don't know whether the issue only concerns AMRDeinterleaver class. The same problem existed with "QCELPAudioRTPSource". Fixes for both will be included in the next release of the code. Thanks again. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Dec 5 13:03:49 2010 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 5 Dec 2010 13:03:49 -0800 Subject: [Live-devel] New LIVE555 version available - improves support for streaming H.264 Video Message-ID: A new version (2010.12.05) of the "LIVE555 Streaming Media" software has now been installed that significantly improves support for streaming H.264 video. (This had been a weak point of the software for some time.) In particular, "H264VideoStreamFramer" and "H264VideoStreamDiscreteFramer" (a new class) act like their corresponding MPEG4 versions: "H264VideoStreamFramer" reads a H.264 Video Elementary Stream byte stream (e.g., from a file), and "H264VideoStreamDiscreteFramer" reads discrete H.264 video NAL units (i.e., one-at-a-time), e.g., from a H.264 video encoder NOTE: Developers no longer need to write their own subclasses of "H264VideoStreamFramer" (or "H264VideoStreamDiscreteFramer"). We also added a new demo application - "testH264VideoStreamer" - for streaming from a H.264 Elementary Stream Video file (named "test.264") via multicast. "testOnDemandRTSPServer" and "live555MediaServer" were also updated to stream H.264 Video Elementary Stream files. (Some examples of H.264 Video Elementary Stream files - that can now be streamed by our software - are available online at http://www.live555.com/liveMedia/public/264/ ) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From joao_dealmeida at hotmail.com Sun Dec 5 03:29:33 2010 From: joao_dealmeida at hotmail.com (Joao Almeida) Date: Sun, 5 Dec 2010 11:29:33 +0000 (UTC) Subject: [Live-devel] H264 SVC References: Message-ID: Hi, thanks for your answer. This is also my ideaa, therefore can you indicate me whitch parts of the code are responsable for the H264 AVC support? so i can look into it and try to add support for SVC. Later i will also try to make it compatible with SVC multi rtp session (MST). Best regards Joao Ross Finlayson writes: Im very interested :) i need for my phd thesis to do an streaming of SVC RTP multisession, for that i need the LIVE555 library to do SVC RTP (MST). Although we don't yet support the RTP payload format for SVC (as currently described in ), it is basically an extension of the RTP payload format for H.264 - which we *do* support. Therefore, support for SVC could probably be added quite easily. From finlayson at live555.com Sun Dec 5 14:53:41 2010 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 5 Dec 2010 14:53:41 -0800 Subject: [Live-devel] H264 SVC In-Reply-To: References: Message-ID: >This is also my ideaa, therefore can you indicate me whitch parts of the code >are responsable for the H264 AVC support? Those files in "liveMedia" (and "liveMedia/include") that have "H264" in their name. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From pj81102 at gmail.com Mon Dec 6 01:15:37 2010 From: pj81102 at gmail.com (P.J.) Date: Mon, 06 Dec 2010 17:15:37 +0800 Subject: [Live-devel] a question about putting FramedSource::afterGetting in delay queue Message-ID: <4CFCA9B9.5040102@gmail.com> Dear Ross, I have a question. When the RTSPServer uses the following code envir().taskScheduler().scheduleDelayedTask(0,(TaskFunc*)FramedSource::afterGetting, this); to put FramedSource::afterGetting in DelayQueue, then the openRTSP immediately sends a 'TEARDOWN' command to the Server, as we know, in SingleStep function any delayed event is handled after socket handler, so I think the RTSPServer will probably handle 'TEARDOWN' first. However handleCmd_TEARDOWN will cause to delete RTSPClientSession object and also to delete related ***Source, ***RTPSink. So in this situation, can FramedSource::afterGetting be called normally? Won't it result in SIGSEGV signal? I just give an example above. Best regards, PJ -- *P.J.* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 6 01:51:09 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Dec 2010 01:51:09 -0800 Subject: [Live-devel] a question about putting FramedSource::afterGetting in delay queue In-Reply-To: <4CFCA9B9.5040102@gmail.com> References: <4CFCA9B9.5040102@gmail.com> Message-ID: > I have a question. When the RTSPServer uses the following code > >envir().taskScheduler().scheduleDelayedTask(0,(TaskFunc*)FramedSource::afterGetting, >this); > to put FramedSource::afterGetting in DelayQueue, then the >openRTSP immediately sends a 'TEARDOWN' command to the Server, as we >know, in SingleStep function any delayed event is handled after >socket handler, so I think the RTSPServer will probably handle >'TEARDOWN' first. However handleCmd_TEARDOWN will cause to delete >RTSPClientSession object and also to delete related ***Source, >***RTPSink. So in this situation, can FramedSource::afterGetting be >called normally? No, because that 'delayed task' will have been removed from the queue when the object was deleted (see Medium::~Medium()). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From marco-oweber at gmx.de Mon Dec 6 08:02:49 2010 From: marco-oweber at gmx.de (Marc Weber) Date: Mon, 06 Dec 2010 17:02:49 +0100 Subject: [Live-devel] installation - why install source and .o files ? Message-ID: <1291651236-sup-4501@nixos> The manual (homepage) says: There's no official 'install' procedure; you can put the "live/" directory wherever you wish - but you must leave it intact. You may wish to do the following: rm -rf /usr/lib/live ; cp -r live /usr/lib However after building live still contains the object and source files. So why do I have to install them as well? Is this intentionally? Marc Weber From justin.huff at qq.com Mon Dec 6 08:30:01 2010 From: justin.huff at qq.com (=?ISO-8859-1?B?anVzdGlu?=) Date: Tue, 7 Dec 2010 00:30:01 +0800 Subject: [Live-devel] New LIVE555 version available - improves support forstreaming H.264 Video Message-ID: This is really great. What about MPEG2-TS encap of the H.264? Now seems that we are having an "official" H.264 framer. Could you pls provide a testApp that utilize all the elements in place already? ------------------ Original ------------------ From: "Ross Finlayson"; Date: Mon, Dec 6, 2010 05:03 AM To: "live-devel"; Subject: [Live-devel] New LIVE555 version available - improves support forstreaming H.264 Video A new version (2010.12.05) of the "LIVE555 Streaming Media" software has now been installed that significantly improves support for streaming H.264 video. (This had been a weak point of the software for some time.) In particular, "H264VideoStreamFramer" and "H264VideoStreamDiscreteFramer" (a new class) act like their corresponding MPEG4 versions: "H264VideoStreamFramer" reads a H.264 Video Elementary Stream byte stream (e.g., from a file), and "H264VideoStreamDiscreteFramer" reads discrete H.264 video NAL units (i.e., one-at-a-time), e.g., from a H.264 video encoder NOTE: Developers no longer need to write their own subclasses of "H264VideoStreamFramer" (or "H264VideoStreamDiscreteFramer"). We also added a new demo application - "testH264VideoStreamer" - for streaming from a H.264 Elementary Stream Video file (named "test.264") via multicast. "testOnDemandRTSPServer" and "live555MediaServer" were also updated to stream H.264 Video Elementary Stream files. (Some examples of H.264 Video Elementary Stream files - that can now be streamed by our software - are available online at http://www.live555.com/liveMedia/public/264/ ) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 6 13:26:34 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Dec 2010 13:26:34 -0800 Subject: [Live-devel] installation - why install source and .o files ? In-Reply-To: <1291651236-sup-4501@nixos> References: <1291651236-sup-4501@nixos> Message-ID: >The manual (homepage) says: > >There's no official 'install' procedure; you can put the "live/" >directory wherever you wish - but you must leave it intact. You may >wish to do the following: > rm -rf /usr/lib/live ; cp -r live /usr/lib > >However after building live still contains the object and source files. >So why do I have to install them as well? You don't "have to". That's why I said "You *may* wish to do the following". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Dec 6 14:06:43 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Dec 2010 14:06:43 -0800 Subject: [Live-devel] New LIVE555 version available - improves support forstreaming H.264 Video In-Reply-To: References: Message-ID: >This is really great. What about MPEG2-TS encap of the H.264? We have always supported this - for both sending and receiving. Transport Stream data is sent/received the same way, regardless of what kind of video or audio is inside it. (However, we currently support server 'trick play' operations only on Transport Streams that contain MPEG-1 or 2 video.) > Now seems that we are having an "official" H.264 framer. Could you >pls provide a testApp that utilize all the elements in place already? Note the following from my earlier message: >We also added a new demo application - "testH264VideoStreamer" - for >streaming from a H.264 Elementary Stream Video file (named >"test.264") via multicast. > >"testOnDemandRTSPServer" and "live555MediaServer" were also updated >to stream H.264 Video Elementary Stream files. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From marco-oweber at gmx.de Mon Dec 6 14:20:14 2010 From: marco-oweber at gmx.de (Marc Weber) Date: Mon, 06 Dec 2010 23:20:14 +0100 Subject: [Live-devel] installation - why install source and .o files ? In-Reply-To: References: <1291651236-sup-4501@nixos> Message-ID: <1291673574-sup-1286@nixos> Excerpts from Ross Finlayson's message of Mon Dec 06 22:26:34 +0100 2010: > You don't "have to". That's why I said "You *may* wish to do the following". "but you must leave it intact" - That sentence is irritating. Marc Weber From finlayson at live555.com Mon Dec 6 16:04:14 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Dec 2010 16:04:14 -0800 Subject: [Live-devel] installation - why install source and .o files ? In-Reply-To: <1291673574-sup-1286@nixos> References: <1291651236-sup-4501@nixos> <1291673574-sup-1286@nixos> Message-ID: >Excerpts from Ross Finlayson's message of Mon Dec 06 22:26:34 +0100 2010: >> You don't "have to". That's why I said "You *may* wish to do the >>following". > >"but you must leave it intact" - That sentence is irritating. When I said "you must leave it intact", I was referring to the source code directory structure. Of course, you can - if you wish - copy the binary files (especially the library files) anywhere else. I agree, though, that the wording was a bit misleading. I've changed it now. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From demask at mail.ru Tue Dec 7 11:07:25 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Wed, 8 Dec 2010 01:07:25 +0600 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file Message-ID: <73A2DF189EE046A7B7D4F26CB44E93B4@castle> Hi Ross, I have noticed that when I'm recording video stream to AVI file using LIVE555 (evein if I'm using openRTSP sample app), the output file does not contain any key frames exept for the very first one. E.g. when I open the recorded file in Virtual Dub and go to FIle->File Information, the "Number of key frames" is always 1, whatever is the length of the recorded video. Or, if I execute search for the key frame using AVIStreamFindSample() VFW API function, it always return 0 (the very first frame), whatever start search position I do specify in call. As a video source I'm using Axis 241S/241Q video servers, and they are configured to stream I-frame followed by 7 P-frames (GOV Settings Structure: IP, Length: 8). Could you please advise what could be the reason of such strange behaviour? Kind regards, Dmitriy Petrenko -------------- next part -------------- An HTML attachment was scrubbed... URL: From kanth.war at gmail.com Tue Dec 7 20:59:56 2010 From: kanth.war at gmail.com (shashi kanth) Date: Tue, 7 Dec 2010 20:59:56 -0800 Subject: [Live-devel] Regarding configuration Message-ID: Hi... I am new to Live555, could you please tell me steps to configure the media server on windows . It is bit urgent. Thanking you Kid Regards Shashi -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 8 00:05:30 2010 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 8 Dec 2010 00:05:30 -0800 Subject: [Live-devel] Regarding configuration In-Reply-To: References: Message-ID: >I am new to Live555, could you please tell me steps to configure the >media server on windows There's no 'configuration' (other than putting the files that you want to stream into the same folder as the "live555MediaServer.exe" binary). Just run the application. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From edi87 at fibertel.com.ar Wed Dec 8 15:19:31 2010 From: edi87 at fibertel.com.ar (Jonathan Granade) Date: Wed, 08 Dec 2010 20:19:31 -0300 Subject: [Live-devel] Question about trick play, server side In-Reply-To: References: <2c2fad7a7e80.4cf627c6@fibertel.com.ar> Message-ID: <4D001283.4060904@fibertel.com.ar> Ross, Sorry by being late in reply but I was pretty busy. Thanks for your reply, it sounds very very good and I'm starting to test it. About stream N minutes of a file, I dont want to "cut" the original video, I think that I can modify the headers or something to make the client think that the video duration is N and not the total. Sorry about the "trick play" confusion, I miss understood the concept. PS: I need to stream a MPEG file, not a transport stream file, I just say a TS file because the example, it's the same to stream a TS file or a MPG? I mean, I should say what you said for any of those formats? Thanks in advance, Jonathan On 12/01/2010 11:23 AM, Ross Finlayson wrote: >> I checked the code mentioned, and it works as expected. >> But now I have a question... is possible to do the same (stream only N >> mins of file, or make an infinite loop) from the server side? > > Well, if you want to only stream N minutes of a file - without the > client asking you to do this - then the best/right way to do this would > simply be to create a new file in advance, by editing the original file, > and then asking your clients to stream from the new file instead. > > As for streaming a single file over and over again, in an infinite loop > (again, without the client asking you to do this): You could do this, > but you'd need to write a new "FramedFilter" subclass that sits in front > of your "ByteStreamFileSource" class (and presents the illusion of > delivering a single, unbroken stream to the downstream object (a > "MPEG2TransportStreamFramer")). You'd also need to write a new > "OnDemandServerMediaSubsession" class (replacing the existing > "MPEG2TransportFileServerMediaSubsession" class) that uses your new > "FramedFilter" subclass. > > Also, because you're streaming a Transport Stream file, you'd need to > make sure that the 'discontinuity flag' is set at the start of the file, > so that PCR values (and thus presentation times) don't get messed up > when the server loops back to the start of the file. > > None of this really has anything to do with 'trick play' (in particular, > you won't be creating index files at all), because - from the client's > point of view - you're not doing anything other than playing a simple > stream. From Utkarsh.Nimesh at maxim-ic.com Wed Dec 8 18:59:35 2010 From: Utkarsh.Nimesh at maxim-ic.com (Utkarsh Nimesh) Date: Wed, 8 Dec 2010 20:59:35 -0600 Subject: [Live-devel] New LIVE555 version available - improves support for streaming H.264 Video In-Reply-To: References: Message-ID: Superb, this is exactly what I was looking for! Thanks Ross. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, December 06, 2010 2:34 AM To: live-devel at ns.live555.com Subject: [Live-devel] New LIVE555 version available - improves support for streaming H.264 Video A new version (2010.12.05) of the "LIVE555 Streaming Media" software has now been installed that significantly improves support for streaming H.264 video. (This had been a weak point of the software for some time.) In particular, "H264VideoStreamFramer" and "H264VideoStreamDiscreteFramer" (a new class) act like their corresponding MPEG4 versions: "H264VideoStreamFramer" reads a H.264 Video Elementary Stream byte stream (e.g., from a file), and "H264VideoStreamDiscreteFramer" reads discrete H.264 video NAL units (i.e., one-at-a-time), e.g., from a H.264 video encoder NOTE: Developers no longer need to write their own subclasses of "H264VideoStreamFramer" (or "H264VideoStreamDiscreteFramer"). We also added a new demo application - "testH264VideoStreamer" - for streaming from a H.264 Elementary Stream Video file (named "test.264") via multicast. "testOnDemandRTSPServer" and "live555MediaServer" were also updated to stream H.264 Video Elementary Stream files. (Some examples of H.264 Video Elementary Stream files - that can now be streamed by our software - are available online at http://www.live555.com/liveMedia/public/264/ ) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From samparknisha at rediffmail.com Wed Dec 8 04:04:34 2010 From: samparknisha at rediffmail.com (Nisha Singh) Date: 8 Dec 2010 12:04:34 -0000 Subject: [Live-devel] =?utf-8?q?Problem_in_H264_streaming?= Message-ID: <20101208120434.11543.qmail@f6mail-145-225.rediffmail.com> Hi, I am encoding a YUV video data into h264 and then I need to stream it to the vlc player. Each time a video frame is encoded into h264 and then streamed using live555. I have created H264MediaSubsession, H264Framer classes and have updated the H264Sink to perform the desired task. I checked the rtp packets flowing over the network which all appears fine to me and are as per the rfc3984.However the data is not getting played properly by the vlc player. The time is getting incremented at the vlc side but only the LAST FEW FRAMES are shown by the vlc player. I am not able to identify the problem.  If anyone has ever encountered such kind of issue then please share your experience and the possible solution. Thanks and Regards,Nisha -------------- next part -------------- An HTML attachment was scrubbed... URL: From demask at mail.ru Wed Dec 8 10:39:39 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Thu, 9 Dec 2010 00:39:39 +0600 Subject: [Live-devel] MPEG-4 Visual configuration bits in SDP Message-ID: Hi, Just came across that it would be useful to have an easy-to-use decoder for MPEG-4 Visual configuration bits that are encoded in "config" parameter of SDP. Something like mpeg4vol command-line tool from mpeg4ip does. Are there any plans to implement this in the future? Kind regards, Dmitriy Petrenko -------------- next part -------------- An HTML attachment was scrubbed... URL: From Ravi_G at mindtree.com Wed Dec 8 21:04:38 2010 From: Ravi_G at mindtree.com (Ravi Kumar G) Date: Thu, 9 Dec 2010 05:04:38 +0000 Subject: [Live-devel] Aduio Formats And HTTPS Message-ID: <09CAD26126EC6C429EB81B5E61E54FC1BF62@MTW02MBX03.mindtree.com> Hi, I would like know couple of things. Does LIVE555 supports G.711 and G.726 audio formats? Does it supports HTTPS? Regards, Ravi ________________________________ http://www.mindtree.com/email/disclaimer.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From edi87 at fibertel.com.ar Thu Dec 9 11:10:50 2010 From: edi87 at fibertel.com.ar (edi87 at fibertel.com.ar) Date: Thu, 09 Dec 2010 16:10:50 -0300 Subject: [Live-devel] Question about trick play, server side Message-ID: <4800cef41d80.4d00ff8a@fibertel.com.ar> Ross, I just found a "bit" change... I just realized that the live555 version I can use here is 2009.02.13. I checked FramedFilter and I'm a bit lost about how should I do it... I saw that ByteStreamFileSource could be the way to make the stream to go back to the start when its ending, but I'm not sure... I though in doing something like subclass FramedSource, let's say "FramedLoopFilter", then subclas OnDemandServerMediaSubsession to use FramedLoopFilter instead of FramedSource* The code that im using to make tests (run the server) is: [....] ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); MPEG1or2FileServerDemux* demux = MPEG1or2FileServerDemux::createNew(*env, inputFileName, reuseFirstSource); ServerMediaSubsession *video_ssession = demux->newVideoServerMediaSubsession(iFramesOnly); ServerMediaSubsession *audio_ssession = demux->newAudioServerMediaSubsession(); sms->addSubsession(video_ssession); sms->addSubsession(audio_ssession); rtspServer->addServerMediaSession(sms); [....] So I would need to subclass MPEG1or2FileServerDemux to use my OnDemandServerMediaSubsession subclass. Please, could you guide me if I'm correct? since I know that there are a lot of changes between my live555 version and current. PS: I'm not able to update the live version, so I need to do this with this old version. Thanks in advance, Jonathan ----- Mensaje original ----- De: Jonathan Granade Fecha: Mi?rcoles, Diciembre 8, 2010 8:19 pm Asunto: Re: [Live-devel] Question about trick play, server side > Ross, > > Sorry by being late in reply but I was pretty busy. > > Thanks for your reply, it sounds very very good and I'm starting to > test it. > > About stream N minutes of a file, I dont want to "cut" the original > video, I think that I can modify the headers or something to make > the > client think that the video duration is N and not the total. > > Sorry about the "trick play" confusion, I miss understood the concept. > > PS: I need to stream a MPEG file, not a transport stream file, I > just > say a TS file because the example, it's the same to stream a TS > file or > a MPG? I mean, I should say what you said for any of those formats? > > Thanks in advance, > > Jonathan > > On 12/01/2010 11:23 AM, Ross Finlayson wrote: > >> I checked the code mentioned, and it works as expected. > >> But now I have a question... is possible to do the same (stream > only N > >> mins of file, or make an infinite loop) from the server side? > > > > Well, if you want to only stream N minutes of a file - without the > > client asking you to do this - then the best/right way to do this > would> simply be to create a new file in advance, by editing the > original file, > > and then asking your clients to stream from the new file instead. > > > > As for streaming a single file over and over again, in an > infinite loop > > (again, without the client asking you to do this): You could do > this,> but you'd need to write a new "FramedFilter" subclass that > sits in front > > of your "ByteStreamFileSource" class (and presents the illusion of > > delivering a single, unbroken stream to the downstream object (a > > "MPEG2TransportStreamFramer")). You'd also need to write a new > > "OnDemandServerMediaSubsession" class (replacing the existing > > "MPEG2TransportFileServerMediaSubsession" class) that uses your new > > "FramedFilter" subclass. > > > > Also, because you're streaming a Transport Stream file, you'd > need to > > make sure that the 'discontinuity flag' is set at the start of > the file, > > so that PCR values (and thus presentation times) don't get messed up > > when the server loops back to the start of the file. > > > > None of this really has anything to do with 'trick play' (in > particular,> you won't be creating index files at all), because - > from the client's > > point of view - you're not doing anything other than playing a > simple> stream. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Thu Dec 9 18:11:04 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Dec 2010 18:11:04 -0800 Subject: [Live-devel] Aduio Formats And HTTPS In-Reply-To: <09CAD26126EC6C429EB81B5E61E54FC1BF62@MTW02MBX03.mindtree.com> References: <09CAD26126EC6C429EB81B5E61E54FC1BF62@MTW02MBX03.mindtree.com> Message-ID: > Does LIVE555 supports G.711 and G.726 audio formats? Yes, we support sending and/or receiving both of those audio formats (over RTP). > Does it supports HTTPS? I'm not sure what you're asking here. Note that our software does *not* do HTTP streaming. Instead, it streams via RTSP/RTP/RTCP. (We *do* support RTSP/RTP tunneling over HTTP, but I suspect that's not what you're asking about here.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From demask at mail.ru Thu Dec 9 08:27:20 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Thu, 9 Dec 2010 22:27:20 +0600 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file Message-ID: <99071844EFC64A5FA9CDC1E9E466668F@castle> Hi Ross, I just was wondering if it's possible to calculate and write keyframe index information to the recorded AVI files using AVIFileSink? Implementation of this class has some hints (like setting AVIF_HASINDEX and AVIF_TRUSTCKTYPE flags in AVI header) but where is the the actual implementation? Kind regards, Dmitriy Petrenko ----- Original Message ----- From: Dmitry Petrenko To: LIVE555 Streaming Media - development & use Sent: Wednesday, December 08, 2010 1:07 AM Subject: key frames (I-frames) in the recorded AVI file Hi Ross, I have noticed that when I'm recording video stream to AVI file using LIVE555 (evein if I'm using openRTSP sample app), the output file does not contain any key frames exept for the very first one. E.g. when I open the recorded file in Virtual Dub and go to FIle->File Information, the "Number of key frames" is always 1, whatever is the length of the recorded video. Or, if I execute search for the key frame using AVIStreamFindSample() VFW API function, it always return 0 (the very first frame), whatever start search position I do specify in call. As a video source I'm using Axis 241S/241Q video servers, and they are configured to stream I-frame followed by 7 P-frames (GOV Settings Structure: IP, Length: 8). Could you please advise what could be the reason of such strange behaviour? Kind regards, Dmitriy Petrenko -------------- next part -------------- An HTML attachment was scrubbed... URL: From TWiser at logostech.net Thu Dec 9 10:03:47 2010 From: TWiser at logostech.net (Wiser, Tyson) Date: Thu, 9 Dec 2010 10:03:47 -0800 Subject: [Live-devel] OnDemandServerMediaSubsession and FramedSource subclassing Message-ID: <8CD7A9204779214D9FDC255DE48B95211DE2C9D9@EXPMBX105-1.exch.logostech.net> I am trying to use the live555 library in a project of mine, but I am running into some issues I haven't been able to figure out despite reading the FAQ and searching the mailing list archives. Basically, I need a server that streams data on demand from a live source to a client. The stream needs to be able to be sent using either unicast or multicast to an address and port combination dictated by the client in the SETUP message. The format of the stream data is non-standard, but can be thought of as a binary blob for each frame. Based on what I have gathered from looking at the FAQ, the source code, and the mailing list archives, I have subclassed FramedSource to be able to get data from my source. Unfortunately, my source cannot be treated as a file, so I have used the watch variable method as suggested in the FAQ and the messages it references. I then tried using PassiveServerMediaSubsession but realized that it only created multicast streams and the destination address and port needed to be set up before the client ever sent a SETUP message. My next thought was OnDemandServerMediaSubsession. With the exception of two things, it seems to do almost exactly what I need. First, if I wanted to subclass OnDemandServerMediaSubsession to handle multicast as well as unicast, is it sufficient to change getStreamParameters() so that it doesn't automatically set isMulticast to False, but instead sets it based on the destination address? Second, in using the watch variable method to indicate to my source when a new frame is available I have basically done the following: while(running) { env->taskScheduler().doEventLoop(&frameAvailable); if (frameAvailable != 0) { mySource->doGetNextFrame(); frameAvailable = 0; } } The problem, when using OnDemandServerMediaSubsession, is that I don't know how to get a reference to my source object. That object, as the name of the class implies, is created on demand and I have not found any functions that return to me a FramedSource* or a RTPSink*. I have made sure that I set reuseFirstSource to True when creating the OnDemandServerMediaSubsession subclass, which I understand to mean that there will only ever be one source. Is there a way to get a reference to that source? I appreciate your help and think that this is a great library. Thanks, Tyson Wiser Logos Technologies, Inc. From mmorogan at cisco.com Thu Dec 9 10:39:09 2010 From: mmorogan at cisco.com (Monica Morogan (mmorogan)) Date: Thu, 9 Dec 2010 10:39:09 -0800 Subject: [Live-devel] RTCP RR Message-ID: Hello, I ran into an issue where RTCP SRs are sent to my VLC client (playing jpeg, h264 streams, UDP transport, frequency 5 seconds but experimented with higher numbers as well), but no RTCP RRs are sent back. Could you please let me know when such situations occur? Thank you for your time, Monica -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 9 19:54:12 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Dec 2010 19:54:12 -0800 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file In-Reply-To: <99071844EFC64A5FA9CDC1E9E466668F@castle> References: <99071844EFC64A5FA9CDC1E9E466668F@castle> Message-ID: >I just was wondering if it's possible to calculate and write >keyframe index information to the recorded AVI files using >AVIFileSink? Implementation of this class has some hints (like >setting AVIF_HASINDEX and AVIF_TRUSTCKTYPE flags in AVI header) but >where is the the actual implementation? The file "liveMedia/AVIFileSink.cpp" contains our complete implementation of outputting to an AVI file. It could undoubtedly be improved, however. As always, feel free to propose any specific improvements or bugfixes that you might have. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 9 20:07:28 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Dec 2010 20:07:28 -0800 Subject: [Live-devel] MPEG-4 Visual configuration bits in SDP In-Reply-To: References: Message-ID: >Just came across that it would be useful to have an easy-to-use >decoder for MPEG-4 Visual configuration bits that are encoded in >"config" parameter of SDP. Something like mpeg4vol command-line tool >from mpeg4ip does. Are there any plans to implement this in the >future? The 'config' parameter string is simply a string of bytes, encoded as hexadecimal digits (therefore, two hexadecimal digits per byte). We do have a function: parseGeneralConfigStr() (defined in "liveMedia/include/MPEG4LATMAudioRTPSource.hh) that will convert a 'config' string to binary. E.g., the 'config' string deadbeef would be converted to the following sequence of 4 bytes 0xDE 0xAD 0xBE 0xEF However, we don't implement any additional 'decoding' of this data (or any other MPEG-4 data), because it's not needed at all for streaming. (We don't include any audio/video decoding or encoding software.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 10 00:10:24 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 00:10:24 -0800 Subject: [Live-devel] OnDemandServerMediaSubsession and FramedSource subclassing In-Reply-To: <8CD7A9204779214D9FDC255DE48B95211DE2C9D9@EXPMBX105-1.exch.logostech.net> References: <8CD7A9204779214D9FDC255DE48B95211DE2C9D9@EXPMBX105-1.exch.logostech.net> Message-ID: >Basically, I need a server that streams data on demand from a live >source to a client. The stream needs to be able to be sent using >either unicast or multicast to an address and port combination >dictated by the client in the SETUP message. It's usually the server, not the client, that decides whether or not the stream is unicast or multicast. The usual way for a server to support both kinds of stream is to have both a "OnDemandServerMediaSubsession" and a "PassiveServerMediaSubsession" - with different names, of course. The client could use the stream name to choose between unicast and multicast. In principle this would work, but in your case you'd have a problem: The 'reuseFirstSource' mechanism has been implemented only for "OnDemandServerMediaSubsession"s; not for "PassiveServerMediaSubsession". Furthernore, there's no way for those two separate classes to share the same input source. So, to have both the unicast and multicast streams reading from the same source, you'd need to modify the code. A simpler alternative is to do what you seem to be doing: Allow the client to specify the destination address (i.e., unicast or multicast), using a "destination=" parameter in the RTSP "SETUP" message. To support this, you'll need to define RTSP_ALLOW_CLIENT_DESTINATION_SETTING before you compile "RTSPServer.cpp"; I presume that you've done this. >First, if I wanted to subclass OnDemandServerMediaSubsession to >handle multicast as well as unicast, is it sufficient to change >getStreamParameters() so that it doesn't automatically set >isMulticast to False, but instead sets it based on the destination >address? I haven't tested this, but yes, I believe so. It should be easy: Your subclass could redefine the "getStreamParameters()" virtual function to do this: void mySubclass::getStreamParameters( ...parameters... ) { OnDemandServerMediaSubsession::getStreamParameters( ...parameters... ); if (IsMulticastAddress(destinationAddress) isMulticast = True; } >The problem, when using OnDemandServerMediaSubsession, is that I >don't know how to get a reference to my source object. That object, >as the name of the class implies, is created on demand and I have >not found any functions that return to me a FramedSource* or a >RTPSink*. I have made sure that I set reuseFirstSource to True when >creating the OnDemandServerMediaSubsession subclass, which I >understand to mean that there will only ever be one source. Is >there a way to get a reference to that source? Yes, when your "OnDemandServerMediaSubsession " subclass creates a source object - in your implementation of the "createNewStreamSource()" virtual function - you can store a pointer to the source object (e.g., in a global variable). (Because "reuseFirstSource" is true, "createNewStreamSource()" should be called just once, so there will be only one such source object.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 10 00:26:33 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 00:26:33 -0800 Subject: [Live-devel] Problem in H264 streaming In-Reply-To: <20101208120434.11543.qmail@f6mail-145-225.rediffmail.com> References: <20101208120434.11543.qmail@f6mail-145-225.rediffmail.com> Message-ID: >I am encoding a YUV video data into h264 and then I need to stream >it to the vlc player. >Each time a video frame is encoded into h264 and then streamed using >live555. I have created H264MediaSubsession, H264Framer classes Note that you no longer need to write your own subclass of "H264VideoStreamFramer". For encoder sources - like yours - we now provide our own subclass "H264VideoStreamDiscreteFramer" that you should use instead. > and have updated the H264Sink You don't need to 'update' the "H264VideoRTPSink" class. The existing code works just fine. >However the data is not getting played properly by the vlc player. >The time is getting incremented at the vlc side but only the LAST >FEW FRAMES are shown by the vlc player. I am not able to identify >the problem. Make sure that you are setting "fPresentationTime" correctly in your encoder class (for each delivered NAL unit). You might also want to experiment using "openRTSP" as a RTSP client, before you start using VLC. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From justin.huff at qq.com Fri Dec 10 00:17:36 2010 From: justin.huff at qq.com (=?ISO-8859-1?B?anVzdGlu?=) Date: Fri, 10 Dec 2010 16:17:36 +0800 Subject: [Live-devel] How to test/use RTSP-over-HTTP Message-ID: Do we need special client program? What is the procedure to test/use the RTSP-over-HTTP? Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 10 00:30:34 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 00:30:34 -0800 Subject: [Live-devel] Question about trick play, server side In-Reply-To: <4800cef41d80.4d00ff8a@fibertel.com.ar> References: <4800cef41d80.4d00ff8a@fibertel.com.ar> Message-ID: >I just found a "bit" change... I just realized that the live555 >version I can use here is 2009.02.13. Sorry, but support is given only for the latest version of the code only. (The version that you're using has many, many bugs that were fixed in subsequent versions.) >PS: I'm not able to update the live version, so I need to do this >with this old version. Then you will get absolutely no help from me. Sorry. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 10 00:47:14 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 00:47:14 -0800 Subject: [Live-devel] RTCP RR In-Reply-To: References: Message-ID: >I ran into an issue where RTCP SRs are sent to my VLC client (playing >jpeg, h264 streams, UDP transport, frequency 5 seconds but experimented >with higher numbers as well), but no RTCP RRs are sent back. >Could you please let me know when such situations occur? If your version of VLC is using the "LIVE555 Streaming Media" software for its RTSP client implementation, then it will definitely send RTCP "RR" packets. If your server is not seeing these packets, then perhaps there is a firewall somewhere that is blocking them? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 10 00:57:46 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 00:57:46 -0800 Subject: [Live-devel] How to test/use RTSP-over-HTTP In-Reply-To: References: Message-ID: >Do we need special client program? It depends on whether your existing client program supports this as an option. Note that our "openRTSP" client supports this (using the "-T " option). See VLC also supports it, I believe. > What is the procedure to test/use the RTSP-over-HTTP? Thanks! I suggest using "openRTSP", with the "-T " option, as noted above. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From TWiser at logostech.net Fri Dec 10 05:23:27 2010 From: TWiser at logostech.net (Wiser, Tyson) Date: Fri, 10 Dec 2010 05:23:27 -0800 Subject: [Live-devel] OnDemandServerMediaSubsession and FramedSource subclassing In-Reply-To: References: <8CD7A9204779214D9FDC255DE48B95211DE2C9D9@EXPMBX105-1.exch.logostech.net> Message-ID: <8CD7A9204779214D9FDC255DE48B95211DE2CCC2@EXPMBX105-1.exch.logostech.net> Ross, Thanks for the help. After doing a quick test it appears that making the suggested modifications to my subclass of "OnDemandServerMediaSubsession" does work as expected. I appreciate your great work on this library. Tyson -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 10, 2010 3:10 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] OnDemandServerMediaSubsession and FramedSource subclassing >Basically, I need a server that streams data on demand from a live >source to a client. The stream needs to be able to be sent using >either unicast or multicast to an address and port combination >dictated by the client in the SETUP message. It's usually the server, not the client, that decides whether or not the stream is unicast or multicast. The usual way for a server to support both kinds of stream is to have both a "OnDemandServerMediaSubsession" and a "PassiveServerMediaSubsession" - with different names, of course. The client could use the stream name to choose between unicast and multicast. In principle this would work, but in your case you'd have a problem: The 'reuseFirstSource' mechanism has been implemented only for "OnDemandServerMediaSubsession"s; not for "PassiveServerMediaSubsession". Furthernore, there's no way for those two separate classes to share the same input source. So, to have both the unicast and multicast streams reading from the same source, you'd need to modify the code. A simpler alternative is to do what you seem to be doing: Allow the client to specify the destination address (i.e., unicast or multicast), using a "destination=" parameter in the RTSP "SETUP" message. To support this, you'll need to define RTSP_ALLOW_CLIENT_DESTINATION_SETTING before you compile "RTSPServer.cpp"; I presume that you've done this. >First, if I wanted to subclass OnDemandServerMediaSubsession to >handle multicast as well as unicast, is it sufficient to change >getStreamParameters() so that it doesn't automatically set >isMulticast to False, but instead sets it based on the destination >address? I haven't tested this, but yes, I believe so. It should be easy: Your subclass could redefine the "getStreamParameters()" virtual function to do this: void mySubclass::getStreamParameters( ...parameters... ) { OnDemandServerMediaSubsession::getStreamParameters( ...parameters... ); if (IsMulticastAddress(destinationAddress) isMulticast = True; } >The problem, when using OnDemandServerMediaSubsession, is that I >don't know how to get a reference to my source object. That object, >as the name of the class implies, is created on demand and I have >not found any functions that return to me a FramedSource* or a >RTPSink*. I have made sure that I set reuseFirstSource to True when >creating the OnDemandServerMediaSubsession subclass, which I >understand to mean that there will only ever be one source. Is >there a way to get a reference to that source? Yes, when your "OnDemandServerMediaSubsession " subclass creates a source object - in your implementation of the "createNewStreamSource()" virtual function - you can store a pointer to the source object (e.g., in a global variable). (Because "reuseFirstSource" is true, "createNewStreamSource()" should be called just once, so there will be only one such source object.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From edi87 at fibertel.com.ar Fri Dec 10 13:24:58 2010 From: edi87 at fibertel.com.ar (edi87 at fibertel.com.ar) Date: Fri, 10 Dec 2010 18:24:58 -0300 Subject: [Live-devel] Question about trick play, server side Message-ID: <6edb3b012f56.4d02707a@fibertel.com.ar> Ross, After your negative, I talk here and request a lib update and I got OK, so now I'm playing with the latest version of live555. I started working on what you said, basically I do: Loop_FramedFilter (inherits FramedFilter) Loop_MPEG2TransportStreamFramer (inherits Loop_FramedFilter) Loop_MPEG2TransportFileServerMediaSubsession (inherits OnDemandServerMediaSubsession) Now the problem is where to make the stream to loop to the start of file... you said: "You could do this, but you'd need to write a new "FramedFilter" subclass that sits in front of your "ByteStreamFileSource" class (and presents the illusion of delivering a single, unbroken stream to the downstream object (a "MPEG2TransportStreamFramer"))" I can't understand why FramedFilter subclass should be used as I don't see any interface to make anything over the stream, I think that the best option could be a subclass of ByteStreamFileSource, which handles the file or MPEG2TransportFileServerMediaSubsession, which could make a seek to the start of the file. Could you clarify this to me? Thanks in advance, Jonathan ----- Mensaje original ----- De: Ross Finlayson Fecha: Viernes, Diciembre 10, 2010 5:30 am Asunto: Re: [Live-devel] Question about trick play, server side > >I just found a "bit" change... I just realized that the live555 > >version I can use here is 2009.02.13. > > Sorry, but support is given only for the latest version of the code > only. (The version that you're using has many, many bugs that were > fixed in subsequent versions.) > > > >PS: I'm not able to update the live version, so I need to do this > >with this old version. > > Then you will get absolutely no help from me. Sorry. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From mmorogan at cisco.com Fri Dec 10 18:03:53 2010 From: mmorogan at cisco.com (Monica Morogan (mmorogan)) Date: Fri, 10 Dec 2010 18:03:53 -0800 Subject: [Live-devel] RTCP RR In-Reply-To: References: Message-ID: Thanks for your answer and time. There is no firewall involved here. I was hoping for more insights related with RTCP SR which will cause VLC not to send RTCP RR ... Sometimes I see RTCP RR ...sometimes not. Regards, Monica -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 10, 2010 12:47 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTCP RR >I ran into an issue where RTCP SRs are sent to my VLC client (playing >jpeg, h264 streams, UDP transport, frequency 5 seconds but experimented >with higher numbers as well), but no RTCP RRs are sent back. >Could you please let me know when such situations occur? If your version of VLC is using the "LIVE555 Streaming Media" software for its RTSP client implementation, then it will definitely send RTCP "RR" packets. If your server is not seeing these packets, then perhaps there is a firewall somewhere that is blocking them? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Fri Dec 10 21:51:25 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 21:51:25 -0800 Subject: [Live-devel] RTCP RR In-Reply-To: References: Message-ID: I suggest that you begin by running "openRTSP" instead of VLC as your RTSP client. "openRTSP" should definitely be sending RTCP "RR" packets. If you see "openRTSP" sending "RR" packets, but VLC does not (and you're using the latest version of VLC), then let us know, and we'll see if we can figure out something else... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 10 22:19:24 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 22:19:24 -0800 Subject: [Live-devel] New TaskScheduler 'event trigger' mechanism, and (much) improved "DeviceSource" Message-ID: One of the biggest problems that developers have had with this library is that it has been difficult to define and handle new kinds of event - beyond the file/socket I/O and delayed task events that we support by default. In particular, it has been difficult to implement input devices (such as encoders) when we want to signal a new event (such as the availability of new frame data) from an external thread. The 'watch variable' mechanism - although it can be used - is not particular well-suited for this purpose. Furthermore, the model code in "DeviceSource.cpp" has not been particular helpful, because it doesn't really describe what to do to handle events that are signaled from an external thread. To overcome this, I have now installed a new version (2010.12.11) of the "LIVE555 Streaming Media" library that now includes a new 'event trigger' mechanism for "TaskScheduler" (and its subclass, "BasicTaskScheduler"). Specifically, you can now - using the new function TaskScheduler::createEventTrigger() - register an event handler function, to be associated with a particular 'event trigger id'. At some later time, you can call TaskScheduler:: triggerEvent() with this 'event trigger id' as parameter, and this will cause the event handler function to get called (from within the event loop). For more details, see "UsageEnvironment/include/UsageEnvironment.hh" One nice feature of this mechanism is that - unlike other library routines - the "triggerEvent()" function can be called from a separate thread. This makes it easier to implement input device classes. I have also updated (and, I hope, significantly improved) the model code in "liveMedia/DeviceSource.cpp". This code describes how to implement an input device class that uses the new 'event trigger' mechanism (with the event possibly being signaled from an external thread). If you're implementing (or have already implemented) an input device class, then I encourage you to take a look at the new "DeviceSource" code - and perhaps use this as a model for your code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 10 23:32:20 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Dec 2010 23:32:20 -0800 Subject: [Live-devel] Question about trick play, server side In-Reply-To: <6edb3b012f56.4d02707a@fibertel.com.ar> References: <6edb3b012f56.4d02707a@fibertel.com.ar> Message-ID: >Now the problem is where to make the stream to loop to the start of >file... you said: > >"You could do this, but you'd need to write a new "FramedFilter" >subclass that sits >in front of your "ByteStreamFileSource" class (and presents the >illusion of delivering a single, unbroken stream to the downstream >object (a "MPEG2TransportStreamFramer"))" > >I can't understand why FramedFilter subclass should be used An alternative - which might be simpler for you - would be to write a new class (e.g. called "ContinuousByteStreamFileSource") that duplicates much of the functionality (and code) of the existing "ByteStreamFileSource" - except that it reads from its file continously - and just use this instead of "ByteStreamFileSource". I.e, you would define your own subclass of "OnDemandServerMediaSubsession" that would be identical to the existing "MPEG2TransportFileServerMediaSubsession", except that it use a "ContinuousByteStreamFileSource" as input instead of a "ByteStreamFileSource". As always, you should *not* need to modify any of the existing code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yangzx at acejet.com.cn Mon Dec 13 01:15:53 2010 From: yangzx at acejet.com.cn (=?gb2312?B?0e7Wvs/p?=) Date: Mon, 13 Dec 2010 17:15:53 +0800 Subject: [Live-devel] live555 stream aac raw problem Message-ID: <41FC1A94FE13684D8249B089C38119700F674F@mailserver-nj1.njacejet.com> hi,Ross: I have live aac raw data,I implement own audioservermediasession,overload virual creatertpsink function and createstreamsource function. code: myownserversession::createNewRTPSink(....) { return MPEG4GenericRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic, 44100, "audio", "AAC-hbr", "1", //aac raw 2); } myownserverseesion:createNewStreamSource(...) { return myowndevicesource(...); } in myowndevicesource class ,i use videodevicesource same code.videodevicesource is work ok. but,i use vlcplay,vlc disp not decode audio.why? live555 no support aac raw data?only use aac adts? From vincenzo.terracciano at its.na.it Mon Dec 13 05:48:25 2010 From: vincenzo.terracciano at its.na.it (Vincenzo Terracciano) Date: Mon, 13 Dec 2010 14:48:25 +0100 Subject: [Live-devel] Trick play H264 Message-ID: Hi Ross, I'm working on the trick play support for a ts file with h264 video. I changed the method parseFrame of the class MPEG2IFrameIndexFromTransportStream. Do you think this is enough or I need to change the server-side implementation of seeking and fast forwarding capabilities? I have the tsx related to the ts / h264 video file. If I invoke the rtsp url in VLC the timeline flows but the seeking and fast fowrarding functionalities don't work. ------------------------------------------------------------------ ITS S.p.A. Vincenzo Terracciano, Via Terragneta 90, 80058 Torre Annunziata(NA) +39 081 5353392 -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincenzo.terracciano at its.na.it Mon Dec 13 06:43:39 2010 From: vincenzo.terracciano at its.na.it (Vincenzo Terracciano) Date: Mon, 13 Dec 2010 15:43:39 +0100 Subject: [Live-devel] R: Trick play H264 In-Reply-To: References: Message-ID: <0E8D625B026B413DA78F4C25C062B017@its.na.it> Now seeking functionality works well. I changed the index record type from RECORD_PIC_IFRAME to RECORD_GOP for each IDR H264 frame. I have still the problem about fast forwarding functionality.Do you have any suggestion? ------------------------------------------------------------------ ITS S.p.A. Vincenzo Terracciano, Via Terragneta 90, 80058 Torre Annunziata(NA) +39 081 5353392 _____ Da: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] Per conto di Vincenzo Terracciano Inviato: luned? 13 dicembre 2010 14.48 A: live-devel at ns.live555.com Oggetto: [Live-devel] Trick play H264 Hi Ross, I'm working on the trick play support for a ts file with h264 video. I changed the method parseFrame of the class MPEG2IFrameIndexFromTransportStream. Do you think this is enough or I need to change the server-side implementation of seeking and fast forwarding capabilities? I have the tsx related to the ts / h264 video file. If I invoke the rtsp url in VLC the timeline flows but the seeking and fast fowrarding functionalities don?t work. ------------------------------------------------------------------ ITS S.p.A. Vincenzo Terracciano, Via Terragneta 90, 80058 Torre Annunziata(NA) +39 081 5353392 -------------- next part -------------- An HTML attachment was scrubbed... URL: From amadorim at vdavda.com Mon Dec 13 08:10:41 2010 From: amadorim at vdavda.com (Marco Amadori) Date: Mon, 13 Dec 2010 17:10:41 +0100 Subject: [Live-devel] R: Trick play H264 In-Reply-To: <0E8D625B026B413DA78F4C25C062B017@its.na.it> References: <0E8D625B026B413DA78F4C25C062B017@its.na.it> Message-ID: <201012131710.42103.amadorim@vdavda.com> In data Monday 13 December 2010 15:43:39, Vincenzo Terracciano ha scritto: > Now seeking functionality works well. I changed the index record type from > RECORD_PIC_IFRAME to RECORD_GOP for each IDR H264 frame. I have still the > problem about fast forwarding functionality.Do you have any suggestion? There is some code where you could find some ideas there: http://www.coexsi.fr/publications/live555-universal-indexer/ -- ESC:wq -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From Bruno.Basilio at brisa.pt Mon Dec 13 09:19:18 2010 From: Bruno.Basilio at brisa.pt (Bruno Filipe Basilio) Date: Mon, 13 Dec 2010 17:19:18 +0000 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream Message-ID: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt> Hi, I'm using Live555 to send H.264 ES video over Transport Stream using MPEG2TransportStreamFromESSource but the video stream blocks for >20 minutes in every hour or so. The goal is to deliver a live video stream using an H.264 video encoder as the source, initial tests showed that Transport Stream offered the most fluid stream. This problem doesn't happen when sending raw H.264 ES using H264VideoRTPSink directly. After some analysis it's my understanding that MyDeviceSource::doGetNextFrame() stops being called for a long time and MyDeviceSource::deliverFrame() isn't called at all because FrameSource::isCurrentlyAwaitingData() equals false all the time the video has been blocked. Could you give me some pointers where to look or what to do in order to understand what seems to be the problem using MPEG2TransportStreamFromESSource? The latest version of live555 isn't as yet integrated. Your feedback would be very appreciated. Follows the testMPEG2TransportStreamer.cpp source code changes that detail the object chain used: void play() { // Create the video source: MyDeviceSource* deviceSource = MyDeviceSource::createNew(*env, path,channel); if (deviceSource == NULL) { *env << "Unable to open \"" << inputFileName << "\" as a device source\n"; exit(1); } *env << "DEBUG: device source successfully opened\n"; // Create a framer for the Video Elementary Stream: FramedFilter * h264framer = H264VideoDiscreteFramer::createNew(*env, deviceSource); const int VIDEO_H264_VERSION =5; // And generate a Transport Stream from this: MPEG2TransportStreamFromESSource * mpeg2tsSource = MPEG2TransportStreamFromESSource::createNew(*env); mpeg2tsSource->addNewVideoSource(h264framer, VIDEO_H264_VERSION); videoSource=mpeg2tsSource; // Finally, start playing: *env << "Beginning to read from Picolo H264 encoder...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); } Also, a part of the DeviceSource extension source code: void MyDeviceSource::doGetNextFrame() { if (!fHaveStartedReading) { envir().taskScheduler().turnOnBackgroundReadHandling(fd, (TaskScheduler::BackgroundHandlerProc*)&fileReadableHandler, this); fHaveStartedReading = True; FILE_LOG(logDEBUG) << __PRETTY_FUNCTION__ << ": Frame request by the 'downstream' element, the file read is scheduled."; } } void MyDeviceSource::doStopGettingFrames() { envir().taskScheduler().turnOffBackgroundReadHandling(fd); fHaveStartedReading = False; FILE_LOG(logDEBUG) << __PRETTY_FUNCTION__ << ": The file read is unscheduled."; } void MyDeviceSource::fileReadableHandler( PicoloH264DeviceSource* source, int /*mask*/) { if (!source->isCurrentlyAwaitingData()) { source->doStopGettingFrames(); // we're not ready for the data yet return; } source->deliverFrame(); } void MyDeviceSource::deliverFrame() { // Deliver the data here: ENCODED_DATA data = liveSource->GetData(); if (data.size <= fMaxSize) { memcpy(fTo, data.data , data.size); fFrameSize = data.size; } else { memcpy(fTo, data.data , fMaxSize); fNumTruncatedBytes = data.size - fMaxSize; fFrameSize = fMaxSize; } //compute fPresentationTime timeradd(&presentationTimeBase, &(data.timestamp),&fPresentationTime ); liveSource->ReleaseData(); fDurationInMicroseconds = 0; // After delivering the data, inform the reader that it is now available: FramedSource::afterGetting(this); } Best Regards, Bruno Basilio -------------------------------------------------------------------------------- Declara??o: A informa??o contida nesta mensagem, e os ficheiros anexos, ? privilegiada e confidencial, destinando-se exclusivamente ao(s) destinat?rio(s).Se n?o ? o destinat?rio (ou o respons?vel pela sua entrega ao destinat?rio) e recebeu a mesma por engano, fica notificado que ? estritamente proibido reproduzir, guardar ou distribuir toda ou qualquer parte desta mensagem e ficheiros anexos.Por favor reencaminhe a mensagem para o respons?vel pelo seu envio ou contacte-nos por telefone e elimine a mensagem e ficheiros anexos do seu computador,sem os reproduzir. Disclaimer: The information contained in this message, and any files attached, is privileged and confidential, and intended exclusively for the included addresses.If you are not the intended recipient (or the person responsible for delivering to the intended recipient) and received this message by mistake, be aware that copy, storage, distribution or any other use of all or part of this message and the files attached is strictly prohibited. Please notify the sender by reply e-mail or contact us by telephone and delete this message and the files attached, without retaining a copy. -------------------------------------------------------------------------------- From gbonneau at miranda.com Mon Dec 13 13:56:51 2010 From: gbonneau at miranda.com (BONNEAU Guy) Date: Mon, 13 Dec 2010 21:56:51 +0000 Subject: [Live-devel] New TaskScheduler 'event trigger' mechanism, and (much) improved "DeviceSource" In-Reply-To: References: Message-ID: <8691AB8A396ED446A4D01C69CE2F5D3F23C699AE@CA-OPS-MAILBOX.miranda.com> If the scheduler is waiting on select() (in BasicTaskScheduler.cpp) to kick-in and no network activity or alarm happen to awake the select() will the new mechanism provide a mean to exit the threading loop ? Thanks GB |-----Original Message----- |From: live-devel-bounces at ns.live555.com [mailto:live-devel- |bounces at ns.live555.com] On Behalf Of Ross Finlayson |Sent: Saturday, December 11, 2010 1:19 |To: live-devel at ns.live555.com |Subject: [Live-devel] New TaskScheduler 'event trigger' mechanism, and |(much) improved "DeviceSource" | |One of the biggest problems that developers have had with this library is that |it has been difficult to define and handle new kinds of event - beyond the |file/socket I/O and delayed task events that we support by default. | |In particular, it has been difficult to implement input devices (such as |encoders) when we want to signal a new event (such as the availability of new |frame data) from an external thread. The 'watch variable' mechanism - |although it can be used - is not particular well-suited for this purpose. |Furthermore, the model code in "DeviceSource.cpp" has not been particular |helpful, because it doesn't really describe what to do to handle events that |are signaled from an external thread. | |To overcome this, I have now installed a new version (2010.12.11) of the |"LIVE555 Streaming Media" library that now includes a new 'event trigger' |mechanism for "TaskScheduler" (and its subclass, "BasicTaskScheduler"). | |Specifically, you can now - using the new function | TaskScheduler::createEventTrigger() |- register an event handler function, to be associated with a particular 'event |trigger id'. At some later time, you can call | TaskScheduler:: triggerEvent() |with this 'event trigger id' as parameter, and this will cause the event handler |function to get called (from within the event loop). | |For more details, see "UsageEnvironment/include/UsageEnvironment.hh" | |One nice feature of this mechanism is that - unlike other library routines - the |"triggerEvent()" function can be called from a separate thread. This makes it |easier to implement input device classes. | | |I have also updated (and, I hope, significantly improved) the model code in |"liveMedia/DeviceSource.cpp". This code describes how to implement an |input device class that uses the new 'event trigger' |mechanism (with the event possibly being signaled from an external thread). | |If you're implementing (or have already implemented) an input device class, |then I encourage you to take a look at the new "DeviceSource" |code - and perhaps use this as a model for your code. |-- | |Ross Finlayson |Live Networks, Inc. |http://www.live555.com/ |_______________________________________________ |live-devel mailing list |live-devel at lists.live555.com |http://lists.live555.com/mailman/listinfo/live-devel From jingke2000 at gmail.com Mon Dec 13 13:07:38 2010 From: jingke2000 at gmail.com (Ke Yu) Date: Mon, 13 Dec 2010 16:07:38 -0500 Subject: [Live-devel] cross compiled Live555 libs Message-ID: I had built the Live555 libs with mingw tools in UBuntu successfully following instructions: 1). genMakefiles config.mingw 2). make The reason for cross compiling above is I have other libraries built this way (namely, FFMpeg). I've built an app with those FFMpeg libs in Visual C++ 2008. The app works fine. Now, I need to also use the live555 libraries built above in this same project. However, when I tried to link the live555 libs, the Visual Studio compiler gave me fatal link error: "libliveMedia.a: invalid or corrupt file". The host OS is Windows 7. How should I fix this problem? Thanks! From eyals at aeronautics-sys.com Mon Dec 13 14:20:59 2010 From: eyals at aeronautics-sys.com (Eyal Shveky) Date: Tue, 14 Dec 2010 00:20:59 +0200 Subject: [Live-devel] New TaskScheduler 'event trigger - Usinf Message-ID: <10C6D4821E0BAA44A73B987733947B164A1736@OREV.ads.local> Hi, Thanks you very much for this feature, I guess that I'm not the only one that was waiting for that. J When I trigger the triggerEvent() from my separate thread (DirectShow Input pi Thread), I see that it's being catch by the BasicTaskScheduler::SingleStep, Thinking that the way to work with the UsageEnvironment is in the old fusion way - env->taskScheduler().doEventLoop(). Whiteout using the watchVariable? ********************************************************************************************** LEGAL NOTICE - Unless expressly stated otherwise, this message, its annexes, attachments, appendixes, any subsequent correspondence, and any document, data, sketches, plans and/or other material that is hereby attached, are proprietary. confidential and may be legally privileged. Nothing in this e-mail is intended to conclude a contract on behalf of Aeronautics or make it subject to any other legally binding commitments, unless the e-mail contains an express statement to the contrary or incorporates a formal Purchase Order. This transmission is intended for the named addressee only. Unless you are the named addressee (or authorised to receive it for the addressee) you may not copy or use it, or disclose it to anyone else, any disclosure or copying of the contents of this e-mail or any action taken (or not taken) in reliance on it is unauthorised and may be unlawful. If you are not an addressee, please inform the sender immediately. IMPORTANT: The contents of this email and any attachments are confidential. They are intended for the named recipient(s) only. If you have received this email in error, please notify the system manager or the sender immediately and do not disclose the contents to anyone or make copies thereof. *** eSafe scanned this email for viruses, vandals, and malicious content. *** ********************************************************************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 13 16:20:47 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Dec 2010 16:20:47 -0800 Subject: [Live-devel] New TaskScheduler 'event trigger' mechanism, and (much) improved "DeviceSource" In-Reply-To: <8691AB8A396ED446A4D01C69CE2F5D3F23C699AE@CA-OPS-MAILBOX.miranda.com> References: <8691AB8A396ED446A4D01C69CE2F5D3F23C699AE@CA-OPS-MAILBOX.miranda.com> Message-ID: >If the scheduler is waiting on select() (in BasicTaskScheduler.cpp) >to kick-in and no network activity or alarm happen to awake the >select() will the new mechanism provide a mean to exit the threading >loop ? Yes - I added a periodic 'dummy' task to "BasicTaskScheduler" that wakes up every 10ms. This ensures that we'll always return from the "select()" at least that often. (Therefore, there is no longer any need for the programmer to add such a hack themselves.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mmorogan at cisco.com Mon Dec 13 16:57:10 2010 From: mmorogan at cisco.com (Monica Morogan (mmorogan)) Date: Mon, 13 Dec 2010 16:57:10 -0800 Subject: [Live-devel] RTCP RR In-Reply-To: References: Message-ID: Hi Ross, I followed your suggestions (thank you) and indeed with openRTCP I receive without problems the RTCP RRs. However, VLC doesn't send any RTCP RR (using exactly the same setup, stream). I made also sure that I am using the latest VLC 1.1.5. Could you please let me know how shall we proceed further? Thank you for your time, Monica -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 10, 2010 9:51 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTCP RR I suggest that you begin by running "openRTSP" instead of VLC as your RTSP client. "openRTSP" should definitely be sending RTCP "RR" packets. If you see "openRTSP" sending "RR" packets, but VLC does not (and you're using the latest version of VLC), then let us know, and we'll see if we can figure out something else... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Mon Dec 13 17:03:57 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Dec 2010 17:03:57 -0800 Subject: [Live-devel] RTCP RR In-Reply-To: References: Message-ID: >I followed your suggestions (thank you) and indeed with openRTCP I >receive without problems the RTCP RRs. However, >VLC doesn't send any RTCP RR (using exactly the same setup, stream). >I made also sure that I am using the latest VLC 1.1.5. > >Could you please let me know how shall we proceed further? Have you set any special options for RTSP/RTP in VLC's "Preferences" panel? If not, then you should report your problem to the "vlc at videolan.org" mailing list. The LIVE555 library definitely sends RTCP "RR" packets (as shown by "openRTSP" (not "openRTCP", BTW)), but - for some unknown reason - VLC is apparently not using it properly when it's playing your stream. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mmorogan at cisco.com Mon Dec 13 17:28:43 2010 From: mmorogan at cisco.com (Monica Morogan (mmorogan)) Date: Mon, 13 Dec 2010 17:28:43 -0800 Subject: [Live-devel] RTCP RR In-Reply-To: References: Message-ID: Hi Ross, Thank you for your suggestion. I will do that. Hopefully, we will get to the bottom of this issue. (BWT, it was a typo... I meant openRTSP). Regards, Monica -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, December 13, 2010 5:04 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTCP RR >I followed your suggestions (thank you) and indeed with openRTCP I >receive without problems the RTCP RRs. However, >VLC doesn't send any RTCP RR (using exactly the same setup, stream). >I made also sure that I am using the latest VLC 1.1.5. > >Could you please let me know how shall we proceed further? Have you set any special options for RTSP/RTP in VLC's "Preferences" panel? If not, then you should report your problem to the "vlc at videolan.org" mailing list. The LIVE555 library definitely sends RTCP "RR" packets (as shown by "openRTSP" (not "openRTCP", BTW)), but - for some unknown reason - VLC is apparently not using it properly when it's playing your stream. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From pj81102 at gmail.com Mon Dec 13 19:21:13 2010 From: pj81102 at gmail.com (P.J.) Date: Tue, 14 Dec 2010 11:21:13 +0800 Subject: [Live-devel] New TaskScheduler 'event trigger' mechanism, and (much) improved "DeviceSource" In-Reply-To: References: Message-ID: <4D06E2A9.8010505@gmail.com> Hi,Ross, I'm doubting this piece of code works correctly. void BasicTaskScheduler0::triggerEvent(EventTriggerId eventTriggerId, void* clientData) { // First, record the "clientData": if (eventTriggerId == fLastUsedTriggerMask) { // common-case optimization: fTriggeredEventClientDatas[*eventTriggerId*] = clientData; // here not use *fLastUsedTriggerNum*??? } ... } ? 2010-12-11 14:19, Ross Finlayson ??: > One of the biggest problems that developers have had with this library > is that it has been difficult to define and handle new kinds of event > - beyond the file/socket I/O and delayed task events that we support > by default. > > In particular, it has been difficult to implement input devices (such > as encoders) when we want to signal a new event (such as the > availability of new frame data) from an external thread. The 'watch > variable' mechanism - although it can be used - is not particular > well-suited for this purpose. Furthermore, the model code in > "DeviceSource.cpp" has not been particular helpful, because it doesn't > really describe what to do to handle events that are signaled from an > external thread. > > To overcome this, I have now installed a new version (2010.12.11) of > the "LIVE555 Streaming Media" library that now includes a new 'event > trigger' mechanism for "TaskScheduler" (and its subclass, > "BasicTaskScheduler"). > > Specifically, you can now - using the new function > TaskScheduler::createEventTrigger() > - register an event handler function, to be associated with a > particular 'event trigger id'. At some later time, you can call > TaskScheduler:: triggerEvent() > with this 'event trigger id' as parameter, and this will cause the > event handler function to get called (from within the event loop). > > For more details, see "UsageEnvironment/include/UsageEnvironment.hh" > > One nice feature of this mechanism is that - unlike other library > routines - the "triggerEvent()" function can be called from a separate > thread. This makes it easier to implement input device classes. > > > I have also updated (and, I hope, significantly improved) the model > code in "liveMedia/DeviceSource.cpp". This code describes how to > implement an input device class that uses the new 'event trigger' > mechanism (with the event possibly being signaled from an external > thread). > > If you're implementing (or have already implemented) an input device > class, then I encourage you to take a look at the new "DeviceSource" > code - and perhaps use this as a model for your code. -- *P.J.* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 13 20:41:31 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Dec 2010 20:41:31 -0800 Subject: [Live-devel] New TaskScheduler 'event trigger' mechanism, and (much) improved "DeviceSource" In-Reply-To: <4D06E2A9.8010505@gmail.com> References: <4D06E2A9.8010505@gmail.com> Message-ID: >Hi,Ross, > I'm doubting this piece of code works correctly. >void BasicTaskScheduler0::triggerEvent(EventTriggerId >eventTriggerId, void* clientData) { > // First, record the "clientData": > if (eventTriggerId == fLastUsedTriggerMask) { // common-case optimization: > fTriggeredEventClientDatas[eventTriggerId] = clientData; // here >not use fLastUsedTriggerNum??? Oops - you're right! Thanks for noticing this. I have installed a new version (2010.12.14) that fixes this. If you plan to use the new 'event trigger' mechanism, then you must use this new version. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 13 23:37:01 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Dec 2010 23:37:01 -0800 Subject: [Live-devel] live555 stream aac raw problem In-Reply-To: <41FC1A94FE13684D8249B089C38119700F674F@mailserver-nj1.njacejet.com> References: <41FC1A94FE13684D8249B089C38119700F674F@mailserver-nj1.njacejet.com> Message-ID: > I have live aac raw data,I implement own >audioservermediasession,overload virual creatertpsink function and >createstreamsource function. > code: > > myownserversession::createNewRTPSink(....) >{ > return MPEG4GenericRTPSink::createNew(envir(), rtpGroupsock, > rtpPayloadTypeIfDynamic, > 44100, > "audio", "AAC-hbr", "1", //aac raw > 2); > >} > >myownserverseesion:createNewStreamSource(...) >{ > return myowndevicesource(...); > >} > >in myowndevicesource class ,i use videodevicesource same >code.videodevicesource is work ok. > >but,i use vlcplay,vlc disp not decode audio.why? >live555 no support aac raw data?only use aac adts? No, our code be able to stream AAC audio, no matter how it's delivered to the "MPEG4GenericRTPSink". An "ATDS file" is just one possible way to deliver AAC audio frames. One problem that I see with your code is the "configString" parameter to "MPEG4GenericRTPSink::createNew()". The string "1" is wrong; instead, the string should be 4 hexadecimal digits (i.e., representing 2 bytes). To see how these two bytes are constructed, note how we do it in "ADTSAudioFileSource.cpp", line 108. You should also make sure that your "myowndevicesource()" class delivers one AAC frame (and no more) each time that its "doGetNextFrame()" is called. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Dec 13 23:46:51 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Dec 2010 23:46:51 -0800 Subject: [Live-devel] New TaskScheduler 'event trigger - Usinf In-Reply-To: <10C6D4821E0BAA44A73B987733947B164A1736@OREV.ads.local> References: <10C6D4821E0BAA44A73B987733947B164A1736@OREV.ads.local> Message-ID: >When I trigger the triggerEvent() from my separate thread >(DirectShow Input pi Thread), I see that it's being catch by the >BasicTaskScheduler::SingleStep, Thinking that the way to work with >the UsageEnvironment is in the old fusion way - >env->taskScheduler().doEventLoop(). > >Whiteout using the watchVariable? The 'event trigger' mechanism is similar to the existing 'watch variable' mechanism (which remains available) - but 'event triggers' are probably easier to use if you want to do what you are doing: Signalling an event from an external thread. (If you decide to use the 'event trigger' mechanism, then be sure to use the latest version of the code: 2010.12.14) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 14 00:06:29 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 00:06:29 -0800 Subject: [Live-devel] Trick play H264 In-Reply-To: References: Message-ID: I may be taking a look at this over the Christmas/New Year break. I'm not sure yet, but there might end up having to be a non-backwards-compatible change to the index file format in order to accommodate both kinds of Transport Stream file (MPEG-2 and H.264). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Dec 14 00:07:49 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 00:07:49 -0800 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream In-Reply-To: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt> References: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt> Message-ID: >I'm using Live555 to send H.264 ES video over Transport Stream using >MPEG2TransportStreamFromESSource but the video stream blocks for >20 >minutes in every hour or so. >The goal is to deliver a live video stream using an H.264 video >encoder as the source, initial tests showed that Transport Stream >offered the most fluid stream. >This problem doesn't happen when sending raw H.264 ES using >H264VideoRTPSink directly. IMHO, if you're able to send H.264 video directly over RTP (i.e., without encapsulating it in a Transport Stream first), then you should continue to do this, because there will be less network overhead - and better resliency to packet loss - by doing this, compared to streaming Transport Streams. I don't know what you mean by Transport Streams being 'more fluid', but that should not be the case, especially if the presentation times on your H.264 NAL units (delivered by your encoder) are correct. If, however, you insist on transmitting Transport Stream data (rather than raw H.264) over RTP, then I suggest that you first create a Transport Stream *file*, and then trying to stream that (e.g., using our existing, unmodified "testOnDemandRTSPServer" or "live555MediaServer" applications). That might give you some idea about what's going wrong. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yuri.timenkov at itv.ru Tue Dec 14 02:00:49 2010 From: yuri.timenkov at itv.ru (Yuri Timenkov) Date: Tue, 14 Dec 2010 13:00:49 +0300 Subject: [Live-devel] cross compiled Live555 libs In-Reply-To: References: Message-ID: <4D074051.8030402@itv.ru> Hi, You should know that VC and MinGW use incompatible library format. FFMpeg is C library this allows interoperability between VC and MinGW (as it is with kernel32.dll and ntdll.dll). Moreover AFAIK FFMpeg doesn't support VC (neither has plans to). With C++ things different. As I mentioned, there is a lot of thing to take care of (name mangling, vtables, templates, inlines, exceptions and etc.), and there is a big chance that C++ objects built with different compilers will crash if used in same binary. To prevent this (along with other reasons) MinGW and VC use different mangling schemes. Even if they had same library format (or if you built a dll and generate exports from it) you can't link VC app with MinGW liveMedia. So if you want to build app in VC and link it with liveMedia, you should build liveMedia also with VC. Or you may just add all liveMedia files into your App's project or solution. Recently there was discussion to move liveMedia to CMake to allow generating native build systems on different platforms, but this initiative is now suspended (as I suppose there are not so many people interested in it). Regards, Yuri On 14.12.2010 0:07, Ke Yu wrote: > I had built the Live555 libs with mingw tools in UBuntu successfully > following instructions: > 1). genMakefiles config.mingw > 2). make > The reason for cross compiling above is I have other libraries built > this way (namely, FFMpeg). I've built an app with those FFMpeg libs in > Visual C++ 2008. The app works fine. Now, I need to also use the > live555 libraries built above in this same project. However, when I > tried to link the live555 libs, the Visual Studio compiler gave me > fatal link error: "libliveMedia.a: invalid or corrupt file". The host > OS is Windows 7. > > How should I fix this problem? > > Thanks! > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From Bruno.Basilio at brisa.pt Tue Dec 14 02:30:26 2010 From: Bruno.Basilio at brisa.pt (Bruno Filipe Basilio) Date: Tue, 14 Dec 2010 10:30:26 +0000 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream In-Reply-To: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt> References: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt> Message-ID: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> Ross, Thank you for the fast feedback. After doing some more tests it seems the video stream recovers from the blocking after a call to AlarmHandler::handleTimeout(). Here's the stack trace: H264VideoDiscreteFramer::doGetNextFrame() at H264VideoDiscreteFramer.cpp:35 0x804fbba InputESSourceRecord::askForNewData() at MPEG2TransportStreamFromESSource.cpp:191 0x8050e7c MPEG2TransportStreamFromESSource::awaitNewBuffer() at MPEG2TransportStreamFromESSource.cpp:138 0x8050ef8 MPEG2TransportStreamMultiplexor::doGetNextFrame() at MPEG2TransportStreamMultiplexor.cpp:53 0x805e9ba MultiFramedRTPSink::packFrame() at MultiFramedRTPSink.cpp:215 0x80530bd MultiFramedRTPSink::sendNext() at MultiFramedRTPSink.cpp:402 0x8053599 AlarmHandler::handleTimeout() at BasicTaskScheduler0.cpp:34 0x8067c33 BasicTaskScheduler::SingleStep() at BasicTaskScheduler.cpp:150 0x8065a31 BasicTaskScheduler0::doEventLoop() at BasicTaskScheduler0.cpp:76 0x8067320 > IMHO, if you're able to send H.264 video directly over RTP (i.e., > without encapsulating it in a Transport Stream first), then you > should continue to do this, because there will be less network > overhead - and better resliency to packet loss - by doing this, > compared to streaming Transport Streams. I don't know what you mean > by Transport Streams being 'more fluid', but that should not be the > case, especially if the presentation times on your H.264 NAL units > (delivered by your encoder) are correct. The goal of using Transport Stream is to use trick play in the future. H.264 encapsulated in TS is definitely 'more fluid' with our encoder video stream, I'll try to obtain more information to verify if the presentation times in the H.264 NAL Units are correct. At this moment the presentation times are exposed thru the encoder API and not from the NAL Unit. > If, however, you insist on transmitting Transport Stream data (rather > than raw H.264) over RTP, then I suggest that you first create a > Transport Stream *file*, and then trying to stream that (e.g., using > our existing, unmodified "testOnDemandRTSPServer" or > "live555MediaServer" applications). That might give you some idea > about what's going wrong. Thank you for you suggestion, I'll try to simulate the problem creating a big TS file with the estimated time to failure in mind. Best Regards, Bruno Basilio -------------------------------------------------------------------------------- Declara??o: A informa??o contida nesta mensagem, e os ficheiros anexos, ? privilegiada e confidencial, destinando-se exclusivamente ao(s) destinat?rio(s).Se n?o ? o destinat?rio (ou o respons?vel pela sua entrega ao destinat?rio) e recebeu a mesma por engano, fica notificado que ? estritamente proibido reproduzir, guardar ou distribuir toda ou qualquer parte desta mensagem e ficheiros anexos.Por favor reencaminhe a mensagem para o respons?vel pelo seu envio ou contacte-nos por telefone e elimine a mensagem e ficheiros anexos do seu computador,sem os reproduzir. Disclaimer: The information contained in this message, and any files attached, is privileged and confidential, and intended exclusively for the included addresses.If you are not the intended recipient (or the person responsible for delivering to the intended recipient) and received this message by mistake, be aware that copy, storage, distribution or any other use of all or part of this message and the files attached is strictly prohibited. Please notify the sender by reply e-mail or contact us by telephone and delete this message and the files attached, without retaining a copy. -------------------------------------------------------------------------------- From finlayson at live555.com Tue Dec 14 07:13:41 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 07:13:41 -0800 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream In-Reply-To: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> References: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt> <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> Message-ID: >Here's the stack trace: > >H264VideoDiscreteFramer::doGetNextFrame() at >H264VideoDiscreteFramer.cpp:35 0x804fbba >InputESSourceRecord::askForNewData() at >MPEG2TransportStreamFromESSource.cpp:191 0x8050e7c >MPEG2TransportStreamFromESSource::awaitNewBuffer() at >MPEG2TransportStreamFromESSource.cpp:138 0x8050ef8 >MPEG2TransportStreamMultiplexor::doGetNextFrame() at >MPEG2TransportStreamMultiplexor.cpp:53 0x805e9ba >MultiFramedRTPSink::packFrame() at MultiFramedRTPSink.cpp:215 0x80530bd >MultiFramedRTPSink::sendNext() at MultiFramedRTPSink.cpp:402 0x8053599 >AlarmHandler::handleTimeout() at BasicTaskScheduler0.cpp:34 0x8067c33 >BasicTaskScheduler::SingleStep() at BasicTaskScheduler.cpp:150 0x8065a31 >BasicTaskScheduler0::doEventLoop() at BasicTaskScheduler0.cpp:76 0x8067320 I notice that you're feeding your TransportStream data (from the "MPEG2TransportStreamMultiplexor") directly into a RTPSink. Although that *might* work in this case (because you're reading from a live source rather than from a file), it would be safer to have a "MPEG2TransportStreamFramer" object sitting between your "MPEG2TransportStreamMultiplexor" and your RTPSink. This may cause RTP packets to get sent out at a more even rate. You also didn't say anything about your H.264 input source implementation - i.e., the class that feeds into the "H264VideoStreamDiscreteFramer" - but it's conceivable that the blockage is happening there. I.e., you should make sure that the call to "doGetNextFrame()" on your input source object is always causing data to be delivered to the downstream object (the "H264VideoStreamDiscreteFramer") without excessive delay. > > If, however, you insist on transmitting Transport Stream data (rather >> than raw H.264) over RTP, then I suggest that you first create a >> Transport Stream *file*, and then trying to stream that (e.g., using >> our existing, unmodified "testOnDemandRTSPServer" or >> "live555MediaServer" applications). That might give you some idea >> about what's going wrong. > >Thank you for you suggestion, I'll try to simulate the problem >creating a big TS file with the estimated time to failure in mind. It sounds like you're going to want to do this (write incoming data into a Transport Stream File) in the future anyway, because you mentioned that you eventually want to support 'trick play'. ('Trick play' operations are only for pre-recorded data, not for live streams.) So you might as well get used to recording files. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From matteo.pampolini at elsagdatamat.com Tue Dec 14 08:52:35 2010 From: matteo.pampolini at elsagdatamat.com (Pampolini Matteo) Date: Tue, 14 Dec 2010 17:52:35 +0100 Subject: [Live-devel] Support for Kasenna Mediabase server (long story...) Message-ID: <4D07A0D3.8010906@elsagdatamat.com> Hi all, I'm aware that many discussions have already been done around this subject, but unfortunately I still have many doubts, that's why I'm asking for help here. I'm developing an application, based on Live555, of course, that should be able to receive and play multicast streams from a Kasenna Mediabase server, currently it works fine with an RTSP server such as VideoLAN. As far as I know Live555 doesn't natively support it, am I right? Looking at VideoLAN source code it seems to me that the parsing of Mediabase SDP-like info is done in a different file, say sgimb.c that apparently has nothing to do with Live555. Unfortunately I'm not able to understand the calls trace because the VideoLAN binary I installed as an RPM on my PC works fine, while a version I compiled from source adding Live555 does not work, though all the modules seem to be OK. Then I have two questions: 1) Is there any VideoLAN independent patch that I can apply to Live555 source tree in order to connect to a Kasenna Mediabase server with openRTSP? 2) If not, may you please help me understand VideoLAN behavior? Many thanks in advance, Matteo From Bruno.Basilio at brisa.pt Tue Dec 14 10:04:13 2010 From: Bruno.Basilio at brisa.pt (Bruno Filipe Basilio) Date: Tue, 14 Dec 2010 18:04:13 +0000 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream In-Reply-To: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> References: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt>, <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> Message-ID: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A67B@SRVBRIEXC012.brisa.pt> Ross, The time period between the stream stopping and consequent recover is exactly 35:47.50 with a variance less than 10 msecs and this goes forever. Could you know any action that could be causing this behavior (so deterministic) in my linux system? In my previous post I think didn't understand your question and poorly explain myself about the meaning of "more fluid video stream" when preferring H.264 over TS to only H.264. I came to this conclusion comparing video streams visualized in VLC. But I agree with you that TS is more resource intensive in terms of network and processing. A subject to another thread is the compatibility on the client, i.e. Set-Top-Boxes. > I notice that you're feeding your TransportStream data (from the > "MPEG2TransportStreamMultiplexor") directly into a RTPSink. Although > that *might* work in this case (because you're reading from a live > source rather than from a file), it would be safer to have a > "MPEG2TransportStreamFramer" object sitting between your > "MPEG2TransportStreamMultiplexor" and your RTPSink. This may cause > RTP packets to get sent out at a more even rate. My first test wasn't successful because no frames have been sent, but I will try this in more detail latter. Thank you for pointing this out. > You also didn't say anything about your H.264 input source > implementation - i.e., the class that feeds into the > "H264VideoStreamDiscreteFramer" - but it's conceivable that the > blockage is happening there. I.e., you should make sure that the > call to "doGetNextFrame()" on your input source object is always > causing data to be delivered to the downstream object (the > "H264VideoStreamDiscreteFramer") without excessive delay. The class that feeds into the "H264VideoStreamDiscreteFramer" is MyDeviceSource as I mentioned in my first post in this subject and in normal situation, after being called "doGetNextFrame()", the data is successfully delivered. The abnormal situation his being discussed here. > It sounds like you're going to want to do this (write incoming data > into a Transport Stream File) in the future anyway, because you > mentioned that you eventually want to support 'trick play'. ('Trick > play' operations are only for pre-recorded data, not for live > streams.) So you might as well get used to recording files. Doing the first tests as we speak. ;) Best Regards, Bruno Basilio -------------------------------------------------------------------------------- Declara??o: A informa??o contida nesta mensagem, e os ficheiros anexos, ? privilegiada e confidencial, destinando-se exclusivamente ao(s) destinat?rio(s).Se n?o ? o destinat?rio (ou o respons?vel pela sua entrega ao destinat?rio) e recebeu a mesma por engano, fica notificado que ? estritamente proibido reproduzir, guardar ou distribuir toda ou qualquer parte desta mensagem e ficheiros anexos.Por favor reencaminhe a mensagem para o respons?vel pelo seu envio ou contacte-nos por telefone e elimine a mensagem e ficheiros anexos do seu computador,sem os reproduzir. Disclaimer: The information contained in this message, and any files attached, is privileged and confidential, and intended exclusively for the included addresses.If you are not the intended recipient (or the person responsible for delivering to the intended recipient) and received this message by mistake, be aware that copy, storage, distribution or any other use of all or part of this message and the files attached is strictly prohibited. Please notify the sender by reply e-mail or contact us by telephone and delete this message and the files attached, without retaining a copy. -------------------------------------------------------------------------------- From finlayson at live555.com Tue Dec 14 12:24:50 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 12:24:50 -0800 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream In-Reply-To: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A67B@SRVBRIEXC012.brisa.pt> References: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt>, <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A67B@SRVBRIEXC012.brisa.pt> Message-ID: >The time period between the stream stopping and consequent recover >is exactly 35:47.50 with a variance less than 10 msecs and this goes >forever. >Could you know any action that could be causing this behavior (so >deterministic) in my linux system? This time is almost exactly 0x80000000, when expressed in microseconds. So the problem seems to be that (somehow) a 32-bit duration value is overflowing from positive to negative (or vice versa). I'm a bit puzzled by this, though for two reasons: 1/ You are not using a "MPEG2TransportStreamFramer" in front of your "MultiFramedRTPSink", so the latter should not be getting a "durationInMicroseconds" value other than 0. 2/ You are using a recent version of the code (because you know that we do not support old versions of the code :-), and a fix to check for negative durations was added to version 2010.12.05. But hopefully you now have enough information to work from... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Dec 14 12:36:07 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 12:36:07 -0800 Subject: [Live-devel] Support for Kasenna Mediabase server (long story...) In-Reply-To: <4D07A0D3.8010906@elsagdatamat.com> References: <4D07A0D3.8010906@elsagdatamat.com> Message-ID: >1) Is there any VideoLAN independent patch that I can apply to Live555 source >tree in order to connect to a Kasenna Mediabase server with openRTSP? No. For many years (at least 6), our RTSP client implementation included a gross hack to accommodate Kasenna's non-standard bastardization of the RTSP protocol. But not anymore. When our RTSP client implementation was overhauled earlier this year (to support asynchronous, non-blocking I/O), the 'Kasenna' hack (which was a *lot* of ugly code) was ripped out. We are not going to support this non-standard crap again (search for the string "Kasenna" in "liveMedia/RTSPClient.cpp" for a further explanation). We are far bigger than 'Kasenna' (in terms of installed base throughout the Internet), and are not going to bastardize our code to accommodate them. *They* are the ones that need to fix their systems to make them standards-compliant. (Or alternatively, you should just stop using their servers.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From demask at mail.ru Tue Dec 14 13:30:09 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Wed, 15 Dec 2010 03:30:09 +0600 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file References: <99071844EFC64A5FA9CDC1E9E466668F@castle> Message-ID: <3F79A9BE5D304E6FA418C1CB93CF6529@castle> Re: [Live-devel] key frames (I-frames) in the recorded AVIHello Ross, I have implemented some functionality related to AVI indecies in AVIFileSink class. It is yet pretty basic. But after all, it feets my needs for the video recording. If you (or somebody else) likes I can share the source code here. Features: - reuses some code from mpeg4ip (not the library itself, but *some code* to parse frames) - supports AVI 1.0 index (advantage is the simplicity and compatibility with most AVI players, disadvantage is the support limited to files < 2Gb only) - supports MP4V-ES streams TODOs: - implement AVI OpenDML index to support files > 2Gb - add support for other stream types (e.g. H.264 and so on) Kind regards, Dmitriy Petrenko ----- Original Message ----- From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Friday, December 10, 2010 9:54 AM Subject: Re: [Live-devel] key frames (I-frames) in the recorded AVI file I just was wondering if it's possible to calculate and write keyframe index information to the recorded AVI files using AVIFileSink? Implementation of this class has some hints (like setting AVIF_HASINDEX and AVIF_TRUSTCKTYPE flags in AVI header) but where is the the actual implementation? The file "liveMedia/AVIFileSink.cpp" contains our complete implementation of outputting to an AVI file. It could undoubtedly be improved, however. As always, feel free to propose any specific improvements or bugfixes that you might have. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 14 14:04:59 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 14:04:59 -0800 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file In-Reply-To: <3F79A9BE5D304E6FA418C1CB93CF6529@castle> References: <99071844EFC64A5FA9CDC1E9E466668F@castle> <3F79A9BE5D304E6FA418C1CB93CF6529@castle> Message-ID: >I have implemented some functionality related to AVI indecies in >AVIFileSink class. It is yet pretty basic. But after all, it feets >my needs for the video recording. If you (or somebody else) likes I >can share the source code here. Thanks. Yes, please send us your new "AVIFileSink.cpp" and "AVIFileSink.hh" classes, and I'll look into including these changes in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From demask at mail.ru Tue Dec 14 13:58:29 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Wed, 15 Dec 2010 03:58:29 +0600 Subject: [Live-devel] MPEG-4 Visual configuration bits in SDP References: Message-ID: <200C03BCA04F4F37BA0B6DC435FA5279@castle> Re: [Live-devel] MPEG-4 Visual configuration bits in SDPHello Ross, I would disagree that decoding SDP config bits is not needed for streaming at all. For example, Axis 240S and 240Q video servers *do not* provide "x-dimensions" parameter in their SDP (or other similar parameters like "cliprect" or "framesize"). Thus, LIVE555 is not aware about video resolution when recording RTSP from these video servers. The width and height values must be supplied manually, which is not feasible in some cases. E.g. when recording MPEG-4 stream to AVI file, the wrong width and height values in AVI header might result in unplayable file. >From another hand, Axis 240S and 240Q video servers provide the required video resolution info in bits "video_object_layer_width" and "video_object_layer_height" from config parameter. Thus, config bits parsing/decoding would be a useful feature in LIVE555. The decoding algorythm is pretty simple and can be found in ISO/IEC 14496 Part 2 documentation in "Visual bitstream syntax" chapter. After all, LIVE555 currently has a static function samplingFrequencyFromAudioSpecificConfig() implemented in MPEG4GenericRTPSource.cpp. That's exactly what it is doing - parsing (i.e. decoding, in other words) SDP config bits to extract frequency. This value is being used in SubsessionIOState class when recording QT movies. Kind regards, Dmitriy Petrenko ----- Original Message ----- From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Friday, December 10, 2010 10:07 AM Subject: Re: [Live-devel] MPEG-4 Visual configuration bits in SDP Just came across that it would be useful to have an easy-to-use decoder for MPEG-4 Visual configuration bits that are encoded in "config" parameter of SDP. Something like mpeg4vol command-line tool from mpeg4ip does. Are there any plans to implement this in the future? The 'config' parameter string is simply a string of bytes, encoded as hexadecimal digits (therefore, two hexadecimal digits per byte). We do have a function: parseGeneralConfigStr() (defined in "liveMedia/include/MPEG4LATMAudioRTPSource.hh) that will convert a 'config' string to binary. E.g., the 'config' string deadbeef would be converted to the following sequence of 4 bytes 0xDE 0xAD 0xBE 0xEF However, we don't implement any additional 'decoding' of this data (or any other MPEG-4 data), because it's not needed at all for streaming. (We don't include any audio/video decoding or encoding software.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Dec 14 14:59:26 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 14:59:26 -0800 Subject: [Live-devel] MPEG-4 Visual configuration bits in SDP In-Reply-To: <200C03BCA04F4F37BA0B6DC435FA5279@castle> References: <200C03BCA04F4F37BA0B6DC435FA5279@castle> Message-ID: >I would disagree that decoding SDP config bits is not needed for >streaming at all. > >For example, Axis 240S and 240Q video servers *do not* provide >"x-dimensions" parameter in their SDP (or other similar parameters >like "cliprect" or "framesize"). Thus, LIVE555 is not aware about >video resolution when recording RTSP from these video servers. The >width and height values must be supplied manually, which is not >feasible in some cases. E.g. when recording MPEG-4 stream to AVI >file, the wrong width and height values in AVI header might result >in unplayable file. > >From another hand, Axis 240S and 240Q video servers provide the >required video resolution info in bits "video_object_layer_width" >and "video_object_layer_height" from config parameter. Thus, config >bits parsing/decoding would be a useful feature in LIVE555. The >decoding algorythm is pretty simple and can be found in ISO/IEC >14496 Part 2 documentation in "Visual bitstream syntax" chapter. Yes, for that specific purpose - extracting width and height information - I can see how it could be useful to have code that analyzes the MPEG-4 Video config data. (In turns out that - for transmitting MPEG-4 Elementary Stream files - we already do a bit of parsing of the "VOL header" structure in "MPEG4VideoStreamFramer", in order to extract the "vop_time_increment_resolution".) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From demask at mail.ru Tue Dec 14 17:23:02 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Wed, 15 Dec 2010 07:23:02 +0600 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file References: <99071844EFC64A5FA9CDC1E9E466668F@castle><3F79A9BE5D304E6FA418C1CB93CF6529@castle> Message-ID: <2D2C55D185FE4B1789BD2D01DD7C3B89@castle> Re: [Live-devel] key frames (I-frames) in the recorded AVIRoss, Please find the files attached. The new class AVIFileIndex is responsible for maintaining AVI index copy in memory. One thing that should be taken into account. The memory allocated by AVIFileIndex for index grows as video is being received and recorded, because index section should be added to the very end of AVI file, just before closure. Although size of index data is very small comparing to video data, it still could potentionally exhaust all available memory after the very long recording (many hours). I decided to not set any checks or limits within AVIFileIndex class methods, because anyway AVI 1.0 index will not work for the files greater than 2 GB. The checks for the recording time (or better for the recorded size) should be done at the end-user side. When needed, the old AVIFileSink object can be closed and the new AVIFileSink object created to continue recording to another file name. This will free up all the memory allocated for index. Kind regards, Dmitriy Petrenko ----- Original Message ----- From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Wednesday, December 15, 2010 4:04 AM Subject: Re: [Live-devel] key frames (I-frames) in the recorded AVI file I have implemented some functionality related to AVI indecies in AVIFileSink class. It is yet pretty basic. But after all, it feets my needs for the video recording. If you (or somebody else) likes I can share the source code here. Thanks. Yes, please send us your new "AVIFileSink.cpp" and "AVIFileSink.hh" classes, and I'll look into including these changes in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: AVIFileSink.cpp Type: application/octet-stream Size: 29428 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: AVIFileSink.hh Type: application/octet-stream Size: 8661 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MPEG4Tools.cpp Type: application/octet-stream Size: 15651 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MPEG4Tools.h Type: application/octet-stream Size: 7688 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MPEG4Types.cpp Type: application/octet-stream Size: 2310 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MPEG4Types.h Type: application/octet-stream Size: 8291 bytes Desc: not available URL: From finlayson at live555.com Tue Dec 14 20:13:27 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Dec 2010 20:13:27 -0800 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file In-Reply-To: <2D2C55D185FE4B1789BD2D01DD7C3B89@castle> References: <99071844EFC64A5FA9CDC1E9E466668F@castle><3F79A9BE5D304E6FA418C1CB93CF6529@castle> <2D2C55D185FE4B1789BD2D01DD7C3B89@castle> Message-ID: >One thing that should be taken into account. The memory allocated by >AVIFileIndex for index grows as video is being received and >recorded, because index section should be added to the very end of >AVI file, just before closure. Although size of index data is very >small comparing to video data, it still could potentionally exhaust >all available memory after the very long recording (many hours). Yes, it turns out that we have the same issue for "QuickTimeFileSink". Unfortunately there isn'e anything we can do about this; these file formats are not well-designed for the purpose of recording incoming streams. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmcouat at smartt.com Tue Dec 14 22:18:45 2010 From: rmcouat at smartt.com (Ron McOuat) Date: Tue, 14 Dec 2010 22:18:45 -0800 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file In-Reply-To: References: <99071844EFC64A5FA9CDC1E9E466668F@castle><3F79A9BE5D304E6FA418C1CB93CF6529@castle> <2D2C55D185FE4B1789BD2D01DD7C3B89@castle> Message-ID: <4D085DC5.6080405@smartt.com> The bulk of the index could be written to a temp file as the video portion of the file is filled and then attached to the end of the AVI / QT file along with any of the other index information that may be required to wrap the raw index. I know, more to manage and potential problems with partial completion but I think the memory problem is solved by this strategy. On 10-12-14 8:13 PM, Ross Finlayson wrote: >> One thing that should be taken into account. The memory allocated by >> AVIFileIndex for index grows as video is being received and recorded, >> because index section should be added to the very end of AVI file, >> just before closure. Although size of index data is very small >> comparing to video data, it still could potentionally exhaust all >> available memory after the very long recording (many hours). > > Yes, it turns out that we have the same issue for > "QuickTimeFileSink". Unfortunately there isn'e anything we can do > about this; these file formats are not well-designed for the purpose > of recording incoming streams. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From demask at mail.ru Wed Dec 15 00:00:35 2010 From: demask at mail.ru (Dmitry Petrenko) Date: Wed, 15 Dec 2010 14:00:35 +0600 Subject: [Live-devel] key frames (I-frames) in the recorded AVI file References: <99071844EFC64A5FA9CDC1E9E466668F@castle><3F79A9BE5D304E6FA418C1CB93CF6529@castle> <2D2C55D185FE4B1789BD2D01DD7C3B89@castle> <4D085DC5.6080405@smartt.com> Message-ID: Re: [Live-devel] key frames (I-frames) in the recorded AVIHello Ron, Me too was thinking abount the use of temporary swap file. :-) But as mentioned below, there is no sense of doing this in AVIFileSink as long as we use AVI 1.0 index and user checks output file size from parent code. >From another hand, the proper implementation of OpenDML AVI index will not require the use of temporary files either. There are multiple index sections (index chunks) can be created within the single OpenDML AVI file and one small superindex section (index of indexes) is needed in the very beginning of file. The RIFF section in the new format is still limited to 1 Gb, but it's now legal to have more than one RIFF section. Therefore, the best strategy would be: 1) writting bulk of index data to a separate section (and filling appropriate superindex entry to point to this index section) every ~1 Gb or less of video data, 2) finalizing the current RIFF section, 3) starting the new RIFF section and calculating the new bulk of index data. And so on... Current AVIFileSink implementation creates a starnge empty JUNK section in the beginning of AVI files. I think that was an attempt by the previous author to consider OpenDML superindex implementation in the furher releases. Kind regards, Dmitriy Petrenko ----- Original Message ----- From: Ron McOuat To: LIVE555 Streaming Media - development & use Sent: Wednesday, December 15, 2010 12:18 PM Subject: Re: [Live-devel] key frames (I-frames) in the recorded AVI file The bulk of the index could be written to a temp file as the video portion of the file is filled and then attached to the end of the AVI / QT file along with any of the other index information that may be required to wrap the raw index. I know, more to manage and potential problems with partial completion but I think the memory problem is solved by this strategy. On 10-12-14 8:13 PM, Ross Finlayson wrote: One thing that should be taken into account. The memory allocated by AVIFileIndex for index grows as video is being received and recorded, because index section should be added to the very end of AVI file, just before closure. Although size of index data is very small comparing to video data, it still could potentionally exhaust all available memory after the very long recording (many hours). Yes, it turns out that we have the same issue for "QuickTimeFileSink". Unfortunately there isn'e anything we can do about this; these file formats are not well-designed for the purpose of recording incoming streams. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From Bruno.Basilio at brisa.pt Thu Dec 16 02:45:36 2010 From: Bruno.Basilio at brisa.pt (Bruno Filipe Basilio) Date: Thu, 16 Dec 2010 10:45:36 +0000 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream In-Reply-To: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A67B@SRVBRIEXC012.brisa.pt> References: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt>, <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A67B@SRVBRIEXC012.brisa.pt> Message-ID: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A9E9@SRVBRIEXC012.brisa.pt> Ross, > I'm a bit puzzled by this, though for two reasons: > 1/ You are not using a "MPEG2TransportStreamFramer" in front of your > "MultiFramedRTPSink", so the latter should not be getting a > "durationInMicroseconds" value other than 0. > 2/ You are using a recent version of the code (because you know that > we do not support old versions of the code :-), and a fix to check > for negative durations was added to version 2010.12.05. I successfully upgraded to the latest version and the test app is streaming raw H.264 video over TS for several hours without interruptions, thank you for your support even though a simple update from my part did the trick. Regarding issue 1/: I couldn't see the video stream in VLC when using a "MPEG2TransportStreamFramer" in front of "MultiFramedRTPSink", I don't know where to go further with this. Since the upgrade I had also come up with another issue: the stream from the encoder shows "Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake?" from H264VideoStreamDiscreteFramer::afterGettingFrame1(). The video stops working in VLC if I try to remove the mistaken "'nal_unit_type': 0". I tried to solve this removing the fixed number of "0" bytes in the beginning of the NAL Unit simply advancing the pointer with the fixed size but was not successful. Thank you for this good piece of software. Best regards, Bruno Basilio -------------------------------------------------------------------------------- Declara??o: A informa??o contida nesta mensagem, e os ficheiros anexos, ? privilegiada e confidencial, destinando-se exclusivamente ao(s) destinat?rio(s).Se n?o ? o destinat?rio (ou o respons?vel pela sua entrega ao destinat?rio) e recebeu a mesma por engano, fica notificado que ? estritamente proibido reproduzir, guardar ou distribuir toda ou qualquer parte desta mensagem e ficheiros anexos.Por favor reencaminhe a mensagem para o respons?vel pelo seu envio ou contacte-nos por telefone e elimine a mensagem e ficheiros anexos do seu computador,sem os reproduzir. Disclaimer: The information contained in this message, and any files attached, is privileged and confidential, and intended exclusively for the included addresses.If you are not the intended recipient (or the person responsible for delivering to the intended recipient) and received this message by mistake, be aware that copy, storage, distribution or any other use of all or part of this message and the files attached is strictly prohibited. Please notify the sender by reply e-mail or contact us by telephone and delete this message and the files attached, without retaining a copy. -------------------------------------------------------------------------------- From finlayson at live555.com Thu Dec 16 03:41:58 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Dec 2010 01:41:58 -1000 Subject: [Live-devel] Problem streaming continuous H.264 video over Transport Stream In-Reply-To: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A9E9@SRVBRIEXC012.brisa.pt> References: <3EC6EEDBCBD4AC4A9FBE5F1F5EEA8619019BFD@SRVBRIEXC012.brisa.pt>, <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A0D6@SRVBRIEXC012.brisa.pt> <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A67B@SRVBRIEXC012.brisa.pt> <3EC6EEDBCBD4AC4A9FBE5F1F5EEA861901A9E9@SRVBRIEXC012.brisa.pt> Message-ID: >Since the upgrade I had also come up with another issue: the stream >from the encoder shows "Warning: Invalid 'nal_unit_type': 0. Does >the NAL unit begin with a MPEG 'start code' by mistake?" from >H264VideoStreamDiscreteFramer::afterGettingFrame1(). I don't understand how you could have been using the code before, without having upgraded to at least the "2010.12.05" version, because the "H264VideoStreamDiscreteFramer" class did not even exist prior to that version of the code! By (apparently) using a bastardized, partially out-of-date version of the code, you wasted lots of time (mostly your own). But anyway, now that you're using a fully up-to-date version of the code... Does the data from your H.264 encoder start with a 0x00000001 (a 'start code') - for each NAL unit? If so, your H.264 data is a *byte stream*, not a sequence of discrete NAL units. In this case: 1/ If your H.264 input stream comes from a file (either a named file, or an open file), then you can use a "ByteStreamFileSource" to encapsulate it; you do not need to write your own 'device source' class. 2/ Because your input data is apparently a byte stream, you should definitely *not* be using a "H264VideoStreamDiscreteFramer". But now that I think about this some more: Because you are feeding your H.264 data into a Transport Stream - via a "MPEG2TransportStreamFromESSource" - I don't think you even need a H.264 'framer' class at all (because you will not be packing NAL units directly into outgoing RTP packets). So, right now, I think you can probably just feed your input device object directly into a "MPEG2TransportStreamFromESSource", without any intervening 'framer' object. Furthermore, you might be able to use a "ByteStreamFileSource" to encapsulate your input device. (But anyway, I've now spent far too much time dealing with your project. Don't expect much more help in the future...) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From matteo.pampolini at elsagdatamat.com Thu Dec 16 04:44:56 2010 From: matteo.pampolini at elsagdatamat.com (Pampolini Matteo) Date: Thu, 16 Dec 2010 13:44:56 +0100 Subject: [Live-devel] Support for Kasenna Mediabase server (long story...) In-Reply-To: References: <4D07A0D3.8010906@elsagdatamat.com> Message-ID: <4D0A09C8.7040509@elsagdatamat.com> Hi Ross, first of all many thanks for your feedback and sorry for this late reply, In the meantime I made further investigations as reported below. Moreover, please let me tell you that I perfectly agree to what you wrote and to the comment inside the code, but unfortunately I have a customer that is still using Kasenna MediaBase and doesn't seem interested in changing it. Let's now come to my results: Fedora precompiled VLC 1.1.4 binaries work fine with MediaBase, looking at the log I see: Kasenna flag enabled: User-Agent: VLC_MEDIA_PLAYER_KA Kasenna flag disabled: User-Agent: LibVLC/1.1.4 (LIVE555 Streaming Media v2010.04.09) >From my understanding, this should mean that version 2010.04.09 still had support for Kasenna, am I right? If so, is it possible to download it from some repository? Many thanks for your kind support, Matteo On 12/14/2010 09:36 PM, Ross Finlayson wrote: >> 1) Is there any VideoLAN independent patch that I can apply to Live555 >> source >> tree in order to connect to a Kasenna Mediabase server with openRTSP? > > No. > > For many years (at least 6), our RTSP client implementation included a > gross hack to accommodate Kasenna's non-standard bastardization of the > RTSP protocol. But not anymore. When our RTSP client implementation > was overhauled earlier this year (to support asynchronous, non-blocking > I/O), the 'Kasenna' hack (which was a *lot* of ugly code) was ripped out. > > We are not going to support this non-standard crap again (search for the > string "Kasenna" in "liveMedia/RTSPClient.cpp" for a further explanation). > > We are far bigger than 'Kasenna' (in terms of installed base throughout > the Internet), and are not going to bastardize our code to accommodate > them. *They* are the ones that need to fix their systems to make them > standards-compliant. (Or alternatively, you should just stop using > their servers.) From matteo.lisi at engicam.com Thu Dec 16 06:37:05 2010 From: matteo.lisi at engicam.com (Matteo Lisi) Date: Thu, 16 Dec 2010 15:37:05 +0100 Subject: [Live-devel] Open RTSP and video file splitting... Message-ID: <4D0A2411.1070206@engicam.com> Hi to All... I'm working on linux I'm trying to modify the test program "openRTSP" in live555 packet, for taking the streaming from a IP cam and put in a different file every 60 seconds. Like final result I want to have a collection of video file with lenght 60 seconds named stdout_xxx.mp4. In file "playCommon.cpp" I created a task that every 60 seconds do the following action: void sessionTimerFileHandler(void* /*clientData*/) { static int file_counter=1; char outFileName[30]; fileTimerTask = env->taskScheduler().scheduleDelayedTask(60000000, (TaskFunc*)sessionTimerFileHandler, (void*)NULL); Medium::close(qtOut); sprintf(outFileName, "stdout_%03d.mp4",file_counter); file_counter++ ; *env<< outFileName<< "\n"; qtOut = QuickTimeFileSink::createNew(*env, *session, outFileName, fileSinkBufferSize, movieWidth, movieHeight, movieFPS, packetLossCompensate, syncStreams, generateHintTracks, generateMP4Format); *env<< "exit function\n"; } But often the program exits in "Segmentation Fault", and I don't know why... I launch my application why this command line: ./myapp -v -4 -w 640 -h 480 -f 25 rtsp://admin:12345 at 192.168.2.62 It's possible that the program crash during the closing and reopening of "qout" object, while the streaming still try to write in output file ? Thanks Matteo -- ------------------------------------------------------------------------ http://www.engicam.com *ENGICAM s.r.l.* Progettazione di sistemi elettronici 50018 Scandicci - FIRENZE Via dei Pratoni, 16 int. 13 Tel. +39 055 7311387 Fax. +39 055 720608 Web www.engicam.com C.F./P.I. 05389070482 Registro Imprese di FIRENZE 542918 Capitale sociale versato 20.000,00? ------------------------------------------------------------------------ NOTICE: This message and attachments are intended only for the use of their addresses and may contain confidential information belonging to Engicam. If you are not the intended recipient, you are hereby notified that any reading, dissemination, distribution, or copying of this message, or any attachment, is strictly prohibited. If you have received this message in error, please notify the original sender immediately and delete this message, along with any attachments. From finlayson at live555.com Thu Dec 16 09:05:03 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Dec 2010 07:05:03 -1000 Subject: [Live-devel] Support for Kasenna Mediabase server (long story...) In-Reply-To: <4D0A09C8.7040509@elsagdatamat.com> References: <4D07A0D3.8010906@elsagdatamat.com> <4D0A09C8.7040509@elsagdatamat.com> Message-ID: > >From my understanding, this should mean that version 2010.04.09 still had >support for Kasenna, am I right? Yes. > If so, is it possible to download it from some repository? I'm sure it's possible to download this from somewhere - but not from us. We do not support old versions of the code. Many bugs have been fixed since that version. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Dec 16 09:36:12 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Dec 2010 07:36:12 -1000 Subject: [Live-devel] Open RTSP and video file splitting... In-Reply-To: <4D0A2411.1070206@engicam.com> References: <4D0A2411.1070206@engicam.com> Message-ID: >But often the program exits in "Segmentation Fault", and I don't know why... The reason is that you are closing (and deleting) your 'data sink' object - "qtOut" - but you are still 'playing' that object. I.e., the code thinks that it should still be receiving incoming network data into the original "qtOut" object. The right way to fix this - with minimal disruption to the existing code - is not to make any changes to the network reading/recording mechanism, but instead to use a different 'data sink' class - one that knows how to periodically close/reopen its output file, but without having to delete/recreate the 'data sink' object itself. So, I suggest that you subclass the "QuickTimeFileSink" class to do this. Unfortunately, to do this, you're going to have to make some modifications to "liveMedia/include/QuickTimeFileSink.hh" - to change some fields and member functions from "private:" to "protected:" - so you can subclass "QuickTimeFileSink". (In particular, you will definitely need to change the constructor and destructor, and "completeOutputFile()" and "fOutFid" to "protected:") Once you've done this, please let us know which fields/functions you needed to make "protected:", and I'll change the next released version of the code to make at least those fields/functions "protected:". That way, you won't need to keep changing the released code each time. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 17 05:30:56 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Dec 2010 03:30:56 -1000 Subject: [Live-devel] openRTSP writing all zeros In-Reply-To: <8CD7A9204779214D9FDC255DE48B95211DF5515C@EXPMBX105-1.exch.logostech.net> References: <8CD7A9204779214D9FDC255DE48B95211DF5515C@EXPMBX105-1.exch.logostech.net> Message-ID: >However, now the entire file, with the exception of a very few bytes >at the beginning and scattered seemingly randomly throughout, >contains all zeros. Using wireshark I can verify that the actual >non-zero data is being sent. The QoS statistics also show that no >packets were dropped. That's very strange. I can't explain this. Do you also get a strange-looking file if you omit the "-m" option (so that you write just a single output file)? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From TWiser at logostech.net Fri Dec 17 06:18:05 2010 From: TWiser at logostech.net (Wiser, Tyson) Date: Fri, 17 Dec 2010 06:18:05 -0800 Subject: [Live-devel] openRTSP writing all zeros In-Reply-To: References: <8CD7A9204779214D9FDC255DE48B95211DF5515C@EXPMBX105-1.exch.logostech.net> Message-ID: <8CD7A9204779214D9FDC255DE48B95211DF55426@EXPMBX105-1.exch.logostech.net> Hi Ross, Thanks for the reply. Yes, if I omit the "-m" option the single large file looks the same. I've tried using a debugger to trace it back. I'm not at my work computer right now, so this is all from memory and may not be entirely correct. As I recall, however, the FileSink had a SimpleRTPSource as its source. The SimpleRTPSource received it's data from a Groupsock. Both when copying the data from the SimpleRTPSource to the FileSink and from the Groupsock to the SimpleRTPSource the buffers that the memory was being copied from contained all zeros (with very few exceptions). They were always of the correct size, but just didn't seem to contain the correct data. Thanks, Tyson -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 17, 2010 8:31 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] openRTSP writing all zeros >However, now the entire file, with the exception of a very few bytes >at the beginning and scattered seemingly randomly throughout, >contains all zeros. Using wireshark I can verify that the actual >non-zero data is being sent. The QoS statistics also show that no >packets were dropped. That's very strange. I can't explain this. Do you also get a strange-looking file if you omit the "-m" option (so that you write just a single output file)? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Fri Dec 17 10:03:29 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Dec 2010 08:03:29 -1000 Subject: [Live-devel] openRTSP writing all zeros In-Reply-To: <8CD7A9204779214D9FDC255DE48B95211DF55426@EXPMBX105-1.exch.logostech.net> References: <8CD7A9204779214D9FDC255DE48B95211DF5515C@EXPMBX105-1.exch.logostech.net> <8CD7A9204779214D9FDC255DE48B95211DF55426@EXPMBX105-1.exch.logostech.net> Message-ID: >Yes, if I omit the "-m" option the single large file looks the same. >I've tried using a debugger to trace it back. I'm not at my work >computer right now, so this is all from memory and may not be >entirely correct. As I recall, however, the FileSink had a >SimpleRTPSource as its source. The SimpleRTPSource received it's >data from a Groupsock. Both when copying the data from the >SimpleRTPSource to the FileSink and from the Groupsock to the >SimpleRTPSource the buffers that the memory was being copied from >contained all zeros (with very few exceptions). They were always of >the correct size, but just didn't seem to contain the correct data. This suggests that there may be a problem with either your operating system, or your network. You mentioned that - using Wireshark - the correct data was being sent. But is the receiving network the same as the sending network? I.e., do you have a router or firewall between the sender and the receiver? If so, you should also check that the correct data is also being received - i.e., arriving on the receiver's network. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From edi87 at fibertel.com.ar Sun Dec 19 16:54:53 2010 From: edi87 at fibertel.com.ar (Jonathan Granade) Date: Sun, 19 Dec 2010 21:54:53 -0300 Subject: [Live-devel] Question about trick play, server side In-Reply-To: References: <6edb3b012f56.4d02707a@fibertel.com.ar> Message-ID: <4D0EA95D.2080705@fibertel.com.ar> Ross, I finally make it working. I have a weird behavior with VLC, but I'm trying to fix it. I did it adding to the afterPlayingStreamState function (in OnDemandServerMediaSubsession), a call to seekStream, to the position 0 and a call to RTPSink->startPlaying(). The problem that I had with VLC was that the stream (a 10 secs mpg video+audio) starts correctly, then when this ends, the audio stream loops and start again, but the video hang in the last frame... Later, when the audio stream ends again, both streams (audio+video) loops and everything starts again. It only happen with VLC, I tried with openRTSP and MPlayer and it works fine. I debug a bit and I found that sometimes the audio stream ends first (afterPlayingStreamState of the audio StreamState object is called first), so it loops and video stream (video StreamState object) appear to never call afterPlayingStreamState function the first time, it got called second time, and again, on the thirth loop it's not called and on the fourth loop it's got called. If you can give me any pointer about what could be happen, I will appreciate it. I'm working on it, and hopefully I will send you a patch with the new objects to make a looping stream (if interested). Thanks in advance, Jonathan On 12/11/2010 04:32 AM, Ross Finlayson wrote: >> Now the problem is where to make the stream to loop to the start of >> file... you said: >> >> "You could do this, but you'd need to write a new "FramedFilter" >> subclass that sits >> in front of your "ByteStreamFileSource" class (and presents the >> illusion of delivering a single, unbroken stream to the downstream >> object (a "MPEG2TransportStreamFramer"))" >> >> I can't understand why FramedFilter subclass should be used > > An alternative - which might be simpler for you - would be to write a > new class (e.g. called "ContinuousByteStreamFileSource") that duplicates > much of the functionality (and code) of the existing > "ByteStreamFileSource" - except that it reads from its file continously > - and just use this instead of "ByteStreamFileSource". > > I.e, you would define your own subclass of > "OnDemandServerMediaSubsession" that would be identical to the existing > "MPEG2TransportFileServerMediaSubsession", except that it use a > "ContinuousByteStreamFileSource" as input instead of a > "ByteStreamFileSource". > > As always, you should *not* need to modify any of the existing code. From matteo.lisi at engicam.com Mon Dec 20 01:38:18 2010 From: matteo.lisi at engicam.com (Matteo Lisi) Date: Mon, 20 Dec 2010 10:38:18 +0100 Subject: [Live-devel] Open RTSP and video file splitting... In-Reply-To: References: <4D0A2411.1070206@engicam.com> Message-ID: <4D0F240A.1060201@engicam.com> Hi I did a quick change on the QuickTimeFileSink class , for runtime output file name changing. You can see below the code: void QuickTimeFileSink::ChangeOutputFile(UsageEnvironment& env, char * outputFileName) { printf("ChangeOutputFile\n"); completeOutputFile(); CloseOutputFile(fOutFid); fOutFid = OpenOutputFile(env, outputFileName); // Begin by writing a "mdat" atom at the start of the file. // (Later, when we've finished copying data to the file, we'll come // back and fill in its size.) fMDATposition = TellFile64(fOutFid); addAtomHeader64("mdat"); // add 64Bit offset fMDATposition += 8; if (fOutFid == NULL) return; } The result is that only the first videofile is visible with a video player, while the others files is not valid. I suppose that I forgot to update something on the file's header, but what ? Instead if I use the follow command: ./vigilartsp -v -w 640 -h 480 -f 25 rtsp://admin:12345 at 192.168.2.62 (without -4 flag) when I call the ChangeOutputFile its gave me "Segmentation fault" why ? Many Thanks. Il 16/12/2010 18:36, Ross Finlayson ha scritto: >> But often the program exits in "Segmentation Fault", and I don't know >> why... > > The reason is that you are closing (and deleting) your 'data sink' > object - "qtOut" - but you are still 'playing' that object. I.e., the > code thinks that it should still be receiving incoming network data > into the original "qtOut" object. > > The right way to fix this - with minimal disruption to the existing > code - is not to make any changes to the network reading/recording > mechanism, but instead to use a different 'data sink' class - one that > knows how to periodically close/reopen its output file, but without > having to delete/recreate the 'data sink' object itself. > > So, I suggest that you subclass the "QuickTimeFileSink" class to do this. > > Unfortunately, to do this, you're going to have to make some > modifications to "liveMedia/include/QuickTimeFileSink.hh" - to change > some fields and member functions from "private:" to "protected:" - so > you can subclass "QuickTimeFileSink". (In particular, you will > definitely need to change the constructor and destructor, and > "completeOutputFile()" and "fOutFid" to "protected:") Once you've done > this, please let us know which fields/functions you needed to make > "protected:", and I'll change the next released version of the code to > make at least those fields/functions "protected:". That way, you won't > need to keep changing the released code each time. -- ------------------------------------------------------------------------ http://www.engicam.com *ENGICAM s.r.l.* Progettazione di sistemi elettronici 50018 Scandicci - FIRENZE Via dei Pratoni, 16 int. 13 Tel. +39 055 7311387 Fax. +39 055 720608 Web www.engicam.com C.F./P.I. 05389070482 Registro Imprese di FIRENZE 542918 Capitale sociale versato 20.000,00? ------------------------------------------------------------------------ NOTICE: This message and attachments are intended only for the use of their addresses and may contain confidential information belonging to Engicam. If you are not the intended recipient, you are hereby notified that any reading, dissemination, distribution, or copying of this message, or any attachment, is strictly prohibited. If you have received this message in error, please notify the original sender immediately and delete this message, along with any attachments. From finlayson at live555.com Mon Dec 20 10:45:24 2010 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Dec 2010 08:45:24 -1000 Subject: [Live-devel] latest live555, kernel and dev-toolkit version requirements? In-Reply-To: References: Message-ID: >Currently I works with MontVista Linux 4.0.1, kernel 6.2.10 for >Davinci platform and gcc version 3.4.3. Is this good enough to build >latest version of live555??? Almost certainly, yes. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From TWiser at logostech.net Mon Dec 20 13:27:30 2010 From: TWiser at logostech.net (Wiser, Tyson) Date: Mon, 20 Dec 2010 13:27:30 -0800 Subject: [Live-devel] openRTSP writing all zeros In-Reply-To: References: <8CD7A9204779214D9FDC255DE48B95211DF5515C@EXPMBX105-1.exch.logostech.net> <8CD7A9204779214D9FDC255DE48B95211DF55426@EXPMBX105-1.exch.logostech.net> Message-ID: <8CD7A9204779214D9FDC255DE48B95211DFE7831@EXPMBX105-1.exch.logostech.net> After a little more investigation, it turned out to be a stupid error on my part. Sorry to have wasted your time. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, December 17, 2010 1:03 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] openRTSP writing all zeros >Yes, if I omit the "-m" option the single large file looks the same. >I've tried using a debugger to trace it back. I'm not at my work >computer right now, so this is all from memory and may not be >entirely correct. As I recall, however, the FileSink had a >SimpleRTPSource as its source. The SimpleRTPSource received it's >data from a Groupsock. Both when copying the data from the >SimpleRTPSource to the FileSink and from the Groupsock to the >SimpleRTPSource the buffers that the memory was being copied from >contained all zeros (with very few exceptions). They were always of >the correct size, but just didn't seem to contain the correct data. This suggests that there may be a problem with either your operating system, or your network. You mentioned that - using Wireshark - the correct data was being sent. But is the receiving network the same as the sending network? I.e., do you have a router or firewall between the sender and the receiver? If so, you should also check that the correct data is also being received - i.e., arriving on the receiver's network. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Thu Dec 23 09:19:44 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Dec 2010 07:19:44 -1000 Subject: [Live-devel] LGPL In-Reply-To: References: Message-ID: I have no problem at all with statically linking with the LIVE555 library code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Dec 23 09:27:05 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Dec 2010 07:27:05 -1000 Subject: [Live-devel] Open RTSP and video file splitting... In-Reply-To: <4D0F240A.1060201@engicam.com> References: <4D0A2411.1070206@engicam.com> <4D0F240A.1060201@engicam.com> Message-ID: >Hi I did a quick change on the QuickTimeFileSink class , for runtime >output file name changing. Ideally, you should *not* be changing the supplied library code. Instead, you should subclass "QuickTimeFileSink", and added your new method - for changing the output file name - to your subclass. If, however, you were unable to subclass "QuickTimeFileSink" - e.g., because some member functions and/or variables were defined at "private:" instead of "protected:", then let us know which functions/variables you'd like to see "protected:", and I'll make this change in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Dec 23 09:32:00 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Dec 2010 07:32:00 -1000 Subject: [Live-devel] Question about trick play, server side In-Reply-To: <4D0EA95D.2080705@fibertel.com.ar> References: <6edb3b012f56.4d02707a@fibertel.com.ar> <4D0EA95D.2080705@fibertel.com.ar> Message-ID: Because you have modified the supplied source code, you can expect no further help from me. Sorry. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Dec 23 09:40:03 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Dec 2010 07:40:03 -1000 Subject: [Live-devel] RTSP over HTTP using VLC In-Reply-To: References: Message-ID: >I am using live.2010.12.14.tar.gz release which supports RTSP over >HTTP. I am able to make use of RTSP over HTTP using the test >application - openRTSP. Please tell me how to do the same using VLC ? VLC is not our application (although it does use our libraries), so your question is better suited for a VLC mailing list. However, you can do this via Preferences -> Input & Codecs -> All (button) -> Demuxers -> RTP/RTSP and you'll see a check box for "Tunnel RTSP and RTP over HTTP". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From edi87 at fibertel.com.ar Sun Dec 26 18:13:14 2010 From: edi87 at fibertel.com.ar (Jonathan Granade) Date: Sun, 26 Dec 2010 23:13:14 -0300 Subject: [Live-devel] Question about trick play, server side In-Reply-To: References: <6edb3b012f56.4d02707a@fibertel.com.ar> <4D0EA95D.2080705@fibertel.com.ar> Message-ID: <4D17F63A.2060100@fibertel.com.ar> Ross, Actually I did not modified the source code. I created subclasses of specified objects and created an OnDemandContinuousMediaSubsession object. Regards, Jonathan On 12/23/2010 02:32 PM, Ross Finlayson wrote: > Because you have modified the supplied source code, you can expect no > further help from me. Sorry. From matteo.lisi at engicam.com Mon Dec 27 00:30:45 2010 From: matteo.lisi at engicam.com (Matteo Lisi) Date: Mon, 27 Dec 2010 09:30:45 +0100 Subject: [Live-devel] Open RTSP and video file splitting... In-Reply-To: References: <4D0A2411.1070206@engicam.com> <4D0F240A.1060201@engicam.com> Message-ID: <4D184EB5.2040301@engicam.com> Sorry for the misunderstood , I'll subclass "QuickTimeFileSink" as soon is possible... But I have another question.. Finally, I divide the video streming in 60 second mp4 video file... but now I have another problem: excpet the first one , the files have about 1 sec of "gray" before starting to play normaly , during this second seems that you can see only the part on movement... Why ? And How I can remove this "gray" seconds ? Thanks Il 23/12/2010 18:27, Ross Finlayson ha scritto: >> Hi I did a quick change on the QuickTimeFileSink class , for runtime >> output file name changing. > > Ideally, you should *not* be changing the supplied library code. > Instead, you should subclass "QuickTimeFileSink", and added your new > method - for changing the output file name - to your subclass. > > If, however, you were unable to subclass "QuickTimeFileSink" - e.g., > because some member functions and/or variables were defined at > "private:" instead of "protected:", then let us know which > functions/variables you'd like to see "protected:", and I'll make this > change in the next release of the software. -- ------------------------------------------------------------------------ http://www.engicam.com *ENGICAM s.r.l.* Progettazione di sistemi elettronici 50018 Scandicci - FIRENZE Via dei Pratoni, 16 int. 13 Tel. +39 055 7311387 Fax. +39 055 720608 Web www.engicam.com C.F./P.I. 05389070482 Registro Imprese di FIRENZE 542918 Capitale sociale versato 20.000,00? ------------------------------------------------------------------------ NOTICE: This message and attachments are intended only for the use of their addresses and may contain confidential information belonging to Engicam. If you are not the intended recipient, you are hereby notified that any reading, dissemination, distribution, or copying of this message, or any attachment, is strictly prohibited. If you have received this message in error, please notify the original sender immediately and delete this message, along with any attachments. From finlayson at live555.com Tue Dec 28 06:36:09 2010 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Dec 2010 06:36:09 -0800 Subject: [Live-devel] H264 problem with StreamParser leak In-Reply-To: References: Message-ID: >I'm puzzled as to why I get this problem as this should be common to >both MPEG4 and H264. Any ideas? I notice that for MPEG-4 you are using a "MPEG4VideoStreamDiscreteFramer", but for H.264 you are using a "H264VideoStreamFramer". If your H.264 encoder delivers discrete NAL units (i.e., one-NAL-unit-at-a-time), then it should be fed into a "H264VideoStreamDiscreteFramer" instead. (In this case, though, there should *not* be a start code (0x00000001) at the start of each NAL unit; that code is used only if the H.264 data is in a stream.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sebastien-devel at celeos.eu Thu Dec 30 07:37:20 2010 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Thu, 30 Dec 2010 16:37:20 +0100 Subject: [Live-devel] Range: clock= Message-ID: <1293723441.10469.4.camel@stim-desktop> Hi, I have a camera which is not working anymore with recents versions of live555. Error is : "Bad "Range:" header" "clock" range seems to be defined in rtsp specs. Here is the play request : Sending request: PLAY rtsp://192.43.185.31:554/MediaInput/mpeg4/ RTSP/1.0 CSeq: 5 User-Agent: LibVLC/1.2.0-git (LIVE555 Streaming Media v2010.12.14) Session: 12 Range: npt=0.000- Received 127 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Session: 12 Range: clock=20101230T162452Z- RTP-Info: url=trackID=1;seq=53547;rtptime=1293726292 From warren at etr-usa.com Thu Dec 30 14:27:55 2010 From: warren at etr-usa.com (Warren Young) Date: Thu, 30 Dec 2010 15:27:55 -0700 Subject: [Live-devel] Sysclock sanity check causes slow-motion MPEG-2 TS playback from live555MediaServer Message-ID: <4D1D076B.1010000@etr-usa.com> We have discovered a problem with the following patch, introduced in live.2010.12.05: - Added a sanity check to "MultiFramedRTPSink" and "BasicUDPSink" to allow for the possibility of the system clock jumping ahead in time, and thereby messing up the calculation of how long to wait before sending the next packet. (Thanks to Anders Chen for noting this issue.) This causes MPEG-2 TS videos to play in slow motion on CentOS 3.x boxes, but not on our CentOS 5.x boxes. I'm guessing that it is due to some difference in the implementation of gettimeofday() in kernel 2.6. The client doesn't seem to matter. We've seen it with VLC on a variety of PCs, but also with hardware RTSP MPEG-2 TS stream decoders. We've ruled out differences in compilers and system libraries. We initially tested with binaries built on the test systems, but later tried copying the binary built on our CentOS 3.x test box to our CentOS 5.x test box, and the problem persisted. That test also rules out CPU type differences, since our CentOS 3.x box is a 32-bit Intel and the CentOS 5.x is a x86_64 box. Running the 32-bit binary built on the older box to the new 64-bit box -- where it runs asymptomatically -- rules out things like int64_t being typedef'd incorrectly. Here are the uname strings for our test boxes: CentOS 3.x, slo-mo: 2.4.21-63.EL i686 GNU/Linux CentOS 5.x, normal: 2.6.18-164.15.1.el5 SMP x86_64 GNU/Linux Maybe this patch also caused this guy's symptom: http://lists.live555.com/pipermail/live-devel/2010-December/012969.html Was this patch just "looks like a good idea" sort of patch, or does it fix something else, and reverting it breaks that? From finlayson at live555.com Thu Dec 30 17:46:06 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Dec 2010 17:46:06 -0800 Subject: [Live-devel] Fix for H264 profile-level-id In-Reply-To: References: Message-ID: Thanks. This will be fixed in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Dec 30 18:01:34 2010 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Dec 2010 18:01:34 -0800 Subject: [Live-devel] H264 problem with StreamParser leak In-Reply-To: References: Message-ID: >We have things working now by avoiding the issue by creating our own >customized DiscreteFramer. It avoids the problem by directly using >the output buffer and does not touch the Bank buffers. I suspect >there is a problem with the H264VideoStreamFramer. No, I think it's OK. However, it expects to be able to read a large (and arbitrary-sized) chunk of data from its input source. Usually the input source is a file, in which case that's not a problem. However, in your case, the input source was delivering just a short chunk of data (an individual NAL unit) each time. That was apparently (somehow) upsetting the stream buffering/parsing code. That's why "H264VideoStreamDiscreteFramer" is the appropriate 'framer' class for you. >BTW, does it conflict with the H264 standard if the data being >passed through the discrete framer *does* contain the startcodes? The start codes are not supposed to be present in outgoing RTP packets (according to the H/264 RTP payload format specification), so our "H264VideoRTPSink" class (which the 'framer' class feeds into) can't accept NAL units that begin with the start code. (In principle, we *could* strip these out - in either "H264VideoRTPSink" or "H264VideoStreamDiscreteFramer" - if we see them, but that would require doing an extra memcpy(), which would be inefficient. So it's better to not allow the start codes at all.) Your best solution, IMHO, would be to fix your encoder source to *not* include the start code at the beginning of each NAL unit that it delivers. Then you could just use our (unmodified) "H264VideoStreamDiscreteFramer". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 31 01:45:01 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 31 Dec 2010 01:45:01 -0800 Subject: [Live-devel] Range: clock= In-Reply-To: <1293723441.10469.4.camel@stim-desktop> References: <1293723441.10469.4.camel@stim-desktop> Message-ID: >I have a camera which is not working anymore with recents versions of >live555. Error is : "Bad "Range:" header" >"clock" range seems to be defined in rtsp specs. OK, the next version of the code will allow this type of parameter in "Range:" headers (but it won't actually interpret such parameters; instead, it'll just ignore them). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 31 02:17:40 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 31 Dec 2010 02:17:40 -0800 Subject: [Live-devel] Sysclock sanity check causes slow-motion MPEG-2 TS playback from live555MediaServer In-Reply-To: <4D1D076B.1010000@etr-usa.com> References: <4D1D076B.1010000@etr-usa.com> Message-ID: >We have discovered a problem with the following patch, introduced in >live.2010.12.05: > >- Added a sanity check to "MultiFramedRTPSink" and "BasicUDPSink" to > allow for the possibility of the system clock jumping ahead in time, > and thereby messing up the calculation of how long to wait before > sending the next packet. (Thanks to Anders Chen for noting this > issue.) Sorry, but that patch definitely averted a real problem, and we won't be backing it out. If you're really seeing a problem with this new version, then you're going to have to identify specifically what is causing it. >Maybe this patch also caused this guy's symptom: > >http://lists.live555.com/pipermail/live-devel/2010-December/012969.html No, that's something completely unrelated. >Was this patch just "looks like a good idea" sort of patch, or does >it fix something else, and reverting it breaks that? The latter. See http://lists.live555.com/pipermail/live-devel/2010-November/012840.html -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 31 06:53:16 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 31 Dec 2010 06:53:16 -0800 Subject: [Live-devel] Sysclock sanity check causes slow-motion MPEG-2 TS playback from live555MediaServer In-Reply-To: <4D1D076B.1010000@etr-usa.com> References: <4D1D076B.1010000@etr-usa.com> Message-ID: >This causes MPEG-2 TS videos to play in slow motion on CentOS 3.x >boxes, but not on our CentOS 5.x boxes. Does this happen with all of your MPEG-2 TS video files, or only with some of them? My current suspicion is that something strange about your TS file(s) is (somehow) causing the duration estimate computation in "MPEG2TransportStreamFramer" to go negative. The new 'sanity check' in "MultiFramedRTPSink" will overcome this, but perhaps this is also (somehow) exposing some problem with your TS file(s) that happened to get hidden in the old code? It might be useful to see an example of a TS file that illustrates the problem. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 31 07:13:36 2010 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 31 Dec 2010 07:13:36 -0800 Subject: [Live-devel] New LIVE555 version, now supports 'trick play' on H.264 Transport Stream files Message-ID: Happy New Year (for some of you). I have now installed a new version (2010.12.31) of the "LIVE555 Streaming Media" code that supports 'trick play' operations (seeking, fast-forward, reverse play) on MPEG Transport Stream files that contain H.264 video (rather than just MPEG-2 video, as in previous versions). To support this, the index file format has been extended in a backwards-compatible way, so that your existing index files - for MPEG-2 video Transport Streams - will continue to work as before. To create an index file for H.264 Transport Streams - and to act as a server for such streams - you will, of course, need to download the new code. I have also released new binary versions of the "MPEG2TransportStreamIndexer" and "testMPEG2TransportStreamTrickPlay" utilities, and the "live555MediaServer" - for Windows, MacOS/Intel, FreeBSD, and Linux/Intel. The new code has been tested for only a few H.264 Transport Stream files, and might not work properly for all such files. (In particular, it relies upon a SPS (Sequence Parameter Set) NAL unit appearing shortly before each 'I-frame'.) As always, if you have an example of a Transport Stream file for which the indexing (or subsequent trick play) does *not* work, then please put the file on a web server, and send a link to the file (*not* the file itself) to the mailing list, and we'll examine it to see if we can figure out the problem. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/