From zhangmin_84 at 163.com Mon Dec 1 04:38:57 2008 From: zhangmin_84 at 163.com (zhangmin_84) Date: Mon, 1 Dec 2008 20:38:57 +0800 (CST) Subject: [Live-devel] Multicast session question In-Reply-To: References: Message-ID: <26413642.417561228135137567.JavaMail.coremail@bj163app93.163.com> Hi Does anybody knows how play the RTP packets,I used live555 received the RTP packets. thanks advance ?2008-12-01?"Yedidia Amit" ??? Hi, Following my previous questions I want to create a situation where one source is sent to two different multicast address. For that I am using passiveMediaSubsession where I add both destination to the sink. I want to generate 2 sessions that whoever asks "describe" for the first will receive in the sdp the first multicast address, and whoever asks "describe" for the second session will receive in the sdp the second multicast address. 1. Is it possible? My problem is that passiveMediaSubsession->sdpLines() doesn't know who called it (which sessionID) 2. which address does live returns for simple describe request? (as I can see it returns only one address). Regards, Amit Yedidia The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Mon Dec 1 05:15:11 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Mon, 1 Dec 2008 15:15:11 +0200 Subject: [Live-devel] Multicast session question In-Reply-To: <26413642.417561228135137567.JavaMail.coremail@bj163app93.163.com> Message-ID: you are on my thread , get one of your own :) for your question, you can check one of the textXXXstreamer attached tests. Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of zhangmin_84 Sent: Monday, December 01, 2008 2:39 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Multicast session question Hi Does anybody knows how play the RTP packets,I used live555 received the RTP packets. thanks advance ?2008-12-01?"Yedidia Amit" ??? Hi, Following my previous questions I want to create a situation where one source is sent to two different multicast address. For that I am using passiveMediaSubsession where I add both destination to the sink. I want to generate 2 sessions that whoever asks "describe" for the first will receive in the sdp the first multicast address, and whoever asks "describe" for the second session will receive in the sdp the second multicast address. 1. Is it possible? My problem is that passiveMediaSubsession->sdpLines() doesn't know who called it (which sessionID) 2. which address does live returns for simple describe request? (as I can see it returns only one address). Regards, Amit Yedidia The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. ________________________________ [??] ???????-???? The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrcravens at verizon.net Mon Dec 1 10:33:52 2008 From: mrcravens at verizon.net (Mike Cravens) Date: Mon, 01 Dec 2008 12:33:52 -0600 Subject: [Live-devel] Recording an mp2 rtp stream for playback by Live555 media server Message-ID: <49342E10.6080205@verizon.net> Given video encoded as mpeg2 transported over RTP from a Vbrick appliance ( model 9110-4200), we need to record data as a series of mpege2 TS chapter files of fixed duration that may subsequently be played back using Live555 media server. We hope to support trick mode, as well. Before realizing that the Vbrick mp2 appliance does not apparently support RTSP ( although their mpeg4 device does) we had thought to use OpenRTSP as a ( nearly complete) starting point for recording. We did find a note on playing back h323 directly from an RTP transported stream using vlc and a hand generated sdp file: v=0 o=- 7776 3 IN IP4 10.215.130.112 s=Test H263 stream i=Parameters for the session streamed by "ChipsAhoyH263" t=0 0 m=video 7776 *RTP*/AVP 98 a=rtpmap:98 H263-1998/90000 a=fmtp:98 profile=0; level=40 b=TIAS:2048000 Where 7776 is the port you are streaming out to from above steps. Save as say c:\testH263.sdp Start *vlc* like this: *Vlc* -vvv file://c:\testH263.sdp Could we transcode the mp2 RTP stream using vlc or another program into a format openRTSP can record on the fly, or directly use vlc to do the recording? If so, what would be a good guess at parameters for the required sdp file? Or is there a more direct path that escapes us? Regards, Mike Cravens mrcravens at verizon.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From anto.rizzo at caramail.com Mon Dec 1 13:42:33 2008 From: anto.rizzo at caramail.com (Antonella Rizzo) Date: Mon, 1 Dec 2008 22:42:33 +0100 Subject: [Live-devel] Relayer Message-ID: <170219427399803@lycos-europe.com> Hi Ross, I am a relayer. I store rtp packets in two files (audio and video),so I relay on demand! If I work with QuickTime Broadcaster (h264 and mp4) it's ok! But if I generate the stream, with testOnDemanRTSPServer (test.mpg), my application don't work! Audio is blocked after few seconds! Why? Packets arriving regularly, but when relayed it is as if you lose synchronization! Regards From finlayson at live555.com Mon Dec 1 16:40:02 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 1 Dec 2008 16:40:02 -0800 Subject: [Live-devel] Multicast session question In-Reply-To: References: Message-ID: >Following my previous questions I want to create a situation where >one source is sent to two different multicast address. > >For that I am using passiveMediaSubsession where I add both >destination to the sink. > >I want to generate 2 sessions that whoever asks "describe" for the >first will receive in the sdp the first multicast address, > >and whoever asks "describe" for the second session will receive in >the sdp the second multicast address. It sounds like you need your own "ServerMediaSubsession" subclass, similar to (but different from) "PassiveServerMediaSubsession". In particular, you will need to write your own implementation of the "getStreamParameters()" virtual function. Each call to this function can result in a different multicast address being used (if that's what you want). You can also use the "clientSessionId" parameter to distinguish between clients, so that your other virtual functions can know which client was given which multicast address. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 1 17:05:49 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 1 Dec 2008 17:05:49 -0800 Subject: [Live-devel] Recording an mp2 rtp stream for playback by Live555 media server In-Reply-To: <49342E10.6080205@verizon.net> References: <49342E10.6080205@verizon.net> Message-ID: >Given video encoded as mpeg2 transported over RTP from a Vbrick >appliance ( model 9110-4200), we need to record data as a series of >mpege2 TS chapter files of fixed duration that may subsequently be >played back using Live555 media server. We hope to support trick >mode, as well. > >Before realizing that the Vbrick mp2 appliance does not apparently >support RTSP ( although their mpeg4 device does) we had thought to >use OpenRTSP as a ( nearly complete) starting point for recording. If your server doesn't support RTSP, then there seems to be little point in using "openRTSP" as a model for your client, because RTSP is most of what "openRTSP" does. Instead, you can just use "testMPEG1or2VideoReceiver" as a model. Just use a "SimpleRTPSource" instead of a "MPEG1or2VideoRTPSource". See line 751 of "liveMedia/MediaSession.cpp" for an example of how to use this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From amit.yedidia at elbitsystems.com Mon Dec 1 21:06:09 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Tue, 2 Dec 2008 07:06:09 +0200 Subject: [Live-devel] Multicast session question In-Reply-To: Message-ID: Thank Ross. I was thinking the same way but got problem with the "describe". When calling sdpLine() no parameter is being passed ,which make it very difficult for distiguishing between different clients. I'm using multicast sessions which may cause some clients not to send "setup". any idea? Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, December 02, 2008 2:40 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Multicast session question Following my previous questions I want to create a situation where one source is sent to two different multicast address. For that I am using passiveMediaSubsession where I add both destination to the sink. I want to generate 2 sessions that whoever asks "describe" for the first will receive in the sdp the first multicast address, and whoever asks "describe" for the second session will receive in the sdp the second multicast address. It sounds like you need your own "ServerMediaSubsession" subclass, similar to (but different from) "PassiveServerMediaSubsession". In particular, you will need to write your own implementation of the "getStreamParameters()" virtual function. Each call to this function can result in a different multicast address being used (if that's what you want). You can also use the "clientSessionId" parameter to distinguish between clients, so that your other virtual functions can know which client was given which multicast address. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From anitarajan at tataelxsi.co.in Mon Dec 1 18:48:28 2008 From: anitarajan at tataelxsi.co.in (Anita Rajan) Date: Tue, 2 Dec 2008 08:18:28 +0530 Subject: [Live-devel] HTTP Tunnelling Message-ID: <013a01c95428$75476cd0$6e033c0a@telxsi.com> Hi All, Its a generic question i have and i feel i will get a response here as most of u deal with streaming. I have extended the live555 RTSP server code to support HTTP Tunnelling. My test setup is as follows: I have setup the server in the WAN and the requesting client PC is in LAN. So the RTSP request is tunnelled through the firewall. I am streaming MPEG4 currently,but the fps i receive is just around 2fps. The resolution of the video is 1280X720. Is this the rate at which we usual receive a tunnelled request?.. Is there something i need to do to improve the performance? Is there anything i can do in the setup to increase the bitrate to around 10 Mbps(or framerate of 25-30 fps)?? Looking forward for a response.. Thanks and Regards, A.Rajan The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From kollurisrinu at gmail.com Tue Dec 2 00:33:41 2008 From: kollurisrinu at gmail.com (srinivasa rao) Date: Tue, 2 Dec 2008 08:33:41 +0000 Subject: [Live-devel] How can I stream g711(ulaw PCM) audio via RTP? There is no demo application for this. Message-ID: <4da7f3da0812020033o732705a0i31d121303a6d637c@mail.gmail.com> hi, I am new to this Live 555.I have checked the test programs for to stream the g711 via RTP but there is no demo application for this.Can any body has developed the application for this,please send me the source code. Thanks in advance, srinu. From finlayson at live555.com Tue Dec 2 01:40:48 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 2 Dec 2008 01:40:48 -0800 Subject: [Live-devel] How can I stream g711(ulaw PCM) audio via RTP? There is no demo application for this. In-Reply-To: <4da7f3da0812020033o732705a0i31d121303a6d637c@mail.gmail.com> References: <4da7f3da0812020033o732705a0i31d121303a6d637c@mail.gmail.com> Message-ID: > I am new to this Live 555.I have checked the test programs for to >stream the g711 via RTP but there is no demo application for this.Can >any body has developed the application for this,please send me the >source code. There's no demo application specifically for streaming uLaw PCM, but you can get hints from the "testWAVAudioStreamer" application. In particular, use a "SimpleRTPSink" to send RTP packets. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kvmsanand at gmail.com Wed Dec 3 05:46:30 2008 From: kvmsanand at gmail.com (anand meher) Date: Wed, 3 Dec 2008 14:46:30 +0100 Subject: [Live-devel] problems with New FramedSource creation Message-ID: Hi everyone, I am developing software which is a multithreaded program and one thread requires the transport stream packets to be streamed from my data structures. I should also take care of mutual exclusion to the access of the data structure .I have created a new Device Source and implemented the doGetNextFrame() and deliverFrame() functions. the doGetNextFrame() function reads from the data strcutres and calls the deliver frame function. the deliverFrame function copies the data to fTo and then deletes the respective bytes from the data strucutre. //I have modelled the actual thread which streams along the lines of testMPEG2TransportStreamer.cpp in the play() function i do the following steps vectorSource = VectorDeviceSource::createNew(*env, g_packets_outputvector, g_packets_outputvectorlock,signal_vector); //where vectorSource is the new Device Source i have created. videoSource = MPEG2TransportStreamFramer::createNew(*env, vectorSource); //then i create a videoSource for Framer and i pass theVectorSource object. then i call videoSink->startPlaying(*videoSource, afterPlaying, videoSink); BUT my program some how does not work properly... firslty it gets stuck in the deliverFrame() function always ....and uses lot of CPU and causes the other threads running to slow down as well..... I read some of the posts of live555 and feared that may be the program is going to a infinite loop..so i tried using the watchVariable in the event loop , but even that did not work ..... 1) it would be great if some can help me in what i am missing ..... i can claify if something is missing ......... Regards Anand. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 3 05:57:21 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Dec 2008 05:57:21 -0800 Subject: [Live-devel] problems with New FramedSource creation In-Reply-To: References: Message-ID: > I am developing software which is a multithreaded >program and one thread requires the transport stream packets to be >streamed from my data structures. I should also take care of mutual >exclusion to the access of the data structure Read the FAQ!!!!! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kvmsanand at gmail.com Wed Dec 3 07:35:23 2008 From: kvmsanand at gmail.com (anand meher) Date: Wed, 3 Dec 2008 16:35:23 +0100 Subject: [Live-devel] problems with New FramedSource creation In-Reply-To: References: Message-ID: Dear Ross, I am relatively new using the live555 libraries and sorry if my questions are too trivial. I read the FAQ especially the question which says how to modify test*Streamer application to read from a live audio or video. I wanted to modify testMPEG2TransportStreamer so that it can read from my buffers instead of a file. I have tried piping and using files as mentioned in the FAQ but i dont have cotrol over the syncronization. I intended to implement something like below.... NewDeviceSource (which reads from the buffer containing *TS ) --> TransportStreamFramer --> sink( a simple RTP sink) *TS = (MPEG2 Transport Stream 188 byte packets) One question which i have is whether i have to also write a new Filter for TransportStreamFramer . I already have the MPEG2 transport stream packets in my buffer so i thought i need not write a new filter but instead use the exising TransportStreamFramer. Also i have a doubt in the way i need to signal the arrival of new data to the event loop and i explain it below. I used the watchVariable of the doEventloop and when ever its set i call videoSink->startPlaying ( ).and i am not calling newDeviceSource->doGetNextFrame() because it needs to be framed by the TransportStreamFramer. I dont know wether calling videoSink->startPlaying() is the correct way of signalling the new data. i can clarify some of the points if its not clear. best regards Anand Meher. On Wed, Dec 3, 2008 at 2:46 PM, anand meher wrote: > Hi everyone, > I am developing software which is a multithreaded > program and one thread requires the transport stream packets to be streamed > from my data structures. I should also take care of mutual exclusion to the > access of the data structure .I have created a new Device Source and > implemented the doGetNextFrame() and deliverFrame() functions. > > the doGetNextFrame() function reads from the data strcutres and calls the > deliver frame function. > the deliverFrame function copies the data to fTo and then deletes the > respective bytes from the data strucutre. > > //I have modelled the actual thread which streams along the lines of > testMPEG2TransportStreamer.cpp > > > > > in the play() function i do the following steps > > vectorSource = VectorDeviceSource::createNew(*env, g_packets_outputvector, > g_packets_outputvectorlock,signal_vector); > > //where vectorSource is the new Device Source i have created. > > videoSource = MPEG2TransportStreamFramer::createNew(*env, vectorSource); > //then i create a videoSource for Framer and i pass theVectorSource > object. > > then i call > > videoSink->startPlaying(*videoSource, afterPlaying, videoSink); > > > > > BUT my program some how does not work properly... > > firslty it gets stuck in the deliverFrame() function always ....and uses > lot of CPU and causes the other threads running to slow down as well..... > > I read some of the posts of live555 and feared that may be the program is > going to a infinite loop..so i tried using the watchVariable in the event > loop , but even that did not work ..... > > > 1) it would be great if some can help me in what i am missing ..... > > > i can claify if something is missing ......... > > Regards > Anand. > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Wed Dec 3 07:56:07 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Wed, 3 Dec 2008 17:56:07 +0200 Subject: [Live-devel] problems with New FramedSource creation In-Reply-To: Message-ID: few things you must notice: 1.doGetNextFrame should NOT call deliver frame. it should only scheduled deliverFrame in the taskSchedulaer waiting on some socket or the watchVariable. 2. you call startPlaying only once!!! (when you want to start playing - it should not be used to notify new data. Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of anand meher Sent: Wednesday, December 03, 2008 5:35 PM To: live-devel at ns.live555.com Subject: Re: [Live-devel] problems with New FramedSource creation Dear Ross, I am relatively new using the live555 libraries and sorry if my questions are too trivial. I read the FAQ especially the question which says how to modify test*Streamer application to read from a live audio or video. I wanted to modify testMPEG2TransportStreamer so that it can read from my buffers instead of a file. I have tried piping and using files as mentioned in the FAQ but i dont have cotrol over the syncronization. I intended to implement something like below.... NewDeviceSource (which reads from the buffer containing *TS ) --> TransportStreamFramer --> sink( a simple RTP sink) *TS = (MPEG2 Transport Stream 188 byte packets) One question which i have is whether i have to also write a new Filter for TransportStreamFramer . I already have the MPEG2 transport stream packets in my buffer so i thought i need not write a new filter but instead use the exising TransportStreamFramer. Also i have a doubt in the way i need to signal the arrival of new data to the event loop and i explain it below. I used the watchVariable of the doEventloop and when ever its set i call videoSink->startPlaying ( ).and i am not calling newDeviceSource->doGetNextFrame() because it needs to be framed by the TransportStreamFramer. I dont know wether calling videoSink->startPlaying() is the correct way of signalling the new data. i can clarify some of the points if its not clear. best regards Anand Meher. On Wed, Dec 3, 2008 at 2:46 PM, anand meher wrote: Hi everyone, I am developing software which is a multithreaded program and one thread requires the transport stream packets to be streamed from my data structures. I should also take care of mutual exclusion to the access of the data structure .I have created a new Device Source and implemented the doGetNextFrame() and deliverFrame() functions. the doGetNextFrame() function reads from the data strcutres and calls the deliver frame function. the deliverFrame function copies the data to fTo and then deletes the respective bytes from the data strucutre. //I have modelled the actual thread which streams along the lines of testMPEG2TransportStreamer.cpp in the play() function i do the following steps vectorSource = VectorDeviceSource::createNew(*env, g_packets_outputvector, g_packets_outputvectorlock,signal_vector); //where vectorSource is the new Device Source i have created. videoSource = MPEG2TransportStreamFramer::createNew(*env, vectorSource); //then i create a videoSource for Framer and i pass theVectorSource object. then i call videoSink->startPlaying(*videoSource, afterPlaying, videoSink); BUT my program some how does not work properly... firslty it gets stuck in the deliverFrame() function always ....and uses lot of CPU and causes the other threads running to slow down as well..... I read some of the posts of live555 and feared that may be the program is going to a infinite loop..so i tried using the watchVariable in the event loop , but even that did not work ..... 1) it would be great if some can help me in what i am missing ..... i can claify if something is missing ......... Regards Anand. The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrcravens at verizon.net Wed Dec 3 08:53:29 2008 From: mrcravens at verizon.net (Mike Cravens) Date: Wed, 03 Dec 2008 10:53:29 -0600 Subject: [Live-devel] Recording an mp2 rtp stream for playback by Live555 media In-Reply-To: References: Message-ID: <4936B989.1000806@verizon.net> Thanks for the specific suggestions, Ross. One question, though: Why SimpleRTPSource vs. MPEG1or2VideoRTPSource? We don't yet have the VBRICK mpeg2 appliance in hand, but I believe that it does put out mpeg2 encoded data, and may offer a multicast or unicast stream option. We should be able to find out more once we can command it using their web or SNMP interfaces, and and can look on the wire. It seems to have an option to do SAP, but i have heard that the VBRICK SAP is somewhat proprietary and is only intended for other vbrick devices. Does this mean we will need to hand code a local sdp file for each video source? BTW, OpenRTSP does support a rich set of options which just about meet our other requirements. Thanks for the example programs and the tips on which ones apply to our situation. Regards, Mike live-devel-request at ns.live555.com wrote: > Send live-devel mailing list submissions to > live-devel at lists.live555.com > > To subscribe or unsubscribe via the World Wide Web, visit > http://lists.live555.com/mailman/listinfo/live-devel > or, via email, send a message with subject or body 'help' to > live-devel-request at lists.live555.com > > You can reach the person managing the list at > live-devel-owner at lists.live555.com > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of live-devel digest..." > > > Today's Topics: > > 1. Re: Recording an mp2 rtp stream for playback by Live555 media > server (Ross Finlayson) > 2. Re: Multicast session question (Yedidia Amit) > 3. HTTP Tunnelling (Anita Rajan) > 4. How can I stream g711(ulaw PCM) audio via RTP? There is no > demo application for this. (srinivasa rao) > 5. Re: How can I stream g711(ulaw PCM) audio via RTP? There is > no demo application for this. (Ross Finlayson) > 6. problems with New FramedSource creation (anand meher) > 7. Re: problems with New FramedSource creation (Ross Finlayson) > 8. Re: problems with New FramedSource creation (anand meher) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Mon, 1 Dec 2008 17:05:49 -0800 > From: Ross Finlayson > Subject: Re: [Live-devel] Recording an mp2 rtp stream for playback by > Live555 media server > To: LIVE555 Streaming Media - development & use > > Message-ID: > Content-Type: text/plain; charset="us-ascii" ; format="flowed" > > >> Given video encoded as mpeg2 transported over RTP from a Vbrick >> appliance ( model 9110-4200), we need to record data as a series of >> mpege2 TS chapter files of fixed duration that may subsequently be >> played back using Live555 media server. We hope to support trick >> mode, as well. >> >> Before realizing that the Vbrick mp2 appliance does not apparently >> support RTSP ( although their mpeg4 device does) we had thought to >> use OpenRTSP as a ( nearly complete) starting point for recording. >> > > If your server doesn't support RTSP, then there seems to be little > point in using "openRTSP" as a model for your client, because RTSP is > most of what "openRTSP" does. > > Instead, you can just use "testMPEG1or2VideoReceiver" as a model. > Just use a "SimpleRTPSource" instead of a "MPEG1or2VideoRTPSource". > See line 751 of "liveMedia/MediaSession.cpp" for an example of how to > use this. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ggomand at info.fundp.ac.be Wed Dec 3 12:58:33 2008 From: ggomand at info.fundp.ac.be (ggomand) Date: Wed, 03 Dec 2008 21:58:33 +0100 Subject: [Live-devel] Jitter value with openRTSP In-Reply-To: References: <492BE066.7090703@info.fundp.ac.be> <492C22DB.2030703@info.fundp.ac.be> Message-ID: <4936F2F9.9020705@info.fundp.ac.be> Hello Ross, Thank you for your reply. I would like to stream another mp4 video file (which has a timestamp frequency of 90000Hz) to compare my results but I didn't manage to find that kind of files which can be handled by openRTSP (because of video codec). Would you have that kind of file or would you know where I could find one? Thank you, Gille Ross Finlayson wrote: >> I've seen (thanks to wireshark) in the SDP : "Media Attribute (a): >> rtpmap:96 MP4V-ES/5544" > > You don't need to use wireshark to see this - you will also see it in > the diagnostic output from "openRTSP". > > Therefore, the server is telling you that the RTP timestamp frequency is > 5544. This is very unusual, because a timestamp frequency of 90000 Hz > is recommended, and used by everyone else (for MPEG-4 Video), as far as > I know. > >> And my last question: >> "the value jitter that is given in the stats (QoS stats) is the mean >> jitter?" > > It's a weighted average, as described in RFC 3550, section A.8. From finlayson at live555.com Wed Dec 3 17:38:47 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Dec 2008 17:38:47 -0800 Subject: [Live-devel] Jitter value with openRTSP In-Reply-To: <4936F2F9.9020705@info.fundp.ac.be> References: <492BE066.7090703@info.fundp.ac.be> <492C22DB.2030703@info.fundp.ac.be> <4936F2F9.9020705@info.fundp.ac.be> Message-ID: >Hello Ross, > >Thank you for your reply. > >I would like to stream another mp4 video file (which has a timestamp >frequency of 90000Hz) to compare my results but I didn't manage to >find that kind of files which can be handled by openRTSP (because of >video codec). > >Would you have that kind of file or would you know where I could find one? http://www.live555.com/liveMedia/faq.html#m4e-file From anto.rizzo at caramail.com Wed Dec 3 07:36:38 2008 From: anto.rizzo at caramail.com (Antonella Rizzo) Date: Wed, 3 Dec 2008 16:36:38 +0100 Subject: [Live-devel] Relayer and QoS Message-ID: <183510174317948@lycos-europe.com> Hi Ross, I want do QoS on my relayer as PlayCommon.cpp, there are code examples? I want print QoS output as openRTSP, Regards From finlayson at live555.com Wed Dec 3 18:19:33 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Dec 2008 18:19:33 -0800 Subject: [Live-devel] problems with New FramedSource creation In-Reply-To: References: Message-ID: > I am relatively new using the live555 libraries and sorry if >my questions are too trivial. I read the FAQ especially the question >which says how to modify test*Streamer application to read from a >live audio or video. No, I meant: Read the FAQ entry about threads. If you want to use multiple threads in an application that uses our software, then you really need to know what you're doing. (The fact that you're using a "@gmail.com" email address is strong evidence that you don't.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Dec 3 18:24:11 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Dec 2008 18:24:11 -0800 Subject: [Live-devel] Recording an mp2 rtp stream for playback by Live555 media In-Reply-To: <4936B989.1000806@verizon.net> References: <4936B989.1000806@verizon.net> Message-ID: >One question, though: Why SimpleRTPSource vs. MPEG1or2VideoRTPSource? Because "MPEG1or2VideoRTPSource" is specifically for receiving a MPEG-1 or 2 Video Elementary Stream RTP source. This uses a different RTP payload format than MPEG Transport Stream RTP sources. >It seems to have an option to do SAP, but i have heard that the >VBRICK SAP is somewhat proprietary and is only intended for other >vbrick devices. Does this mean we will need to hand code a local >sdp file for each video source? If you are receiving a MPEG Transport Stream *multicast* RTP stream, then you won't need a SDP description, because you should already know the RTP payload format, the IP multicast address, and the RTP port numbers (even for RTP; odd for RTCP). If, however, you want to receive a *unicast* stream, then you probably need RTSP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Dec 3 18:24:47 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Dec 2008 18:24:47 -0800 Subject: [Live-devel] Relayer and QoS In-Reply-To: <183510174317948@lycos-europe.com> References: <183510174317948@lycos-europe.com> Message-ID: >I want do QoS on my relayer as PlayCommon.cpp, >there are code examples? I want print QoS output as openRTSP, Yes - just look at how "openRTSP" implements the "-Q" option (in "playCommon.cpp"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From feiliu at shdv.com Wed Dec 3 21:31:36 2008 From: feiliu at shdv.com (feiliu at shdv.com) Date: Thu, 4 Dec 2008 13:31:36 +0800 Subject: [Live-devel] Help me to solve a trouble! Message-ID: <97B3021581F4EE48BB721532D21D71E4E7C148@dvsvr06.dvtech.com> When I built the code on LINUX, I got the Makefile by running ./genMakefiles solaris, But when make was run, error came forth when the compiler went into /live/testProgs and compiled testMP3Streamer. The error is described as follows: /usr/bin/ld: cannot find ?Clsocket. In fact, there is no directory named ld in /usr/bin in my computer. What??s wrong? Really appreciate if someone will write back to me as quickly as possible! -------------- next part -------------- An HTML attachment was scrubbed... URL: From guiluge at gmail.com Wed Dec 3 22:07:42 2008 From: guiluge at gmail.com (Guillaume Ferry) Date: Thu, 4 Dec 2008 07:07:42 +0100 Subject: [Live-devel] Help me to solve a trouble! In-Reply-To: <97B3021581F4EE48BB721532D21D71E4E7C148@dvsvr06.dvtech.com> References: <97B3021581F4EE48BB721532D21D71E4E7C148@dvsvr06.dvtech.com> Message-ID: <425f13530812032207k1f1a5f6du9414c0387d8953ea@mail.gmail.com> Why don't you use ./genMakefiles linux ? It may quite solve your dependency issue. Guillaume. By the way, ld is not a directory, it's an executable (man ld sure helps :) ) 2008/12/4 > When I built the code on LINUX, I got the Makefile by running > ./genMakefiles solaris, But when make was run, error came forth when the > compiler went into /live/testProgs and compiled testMP3Streamer. The error > is described as follows: > > /usr/bin/ld: cannot find ?Clsocket. > > In fact, there is no directory named ld in /usr/bin in my computer. What's > wrong? Really appreciate if someone will write back to me as quickly as > possible! > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From feiliu at shdv.com Wed Dec 3 22:38:57 2008 From: feiliu at shdv.com (feiliu at shdv.com) Date: Thu, 4 Dec 2008 14:38:57 +0800 Subject: [Live-devel] Help me to solve a trouble! Message-ID: <97B3021581F4EE48BB721532D21D71E4E7C158@dvsvr06.dvtech.com> Oh, great! It??s OK! But when I ran ./genMakefiles linux in the morning it didn??t work. By the way, I am a rookie in this area, please me more suggestions in the future! Thank you very much! ________________________________ ??????: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] ???? Guillaume Ferry ????????: 2008??12??4?? 14:08 ??????: LIVE555 Streaming Media - development & use ????: Re: [Live-devel] Help me to solve a trouble! Why don't you use ./genMakefiles linux ? It may quite solve your dependency issue. Guillaume. By the way, ld is not a directory, it's an executable (man ld sure helps :) ) 2008/12/4 When I built the code on LINUX, I got the Makefile by running ./genMakefiles solaris, But when make was run, error came forth when the compiler went into /live/testProgs and compiled testMP3Streamer. The error is described as follows: /usr/bin/ld: cannot find ?Clsocket. In fact, there is no directory named ld in /usr/bin in my computer. What's wrong? Really appreciate if someone will write back to me as quickly as possible! _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From domentarion at live.nl Wed Dec 3 23:19:12 2008 From: domentarion at live.nl (Sander den Breejen) Date: Thu, 4 Dec 2008 07:19:12 +0000 Subject: [Live-devel] Running Live555 Media Server Fedora 10 Message-ID: Hello,Is there a way to use live555media server at a fedora 10 server ?. How can i do that?Sander _________________________________________________________________ De leukste online filmpjes vind je op MSN Video! http://video.msn.com/video.aspx?mkt=nl-nl -------------- next part -------------- An HTML attachment was scrubbed... URL: From feiliu at shdv.com Thu Dec 4 00:28:04 2008 From: feiliu at shdv.com (feiliu at shdv.com) Date: Thu, 4 Dec 2008 16:28:04 +0800 Subject: [Live-devel] ask for help Message-ID: <97B3021581F4EE48BB721532D21D71E4E7C1EA@dvsvr06.dvtech.com> Hi all, I?m trying to use testMPEG1or2VideoStreamer to stream a MPEG Program Stream and I have changed testMPEG1or2VideoStreamer.cpp correspondingly, but I got the error: Saw unexpected code 0x1c0?? Anyone who knows this please help me! Regards, Faye -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 4 00:34:13 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Dec 2008 00:34:13 -0800 Subject: [Live-devel] ask for help In-Reply-To: <97B3021581F4EE48BB721532D21D71E4E7C1EA@dvsvr06.dvtech.com> References: <97B3021581F4EE48BB721532D21D71E4E7C1EA@dvsvr06.dvtech.com> Message-ID: >I'm trying to use testMPEG1or2VideoStreamer to stream a MPEG Program >Stream and I have changed testMPEG1or2VideoStreamer.cpp >correspondingly What happens if you just use "testMPEG1or2AudioVideoStreamer" (unmodified), which is intended specifically for streaming a MPEG Program Stream? Ross. -------------- next part -------------- An HTML attachment was scrubbed... URL: From feiliu at shdv.com Thu Dec 4 01:17:16 2008 From: feiliu at shdv.com (feiliu at shdv.com) Date: Thu, 4 Dec 2008 17:17:16 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIGFzayBmb3IgaGVscA==?= Message-ID: <97B3021581F4EE48BB721532D21D71E4E7C205@dvsvr06.dvtech.com> Hi Ross, Sorry, I made a mistake just now. The video I want to stream is an elementary stream, but when I use the unmodified testMPEG1or2VideoStreamer or testMPEG1or2AudioVideoStreamer the same error still exists. ________________________________ ???: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] ?? Ross Finlayson ????: 2008?12?4? 16:34 ???: LIVE555 Streaming Media - development & use ??: Re: [Live-devel] ask for help I'm trying to use testMPEG1or2VideoStreamer to stream a MPEG Program Stream and I have changed testMPEG1or2VideoStreamer.cpp correspondingly What happens if you just use "testMPEG1or2AudioVideoStreamer" (unmodified), which is intended specifically for streaming a MPEG Program Stream? Ross. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 4 01:33:00 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Dec 2008 01:33:00 -0800 Subject: [Live-devel] =?iso-8859-1?q?=EC=F6=91=60_=3A__ask_for_help?= In-Reply-To: <97B3021581F4EE48BB721532D21D71E4E7C205@dvsvr06.dvtech.com> References: <97B3021581F4EE48BB721532D21D71E4E7C205@dvsvr06.dvtech.com> Message-ID: >Sorry, I made a mistake just now. The video I want to stream is an >elementary stream, but when I use the unmodified >testMPEG1or2VideoStreamer or testMPEG1or2AudioVideoStreamer the >same error still exists. See "If you're sure that your file is of the correct type, then please put it on a publically-accessible web (or FTP) server, and post the URL (not the file itself) to the "live-devel" mailing list, and we'll take a look at it, to see if we can figure out what's wrong. " -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kris.delbarba at lextech.com Thu Dec 4 16:28:19 2008 From: kris.delbarba at lextech.com (Kris DelBarba) Date: Thu, 4 Dec 2008 18:28:19 -0600 Subject: [Live-devel] openRTSP Message-ID: <0E675338-EF67-4CBC-A56F-3AB028A3C77D@lextech.com> Is there anyone on here who's used the openRTSP libraries in a project to pull down a motion JPEG stream? I'm looking to use the libraries in my project and looking for suggestions on where to start. Currently my project does motion JPEG over HTTP and I'm looking to implement RTP/RTSP. Thanks, Kris DelBarba From venugopalpaikr at tataelxsi.co.in Fri Dec 5 01:01:26 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Fri, 5 Dec 2008 14:31:26 +0530 Subject: [Live-devel] timeout after 1 min Message-ID: <005201c956b8$0e7ca230$3c033c0a@telxsi.com> Hi everybody, We have extended the Live555 code to support HTTP tunneling and we have been able to succesfully stream data. But exactly after a minute the server terminates the connection. We have been trying to analyze the prob but in vain. -> We have set the reclamation time to 0 seconds to avoid timeout. Yet this happens. We increased it to 30 sec nd noticed that the server times out after 30 sec but if we increase it greater than 60 the server still times out at 60 sec. Is there any solution to avoid this? As tunneling uses HTTP, is the timeout happening bcoz the default timeout in tcp stack is 60 sec? If so where in the code do we change to avoid this. I have to set the socket to nonblocking mode. Please help. Thanks. The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From RGlobisch at csir.co.za Fri Dec 5 03:48:12 2008 From: RGlobisch at csir.co.za (Ralf Globisch) Date: Fri, 05 Dec 2008 13:48:12 +0200 Subject: [Live-devel] Live555/DirectShow Source Filter for PCM audio: source code Message-ID: <4939311B.5DA9.004D.0@csir.co.za> I've just implemented a very simple DirectShow RTSP Source filter using the live555 library which can receive 8/16 bit PCM from an RTSP server at various sampling rates. I'm hoping to add other media types such as mp3 at a later stage if time allows it. I've seen a couple of queries relating to the live555 and DirectShow integration on this mailing list and the source code could prove to be helpful to someone that is new to either or both frameworks. You can download the source code at: http://wirelessafrica.meraka.org.za/wiki/index.php/Real-Time_Video_Coding I've tested it using Windows Media Player with content (a wav file ) streamed from the liveMedia RTSP server. Hope it helps someone.... Ross, if you read this: thanks for an awesome library! Ralf -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. MailScanner thanks Transtec Computers for their support. From Stich_Jacob_G at cat.com Fri Dec 5 07:18:14 2008 From: Stich_Jacob_G at cat.com (Jacob G. Stich) Date: Fri, 5 Dec 2008 09:18:14 -0600 Subject: [Live-devel] Fw: Trouble with openRTSP Message-ID: Hello, I've compiled and installed the live555 library and openRTSP client test program. I am trying to get the client program to save a real-time RTSP stream to a data file. I have tried all the options I could think of for running the application, but I just can't figure out why the resultant file won't play on my PC. I'm running it on OpenSUSE 10.3. I have passed the following options to the program ./openRTSP -w 704 -h 576 -f 25 rtsp://192.168.0.62 When I run it,....a file appears in the same directory called video-MP4V-ES-1 and there aren't any errors returned on the console. I have attached the console output up to the point where its reading the stream. After I kill the program by going to another terminal and running kill -HUP 8472 (or whatever the process id is), the program terminates successfully and the teardown completes,...but I can't figure out how to replay the video file that was created. The RTSP stream is a MPEG4 stream, so I tried playing it as is and after adding a .mp4 file extension, but it won't play in VLC media player or Kaffeine. Can anyone help me out? I have also attaced the video file in case there's some information you can gleam from it? Thanks, Jacob Stich Caterpillar Electronics and Controls - T&SD Tech Center - E 14009 Old Galena Rd Mossville, IL 61552 USA Office: 309-675-1282 Email: stich_jacob_g at cat.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: video-MP4V-ES-1 Type: application/octet-stream Size: 47 bytes Desc: not available URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: openRTSP_output.txt URL: From sebastien-devel at celeos.eu Fri Dec 5 07:41:07 2008 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Fri, 05 Dec 2008 16:41:07 +0100 Subject: [Live-devel] Fw: Trouble with openRTSP In-Reply-To: References: Message-ID: <1228491667.49394b9314af0@imp.celeos.eu> Quoting "Jacob G. Stich" : > but I can't figure out how to > replay the video file that was created. The RTSP stream is a MPEG4 stream, > so I tried playing it as is and after adding a .mp4 file extension, but it > won't play in VLC media player or Kaffeine. Can anyone help me out? I have > also attaced the video file in case there's some information you can gleam > from it? Your file is 47 bytes, how big is your file in your disk ? You should try ffplay to play your file. From renatomauro at libero.it Fri Dec 5 10:48:37 2008 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Fri, 5 Dec 2008 19:48:37 +0100 Subject: [Live-devel] reclamationTestSeconds in TCP References: <1228491667.49394b9314af0@imp.celeos.eu> Message-ID: <65C81D6CF17E41D59C97280D010F6C1F@CSystemDev> Hi Ross, hi all. I have an RTSPServer dealing with RTP_UDP clients and reclamationTestSeconds set to 45s. RTSPClient-s say they are alive by sending a GET_PARAMETER command every 20s. Everything works fine. Now I'd like to use the same server with RTP_TCP clients. I see that they send the GET_PARAMETER command, nevertheless the server deletes the stream after reclamationTestSeconds seconds. 1) Is right using reclamationTestSeconds with RTP_TCP clients? If yes, where is my mistake? 2) If not, how can I manage reclamationTestSeconds since I have RTP_TCP and RTP_UDP clients on the same server at the same time? Thank you very much, Renato MAURO From peeyushduttamishra at gmail.com Sat Dec 6 00:27:22 2008 From: peeyushduttamishra at gmail.com (Peeyush Mishra) Date: Sat, 6 Dec 2008 13:57:22 +0530 Subject: [Live-devel] Media Server behind Web server implementation. Message-ID: Hi I have successfully create a video streamer with live555 , it gives RTSP URL to play the data over network . I want to access this stream through web client . Can any one tell me how can I connect my media server with web server , so that It can handle web request and play media on web browser. If any one has any link or any idea : How to put my media server behind web server . -- Thanks Peeyush Mishra -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Dec 6 00:46:36 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 6 Dec 2008 00:46:36 -0800 Subject: [Live-devel] reclamationTestSeconds in TCP In-Reply-To: <65C81D6CF17E41D59C97280D010F6C1F@CSystemDev> References: <1228491667.49394b9314af0@imp.celeos.eu> <65C81D6CF17E41D59C97280D010F6C1F@CSystemDev> Message-ID: > I have an RTSPServer dealing with RTP_UDP clients and >reclamationTestSeconds set to 45s. > RTSPClient-s say they are alive by sending a GET_PARAMETER >command every 20s. > Everything works fine. > > Now I'd like to use the same server with RTP_TCP clients. I see >that they send the GET_PARAMETER command, nevertheless the server >deletes the stream after reclamationTestSeconds seconds. > > 1) Is right using reclamationTestSeconds with RTP_TCP clients? If >yes, where is my mistake? It's not your mistake. There is a known problem with the current "LIVE555 Streaming Media" code: If RTP-over-TCP streaming is used, then incoming RTSP commands (after the initial "PLAY") will not be recognized by the server. In the meantime, if you want to use RTP-over-TCP streaming with our server, then you should *not* send RTSP commands as 'keep alive' indicators. Instead, you *must* send RTCP "RR" packets. Note that RTCP is a *mandatory* part of the RTP specification, so if your client does not send RTCP "RR" packets, then it is not in compliance with the standard, and you cannot expect a standards-compliant server to work with it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sat Dec 6 01:56:13 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 6 Dec 2008 01:56:13 -0800 Subject: [Live-devel] Fw: Trouble with openRTSP In-Reply-To: <1228491667.49394b9314af0@imp.celeos.eu> References: <1228491667.49394b9314af0@imp.celeos.eu> Message-ID: >Your file is 47 bytes Yes, that's very strange. If his file had been zero-length, then I would have known what to say. If his file had been a proper size, then I would also have known what to say. What happens when you (attempt to) play the RTSP/RTP stream (from the "rtsp://" URL), using VLC or QuickTime Player? Does this work? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From venugopalpaikr at tataelxsi.co.in Sun Dec 7 21:51:20 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Mon, 8 Dec 2008 11:21:20 +0530 Subject: [Live-devel] not able to respond to SET_PARAMETER Message-ID: <003e01c958f8$ff3002c0$3c033c0a@telxsi.com> Hi, I am modifying Live555 to respond to Windows Media Player. While trying to establish the session WMP sends SET_PARAMETER. The actual response to SET_PARAMETER should be RTSP/1.0 200 OK Content-Type: application/x-rtsp-udp-packetpair;charset=UTF-8 Content-Length: 43 Date: Mon, 01 Dec 2008 05:28:27 GMT CSeq: 3 Session: 13114845032829158031;timeout=60 Server: WMServer/9.5.6001.18000 type: high-entropy-packetpair variable-size* The end of message should be at d asterisk sign as shown above. I have written the response for SET_PARAMETER as shown : void RTSPServer::RTSPClientSession::Call_SET_PARAMETER(char const* cseq) { unsigned char ResponseBuffer[10000]; snprintf((char*)ResponseBuffer,sizeof ResponseBuffer,"RTSP/1.0 200 OK\r\nContent-Type: application/x-rtsp-udp-packetpair;charset=UTF-8\r\n%sCSeq: %s\r\nSession: %d;timeout=60\r\nServer: WMServer/9.5.6001.18000\r\n\r\n",dateHeader(),cseq, fOurSessionId); memset(fResponseBuffer,0,RTSP_BUFFER_SIZE); send(fClientSocket,(char const*)ResponseBuffer, strlen((char*)ResponseBuffer), 0); snprintf((char*)ResponseBuffer,sizeof ResponseBuffer,"type: high-entropy-packetpair variable-size\r\n"); send(fClientSocket, (char const*)ResponseBuffer, strlen((char*)ResponseBuffer), 0); memset(fResponseBuffer,0,RTSP_BUFFER_SIZE); } the response is as shown below RTSP/1.0 200 OK Content-Type: application/x-rtsp-udp-packetpair;charset=UTF-8 Date: Fri, Jan 02 1970 22:53:22 GMT CSeq: 3 Session: 1;timeout=60 Server: WMServer/9.5.6001.18000 type: high-entropy-packetpair variable-size * the message ends at the asterisk sign which i don't want. Am not able to replicate the response properly. How should i modify the code? if i don't use "\r\n\r\n" the response gets appended to the Teardown response and is sent along with it when a Teardown request is received. REGARDS, Venu The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From cbitsunil at gmail.com Mon Dec 8 01:11:40 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Mon, 8 Dec 2008 14:41:40 +0530 Subject: [Live-devel] MPEG2 TS file with multiple streams Message-ID: Hi, If a single MPEG2 TS file contains multiple streams in it, then how live media streamer behaves? I think it will try to play the file as it is. But If I play the same TS file via VLC, it will stream only the first stream in the TS file. Can I do the same with Live media server?? Please clarify. Thanks and Regards, Sunil. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 8 01:30:30 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Dec 2008 01:30:30 -0800 Subject: [Live-devel] MPEG2 TS file with multiple streams In-Reply-To: References: Message-ID: > If a single MPEG2 TS file contains multiple streams in it, then how >live media streamer behaves? >I think it will try to play the file as it is. Yes. The MPEG Transport Stream file will be streamed in its entirety. >But If I play the same TS file via VLC, it will stream only the >first stream in the TS file. >Can I do the same with Live media server?? No. You would first need to (somehow) create a new Transport Stream file that contains just the stream that you want to send, and then stream that file. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From paul at packetship.com Mon Dec 8 02:18:06 2008 From: paul at packetship.com (Paul Clark) Date: Mon, 08 Dec 2008 10:18:06 +0000 Subject: [Live-devel] not able to respond to SET_PARAMETER In-Reply-To: <003e01c958f8$ff3002c0$3c033c0a@telxsi.com> References: <003e01c958f8$ff3002c0$3c033c0a@telxsi.com> Message-ID: <493CF45E.3090208@packetship.com> venugopalpaikr wrote: > Hi, > I am modifying Live555 to respond to Windows Media Player. While trying > to establish the session WMP sends SET_PARAMETER. The actual response to > SET_PARAMETER should be > > RTSP/1.0 200 OK > Content-Type: application/x-rtsp-udp-packetpair;charset=UTF-8 > Content-Length: 43 > Date: Mon, 01 Dec 2008 05:28:27 GMT > CSeq: 3 > Session: 13114845032829158031;timeout=60 > Server: WMServer/9.5.6001.18000 > > type: high-entropy-packetpair variable-size* > > The end of message should be at d asterisk sign as shown above. > Do you mean it shouldn't have a CRLF on the end? That seems mighty fussy of the client! > I have written the response for SET_PARAMETER as shown : > > void RTSPServer::RTSPClientSession::Call_SET_PARAMETER(char const* cseq) { > > unsigned char ResponseBuffer[10000]; > > snprintf((char*)ResponseBuffer,sizeof ResponseBuffer,"RTSP/1.0 200 > OK\r\nContent-Type: > application/x-rtsp-udp-packetpair;charset=UTF-8\r\n%sCSeq: %s\r\nSession: > %d;timeout=60\r\nServer: WMServer/9.5.6001.18000\r\n\r\n",dateHeader(),cseq, > fOurSessionId); > This "\r\n\r\n" is correct - the blank line splits the header section from the response body. > memset(fResponseBuffer,0,RTSP_BUFFER_SIZE); > > send(fClientSocket,(char const*)ResponseBuffer, > strlen((char*)ResponseBuffer), 0); > > snprintf((char*)ResponseBuffer,sizeof ResponseBuffer,"type: > high-entropy-packetpair variable-size\r\n"); > (*) You're adding a CRLF here. > send(fClientSocket, (char const*)ResponseBuffer, > strlen((char*)ResponseBuffer), 0); > > memset(fResponseBuffer,0,RTSP_BUFFER_SIZE); > Not sure why you are resetting this buffer twice (or at all, actually!). > } > > the response is as shown below > RTSP/1.0 200 OK > Content-Type: application/x-rtsp-udp-packetpair;charset=UTF-8 > Date: Fri, Jan 02 1970 22:53:22 GMT > CSeq: 3 > Session: 1;timeout=60 > Server: WMServer/9.5.6001.18000 > > type: high-entropy-packetpair variable-size > > * > the message ends at the asterisk sign which i don't want. Am not able to > replicate the response properly. How should i modify the code? if i don't > use "\r\n\r\n" the response gets appended to the Teardown response and is > sent along with it when a Teardown request is received. > You're explicitly sending the final CRLF at (*) above; do you think you're getting another one as well? I can't see how not sending the double CRLF earlier could affect what the server sends back, but I guess WMP's response parser might get confused if it doesn't see the end of headers - where are you tracing this output? Regards Paul -- Paul Clark Packet Ship Technologies Limited http://www.packetship.com From mrcravens at verizon.net Mon Dec 8 14:20:07 2008 From: mrcravens at verizon.net (Mike Cravens) Date: Mon, 08 Dec 2008 16:20:07 -0600 Subject: [Live-devel] How to invoke second ( and subsequent) streams on live555 media server? Message-ID: <493D9D97.9020707@verizon.net> So, when live555 Media Server starts up, it prints out a url of the form *rtsp://serverip:serverport.* Running a media client, I can start up a unicast stream by providing a url of the form *rtsp:/serverip:serverport/file_to_be streamed. * Now, when I want to run a second stream, say of the same file delayed by 30 seconds, is that allowed? *( actually, vlc provides a second panel option for the port number, which does not appear selectable/changeable)* What port(s) will the server use for control and data, relative to the first port? My current vm simulation has limited file and network bandwidth, so I can't be sure, but it looks like things wedge pretty tightly when I attempt to start up a second stream from a second instance of vlc on a second (virtual) machine. Should this work ( with the same or different files)? The final client will be a custom one, but I'm testing with vlc. Regards, Mike Cravens mrcravens at gte.net (972) 693-7799 From finlayson at live555.com Mon Dec 8 17:20:25 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Dec 2008 17:20:25 -0800 Subject: [Live-devel] How to invoke second ( and subsequent) streams on live555 media server? In-Reply-To: <493D9D97.9020707@verizon.net> References: <493D9D97.9020707@verizon.net> Message-ID: >So, when live555 Media Server starts up, it prints out a url of the form >*rtsp://serverip:serverport.* > >Running a media client, I can start up a unicast stream by providing a >url of the form *rtsp:/serverip:serverport/file_to_be streamed. >* >Now, when I want to run a second stream, say of the same file delayed by >30 seconds, is that allowed? From the server's point of view, yes. Our server can handle multiple requests for the same, or different, streams, at arbitrary times. Most RTSP *client* applications, however, do not support playing more than one stream. Although VLC is not our software, I'm fairly sure that that is true for VLC (when used as a RTSP client). Therefore, if you want to use VLC to play more than one "rtsp://" URL (or the same "rtsp://" URL at different times), then you will need to run more than one copy of the VLC application. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From anto.rizzo at caramail.com Tue Dec 9 08:01:54 2008 From: anto.rizzo at caramail.com (Antonella Rizzo) Date: Tue, 9 Dec 2008 17:01:54 +0100 Subject: [Live-devel] Simulate Packet loss Message-ID: <191918192116235@lycos-europe.com> Hi Ross, in MultiFramedRTPSource.cpp there is: #ifdef TEST_LOSS ??? source->setPacketReorderingThresholdTime(0); ?????? // don't wait for 'lost' packets to arrive out-of-order later ??? if ((our_random()%10) == 0) break; // simulate 10% packet loss #endif I do not understand what it means simulate 10% packet loss, because analyzing the QoS I see that the loss of packages below the value that you set here. Why? Regards From finlayson at live555.com Tue Dec 9 18:17:43 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 9 Dec 2008 18:17:43 -0800 Subject: [Live-devel] Simulate Packet loss In-Reply-To: <191918192116235@lycos-europe.com> References: <191918192116235@lycos-europe.com> Message-ID: >Hi Ross, >in MultiFramedRTPSource.cpp there is: >#ifdef TEST_LOSS > source->setPacketReorderingThresholdTime(0); > // don't wait for 'lost' packets to arrive out-of-order later > if ((our_random()%10) == 0) break; // simulate 10% packet loss >#endif > >I do not understand what it means simulate 10% packet loss What part of this code don't you understand? This code, when enabled (note that by default, it's *not* enabled) randomly drops each packet, with probabilty 1/10. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From microqq001 at gmail.com Thu Dec 11 00:05:28 2008 From: microqq001 at gmail.com (qiqi z) Date: Thu, 11 Dec 2008 16:05:28 +0800 Subject: [Live-devel] H264 RTP Streaming: A Tutorial Message-ID: <78b5e57e0812110005y3031c31dp2560721fc43a2f17@mail.gmail.com> HI, The H264 RTP Streaming Tutorial comes back, thanks to Ron ^ He drove a time machine back to the past and obtained the file. Here is the link: http://www.fileden.com/files/2008/12/4/2210768/live555_H.264_tutorial.tar.gz q From servent at apollo.lv Thu Dec 11 03:34:41 2008 From: servent at apollo.lv (Alexey Vasilyev) Date: Thu, 11 Dec 2008 14:34:41 +0300 Subject: [Live-devel] Converting timeval to DateTime .NET Message-ID: Hi, I'm using Windows. Windows version uses QueryPerformanceCounter() in GroupsockHelper.cpp. I would like to know how can I convert a timeval presentationTime value in function afterGettingFrame(..., struct timeval presentationTime) to .NET DateTime ticks (number of ticks passed since 0001-01-01)? In my afterGettingFrame(frameSize, presentationTime) function I made a log call LOG("Video frame received: %d bytes (%d:%d)\n"), frameSize, presentationTime.tv_sec, presentationTime.tv_usec); And the logs is 2008-12-11 14:30:09 Video frame received: 14796 bytes (8970:592115) 2008-12-11 14:30:09 Video frame received: 1050 bytes (8971:592259) 2008-12-11 14:30:10 Video frame received: 1031 bytes (8972:592403) 2008-12-11 14:30:12 Video frame received: 1074 bytes (9745106:531920) <------ gap 2008-12-11 14:30:12 Video frame received: 1045 bytes (9745107:532064) 2008-12-11 14:30:13 Video frame received: 1040 bytes (9745108:532197) 2008-12-11 14:30:14 Video frame received: 1121 bytes (9745109:532341) 2008-12-11 14:30:15 Video frame received: 1121 bytes (9745110:532485) 2008-12-11 14:30:17 Video frame received: 14666 bytes (9745111:532618) 2008-12-11 14:30:17 Video frame received: 1132 bytes (9745112:532762) 2008-12-11 14:30:18 Video frame received: 1103 bytes (9745113:532895) 2008-12-11 14:30:19 Video frame received: 1100 bytes (9745114:533039) It's not clear for me why there is a gap between 3rd and 4th record? Is timeval an absolute value? -- Best regards, Alexey. From finlayson at live555.com Thu Dec 11 04:54:31 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 11 Dec 2008 04:54:31 -0800 Subject: [Live-devel] Converting timeval to DateTime .NET In-Reply-To: References: Message-ID: > I would like to know how can I convert a timeval presentationTime value in > function afterGettingFrame(..., struct timeval presentationTime) to .NET > DateTime ticks (number of ticks passed since 0001-01-01)? I don't know, but this seems completely unrelated to your next question: > In my afterGettingFrame(frameSize, presentationTime) function I made >a log call > LOG("Video frame received: %d bytes (%d:%d)\n"), frameSize, >presentationTime.tv_sec, presentationTime.tv_usec); > > And the logs is >2008-12-11 14:30:09 Video frame received: 14796 bytes (8970:592115) >2008-12-11 14:30:09 Video frame received: 1050 bytes (8971:592259) >2008-12-11 14:30:10 Video frame received: 1031 bytes (8972:592403) >2008-12-11 14:30:12 Video frame received: 1074 bytes >(9745106:531920) <------ gap >2008-12-11 14:30:12 Video frame received: 1045 bytes (9745107:532064) >2008-12-11 14:30:13 Video frame received: 1040 bytes (9745108:532197) >2008-12-11 14:30:14 Video frame received: 1121 bytes (9745109:532341) >2008-12-11 14:30:15 Video frame received: 1121 bytes (9745110:532485) >2008-12-11 14:30:17 Video frame received: 14666 bytes (9745111:532618) >2008-12-11 14:30:17 Video frame received: 1132 bytes (9745112:532762) >2008-12-11 14:30:18 Video frame received: 1103 bytes (9745113:532895) >2008-12-11 14:30:19 Video frame received: 1100 bytes (9745114:533039) > > It's not clear for me why there is a gap between 3rd and 4th record? It's because the first few presentation times - before RTCP synchronization occurs - are just 'guesses' made by the receiving code (based on the receiver's 'wall clock' and the RTP timestamp). However, once RTCP synchronization occurs, all subsequent presentation times will be accurate. This means is that a receiver should be prepared for the fact that the first few presentation times (until RTCP synchronization starts) will not be accurate. The code, however, can check this by calling "RTPSource:: hasBeenSynchronizedUsingRTCP()". If this returns False, then the presentation times are not accurate, and should be treated with 'a grain of salt'. However, once the call to returns True, then the presentation times (from then on) will be accurate. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From wouter.dhondt at vsk.be Thu Dec 11 07:59:44 2008 From: wouter.dhondt at vsk.be (Wouter Dhondt) Date: Thu, 11 Dec 2008 16:59:44 +0100 Subject: [Live-devel] RTPInterface blocking readsocket() Message-ID: <494138F0.7010402@vsk.be> Hi. I'm using livemedia to stream over TCP. In the RTPInterface file there is a function tcpReadHandler() which handles all the data received from the client (reports, commands, ...). This function tries to read a '$' through a non-blocking call to readSocket(). Afterwards several more bytes are read, but this time no timeout is supplied making these calls blocking. Is there any reason that these are blocking? Why not use a timeout? I seem to notice a "hang" (could be a long block) when these functions are called, meaning a complete stop of livemedia during that time. There are other calls to readSocket() without timeout (so blocking). Can these have the same effect? Any advise? From morais.levi at gmail.com Thu Dec 11 08:27:20 2008 From: morais.levi at gmail.com (Andre Morais) Date: Thu, 11 Dec 2008 16:27:20 +0000 Subject: [Live-devel] Live 555 and Darwin Streaming Server 5.5.5 Message-ID: <273b5af00812110827t718775c0qabaa2d53b6ac5c98@mail.gmail.com> Hello to you all, I'm recently starting to use Live555.Before that I use Darwin Streaming Server but it's limited in functionality that I can do over the streaming media while hosting.Does anybody already uses this servers at the same time to give more functionalitys to DSS? Thanks Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 11 22:52:08 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 11 Dec 2008 22:52:08 -0800 Subject: [Live-devel] RTPInterface blocking readsocket() In-Reply-To: <494138F0.7010402@vsk.be> References: <494138F0.7010402@vsk.be> Message-ID: >I'm using livemedia to stream over TCP. In the RTPInterface file >there is a function tcpReadHandler() which handles all the data >received from the client (reports, commands, ...). This function >tries to read a '$' through a non-blocking call to readSocket(). >Afterwards several more bytes are read, but this time no timeout is >supplied making these calls blocking. If the RTP-over-TCP data is well-formed, then there shouldn't be any blocking, because the data should be present in the correct format: '$';1-byte stream channel id; 2-byte packet size; packet data. Perhaps the data that you are sending is not well-formed (though it you're using our server, it should be)? You're correct, though, that the code should probably be made more tolerant of malformed data (e.g., by including the (otherwise optional) "timeout" parameter to "readSocket()" and "readSocketExact()"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From wouter.dhondt at vsk.be Fri Dec 12 00:18:45 2008 From: wouter.dhondt at vsk.be (Wouter Dhondt) Date: Fri, 12 Dec 2008 09:18:45 +0100 Subject: [Live-devel] RTPInterface blocking readsocket() In-Reply-To: <494138F0.7010402@vsk.be> References: <494138F0.7010402@vsk.be> Message-ID: <49421E65.5020507@vsk.be> Thanks for the quick answer. > If the RTP-over-TCP data is well-formed, then there shouldn't be any > blocking, because the data should be present in the correct format: > '$';1-byte stream channel id; 2-byte packet size; packet data. Perhaps > the data that you are sending is not well-formed (though it you're > using our server, it should be)? We are using the livemedia client as well. The message is probably split when this happens: first a command that will be ignored followed by a report. But only the $ of the report is in the packet and the rest in a second part. Not sure if this is possible. It might be possible that we are doing something wrong, but the server should be robust enough to handle any data. > You're correct, though, that the code should probably be made more > tolerant of malformed data (e.g., by including the (otherwise > optional) "timeout" parameter to "readSocket()" and "readSocketExact()"). Ok, so I'll keep the timeouts. There are other calls to readSocket() without a timeout as well. Not sure if these have to have the timeout? Maybe not cause it looks like there won't be any other data before those. Thanks again. From domentarion at live.nl Fri Dec 12 01:14:28 2008 From: domentarion at live.nl (Sander den Breejen) Date: Fri, 12 Dec 2008 09:14:28 +0000 Subject: [Live-devel] Streaming Directoy LIVE555 Media Server Message-ID: Is it posible to stream a hole directory ? Because i have a genre system i wanne stream sub durectoy's can somebody help me with that ? Chears , Sander"It Can't Rain All The Time" _________________________________________________________________ Download de nieuwste emoticons voor in je Messenger http://www.msnmessengerexperience.nl/chuck/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrcravens at verizon.net Fri Dec 12 09:42:09 2008 From: mrcravens at verizon.net (mike cravens) Date: Fri, 12 Dec 2008 11:42:09 -0600 Subject: [Live-devel] Modify OpenRTSP to record RTP stream from VLC (simulating MPEG2 VBRICK)? Message-ID: <4942A271.2030406@verizon.net> Pardon me if I'm repeating a topic. To record ( to a file ) an RTP stream sent by VLC into a mpeg2 TS file, should I start with OpenRTSP and change out some classes? ( Which ones?) Or can it play RTP only sessions as is? I'm looking for a very quick success in reading and saving to an mpg2.ts file an RTP stream (unicast) by VLC. Following this success, we can repent at leisure and follow all the pretty doxygen graphics to a more optimal approach and more complete understanding of this rich featured ecosystem. We're also curious about what we will need to do about session descriptors. I do notice that vlc will send an SAP on command. We also have captured some sample descriptions off the wire when streaming from live555 RTSP server that might be a starting place, but there are sure to be fields that change. We hope to come up the curve and be genuine contributors to the project if we can survive a quick paced demonstration of feasibility in a few days. Regards, Mike Cravens From alex889 at gmail.com Fri Dec 12 10:07:08 2008 From: alex889 at gmail.com (alex) Date: Fri, 12 Dec 2008 20:07:08 +0200 Subject: [Live-devel] Live555 on TI's DSP Message-ID: Hi, Is there anyone who managed to run it on a non daVinci processor? I'm having many problems compiling the package and use it with the NDK. I'll Be glad to get some help/info. Thanks, Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From morais.levi at gmail.com Fri Dec 12 10:54:57 2008 From: morais.levi at gmail.com (Andre Morais) Date: Fri, 12 Dec 2008 18:54:57 +0000 Subject: [Live-devel] problems downloading "testMPEG2TransportStreamTrickPlay" Message-ID: <273b5af00812121054m1f9543efj8e83164518717472@mail.gmail.com> Hello to you all, I'm trying to download "testMPEG2TransportStreamTrickPlay" but when I click on the link appears the message "object not found". Does anybody know how I can download the file? Thanks! Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 12 14:16:03 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 12 Dec 2008 14:16:03 -0800 Subject: [Live-devel] Modify OpenRTSP to record RTP stream from VLC (simulating MPEG2 VBRICK)? In-Reply-To: <4942A271.2030406@verizon.net> References: <4942A271.2030406@verizon.net> Message-ID: >To record ( to a file ) an RTP stream sent by VLC into a mpeg2 TS file, >should I start with OpenRTSP and change out some classes? ( Which ones?) You should not need to make any changes to "openRTSP", because - when streaming a unicast stream from VLC - you should be using RTSP. (i.e., VLC is a RTSP server.) Just run "openRTSP" with the appropriate "rtsp://" URL. In general, unicast RTP streams should be sent/received using RTSP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From morais.levi at gmail.com Fri Dec 12 14:29:16 2008 From: morais.levi at gmail.com (Andre Morais) Date: Fri, 12 Dec 2008 22:29:16 +0000 Subject: [Live-devel] Transport Stream 'index files' Message-ID: <273b5af00812121429o72b21cfo282272af5d9c09b5@mail.gmail.com> Hello to you all, I'm trying to create a tsx file with MPEG2TransportStreamTrickPlay.exe...My file names "sample.ts" After I run the following code in the comand line window "MPEG2TransportStreamIndexer sample.ts" appear this error message writing index file "sample.tsx" ...bad TS sync byte :0x0...done and when I go to the folder where is the file the sample.tsx is empty Anybody already have this problem?I can't solve it yet Thanks! regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 12 16:29:56 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 12 Dec 2008 16:29:56 -0800 Subject: [Live-devel] problems downloading "testMPEG2TransportStreamTrickPlay" In-Reply-To: <273b5af00812121054m1f9543efj8e83164518717472@mail.gmail.com> References: <273b5af00812121054m1f9543efj8e83164518717472@mail.gmail.com> Message-ID: >I'm trying to download "testMPEG2TransportStreamTrickPlay" but when >I click on the link appears the message "object not found". >Does anybody know how I can download the file? Oops, there was some bad HTML on that page. I've fixed it now, so please try again. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 12 17:27:35 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 12 Dec 2008 17:27:35 -0800 Subject: [Live-devel] Transport Stream 'index files' In-Reply-To: <273b5af00812121429o72b21cfo282272af5d9c09b5@mail.gmail.com> References: <273b5af00812121429o72b21cfo282272af5d9c09b5@mail.gmail.com> Message-ID: >I'm trying to create a tsx file with >MPEG2TransportStreamTrickPlay.exe...My file names "sample.ts" >After I run the following code in the comand line window >"MPEG2TransportStreamIndexer sample.ts" appear this error message >writing index file "sample.tsx" ...bad TS sync byte :0x0...done and >when I go to the folder where is the file the sample.tsx is empty >Anybody already have this problem?I can't solve it yet See -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From zhuyunbin at shdv.com Fri Dec 12 18:33:29 2008 From: zhuyunbin at shdv.com (zhuyunbin) Date: Sat, 13 Dec 2008 10:33:29 +0800 Subject: [Live-devel] How to use PCIH.264 encoding card in streaming server Message-ID: <200812131033287656533@shdv.com> Dear all I have a fujisu h.264 encoding card ( with PCI interface). I want to use live stream server to receivce the h.264 file from card and put stream to network. How can I do if I want to do this function? I have already had fujisu's api in Linux and windows platform. Thanks Zhu yunbin zhuyunbin 2008-12-13 -------------- next part -------------- An HTML attachment was scrubbed... URL: From morais.levi at gmail.com Fri Dec 12 19:05:11 2008 From: morais.levi at gmail.com (Andre Morais) Date: Sat, 13 Dec 2008 03:05:11 +0000 Subject: [Live-devel] problems downloading "testMPEG2TransportStreamTrickPlay" In-Reply-To: References: <273b5af00812121054m1f9543efj8e83164518717472@mail.gmail.com> Message-ID: <273b5af00812121905q1a8f518bj47044a582a330c30@mail.gmail.com> Thanks... now I can download " MPEG2TransportStreamIndexer" but when I tried the "testMPEG2TransportStreamTrickPlay " appears the message "object not found".I will apreciate if you can solve this problem too. thanks regards On Sat, Dec 13, 2008 at 12:29 AM, Ross Finlayson wrote: > I'm trying to download "testMPEG2TransportStreamTrickPlay" but when I click >> on the link appears the message "object not found". >> Does anybody know how I can download the file? >> > > Oops, there was some bad HTML on that page. I've fixed it now, so please > try again. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 12 21:55:07 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 12 Dec 2008 21:55:07 -0800 Subject: [Live-devel] problems downloading "testMPEG2TransportStreamTrickPlay" In-Reply-To: <273b5af00812121905q1a8f518bj47044a582a330c30@mail.gmail.com> References: <273b5af00812121054m1f9543efj8e83164518717472@mail.gmail.com> <273b5af00812121905q1a8f518bj47044a582a330c30@mail.gmail.com> Message-ID: >Thanks... >now I can download " >MPEG2TransportStreamIndexer" >but when I tried the >"testMPEG2TransportStreamTrickPlay" >appears the message "object not found" You may still have the old version of the web page cached. Refresh your browser window before trying again. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrcravens at verizon.net Sat Dec 13 06:35:34 2008 From: mrcravens at verizon.net (Mike Cravens) Date: Sat, 13 Dec 2008 08:35:34 -0600 Subject: [Live-devel] Modify OpenRTSP to record RTP stream from VLC? (simulating VBRICK) In-Reply-To: References: Message-ID: <4943C836.3050505@verizon.net> We understand how things should be done. Really, we at least have an inkling of the difference between running an application on top of a session layer and running one on top of a transport (mostly) layer. We also have a clue as to some advantages (and disadvantages) that multicasting can offer over unicasting. The issue at hand is to simulate communication with an mpeg2 VBRICK, and it talks RTP only, so far as we can determine without one yet in hand. We are required to demonstrate communication with an rtp source within the next few days. VBRICK, set up to stream in RTP mode, is the test we have to pass to continue. There are a number of other issues we have overcome, but right now we are looking at modifying OpenRTSP, or mplayer ( if that is more productive), or building from live555 library elements ( seems risky given our limited familiarity at this point. We need to succeed literally overnight to continue th project. So it is not a philosophical question of what we should do, but what we must do to move forward and eventually have the conversation of what we should be doing. We must pick up an mp2 ts over RTP stream and record it to a file (or series of files) that live555 streaming server can later play back (RTSP is allowed in this case). ive-devel-request at ns.live555.com wrote: > Send live-devel mailing list submissions to > live-devel at lists.live555.com > > To subscribe or unsubscribe via the World Wide Web, visit > http://lists.live555.com/mailman/listinfo/live-devel > or, via email, send a message with subject or body 'help' to > live-devel-request at lists.live555.com > > You can reach the person managing the list at > live-devel-owner at lists.live555.com > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of live-devel digest..." > > > Today's Topics: > > 1. Streaming Directoy LIVE555 Media Server (Sander den Breejen) > 2. Modify OpenRTSP to record RTP stream from VLC (simulating > MPEG2 VBRICK)? (mike cravens) > 3. Live555 on TI's DSP (alex) > 4. problems downloading "testMPEG2TransportStreamTrickPlay" > (Andre Morais) > 5. Re: Modify OpenRTSP to record RTP stream from VLC (simulating > MPEG2 VBRICK)? (Ross Finlayson) > 6. Transport Stream 'index files' (Andre Morais) > 7. Re: problems downloading "testMPEG2TransportStreamTrickPlay" > (Ross Finlayson) > 8. Re: Transport Stream 'index files' (Ross Finlayson) > 9. How to use PCIH.264 encoding card in streaming server (zhuyunbin) > 10. Re: problems downloading "testMPEG2TransportStreamTrickPlay" > (Andre Morais) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Fri, 12 Dec 2008 09:14:28 +0000 > From: Sander den Breejen > Subject: [Live-devel] Streaming Directoy LIVE555 Media Server > To: > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > > Is it posible to stream a hole directory ? Because i have a genre system i wanne stream sub durectoy's can somebody help me with that ? > > Chears , > > Sander"It Can't Rain All The Time" > _________________________________________________________________ > Download de nieuwste emoticons voor in je Messenger > http://www.msnmessengerexperience.nl/chuck/ > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Message: 2 > Date: Fri, 12 Dec 2008 11:42:09 -0600 > From: mike cravens > Subject: [Live-devel] Modify OpenRTSP to record RTP stream from VLC > (simulating MPEG2 VBRICK)? > To: live-devel at ns.live555.com > Message-ID: <4942A271.2030406 at verizon.net> > Content-Type: text/plain; charset=ISO-8859-1 > > Pardon me if I'm repeating a topic. > > To record ( to a file ) an RTP stream sent by VLC into a mpeg2 TS file, > should I start with OpenRTSP and change out some classes? ( Which ones?) > > Or can it play RTP only sessions as is? > > I'm looking for a very quick success in reading and saving to an > mpg2.ts file an RTP stream (unicast) by VLC. > > Following this success, we can repent at leisure and follow all the > pretty doxygen graphics to a more optimal approach and more complete > understanding of this rich featured ecosystem. > > We're also curious about what we will need to do about session > descriptors. I do notice that vlc will send an SAP on command. We also > have captured some sample descriptions off the wire when streaming from > live555 RTSP server that might be a starting place, but there are sure > to be fields that change. > > We hope to come up the curve and be genuine contributors to the project > if we can survive a quick paced demonstration of feasibility in a few days. > > Regards, > > Mike Cravens > > > ------------------------------ > > Message: 3 > Date: Fri, 12 Dec 2008 20:07:08 +0200 > From: alex > Subject: [Live-devel] Live555 on TI's DSP > To: live-devel at ns.live555.com > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Hi, > Is there anyone who managed to run it on a non daVinci processor? > I'm having many problems compiling the package and use it with the NDK. > > I'll Be glad to get some help/info. > Thanks, > Alex > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Message: 4 > Date: Fri, 12 Dec 2008 18:54:57 +0000 > From: "Andre Morais" > Subject: [Live-devel] problems downloading > "testMPEG2TransportStreamTrickPlay" > To: live-devel at ns.live555.com > Message-ID: > <273b5af00812121054m1f9543efj8e83164518717472 at mail.gmail.com> > Content-Type: text/plain; charset="iso-8859-1" > > Hello to you all, > > I'm trying to download "testMPEG2TransportStreamTrickPlay" but when I click > on the link appears the message "object not found". > Does anybody know how I can download the file? > > Thanks! > > Regards > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > ------------------------------ > > Message: 5 > Date: Fri, 12 Dec 2008 14:16:03 -0800 > From: Ross Finlayson > Subject: Re: [Live-devel] Modify OpenRTSP to record RTP stream from > VLC (simulating MPEG2 VBRICK)? > To: LIVE555 Streaming Media - development & use > > Message-ID: > Content-Type: text/plain; charset="us-ascii" ; format="flowed" > > >> To record ( to a file ) an RTP stream sent by VLC into a mpeg2 TS file, >> should I start with OpenRTSP and change out some classes? ( Which ones?) >> > > You should not need to make any changes to "openRTSP", because - when > streaming a unicast stream from VLC - you should be using RTSP. > (i.e., VLC is a RTSP server.) > > Just run "openRTSP" with the appropriate "rtsp://" URL. > > In general, unicast RTP streams should be sent/received using RTSP. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From morais.levi at gmail.com Sat Dec 13 16:12:39 2008 From: morais.levi at gmail.com (Andre Morais) Date: Sun, 14 Dec 2008 00:12:39 +0000 Subject: [Live-devel] problems downloading "testMPEG2TransportStreamTrickPlay" In-Reply-To: References: <273b5af00812121054m1f9543efj8e83164518717472@mail.gmail.com> <273b5af00812121905q1a8f518bj47044a582a330c30@mail.gmail.com> Message-ID: <273b5af00812131612q40788f2aoc4042600c9f39062@mail.gmail.com> http://www.live555.com/mediaServer/linux/testMPEG2TransportStreamTrickPlaye when I click in "testMPEG2TransportStreamTrickPlay"(linux) this url open and appears the message "object not found" 2008/12/13 Ross Finlayson > Thanks... > > now I can download " MPEG2TransportStreamIndexer" > but when I tried the "testMPEG2TransportStreamTrickPlay > " appears the message "object not found" > > > You may still have the old version of the web page cached. Refresh your > browser window before trying again. > > -- > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Dec 13 17:42:41 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 13 Dec 2008 17:42:41 -0800 Subject: [Live-devel] problems downloading "testMPEG2TransportStreamTrickPlay" In-Reply-To: <273b5af00812131612q40788f2aoc4042600c9f39062@mail.gmail.com> References: <273b5af00812121054m1f9543efj8e83164518717472@mail.gmail.com> <273b5af00812121905q1a8f518bj47044a582a330c30@mail.gmail.com> <273b5af00812131612q40788f2aoc4042600c9f39062@mail.gmail.com> Message-ID: >http://www.live555.com/mediaServer/linux/testMPEG2TransportStreamTrickPlaye > >when I click in >"testMPEG2TransportStreamTrickPlay"(linux) >this url open and appears the message "object not found" Once again - you have a stale version of the web page. The correct URL - which is not on the web page - is http://www.live555.com/mediaServer/linux/testMPEG2TransportStreamTrickPlay -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Sun Dec 14 07:15:24 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Sun, 14 Dec 2008 17:15:24 +0200 Subject: [Live-devel] How to use PCIH.264 encoding card in streaming server In-Reply-To: <200812131033287656533@shdv.com> Message-ID: first, you should implemet H.264 Framer (you can search for the tutorial republished few days ago) second, you shoud write your own DeviceSource which will make use of your proprietary API. Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of zhuyunbin Sent: Saturday, December 13, 2008 4:33 AM To: live-devel Subject: [Live-devel] How to use PCIH.264 encoding card in streaming server Dear all I have a fujisu h.264 encoding card ( with PCI interface). I want to use live stream server to receivce the h.264 file from card and put stream to network. How can I do if I want to do this function? I have already had fujisu's api in Linux and windows platform. Thanks Zhu yunbin ________________________________ zhuyunbin 2008-12-13 The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhuyunbin at shdv.com Sun Dec 14 16:56:51 2008 From: zhuyunbin at shdv.com (zhuyunbin) Date: Mon, 15 Dec 2008 08:56:51 +0800 Subject: [Live-devel] How to use PCIH.264 encoding card in streamingserver References: Message-ID: <200812150856512039174@shdv.com> The Fujisu's api I have already get. My questions are below: (1) I don't know where and which part code can I write fujisu's api and let live555 call this card's api. (2) if I know and write the correct api in live555 , But whether can I change live555's upper app's code to let h.264 card work ? Thanks Zhu yunbin zhuyunbin 2008-12-15 ???? Yedidia Amit ????? 2008-12-14 23:26:13 ???? LIVE555 Streaming Media - development & use ??? ??? Re: [Live-devel] How to use PCIH.264 encoding card in streamingserver first, you should implemet H.264 Framer (you can search for the tutorial republished few days ago) second, you shoud write your own DeviceSource which will make use of your proprietary API. Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of zhuyunbin Sent: Saturday, December 13, 2008 4:33 AM To: live-devel Subject: [Live-devel] How to use PCIH.264 encoding card in streaming server Dear all I have a fujisu h.264 encoding card ( with PCI interface). I want to use live stream server to receivce the h.264 file from card and put stream to network. How can I do if I want to do this function? I have already had fujisu's api in Linux and windows platform. Thanks Zhu yunbin zhuyunbin 2008-12-13 The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From amadorim at vdavda.com Mon Dec 15 00:14:33 2008 From: amadorim at vdavda.com (Marco Amadori) Date: Mon, 15 Dec 2008 09:14:33 +0100 Subject: [Live-devel] Live555 on TI's DSP In-Reply-To: References: Message-ID: <200812150914.33922.amadorim@vdavda.com> On Friday 12 December 2008, 19:07:08, alex wrote: > Hi, > Is there anyone who managed to run it on a non daVinci processor? > I'm having many problems compiling the package and use it with the NDK. I used it on success on another architecture (non davinci STB) too without problems... live555, e.g. works out of the box on all architectures for which there are a debian package. Try a debian binary before rebuilding it. -- ESC:wq -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. From stas at tech-mer.com Mon Dec 15 05:34:52 2008 From: stas at tech-mer.com (Stas Desyatnlkov) Date: Mon, 15 Dec 2008 15:34:52 +0200 Subject: [Live-devel] H264 streaming Message-ID: <21E398286732DC49AD45BE8C7BE96C073559E26D67@fs11.mertree.mer.co.il> Hi All, I need to stream H264 video from linux device to PC. I write both the sender and receiver part but would like to comply to the standard way of streaming it, so: How should I packetize the video? Should it be NAL packets or should I put it into transport stream? Regards, Stas -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 15 09:38:16 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 15 Dec 2008 09:38:16 -0800 Subject: [Live-devel] H264 streaming In-Reply-To: <21E398286732DC49AD45BE8C7BE96C073559E26D67@fs11.mertree.mer.co.il> References: <21E398286732DC49AD45BE8C7BE96C073559E26D67@fs11.mertree.mer.co.il> Message-ID: >I need to stream H264 video from linux device to PC. I write both >the sender and receiver part but would like to comply to the >standard way of streaming it, so: >How should I packetize the video? Should it be NAL packets or should >I put it into transport stream? For streaming, it is generally better to stream video and audio separately, using their own specialized RTP payload formats. To stream H.264 video, see http://www.live555.com/liveMedia/faq.html#h264-streaming You will then write a "ServerMediaSubsession" subclass that uses your H.264 video source, and your "H264VideoStreamFramer" subclass, to send RTP packets via a "H264VideoRTPSink". Then use this in a "RTSPServer". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Mon Dec 15 21:20:31 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Tue, 16 Dec 2008 07:20:31 +0200 Subject: [Live-devel] How to use PCIH.264 encoding card instreamingserver In-Reply-To: <200812150856512039174@shdv.com> Message-ID: check the DeviceSource.h/cpp files for some initial instructions. - this is where you should put your API The you need to write your on subsession class that will create and connect the appropriate sink and source (in your case its the YourDeviceSouurce and H264VideoRTPSink (check other subsession for example) also check this tutorial http://www.fileden.com/files/2008/12/4/2210768/live555_H.264_tutorial.tar.gz Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of zhuyunbin Sent: Monday, December 15, 2008 2:57 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] How to use PCIH.264 encoding card instreamingserver The Fujisu's api I have already get. My questions are below: (1) I don't know where and which part code can I write fujisu's api and let live555 call this card's api. (2) if I know and write the correct api in live555 , But whether can I change live555's upper app's code to let h.264 card work ? Thanks Zhu yunbin ________________________________ zhuyunbin 2008-12-15 ________________________________ ???? Yedidia Amit ????? 2008-12-14 23:26:13 ???? LIVE555 Streaming Media - development & use ??? ??? Re: [Live-devel] How to use PCIH.264 encoding card in streamingserver first, you should implemet H.264 Framer (you can search for the tutorial republished few days ago) second, you shoud write your own DeviceSource which will make use of your proprietary API. Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of zhuyunbin Sent: Saturday, December 13, 2008 4:33 AM To: live-devel Subject: [Live-devel] How to use PCIH.264 encoding card in streaming server Dear all I have a fujisu h.264 encoding card ( with PCI interface). I want to use live stream server to receivce the h.264 file from card and put stream to network. How can I do if I want to do this function? I have already had fujisu's api in Linux and windows platform. Thanks Zhu yunbin ________________________________ zhuyunbin 2008-12-13 The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From martinbonnin at gmail.com Tue Dec 16 01:02:17 2008 From: martinbonnin at gmail.com (Martin Bonnin) Date: Tue, 16 Dec 2008 10:02:17 +0100 Subject: [Live-devel] [PATCH] fix for asynchronous server requests Message-ID: <61ea209e0812160102y23685552h242db4e8d9813c3c@mail.gmail.com> Hello All, I have a server here that sends me SET_PARAMETER requests while the client is waiting for a DESCRIBE response. The RFC states in appendix D.1 that "A client implementation MUST be able to do the following : * Expect and respond to asynchronous requests from the server". And also that RTSP is a symmetric protocol (compared to asymmetric HTTP). So I guess we have to deal with these situations. Attached is a patch that tries to fix this. Best Regards, --- Martin -------------- next part -------------- A non-text attachment was scrubbed... Name: fix_asynchronous_requests.patch Type: text/x-patch Size: 1549 bytes Desc: not available URL: From zhuyunbin at shdv.com Tue Dec 16 16:34:32 2008 From: zhuyunbin at shdv.com (zhuyunbin) Date: Wed, 17 Dec 2008 08:34:32 +0800 Subject: [Live-devel] How to use PCIH.264 encoding card instreamingserver References: Message-ID: <200812170834320004490@shdv.com> Thanks! I will read this tutorial file carefully. I have another basic question beacasue I am new one to study live555. Where to get the live555 's api files and useful instructions? With best regards Zhu yunbin zhuyunbin 2008-12-17 ???? Yedidia Amit ????? 2008-12-16 13:31:37 ???? LIVE555 Streaming Media - development & use ??? ??? Re: [Live-devel] How to use PCIH.264 encoding card instreamingserver check the DeviceSource.h/cpp files for some initial instructions. - this is where you should put your API The you need to write your on subsession class that will create and connect the appropriate sink and source (in your case its the YourDeviceSouurce and H264VideoRTPSink (check other subsession for example) also check this tutorial http://www.fileden.com/files/2008/12/4/2210768/live555_H.264_tutorial.tar.gz Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of zhuyunbin Sent: Monday, December 15, 2008 2:57 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] How to use PCIH.264 encoding card instreamingserver The Fujisu's api I have already get. My questions are below: (1) I don't know where and which part code can I write fujisu's api and let live555 call this card's api. (2) if I know and write the correct api in live555 , But whether can I change live555's upper app's code to let h.264 card work ? Thanks Zhu yunbin zhuyunbin 2008-12-15 ???? Yedidia Amit ????? 2008-12-14 23:26:13 ???? LIVE555 Streaming Media - development & use ??? ??? Re: [Live-devel] How to use PCIH.264 encoding card in streamingserver first, you should implemet H.264 Framer (you can search for the tutorial republished few days ago) second, you shoud write your own DeviceSource which will make use of your proprietary API. Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of zhuyunbin Sent: Saturday, December 13, 2008 4:33 AM To: live-devel Subject: [Live-devel] How to use PCIH.264 encoding card in streaming server Dear all I have a fujisu h.264 encoding card ( with PCI interface). I want to use live stream server to receivce the h.264 file from card and put stream to network. How can I do if I want to do this function? I have already had fujisu's api in Linux and windows platform. Thanks Zhu yunbin zhuyunbin 2008-12-13 The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gabriele.deluca at hotmail.com Wed Dec 17 01:03:48 2008 From: gabriele.deluca at hotmail.com (Gabriele De Luca) Date: Wed, 17 Dec 2008 10:03:48 +0100 Subject: [Live-devel] Nat and RTCP Message-ID: Hello, I would like to know how I could manage the problem of NAT, whereas for an audio video stream requires two ports (plus two more for RTCP). In particular, you can use the same port for RTP and RTCP to use all but two ports instead of four? The standard speaks of equal ports (RTP) and odd (RTCP) but it is a blasphemy to speak of RTCP on the same port RTP to have fewer problems with NAT? Thanks in advance _________________________________________________________________ Che tipo sei? Crea la tua animoticon invernale! http://www.messenger.it/test.html From finlayson at live555.com Wed Dec 17 05:14:02 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 17 Dec 2008 05:14:02 -0800 Subject: [Live-devel] Nat and RTCP In-Reply-To: References: Message-ID: >it is a blasphemy to speak of RTCP on the same port RTP to have >fewer problems with NAT? Yes (it's not just 'blasphemy'; it won't work). One thing you can do, however, is use RTP/RTCP-over-TCP streaming (using the RTSP channel). That way, you need only relay one (TCP) port number - the one for RTSP (by default, that's 554). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gabriele.deluca at hotmail.com Wed Dec 17 08:20:01 2008 From: gabriele.deluca at hotmail.com (Gabriele De Luca) Date: Wed, 17 Dec 2008 17:20:01 +0100 Subject: [Live-devel] Nat and RTCP In-Reply-To: References: Message-ID: Thanks! I have another question. If I have another application that I will open the socket audio and video, how can I fix OnDemandServerMediaSubsession order to avoid reopening a socket already opened (so that he can use without opening it)? Thanks in advance ---------------------------------------- > Date: Wed, 17 Dec 2008 05:14:02 -0800 > To: live-devel at ns.live555.com > From: finlayson at live555.com > Subject: Re: [Live-devel] Nat and RTCP > >>it is a blasphemy to speak of RTCP on the same port RTP to have >>fewer problems with NAT? > > Yes (it's not just 'blasphemy'; it won't work). > > One thing you can do, however, is use RTP/RTCP-over-TCP streaming > (using the RTSP channel). That way, you need only relay one (TCP) > port number - the one for RTSP (by default, that's 554). > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel _________________________________________________________________ Ci sai fare con l'italiano? Scoprilo con Typectionary! http://typectionary.it.msn.com/ From finlayson at live555.com Wed Dec 17 11:55:01 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 17 Dec 2008 11:55:01 -0800 Subject: [Live-devel] Nat and RTCP In-Reply-To: References: Message-ID: >I have another question. >If I have another application that I will open the socket audio and >video, how can I fix OnDemandServerMediaSubsession order to avoid >reopening a socket already opened (so that he can use without >opening it)? There's nothing wrong with "OnDemandServerMediaSubsession", so you don't need to 'fix' it. Instead, you write your own subclass of the existing "*FileServerMediaSubsession" class that you plan to use, but redefine the "createNewStreamSource()" virtual function to use an existing open file rather than opening a new one. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gabriele.deluca at hotmail.com Thu Dec 18 00:39:48 2008 From: gabriele.deluca at hotmail.com (Gabriele De Luca) Date: Thu, 18 Dec 2008 09:39:48 +0100 Subject: [Live-devel] Nat and RTCP In-Reply-To: References: Message-ID: Thanks for your answer, maybe I explained badly. I would like to know whether opening the socket for communication with another application (perhaps using the functions of the operating system) going into conflict with the rtpgroupsock and rtcpgroupsock. In other words, if I open a socket on port Y with another application will give me problems using rtpGroupsock = new Groupsock (envir (), dummyAddr, serverRTPPort, 255); where serverRTPPort is Y? Thanks in advance ---------------------------------------- > From: gabriele.deluca at hotmail.com > To: live-devel at ns.live555.com > Date: Wed, 17 Dec 2008 17:20:01 +0100 > Subject: Re: [Live-devel] Nat and RTCP > > > Thanks! > > I have another question. > If I have another application that I will open the socket audio and video, how can I fix OnDemandServerMediaSubsession order to avoid reopening a socket already opened (so that he can use without opening it)? > > Thanks in advance > > ---------------------------------------- >> Date: Wed, 17 Dec 2008 05:14:02 -0800 >> To: live-devel at ns.live555.com >> From: finlayson at live555.com >> Subject: Re: [Live-devel] Nat and RTCP >> >>>it is a blasphemy to speak of RTCP on the same port RTP to have >>>fewer problems with NAT? >> >> Yes (it's not just 'blasphemy'; it won't work). >> >> One thing you can do, however, is use RTP/RTCP-over-TCP streaming >> (using the RTSP channel). That way, you need only relay one (TCP) >> port number - the one for RTSP (by default, that's 554). >> -- >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _________________________________________________________________ > Ci sai fare con l'italiano? Scoprilo con Typectionary! > http://typectionary.it.msn.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel _________________________________________________________________ Fanne di tutti i colori, personalizza la tua Hotmail! http://imagine-windowslive.com/Hotmail/#0 From ljsthestar at gmail.com Thu Dec 18 03:54:02 2008 From: ljsthestar at gmail.com (sri kanth) Date: Thu, 18 Dec 2008 17:24:02 +0530 Subject: [Live-devel] Decrease of bitrate for mpeg2ts during trick play Message-ID: <4d859a5f0812180354y1cb7b107q13f7e3edcd562243@mail.gmail.com> hi , Can u suggest methods to decrease the bitrate during the trick play. The bitrate is increasing very high compared to normal playback. Thanks, Srijeet. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 18 07:11:45 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Dec 2008 07:11:45 -0800 Subject: [Live-devel] Decrease of bitrate for mpeg2ts during trick play In-Reply-To: <4d859a5f0812180354y1cb7b107q13f7e3edcd562243@mail.gmail.com> References: <4d859a5f0812180354y1cb7b107q13f7e3edcd562243@mail.gmail.com> Message-ID: > Can u suggest methods to decrease the bitrate during the trick play. Be patient. This is high on our 'to do' list. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From gabriele.deluca at hotmail.com Thu Dec 18 15:21:10 2008 From: gabriele.deluca at hotmail.com (Gabriele De Luca) Date: Fri, 19 Dec 2008 00:21:10 +0100 Subject: [Live-devel] Multiplexing RTP Session Message-ID: It's possible multiplexing audio and video using only a port (and then only a session)? Thanks in advance -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Dec 18 16:59:05 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Dec 2008 16:59:05 -0800 Subject: [Live-devel] Multiplexing RTP Session In-Reply-To: References: Message-ID: >It's possible multiplexing audio and video using only a port (and >then only a session)? Once again, no (unless you use RTP/RTCP-over-TCP). For streaming audio+video over RTP/UDP, you will need 4 ports: 2 for RTP; 2 for RTCP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From linkfanel at yahoo.fr Fri Dec 19 06:26:11 2008 From: linkfanel at yahoo.fr (Pierre Ynard) Date: Fri, 19 Dec 2008 15:26:11 +0100 Subject: [Live-devel] [PATCH] Infinite loop in source port selection on WinCE Message-ID: <20081219142611.GA5746@via.ecp.fr> Hello, I am using the live555 library on a smartphone running under Windows CE, as a plugin for the VLC media player, used as part of a custom application. When opening an RTSP stream, live555 source port selection algorithm sometimes gets caught in an infinite loop, which freezes the process. (It even freezes the whole Windows CE system, since apparently preempting a process in an infinite loop is too much for Windows CE.) RTP needs an even source port number, so live555 requests a socket, checks the source port, and if it's not even, loops back until it is. According to my investigations, WinCE occasionally keeps returning the same source port again and again, effectively causing an endless loop. I found that this can be partially prevented by fiddling with the network stack, like, resetting the GPRS connection. I applied this patch, and it solves my problem: no more freezes, no errors. Please consider. --- live_orig/liveMedia/MediaSession.cpp 2008-11-13 10:30:10.000000000 +0100 +++ live/liveMedia/MediaSession.cpp 2008-12-19 11:57:33.000000000 +0100 @@ -599,7 +599,8 @@ struct in_addr tempAddr; tempAddr.s_addr = connectionEndpointAddress(); // This could get changed later, as a result of a RTSP "SETUP" - while (1) { + unsigned int numTries; + for (numTries = 0; numTries < 20; numTries++) { unsigned short rtpPortNum = fClientPortNum&~1; if (isSSM()) { fRTPSocket = new Groupsock(env(), tempAddr, fSourceFilterAddr, @@ -608,7 +609,10 @@ fRTPSocket = new Groupsock(env(), tempAddr, rtpPortNum, 255); } if (fRTPSocket == NULL) { - env().setResultMsg("Failed to create RTP socket"); + if (fClientPortNum && numTries > 0) { + fClientPortNum += 2; + continue; + } break; } @@ -627,8 +631,11 @@ // Try again: delete oldGroupsock; oldGroupsock = fRTPSocket; - fClientPortNum = 0; + if (numTries < 10) { + fClientPortNum = 0; + } } + env().setResultMsg("Failed to create RTP socket"); delete oldGroupsock; if (!success) break; ---------- Regards, -- Pierre Ynard "Une ?me dans un corps, c'est comme un dessin sur une feuille de papier." From kris.delbarba at lextech.com Fri Dec 19 16:40:52 2008 From: kris.delbarba at lextech.com (Kris DelBarba) Date: Fri, 19 Dec 2008 18:40:52 -0600 Subject: [Live-devel] Decoding a jpeg from RTP packet Message-ID: <0B5B9E5C-4982-4592-A8E0-03E40B5085F4@lextech.com> I'm currently trying to locate the portion of the code that reassembles the jpeg from the RTP packets. Where is this done in live555? I'm having difficulty finding all the pieces, like where the different headers are stripped off and the data is put together. Looking over the RFCs is a bit convoluted. Thanks, Kris DelBarba From finlayson at live555.com Fri Dec 19 17:35:49 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 19 Dec 2008 17:35:49 -0800 Subject: [Live-devel] Decoding a jpeg from RTP packet In-Reply-To: <0B5B9E5C-4982-4592-A8E0-03E40B5085F4@lextech.com> References: <0B5B9E5C-4982-4592-A8E0-03E40B5085F4@lextech.com> Message-ID: >I'm currently trying to locate the portion of the code that >reassembles the jpeg from the RTP packets. Where is this done in >live555? The "JPEGVideoRTPSource" class. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Dec 19 22:55:27 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 19 Dec 2008 22:55:27 -0800 Subject: [Live-devel] [PATCH] Infinite loop in source port selection on WinCE In-Reply-To: <20081219142611.GA5746@via.ecp.fr> References: <20081219142611.GA5746@via.ecp.fr> Message-ID: Thanks for the suggestion. I have now installed a new version (2008.12.20) of the code that includes this change. The change was a bit different from the patch that you suggested, so please check to make sure that it works OK for you. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From tchristensen at nordija.com Sat Dec 20 23:16:15 2008 From: tchristensen at nordija.com (Thomas Christensen) Date: Sun, 21 Dec 2008 08:16:15 +0100 Subject: [Live-devel] H.264 and QuickTime client In-Reply-To: References: Message-ID: <86AF8870-9A8F-4126-AC18-F87E49EFD50D@nordija.com> Hi Maxim Did you find a solution to the problem you described below? And is it possible for you to share the implementation you have done? Cheers Thomas Den 25/07/2008 kl. 16.16 skrev Maxim Petrov: > Hi Gents. > > The topic of this message is little bit off-topic, but because it's > related to live555 software too I think I can describe my problem > here. > > We have implemented rtsp server using live555 framework. Also we are > using libx264 as encoder. Open source clients like VLC and mplayer can > get and play H.264 stream w/o any problem. So I think our > implementation > of H264VideoStreamFramer is correct. > > However QuickTime(7.4.5) is seems different story. RTSP negotiation > passed great, after PLAY command our server are sending stream to > QuickTime, and I even see control bar with progress of streaming. > But no > video displayed. Even if I click on "pause" button, PAUSE command is > sending to our server... As I understand there is no problem with > network level, but for some reason QuickTime cannot decode stream. > > Of course I do not ask how to solve my problem. I just wondering if > somehow had experience with QuickTime client and have problems like > we? > > I've just subscribed to quicktime-users mail list and will send > questions there too... > > Another question: anybody know about another (may be proprietary) > clients (except vlc, mplayer, quicktime) which can play H.264 over > RTSP/RTP. > > -- > Bye, Maxim. > > We can solve any problem by introducing an extra level of indirection. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From amit.yedidia at elbitsystems.com Sat Dec 20 23:33:56 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Sun, 21 Dec 2008 09:33:56 +0200 Subject: [Live-devel] Multiplexing RTP Session In-Reply-To: Message-ID: yes. using Mpeg2TS (Transport stream) Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Gabriele De Luca Sent: Friday, December 19, 2008 1:21 AM To: live-devel Subject: [Live-devel] Multiplexing RTP Session Importance: High It's possible multiplexing audio and video using only a port (and then only a session)? Thanks in advance The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From etienne.bomcke at uclouvain.be Sun Dec 21 04:12:32 2008 From: etienne.bomcke at uclouvain.be (=?ISO-8859-1?Q?Etienne_B=F6mcke?=) Date: Sun, 21 Dec 2008 13:12:32 +0100 Subject: [Live-devel] H.264 and QuickTime client In-Reply-To: <86AF8870-9A8F-4126-AC18-F87E49EFD50D@nordija.com> References: <86AF8870-9A8F-4126-AC18-F87E49EFD50D@nordija.com> Message-ID: <7B59BDDE-35D0-4124-9E2C-E7469B012811@uclouvain.be> Hi, When you stream video sequence to Quicktime, you have to make sure that they're properly hinted. The hint track is an additional information track in the file telling Quicktime how to decode it. You can add a hinting track to an H.264 encoded sequence with mp4box, which is part of the GPAC project (http://gpac.sourceforge.net/). I did some work with Live555, x264 and the Quicktime client, if you need additional info I'll be happy to help. Cheers, Etienne On 21 Dec 2008, at 08:16, Thomas Christensen wrote: > Hi Maxim > > Did you find a solution to the problem you described below? > > And is it possible for you to share the implementation you have done? > > Cheers > > Thomas > > Den 25/07/2008 kl. 16.16 skrev Maxim Petrov: > >> Hi Gents. >> >> The topic of this message is little bit off-topic, but because it's >> related to live555 software too I think I can describe my problem >> here. >> >> We have implemented rtsp server using live555 framework. Also we are >> using libx264 as encoder. Open source clients like VLC and mplayer >> can >> get and play H.264 stream w/o any problem. So I think our >> implementation >> of H264VideoStreamFramer is correct. >> >> However QuickTime(7.4.5) is seems different story. RTSP negotiation >> passed great, after PLAY command our server are sending stream to >> QuickTime, and I even see control bar with progress of streaming. >> But no >> video displayed. Even if I click on "pause" button, PAUSE command is >> sending to our server... As I understand there is no problem with >> network level, but for some reason QuickTime cannot decode stream. >> >> Of course I do not ask how to solve my problem. I just wondering if >> somehow had experience with QuickTime client and have problems like >> we? >> >> I've just subscribed to quicktime-users mail list and will send >> questions there too... >> >> Another question: anybody know about another (may be proprietary) >> clients (except vlc, mplayer, quicktime) which can play H.264 over >> RTSP/RTP. >> >> -- >> Bye, Maxim. >> >> We can solve any problem by introducing an extra level of >> indirection. >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Etienne B?mcke Laboratoire de T?l?communications et T?l?d?tections Universit? Catholique de Louvain B?timent Stevin - Place du Levant, 2 B-1348 Louvain-la-Neuve e-mail : etienne.bomcke at uclouvain.be tel : +32 10 47 85 51 From finlayson at live555.com Sun Dec 21 05:44:43 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 21 Dec 2008 05:44:43 -0800 Subject: [Live-devel] H.264 and QuickTime client In-Reply-To: <7B59BDDE-35D0-4124-9E2C-E7469B012811@uclouvain.be> References: <86AF8870-9A8F-4126-AC18-F87E49EFD50D@nordija.com> <7B59BDDE-35D0-4124-9E2C-E7469B012811@uclouvain.be> Message-ID: >When you stream video sequence to Quicktime, you have to make sure >that they're properly hinted. 'Hinting' is a hack that applies only to files streamed by Apple's 'Darwin Streaming Server', not by other RTSP servers (including ours). This discussion is therefore off-topic for this mailing list. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Sun Dec 21 07:18:41 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Sun, 21 Dec 2008 10:18:41 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source Message-ID: Hello, I?m trying to use (connect) Apples QTKit (QTCaptureSession) to the input of the testMPEG4VideoToDarwin sample program so I can stream live events from a video source. I have testMPEG4VideoToDarwin connecting to my darwin stream server but I cannot figure out how to get the sampleBuffer from QTKit into the file input of testMPEG4VideoToDarwin. Do I need to create a devicesource or can I do it through sdtout/sdtin? Also for non-live events, I?m recording my QTKit (QTCaptureSession) to a file but I cannot get testMPEG4VideoToDarwin to use my mov file as a source to ByteStreamFileSource I get ?Unable to open file "sample_iTunes.mov" as a byte-stream file source?, test.m4e file works fine. If anyone has information on this it would be great. Thanks Austin From finlayson at live555.com Sun Dec 21 10:36:34 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 21 Dec 2008 10:36:34 -0800 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: References: Message-ID: >Also for non-live events, I'm recording my QTKit (QTCaptureSession) >to a file but I cannot get testMPEG4VideoToDarwin to use my mov file >as a source to ByteStreamFileSource We don't (currently) support the reading of ".mov" or ".mp4" format files. To be able to read MPEG-4 video data, it *must* be in raw video Elementary Stream form - i.e., a sequence of MPEG-4 video frames. If your input data is a discrete sequence of MPEG-4 video frames - rather than a byte stream - then you should feed it into a "MPEG4VideoStreamDiscreteFramer", rather than a "MPEG4VideoStreamFramer". Finally, a reminder, once again, that we have our own RTSP server implementation (as demonstrated by the "testMPEG4VideoStreamer" application (for multicast), or the "testOnDemandRTSPServer" appliication (for unicast). You do *not* need to use a separate 'Darwin Streaming Server', and we no longer provide support for the "test*toDarwin" demo applications. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lennart.gilander at methafour.com Sun Dec 21 10:43:33 2008 From: lennart.gilander at methafour.com (Lennart Gilander) Date: Sun, 21 Dec 2008 19:43:33 +0100 Subject: [Live-devel] Unable to determine ip address Message-ID: Are trying to get live555 standard configurations and compiled from source code working on a FC6 machine with two network cards but the streamer seems confused about the ip address. Have done various tests, this is briefly what I get: -> The response when running testMPEG2TransportStreamer is: Unable to determine our source address: This computer has an invalid IP address: 0x0 Unable to determine our source address: This computer has an invalid IP address: 0x0 Beginning streaming... Beginning to read from file... Please note that the first line really comes twice. Is that because there are two network cards? -> Starting the live555MediaServer I get: LIVE555 Media Server version 0.20 (LIVE555 Streaming Media library version 2008.12.19). Play streams from this server using the URL rtsp://0.0.0.0/ where is a file present in the current directory. This seems to indicate that the streamer can't find an ip-address (a single ip address?) -> Executing the command 'ip addr' gives: 1: lo: mtu 16436 qdisc noqueue link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth1: mtu 1500 qdisc pfifo_fast qlen 100 link/ether 00:30:05:5a:66:1c brd ff:ff:ff:ff:ff:ff inet 10.10.99.215/24 brd 10.10.99.255 scope global eth1 inet6 fe80::230:5ff:fe5a:661c/64 scope link valid_lft forever preferred_lft forever 3: eth0: mtu 1500 qdisc pfifo_fast qlen 1000 link/ether 00:40:05:a4:91:08 brd ff:ff:ff:ff:ff:ff inet 10.10.60.215/24 brd 10.10.60.255 scope global eth0 inet6 fe80::240:5ff:fea4:9108/64 scope link valid_lft forever preferred_lft forever -> Having read an earlier response I also checked the multicast setup, using 'netstat -nr' I get: Kernel IP routing table Destination Gateway Genmask Flags MSS Window irtt Iface 10.10.0.0 10.10.99.1 255.255.255.0 UG 0 0 0 eth1 10.10.99.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 10.10.60.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 169.254.0.0 0.0.0.0 255.255.0.0 U 0 0 0 eth0 224.0.0.0 0.0.0.0 255.0.0.0 U 0 0 0 eth0 232.0.0.0 0.0.0.0 255.0.0.0 U 0 0 0 eth0 0.0.0.0 10.10.99.1 0.0.0.0 UG 0 0 0 eth1 So the multicast route (224.0.0.0/255.0.0.0' is really set up for eth0. Would really need two (or perhaps even three) network cards in this machine if at all possible, so how do I get the streamer to select the preferred eth0 interface? Can't seem to find an answer anywhere in the FAQ. Regards / Lennart From finlayson at live555.com Sun Dec 21 14:02:34 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 21 Dec 2008 14:02:34 -0800 Subject: [Live-devel] Unable to determine ip address In-Reply-To: References: Message-ID: From your description, I couldn't tell what's going wrong (it *should* be working OK), unfortunately. So you're going to have to work though the code for the "ourIPAddress()" function (in "groupsock/GroupsockHelper.cpp") to figure out why that code is not working for you. (Please let us know what you find.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lennart.gilander at methafour.com Mon Dec 22 06:23:00 2008 From: lennart.gilander at methafour.com (Lennart Gilander) Date: Mon, 22 Dec 2008 15:23:00 +0100 Subject: [Live-devel] Unable to determine ip address References: Message-ID: Ross, thanks for the prompt reply, got it to work doing a quick-and-dirty: The multicast address defined when testing for multicast reply was in the 228.x.x.x range while the one mentioned in an earlier reply was in the 224.x.x.x range. I simply changed the code to cover to my range. It is still unclear why it did not find the machine IP thru the lookup, using 'host ' gives the correct ip. Strange. Could possibly be because the test server also acts as a dhcp server. Got my system working so I'm happy. Best Regards Lennart From Michael.Mikhlin at elbitsystems.com Mon Dec 22 07:33:03 2008 From: Michael.Mikhlin at elbitsystems.com (Mikhlin Michael) Date: Mon, 22 Dec 2008 17:33:03 +0200 Subject: [Live-devel] Stopping RTSP session from server side Message-ID: Hi, Is there any standart way to stop a RTSP session from server side? In RFC 2326 (RTSP) is written that TEARDOWN message can only be sent from client to server. Is there any other way, or closing the TCP socket from server side is enough? Thanks in advance, Michael Mikhlin The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From asnow at pathfindertv.net Mon Dec 22 08:20:10 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Mon, 22 Dec 2008 11:20:10 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: References: Message-ID: Thanks Ross, Have you connected the QTCaptureSession output to the input of the testMPEG4VideoStreamer? If so can you provide some code snippet? Also, is the DarwinInjector still supported? Austin On Dec 21, 2008, at 1:36 PM, Ross Finlayson wrote: >> Also for non-live events, I'm recording my QTKit (QTCaptureSession) >> to a file but I cannot get testMPEG4VideoToDarwin to use my mov >> file as a source to ByteStreamFileSource > > We don't (currently) support the reading of ".mov" or ".mp4" format > files. To be able to read MPEG-4 video data, it *must* be in raw > video Elementary Stream form - i.e., a sequence of MPEG-4 video > frames. > > If your input data is a discrete sequence of MPEG-4 video frames - > rather than a byte stream - then you should feed it into a > "MPEG4VideoStreamDiscreteFramer", rather than a > "MPEG4VideoStreamFramer". > > Finally, a reminder, once again, that we have our own RTSP server > implementation (as demonstrated by the "testMPEG4VideoStreamer" > application (for multicast), or the "testOnDemandRTSPServer" > appliication (for unicast). You do *not* need to use a separate > 'Darwin Streaming Server', and we no longer provide support for the > "test*toDarwin" demo applications. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Mon Dec 22 14:36:18 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 22 Dec 2008 14:36:18 -0800 Subject: [Live-devel] Stopping RTSP session from server side In-Reply-To: References: Message-ID: >Is there any standart way to stop a RTSP session from server side? > >In RFC 2326 (RTSP) is written that TEARDOWN message can only be sent >from client to server. > >Is there any other way, or closing the TCP socket from server side is enough? You can also call "RTSPServer::removeServerMediaSession()". This won't stop any existing stream (it will run until completion), but it will prevent any other clients from requesting the same data. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 22 14:42:24 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 22 Dec 2008 14:42:24 -0800 Subject: [Live-devel] Unable to determine ip address In-Reply-To: References: Message-ID: >The multicast address defined when testing for multicast reply was in the >228.x.x.x range while the one mentioned in an earlier reply was in the >224.x.x.x range. I simply changed the code to cover to my range. No, don't change the supplied code; if you do, we won't be able to support you in the future. A 228.x.x.x IP multicast address should still work for you, because you have a route for 224.0.0.0/8. If, however, it's not working for you, you should figure out why (otherwise lots of other multicast stuff won't work for you either). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Dec 22 14:46:14 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 22 Dec 2008 14:46:14 -0800 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: References: Message-ID: >Thanks Ross, >Have you connected the QTCaptureSession output to the input of the >testMPEG4VideoStreamer? If so can you provide some code snippet? See . This will work if and only if the output from your "QTCaptureSession" is a MPEG-4 Video Elementary Stream. > >Also, is the DarwinInjector still supported? We still include it in the supplied code, but we don't support it, because now that we have our own RTSP server implementation, a separate 'Darwin' server is no longer needed. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Michael.Mikhlin at elbitsystems.com Mon Dec 22 21:53:16 2008 From: Michael.Mikhlin at elbitsystems.com (Mikhlin Michael) Date: Tue, 23 Dec 2008 07:53:16 +0200 Subject: [Live-devel] Stopping RTSP session from server side In-Reply-To: Message-ID: Thanks for quick reply. I'm looking for a gracefull shutdown, so that the client will know that the session is closed. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, December 23, 2008 12:36 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Stopping RTSP session from server side Is there any standart way to stop a RTSP session from server side? In RFC 2326 (RTSP) is written that TEARDOWN message can only be sent from client to server. Is there any other way, or closing the TCP socket from server side is enough? You can also call "RTSPServer::removeServerMediaSession()". This won't stop any existing stream (it will run until completion), but it will prevent any other clients from requesting the same data. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From paul at packetship.com Tue Dec 23 03:13:08 2008 From: paul at packetship.com (Paul Clark) Date: Tue, 23 Dec 2008 11:13:08 +0000 Subject: [Live-devel] Stopping RTSP session from server side In-Reply-To: References: Message-ID: <4950C7C4.5000605@packetship.com> Mikhlin Michael wrote: > Thanks for quick reply. > > I'm looking for a gracefull shutdown, so that the client will know > that the session is closed. There is the server-to-client ANNOUNCE method, something like this: ANNOUNCE rtsp://server/asset RTSP/1.0 CSeq: 1 Session: 1234 Notice: 2101 End of Stream I think use of ANNOUNCE for this, and the 'Notice' header, come from a pre-RTSP-1.1 draft by Sheedy (nCube): http://www-rn.informatik.uni-bremen.de/ietf/mmusic/47/id/draft-sheedy-mmusic-rtsp-ext-00.txt (sec. 5.1) There are also some attempts to reintroduce it in RTSP/2.0 - e.g. http://tools.ietf.org/html/draft-stiemerling-rtsp-announce-01 Several set-top-boxes (e.g Amino) and commercial servers (including ours) support it. I don't know if Live555 supports ANNOUNCE, though - Ross? Alternatively, we've found that VLC will accept an empty RTP packet as indicating end-of-stream, so we do that too. Non-standard, but not entirely unreasonable, I guess. Hope this helps Paul -- Paul Clark Packet Ship Technologies Limited http://www.packetship.com From finlayson at live555.com Tue Dec 23 04:46:56 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 23 Dec 2008 04:46:56 -0800 Subject: [Live-devel] Stopping RTSP session from server side In-Reply-To: <4950C7C4.5000605@packetship.com> References: <4950C7C4.5000605@packetship.com> Message-ID: > I don't know if Live555 supports ANNOUNCE, though ' No, not when sent server->client. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Tue Dec 23 15:49:34 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Tue, 23 Dec 2008 18:49:34 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: References: Message-ID: <8DE4A2F8-7D90-44D3-88EA-56283926C5FD@pathfindertv.net> Thanks again Ross. I believe we have the raw frames (data) but it can be audio and/or video. Can the elementary stream for the framedsource consist of audio and video information? Austin On Dec 22, 2008, at 5:46 PM, Ross Finlayson wrote: >> Thanks Ross, >> Have you connected the QTCaptureSession output to the input of the >> testMPEG4VideoStreamer? If so can you provide some code snippet? > > See . This > will work if and only if the output from your "QTCaptureSession" is > a MPEG-4 Video Elementary Stream. > >> >> Also, is the DarwinInjector still supported? > > We still include it in the supplied code, but we don't support it, > because now that we have our own RTSP server implementation, a > separate 'Darwin' server is no longer needed. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Dec 23 15:56:39 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 23 Dec 2008 15:56:39 -0800 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: <8DE4A2F8-7D90-44D3-88EA-56283926C5FD@pathfindertv.net> References: <8DE4A2F8-7D90-44D3-88EA-56283926C5FD@pathfindertv.net> Message-ID: >I believe we have the raw frames (data) but it can be audio and/or video. > >Can the elementary stream for the framedsource consist of audio and >video information? No! The input to "testMPEG4VideoStreamer" must be raw MPEG-4 *video* only. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Tue Dec 23 16:11:30 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Tue, 23 Dec 2008 19:11:30 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: References: <8DE4A2F8-7D90-44D3-88EA-56283926C5FD@pathfindertv.net> Message-ID: <0B0E0EEC-7A18-43FE-BED8-5AF73D239E1D@pathfindertv.net> I was afraid of that. Is there a something we can use for both audio and video streaming? On Dec 23, 2008, at 6:56 PM, Ross Finlayson wrote: >> I believe we have the raw frames (data) but it can be audio and/or >> video. >> >> Can the elementary stream for the framedsource consist of audio and >> video information? > > No! The input to "testMPEG4VideoStreamer" must be raw MPEG-4 > *video* only. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Dec 23 16:20:07 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 23 Dec 2008 16:20:07 -0800 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: <0B0E0EEC-7A18-43FE-BED8-5AF73D239E1D@pathfindertv.net> References: <8DE4A2F8-7D90-44D3-88EA-56283926C5FD@pathfindertv.net> <0B0E0EEC-7A18-43FE-BED8-5AF73D239E1D@pathfindertv.net> Message-ID: >I was afraid of that. Is there a something we can use for both >audio and video streaming? If your input data is (or can be multiplexed into) a MPEG Transport Stream, then you could stream that. Otherwise, you will need to demultiplex your input data into separate audio and video streams. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Tue Dec 23 16:48:40 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Tue, 23 Dec 2008 19:48:40 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a live streaming source In-Reply-To: References: <8DE4A2F8-7D90-44D3-88EA-56283926C5FD@pathfindertv.net> <0B0E0EEC-7A18-43FE-BED8-5AF73D239E1D@pathfindertv.net> Message-ID: Thank you Ross for the quick response. If our data is separated into audio and video and each is streamed to our darwin stream server, is the presentation time passed along so there is not any lip-sync issues? Would it be two unicast stream at this point or can we put them into one stream? In this case I would think we would have to create our own sdp file for clients connecting to the DSS, correct? Austin On Dec 23, 2008, at 7:20 PM, Ross Finlayson wrote: >> I was afraid of that. Is there a something we can use for both >> audio and video streaming? > > If your input data is (or can be multiplexed into) a MPEG Transport > Stream, then you could stream that. Otherwise, you will need to > demultiplex your input data into separate audio and video streams. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From zhisun.tech at gmail.com Fri Dec 26 00:11:04 2008 From: zhisun.tech at gmail.com (zhi sun) Date: Fri, 26 Dec 2008 16:11:04 +0800 Subject: [Live-devel] how to detect the access unalign issue when porting live555 to uCOS? Message-ID: <54ced3240812260011w209b6cbdnf45dc2b0e645f114@mail.gmail.com> Ross, I am tring to port the live555 to our embedded system with uCOS support. For now, the MPEG2TransportStream works fine (on PC and embedded device), except for the slow performance when using VLC. I have added our H264 support class, in addition to the H264RTPSink and FUAFragment classes, it works fine on PC. Then I try to run run it on our embeded device (there is no problem to compile it), an memory unalign exception always occurs when first time calling the AfterGettingFrame1 function of our H264StreamFrame class (which inherits from the H264StreamFrame class) ================================================= class H264VideoStreamFramer4LongCircle : public H264VideoStreamFramer ...... void H264VideoStreamFramer4LC::afterGettingFrame1( unsigned frameSize, unsigned numTruncatedBytes, struct timeval presentationTime, unsigned durationInMicroseconds) { //fFrameSize = frameSize; fNumTruncatedBytes = numTruncatedBytes; fPresentationTime = presentationTime; <--- data unalign exception occurs here fDurationInMicroseconds = durationInMicroseconds; ...... ================================================= I have no idea how to detect where the assess unalign coding exists, and how to troubleshooting this type of problem. I think the compiler should take care of the data alignment, and always allocate the the bytes a power of 2 (ie, word). Could you please point me a correct direction about this issue? Happy New Year! -kevin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 26 05:18:56 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Dec 2008 05:18:56 -0800 Subject: [Live-devel] how to detect the access unalign issue when porting live555 to uCOS? In-Reply-To: <54ced3240812260011w209b6cbdnf45dc2b0e645f114@mail.gmail.com> References: <54ced3240812260011w209b6cbdnf45dc2b0e645f114@mail.gmail.com> Message-ID: > fPresentationTime = presentationTime; <--- data >unalign exception occurs here It looks like you have a buggy compiler. If both "fPresentationTime" and "presentationTime" are declared as "struct timeval", you should not be getting a data alignment error when you try to assign one to the other. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Fri Dec 26 11:46:35 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Fri, 26 Dec 2008 14:46:35 -0500 Subject: [Live-devel] doEventLoop() Message-ID: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> Hello All, I have created my own deviceSource.cpp for my live input from QTKit, I call doGetNextFrame with my data and than pass it to deliverFrame. I have based my streaming from MPEG4ToVideoStreamer, which blocks at env- >taskScheduler().doEventLoop(); // does not return and does not allow my live input code to continue. It there a way to allow my other code to continue. Thank you Austin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 26 16:59:13 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Dec 2008 16:59:13 -0800 Subject: [Live-devel] doEventLoop() In-Reply-To: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> References: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> Message-ID: >Hello All, >I have created my own deviceSource.cpp for my live input from QTKit, >I call doGetNextFrame with my data and than pass it to deliverFrame. > I have based my streaming from MPEG4ToVideoStreamer, which blocks >at env->taskScheduler().doEventLoop(); // does not return and does >not allow my live input code to continue. It there a way to allow >my other code to continue. "deliverFrame()" should be called *only when* data is available to be delivered to the downstream object. (Your "doGetNextFrame()" implementation should return immediately - without calling "deliverFrame()" - if no data is currently available to be delivered.) This means that the availabllty of new data needs to be recognized/handled as an 'event' by the event loop. Unfortunately there's no one standard way of handling the arrival of new data as an 'event', because it depends on your particular environment (in particular, the nature of your input device). If your input device can be treated as an open file, then it's easy, because you can use the event loop's existing "select()" mechanism, by calling "TaskScheduler:: turnOnBackgroundReadHandling()" (see the comments in "DeviceSource.cpp"). You should do it this way, if you can. If, however, your input device cannot be treated as an open file, then you have to do something else to handle the arrival of new data as an 'event'. Some people use the optional "watchVariable" parameter (to "TaskScheduler::doEventLoop()") for this purpose. (See, for example, . Note, however, that if you use the 'watchVariable' feature, you should also note this point: ) An alternative way to recognize the arrival of new data as an event would be to subclass "TaskScheduler" to implement your own event loop - but that is more difficult. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Fri Dec 26 17:30:57 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Fri, 26 Dec 2008 20:30:57 -0500 Subject: [Live-devel] doEventLoop() In-Reply-To: References: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> Message-ID: The call to doGetNextFrame (doGetNextFrame(void *data, int len) in my code) only happens when data is present. Than in (doGetNextFrame(void *data, int len) in my code) function I call deliverFrame(data, len); function passing the data. But the function deliverFrame(data, len); the if statement if (! isCurrentlyAwaitingData()) return; (part of deviceSource.cpp) returns most of the time, so only 1 frame in ~5 make into fTo. In deliverFrame function, which of the following is correct? FramedSource::afterGetting(this); <-only 1 and 5 frames or nextTask() = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)afterGetting, this); <-this gives the error "getNextFrame(): attempting to read more than once at the same time!" Thanks Austin On Dec 26, 2008, at 7:59 PM, Ross Finlayson wrote: >> Hello All, >> I have created my own deviceSource.cpp for my live input from >> QTKit, I call doGetNextFrame with my data and than pass it to >> deliverFrame. I have based my streaming from MPEG4ToVideoStreamer, >> which blocks at env->taskScheduler().doEventLoop(); // does not >> return and does not allow my live input code to continue. It there >> a way to allow my other code to continue. > > "deliverFrame()" should be called *only when* data is available to > be delivered to the downstream object. (Your "doGetNextFrame()" > implementation should return immediately - without calling > "deliverFrame()" - if no data is currently available to be delivered.) > > This means that the availabllty of new data needs to be recognized/ > handled as an 'event' by the event loop. > > Unfortunately there's no one standard way of handling the arrival of > new data as an 'event', because it depends on your particular > environment (in particular, the nature of your input device). If > your input device can be treated as an open file, then it's easy, > because you can use the event loop's existing "select()" mechanism, > by calling "TaskScheduler:: turnOnBackgroundReadHandling()" (see the > comments in "DeviceSource.cpp"). You should do it this way, if you > can. > > If, however, your input device cannot be treated as an open file, > then you have to do something else to handle the arrival of new data > as an 'event'. Some people use the optional "watchVariable" > parameter (to "TaskScheduler::doEventLoop()") for this purpose. > (See, for example, >. Note, however, that if you use the 'watchVariable' feature, you > should also note this point: >) > > An alternative way to recognize the arrival of new data as an event > would be to subclass "TaskScheduler" to implement your own event > loop - but that is more difficult. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Dec 26 17:38:32 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Dec 2008 17:38:32 -0800 Subject: [Live-devel] doEventLoop() In-Reply-To: References: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> Message-ID: >The call to doGetNextFrame (doGetNextFrame(void *data, int len) in >my code) only happens when data is present. No, that's not true! The "doGetNextFrame()" (virtual) function is called by the "FramedSource::getNextFrame()" function, which is called by the downstream object whenever it wants new data. I.e., it's the downstream object that decides when that function gets called. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Sun Dec 28 16:39:53 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Sun, 28 Dec 2008 19:39:53 -0500 Subject: [Live-devel] doEventLoop() In-Reply-To: References: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> Message-ID: <675AA6B0-78D1-4CA2-98DB-82962144979C@pathfindertv.net> Thanks Ross, So there is not any way to push data (frame data) into a deivceSource? I have modified DeviceSource.cpp deliverFrame to deliverFrame(void *data, int len) to pass data into it. And in the function follows; if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet // Deliver the data here: fMaxSize = 1048576; memcpy(fTo, data, len); printf("Frame sent, len=%d\n", len);\\lets me know when a frame is accepted // After delivering the data, inform the reader that it is now available: FramedSource::afterGetting(this); The isCurrentlyAwaitingData returns most of the time, meaning only about every 1.3 seconds a frame is accepted. My input is defined and used as below; MPEG4VideoStreamDiscreteFramer* videoSource; . . DeviceParameters params; fileSource = DeviceSource::createNew(*env, params); . . FramedSource* videoES = fileSource; videoSource = MPEG4VideoStreamDiscreteFramer::createNew(*env, videoES); // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); . . The rest of the code is based in testMPEG4VideoStreamer.cpp Thanks Austin On Dec 26, 2008, at 8:38 PM, Ross Finlayson wrote: >> The call to doGetNextFrame (doGetNextFrame(void *data, int len) in >> my code) only happens when data is present. > > No, that's not true! The "doGetNextFrame()" (virtual) function is > called by the "FramedSource::getNextFrame()" function, which is > called by the downstream object whenever it wants new data. I.e., > it's the downstream object that decides when that function gets > called. > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Dec 28 18:26:10 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 28 Dec 2008 18:26:10 -0800 Subject: [Live-devel] doEventLoop() In-Reply-To: <675AA6B0-78D1-4CA2-98DB-82962144979C@pathfindertv.net> References: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> <675AA6B0-78D1-4CA2-98DB-82962144979C@pathfindertv.net> Message-ID: >So there is not any way to push data (frame data) into a deivceSource? Once again (and for the last time): A "LIVE555 Streaming Media" application runs an event loop, so the only way to 'push' data to it is to signal the availablity of new data as an event, and thereby have a handler function (which will read the data) called from the event loop. > // Deliver the data here: > fMaxSize = 1048576; No! "fMaxSize" is an *in* parameter. Its value is passed to you by the downstream object. It's a value that you *check* not one that you *set*. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Sun Dec 28 18:46:40 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Sun, 28 Dec 2008 21:46:40 -0500 Subject: [Live-devel] doEventLoop() In-Reply-To: References: <4C0E0E43-4AF2-47BC-BA6F-C1848F417029@pathfindertv.net> <675AA6B0-78D1-4CA2-98DB-82962144979C@pathfindertv.net> Message-ID: Thanks Ross, got it. But I'm this only getting asked for new data ~ every 1.3. Any ideas? Thanks Austin On Dec 28, 2008, at 9:26 PM, Ross Finlayson wrote: >> So there is not any way to push data (frame data) into a >> deivceSource? > > Once again (and for the last time): A "LIVE555 Streaming Media" > application runs an event loop, so the only way to 'push' data to it > is to signal the availablity of new data as an event, and thereby > have a handler function (which will read the data) called from the > event loop. > >> // Deliver the data here: >> fMaxSize = 1048576; > > No! "fMaxSize" is an *in* parameter. Its value is passed to you by > the downstream object. It's a value that you *check* not one that > you *set*. > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From zhisun.tech at gmail.com Sun Dec 28 19:49:31 2008 From: zhisun.tech at gmail.com (zhi sun) Date: Mon, 29 Dec 2008 11:49:31 +0800 Subject: [Live-devel] how to detect the access unalign issue when porting live555 to uCOS? In-Reply-To: References: <54ced3240812260011w209b6cbdnf45dc2b0e645f114@mail.gmail.com> Message-ID: <54ced3240812281949n788a84a7x1abdda5356230627@mail.gmail.com> Thank you for your input, Ross, I am checking this point .... I have post another issue about the select() question. -kevin 2008/12/26 Ross Finlayson > fPresentationTime = presentationTime; <--- data unalign >> exception occurs here >> > > It looks like you have a buggy compiler. If both "fPresentationTime" and > "presentationTime" are declared as "struct timeval", you should not be > getting a data alignment error when you try to assign one to the other. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhisun.tech at gmail.com Sun Dec 28 20:12:48 2008 From: zhisun.tech at gmail.com (zhi sun) Date: Mon, 29 Dec 2008 12:12:48 +0800 Subject: [Live-devel] About the tv_timeToDelay param in the select() function Message-ID: <54ced3240812282012v4fbddb10l42670389afc73f28@mail.gmail.com> Hi Ross, I have another question about the tv_timeToDelay param in the select() function, in BasicTaskScheduler class. By default, I notified the timeout value is 0, that means the select() never stops for waiting an available signals (connect/read socket, read file). I am wondering why not to set the timeout to NULL, to let the select() drivern by the signals. It takes less CPU cycle. On linux, I think it is OK to set the timeout to NULL. On windows, the select() cann't detect the read file signal, so can only read the file once, and be hold there until getting RTCP request from the client. Is it necessary to set the timeout value depends on different OS. On our embeded device, there is no both the build-in select() function support. So the performance is very poor. Ross, could you please point me an correct direction to write an custom select() function? The OS is uCOS, I cannot make change to it (duc to licence). But I may be able to make change the low BIOS level (on which, the uCOS runs). I want to see if it is possible to write an my own select() function based on the BIOS (which interacts with the hardware board directly), or is there a better way. I have notified that there are some similar posts, but the performance is criticle for our embeded device, an custom select() function based on the application level may not address my issue. Thanks for any suggestion! Thanks, -kevin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Dec 28 20:33:22 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 28 Dec 2008 20:33:22 -0800 Subject: [Live-devel] About the tv_timeToDelay param in the select() function In-Reply-To: <54ced3240812282012v4fbddb10l42670389afc73f28@mail.gmail.com> References: <54ced3240812282012v4fbddb10l42670389afc73f28@mail.gmail.com> Message-ID: >I have another question about the tv_timeToDelay param in the >select() function, in BasicTaskScheduler class. > >By default, I notified the timeout value is 0 No, that's not true. By default, the 'timeout' parameter to the "select()" call will be the time until the next 'delayed task' comes due. Note the statement: DelayInterval const& timeToDelay = fDelayQueue.timeToNextAlarm(); If your current "select()" function is broken, then you will either need to fix it (so that it has the standard semantics expected for the "select()" function), or else write a new event loop - i.e., your own subclass ot "TaskScheduler", and use that instead of "BasicTaskScheduler". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From morais.levi at gmail.com Mon Dec 29 09:49:33 2008 From: morais.levi at gmail.com (Andre Morais) Date: Mon, 29 Dec 2008 17:49:33 +0000 Subject: [Live-devel] create .ts files Message-ID: <273b5af00812290949m3d57105k56c8f7f7e2d6dbe@mail.gmail.com> hi to you all, does anybody know the best way to convert a mpeg file to a .ts? Thanks! Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Dec 29 19:35:57 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 29 Dec 2008 19:35:57 -0800 Subject: [Live-devel] create .ts files In-Reply-To: <273b5af00812290949m3d57105k56c8f7f7e2d6dbe@mail.gmail.com> References: <273b5af00812290949m3d57105k56c8f7f7e2d6dbe@mail.gmail.com> Message-ID: >does anybody know the best way to convert a mpeg file to a .ts? Yes, run the "testMPEG1or2ProgramToTransportStream" demo application (in the "testProgs" directory). This converts a MPEG Program Stream file (named "in.mpg") to a MPEG Transport Stream file (named "out.ts"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From morais.levi at gmail.com Tue Dec 30 02:58:16 2008 From: morais.levi at gmail.com (Andre Morais) Date: Tue, 30 Dec 2008 10:58:16 +0000 Subject: [Live-devel] create .ts files In-Reply-To: References: <273b5af00812290949m3d57105k56c8f7f7e2d6dbe@mail.gmail.com> Message-ID: <273b5af00812300258q7900465bk4f5e65fd5fe18805@mail.gmail.com> Thanks! the problem is that I don't know how to do it on windows. On Tue, Dec 30, 2008 at 3:35 AM, Ross Finlayson wrote: > does anybody know the best way to convert a mpeg file to a .ts? >> > > Yes, run the "testMPEG1or2ProgramToTransportStream" demo application (in > the "testProgs" directory). This converts a MPEG Program Stream file (named > "in.mpg") to a MPEG Transport Stream file (named "out.ts"). > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From asnow at pathfindertv.net Wed Dec 31 07:42:25 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Wed, 31 Dec 2008 10:42:25 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a DeviceSource Message-ID: Hello All, I having some difficulty on getting samples out of a QTCaptureSession into MPEG4VideoStreamDiscreteFramer. The DeviceSource::doNextFrame is only being called every 1 to 1.5 seconds, the delay is not in MPEG4VideoStreamDiscreteFramer::doGetNextFrame(). A binary file of the QTCaptureSession output can be seen at the following link; http://www.pathfindertv.net/555/stream-from-qt.bin I think there may be something missing in the bit stream from QT that is causing MPEG4VideoStreamDiscreteFramer not to be able to parse the incoming stream. modulo_time_base and vop_time_increment are always zero. Every call to DeviceSource::deliverFrame the data starts with 000001B6...... the vop start code. There is not any 000001B3xxx000001B6..... (group of vop start code) or 000001B0.... (visual object sequence start code) in the QTCaptureSession output. I tried to use MPEG4VideoStreamFramer but it needs the 000001B0.... (visual object sequence start code) in order to accepted the stream. My input is defined and used as below; MPEG4VideoStreamDiscreteFramer* videoSource; . . DeviceParameters params; fileSource = DeviceSource::createNew(*env, params); . . FramedSource* videoES = fileSource; videoSource = MPEG4VideoStreamDiscreteFramer::createNew(*env, videoES); // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); . . The rest of the code is based in testMPEG4VideoStreamer.cpp I have modified DeviceSource.cpp deliverFrame to deliverFrame(void *data, int len) to pass data into it. And in the deliverFrame function follows; if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet // Deliver the data here: if(fMaxSize < len){ fNumTruncatedBytes = len - fMaxSize; len = fMaxSize; printf("Frame size truncated\n"); } fDurationInMicroseconds = dura; fFrameSize = len; memcpy(fTo, data, len); // After delivering the data, inform the reader that it is now available: FramedSource::afterGetting(this); Any help would be great. Thanks Austin -------------- next part -------------- An HTML attachment was scrubbed... URL: From paul at packetship.com Wed Dec 31 09:10:18 2008 From: paul at packetship.com (Paul Clark) Date: Wed, 31 Dec 2008 17:10:18 +0000 Subject: [Live-devel] create .ts files In-Reply-To: <273b5af00812300258q7900465bk4f5e65fd5fe18805@mail.gmail.com> References: <273b5af00812290949m3d57105k56c8f7f7e2d6dbe@mail.gmail.com> <273b5af00812300258q7900465bk4f5e65fd5fe18805@mail.gmail.com> Message-ID: <495BA77A.1070505@packetship.com> Andre Morais wrote: > Thanks! > the problem is that I don't know how to do it on windows. > VLC (www.videolan.org) will do it for you - tick "Stream/Save" in the "Open" dialog, then open Settings and check "File" in the outputs. The default is encapsulation in MPEG-TS without transcoding. Paul -- Paul Clark Packet Ship Technologies Limited http://www.packetship.com From morais.levi at gmail.com Wed Dec 31 09:22:04 2008 From: morais.levi at gmail.com (Andre Morais) Date: Wed, 31 Dec 2008 17:22:04 +0000 Subject: [Live-devel] create .ts files In-Reply-To: <495BA77A.1070505@packetship.com> References: <273b5af00812290949m3d57105k56c8f7f7e2d6dbe@mail.gmail.com> <273b5af00812300258q7900465bk4f5e65fd5fe18805@mail.gmail.com> <495BA77A.1070505@packetship.com> Message-ID: <273b5af00812310922w1a0be4d2r4fced1a89a92a683@mail.gmail.com> thanks! Happy new year! Regards On Wed, Dec 31, 2008 at 5:10 PM, Paul Clark wrote: > Andre Morais wrote: > >> Thanks! >> the problem is that I don't know how to do it on windows. >> >> > VLC (www.videolan.org) will do it for you - tick "Stream/Save" in the > "Open" dialog, then open Settings and check "File" in the outputs. The > default is encapsulation in MPEG-TS without transcoding. > > Paul > -- > Paul Clark > Packet Ship Technologies Limited > http://www.packetship.com > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Wed Dec 31 11:53:32 2008 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Wed, 31 Dec 2008 11:53:32 -0800 Subject: [Live-devel] Status of Live555 RTPS Client and Server using Stream over TCP Message-ID: <495BCDBC.4010208@schuckmannacres.com> I know this is sort of a loaded question probably without a clear answer but here goes. I was wondering if there are any known issues with using the StreamOverTCP option of the RTSPClient::SetupMediaSession() when both the client application and the server application are based on the Dec 4 2008 release of Live555. I ask this because I'm seeing two problems when I try to use this feature. 1. I've created my own RTP profile for an experimental meta data stream and I've created the necessary classes to stream and receive this stream along with my H264 video stream and it works create when I do standard UDP RTP. However, as soon as I turn on the StreamOverTCP the client starts issuing all this "Discarding interleaved RTP or RTCP packet" messages and my application never gets any of the data. If I don't stream the experimental meta data stream the video streams appears to come through fine. Is there anything I need to do with my custom meta data source or sink. On the server side the MetaData stream source is derived from FramedSource, the Media subsession is derived from OnDemandServerMediaSubsession and it creates a SimpleRTPSink for the RTPSink. On the client the meta data stream sink is derived from MediaSink and I've allowed the MediaSession object to create a SimpleRTPSource for this stream. 2. I've modifed the RTSPServer to accept, handle, and reply to the SET_PARAMETER request (a patch is coming) and it works great when I do RTP over UDP, however when I turn on the StreamOverTCP option as soon as the client tries to send a SET_PARAMETER request the client appears to hang indefinitely. This is probably my problem and I've just started to work it out, but any help or suggestions anyone might have would be appreciated. Any suggestions you might have would be most appreciated, I was hoping that maybe there is a flag or something that I'm not respecting (particularly with problem #1). I understand I'm sort of on the harry edge here so I understand if the only clue you can give is use the debugger. Thanks Matt Schuckmann From matt at schuckmannacres.com Wed Dec 31 12:28:54 2008 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Wed, 31 Dec 2008 12:28:54 -0800 Subject: [Live-devel] Status of Live555 RTPS Client and Server using Stream over TCP In-Reply-To: <495BCDBC.4010208@schuckmannacres.com> References: <495BCDBC.4010208@schuckmannacres.com> Message-ID: <495BD606.1010804@schuckmannacres.com> Ok I've got a little more information. With respect to problem #2 I've discovered that the server is not receiving or responding to any RTSP commands once the play command has been issued when streaming over tcp. In particular I've tested the TEARDOWN and SET_PARAMETER commands both of which the server never receives when streaming over tcp. I haven't gotten far enough into the server code to figure out how this is supposed to work or even if it is supposed to work? One question I have is when streaming over TCP does the client use the same TCP socket connection it's streaming over to send and receive further RTSP commands and responses or does it open up new connections in the same way a it would if streaming over UDP? It seems like the later would work and be easer to implement but I'm getting the feeling it's the former. Thanks, Matt S. Matt Schuckmann wrote: > I know this is sort of a loaded question probably without a clear > answer but here goes. > > I was wondering if there are any known issues with using the > StreamOverTCP option of the RTSPClient::SetupMediaSession() when both > the client application and the server application are based on the Dec > 4 2008 release of Live555. > I ask this because I'm seeing two problems when I try to use this > feature. > > 1. I've created my own RTP profile for an experimental meta data > stream and I've created the necessary classes to stream and receive > this stream along with my H264 video stream and it works create when I > do standard UDP RTP. However, as soon as I turn on the StreamOverTCP > the client starts issuing all this "Discarding interleaved RTP or RTCP > packet" messages and my application never gets any of the data. If I > don't stream the experimental meta data stream the video streams > appears to come through fine. Is there anything I need to do with my > custom meta data source or sink. > On the server side the MetaData stream source is derived from > FramedSource, the Media subsession is derived from > OnDemandServerMediaSubsession and it creates a SimpleRTPSink for the > RTPSink. > On the client the meta data stream sink is derived from MediaSink and > I've allowed the MediaSession object to create a SimpleRTPSource for > this stream. > > 2. I've modifed the RTSPServer to accept, handle, and reply to the > SET_PARAMETER request (a patch is coming) and it works great when I do > RTP over UDP, however when I turn on the StreamOverTCP option as soon > as the client tries to send a SET_PARAMETER request the client appears > to hang indefinitely. > This is probably my problem and I've just started to work it out, but > any help or suggestions anyone might have would be appreciated. > > Any suggestions you might have would be most appreciated, I was hoping > that maybe there is a flag or something that I'm not respecting > (particularly with problem #1). I understand I'm sort of on the harry > edge here so I understand if the only clue you can give is use the > debugger. > > Thanks > Matt Schuckmann > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From matt at schuckmannacres.com Wed Dec 31 13:57:25 2008 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Wed, 31 Dec 2008 13:57:25 -0800 Subject: [Live-devel] Status of Live555 RTPS Client and Server using Stream over TCP In-Reply-To: <495BD606.1010804@schuckmannacres.com> References: <495BCDBC.4010208@schuckmannacres.com> <495BD606.1010804@schuckmannacres.com> Message-ID: <495BEAC5.8000204@schuckmannacres.com> After doing a bit more digging in the code and on the web I've discovered that the problem of not receiving RTSP commands after the play command when using RTP-over-TCP streaming is a known problem (although it is usually associated with the keep-a-live feature (cough hack) and not with usages like issuing PAUSE, SET_PARAMETER, and TEARDOWN commands after starting playing. I had intended to use RTP-over-TCP as a fall back in situations where UDP won't work (for the usual reasons) Ross are there any plans or thoughts on how to fix this? I may have time to work on making this work in the near future. I also discovered that the RTSPclient does indeed establish 1 and only 1 TCP connection per session, regardless if it's TCP or UDP streaming. This is a bit surprising to me considering the connectionless aspects of RTSP. Furthermore with respect to RTP-over-TCP it seems it would virtually eliminate and greatly simplify the whole problem (I'm facing) if the client simply opened a new connection to the server for each RTSP command (or at least created a new one after the original had been subsumed for the purpose of streaming rtp over tcp). Comments and discussion are welcome. I'm still working on figuring out problem 1 from my original post. Thanks Matt S. Matt Schuckmann wrote: > Ok I've got a little more information. > With respect to problem #2 I've discovered that the server is not > receiving or responding to any RTSP commands once the play command has > been issued when streaming over tcp. In particular I've tested the > TEARDOWN and SET_PARAMETER commands both of which the server never > receives when streaming over tcp. > > I haven't gotten far enough into the server code to figure out how > this is supposed to work or even if it is supposed to work? > > One question I have is when streaming over TCP does the client use the > same TCP socket connection it's streaming over to send and receive > further RTSP commands and responses or does it open up new connections > in the same way a it would if streaming over UDP? > It seems like the later would work and be easer to implement but I'm > getting the feeling it's the former. > > Thanks, > Matt S. > > > Matt Schuckmann wrote: >> I know this is sort of a loaded question probably without a clear >> answer but here goes. >> >> I was wondering if there are any known issues with using the >> StreamOverTCP option of the RTSPClient::SetupMediaSession() when both >> the client application and the server application are based on the >> Dec 4 2008 release of Live555. >> I ask this because I'm seeing two problems when I try to use this >> feature. >> >> 1. I've created my own RTP profile for an experimental meta data >> stream and I've created the necessary classes to stream and receive >> this stream along with my H264 video stream and it works create when >> I do standard UDP RTP. However, as soon as I turn on the >> StreamOverTCP the client starts issuing all this "Discarding >> interleaved RTP or RTCP packet" messages and my application never >> gets any of the data. If I don't stream the experimental meta data >> stream the video streams appears to come through fine. Is there >> anything I need to do with my custom meta data source or sink. >> On the server side the MetaData stream source is derived from >> FramedSource, the Media subsession is derived from >> OnDemandServerMediaSubsession and it creates a SimpleRTPSink for the >> RTPSink. >> On the client the meta data stream sink is derived from MediaSink and >> I've allowed the MediaSession object to create a SimpleRTPSource for >> this stream. >> >> 2. I've modifed the RTSPServer to accept, handle, and reply to the >> SET_PARAMETER request (a patch is coming) and it works great when I >> do RTP over UDP, however when I turn on the StreamOverTCP option as >> soon as the client tries to send a SET_PARAMETER request the client >> appears to hang indefinitely. >> This is probably my problem and I've just started to work it out, but >> any help or suggestions anyone might have would be appreciated. >> >> Any suggestions you might have would be most appreciated, I was >> hoping that maybe there is a flag or something that I'm not >> respecting (particularly with problem #1). I understand I'm sort of >> on the harry edge here so I understand if the only clue you can give >> is use the debugger. >> >> Thanks >> Matt Schuckmann >> >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From matt at schuckmannacres.com Wed Dec 31 14:23:41 2008 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Wed, 31 Dec 2008 14:23:41 -0800 Subject: [Live-devel] Status of Live555 RTPS Client and Server using Stream over TCP In-Reply-To: <495BEAC5.8000204@schuckmannacres.com> References: <495BCDBC.4010208@schuckmannacres.com> <495BD606.1010804@schuckmannacres.com> <495BEAC5.8000204@schuckmannacres.com> Message-ID: <495BF0ED.6040300@schuckmannacres.com> I found this very good discussion on persistent tcp vs non-persistent tcp RTSP clients. http://www.ietf.org/mail-archive/web/mmusic/current/msg00655.html From reading some of this thread and the live555 RTPSClient code it appears that the RTSPClient is a persistent tcp client is this true? How about the live555 RTSP server? I hope it works either way I understand at least some of the reasons for making RTSPClient a persistent tcp client however it seems that some things could be made simpler if you could make it non-persistent (in particular this RTP-over-TCP streaming problem). Has any work ever been done in this direction? Thanks for listening to my rambling. Matt S. Matt Schuckmann wrote: > After doing a bit more digging in the code and on the web I've > discovered that the problem of not receiving RTSP commands after the > play command when using RTP-over-TCP streaming is a known problem > (although it is usually associated with the keep-a-live feature (cough > hack) and not with usages like issuing PAUSE, SET_PARAMETER, and > TEARDOWN commands after starting playing. I had intended to use > RTP-over-TCP as a fall back in situations where UDP won't work (for > the usual reasons) > Ross are there any plans or thoughts on how to fix this? I may have > time to work on making this work in the near future. > > I also discovered that the RTSPclient does indeed establish 1 and only > 1 TCP connection per session, regardless if it's TCP or UDP streaming. > This is a bit surprising to me considering the connectionless aspects > of RTSP. Furthermore with respect to RTP-over-TCP it seems it would > virtually eliminate and greatly simplify the whole problem (I'm > facing) if the client simply opened a new connection to the server for > each RTSP command (or at least created a new one after the original > had been subsumed for the purpose of streaming rtp over tcp). > Comments and discussion are welcome. > > I'm still working on figuring out problem 1 from my original post. > > Thanks > Matt S. > > > Matt Schuckmann wrote: >> Ok I've got a little more information. >> With respect to problem #2 I've discovered that the server is not >> receiving or responding to any RTSP commands once the play command >> has been issued when streaming over tcp. In particular I've tested >> the TEARDOWN and SET_PARAMETER commands both of which the server >> never receives when streaming over tcp. >> >> I haven't gotten far enough into the server code to figure out how >> this is supposed to work or even if it is supposed to work? >> >> One question I have is when streaming over TCP does the client use >> the same TCP socket connection it's streaming over to send and >> receive further RTSP commands and responses or does it open up new >> connections in the same way a it would if streaming over UDP? >> It seems like the later would work and be easer to implement but I'm >> getting the feeling it's the former. >> >> Thanks, >> Matt S. >> >> >> Matt Schuckmann wrote: >>> I know this is sort of a loaded question probably without a clear >>> answer but here goes. >>> >>> I was wondering if there are any known issues with using the >>> StreamOverTCP option of the RTSPClient::SetupMediaSession() when >>> both the client application and the server application are based on >>> the Dec 4 2008 release of Live555. >>> I ask this because I'm seeing two problems when I try to use this >>> feature. >>> >>> 1. I've created my own RTP profile for an experimental meta data >>> stream and I've created the necessary classes to stream and receive >>> this stream along with my H264 video stream and it works create when >>> I do standard UDP RTP. However, as soon as I turn on the >>> StreamOverTCP the client starts issuing all this "Discarding >>> interleaved RTP or RTCP packet" messages and my application never >>> gets any of the data. If I don't stream the experimental meta data >>> stream the video streams appears to come through fine. Is there >>> anything I need to do with my custom meta data source or sink. >>> On the server side the MetaData stream source is derived from >>> FramedSource, the Media subsession is derived from >>> OnDemandServerMediaSubsession and it creates a SimpleRTPSink for the >>> RTPSink. >>> On the client the meta data stream sink is derived from MediaSink >>> and I've allowed the MediaSession object to create a SimpleRTPSource >>> for this stream. >>> >>> 2. I've modifed the RTSPServer to accept, handle, and reply to the >>> SET_PARAMETER request (a patch is coming) and it works great when I >>> do RTP over UDP, however when I turn on the StreamOverTCP option as >>> soon as the client tries to send a SET_PARAMETER request the client >>> appears to hang indefinitely. >>> This is probably my problem and I've just started to work it out, >>> but any help or suggestions anyone might have would be appreciated. >>> >>> Any suggestions you might have would be most appreciated, I was >>> hoping that maybe there is a flag or something that I'm not >>> respecting (particularly with problem #1). I understand I'm sort of >>> on the harry edge here so I understand if the only clue you can give >>> is use the debugger. >>> >>> Thanks >>> Matt Schuckmann >>> >>> >>> >>> _______________________________________________ >>> live-devel mailing list >>> live-devel at lists.live555.com >>> http://lists.live555.com/mailman/listinfo/live-devel >>> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Wed Dec 31 16:52:31 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 31 Dec 2008 16:52:31 -0800 Subject: [Live-devel] QTKit QTCaptureSession as a DeviceSource In-Reply-To: References: Message-ID: >Hello All, >I having some difficulty on getting samples out of a >QTCaptureSession into MPEG4VideoStreamDiscreteFramer. > >The DeviceSource::doNextFrame is only being called every 1 to 1.5 >seconds, the delay is not >in MPEG4VideoStreamDiscreteFramer::doGetNextFrame(). The duration between successive calls to "getNextFrame()" (and thus "doGetNextFrame()") by the downstream object depends upon the value of the "fDurationInMicroseconds" that you (should) set in your object's implementation, before calling "FramedSource::afterGetting". > // Deliver the data here: > if(fMaxSize < len){ > fNumTruncatedBytes = len - fMaxSize; > len = fMaxSize; > printf("Frame size truncated\n"); > } > fDurationInMicroseconds = dura; What is "dura"? Are you sure that it is the proper duration of each frame, in microseconds? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Wed Dec 31 18:14:28 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Wed, 31 Dec 2008 21:14:28 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a DeviceSource In-Reply-To: References: Message-ID: dura is set for each call, 33366 for 30 frames per sec or 66666 for 15 frames per sec, still the data is called for ~ every 1. modulo_time_base=0x00 fNumVTIRBits=0x00 vop_time_increment=0x00 The above are always zero, could this be the problem? Also I'm not setting the presentation time, what format should it be in or is it not needed? Thank you for the help!!! Austin On Dec 31, 2008, at 7:52 PM, Ross Finlayson wrote: >> Hello All, >> I having some difficulty on getting samples out of a >> QTCaptureSession into MPEG4VideoStreamDiscreteFramer. >> >> The DeviceSource::doNextFrame is only being called every 1 to 1.5 >> seconds, the delay is not in >> MPEG4VideoStreamDiscreteFramer::doGetNextFrame(). > > The duration between successive calls to "getNextFrame()" (and thus > "doGetNextFrame()") by the downstream object depends upon the value > of the "fDurationInMicroseconds" that you (should) set in your > object's implementation, before calling "FramedSource::afterGetting". > >> // Deliver the data here: >> if(fMaxSize < len){ >> fNumTruncatedBytes = len - fMaxSize; >> len = fMaxSize; >> printf("Frame size truncated\n"); >> } >> fDurationInMicroseconds = dura; > > What is "dura"? Are you sure that it is the proper duration of each > frame, in microseconds? > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Wed Dec 31 18:59:48 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 31 Dec 2008 18:59:48 -0800 Subject: [Live-devel] QTKit QTCaptureSession as a DeviceSource In-Reply-To: References: Message-ID: >dura is set for each call, 33366 for 30 frames per sec or 66666 for >15 frames per sec, still the data is called for ~ every 1. > >modulo_time_base=0x00 >fNumVTIRBits=0x00 >vop_time_increment=0x00 > >The above are always zero, could this be the problem? No, I don't think so. > >Also I'm not setting the presentation time, what format should it be >in or is it not needed? Yes, you need to set "fPresentationTime"; the best way to do this is to set it to the result of calling "gettimeofday()" each time. However, that's probably not the cause of your problem. Sorry, but I don't know what's causing the delay that you're seeing. Are you *sure* that there is a ~1 second delay between each call to "FramedSource::afterGetting()" and the *next* call to "doGetNextFrame()"? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From asnow at pathfindertv.net Wed Dec 31 19:53:42 2008 From: asnow at pathfindertv.net (Austin Snow) Date: Wed, 31 Dec 2008 22:53:42 -0500 Subject: [Live-devel] QTKit QTCaptureSession as a DeviceSource In-Reply-To: References: Message-ID: <892AC9B7-BAC4-4A92-B20C-AB60DAE3C74A@pathfindertv.net> There is not any large delay (~5 to 10ms) from the call to downstream call to doGetNextFrame() and the upstream call to FrameSource::afterGetting(this) in DeviceSource.cpp. from downstream call to doGetNextFrame() fPresentationTime.tv_sec=1230780299 fPresentationTime.tv_usec=377538 from upstream call to FrameSource::afterGetting(this) PresentationTime.tv_sec=1230780299 fPresentationTime.tv_usec=381672 next downstream call to doGetNextFrame() fPresentationTime.tv_sec=1230780301 fPresentationTime.tv_usec=46294 sometimes to less then a 1 and sometimes its more as high as 3 sec. The delay is waiting for if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet to NOT return. Also, I'm setting the presentation time, no change. Could it have something to do with the passing of the data to the encapsulation for the ip packets? There is not much multicast traffic being sent. Thanks again. On Dec 31, 2008, at 9:59 PM, Ross Finlayson wrote: >> dura is set for each call, 33366 for 30 frames per sec or 66666 for >> 15 frames per sec, still the data is called for ~ every 1. >> >> modulo_time_base=0x00 >> fNumVTIRBits=0x00 >> vop_time_increment=0x00 >> >> The above are always zero, could this be the problem? > > No, I don't think so. > >> >> Also I'm not setting the presentation time, what format should it >> be in or is it not needed? > > Yes, you need to set "fPresentationTime"; the best way to do this is > to set it to the result of calling "gettimeofday()" each time. > However, that's probably not the cause of your problem. > > Sorry, but I don't know what's causing the delay that you're seeing. > Are you *sure* that there is a ~1 second delay between each call to > "FramedSource::afterGetting()" and the *next* call to > "doGetNextFrame()"? > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Dec 31 21:47:51 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 31 Dec 2008 21:47:51 -0800 Subject: [Live-devel] Status of Live555 RTPS Client and Server using Stream over TCP In-Reply-To: <495BEAC5.8000204@schuckmannacres.com> References: <495BCDBC.4010208@schuckmannacres.com> <495BD606.1010804@schuckmannacres.com> <495BEAC5.8000204@schuckmannacres.com> Message-ID: >After doing a bit more digging in the code and on the web I've >discovered that the problem of not receiving RTSP commands after the >play command when using RTP-over-TCP streaming is a known problem >(although it is usually associated with the keep-a-live feature >(cough hack The standard way for clients (using either RTP/UDP *or* RTP/TCP) to indicate their 'liveness' to the server is be sending back RTCP "RR" packets (which are a required part of the RTP/RTCP standard). Our (server and client) supports this OK, even when doing RTP/RTCP-over-TCP. I'm sorry, but once you've made modifications to the supplied code I won't (in general) be able to help you. However, if you want to experiment with RTP/RTCP-over-TCP streaming, I suggest that you start using the (original, unmodified) "live555MediaServer" (or "testOnDemandRTSPServer" as your server, and "openRTSP -t" as your client. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/