From ulrik.mikaelsson at gmail.com Thu Mar 1 01:29:08 2007 From: ulrik.mikaelsson at gmail.com (Ulrik Mikaelsson) Date: Thu, 1 Mar 2007 10:29:08 +0100 Subject: [Live-devel] Trick play support Message-ID: <15c1dfa0703010129v62e91dd4o8598ec5d8176ff69@mail.gmail.com> Hi there, I just tried out the liveMediaServer on a MPEG2 TS stream, with the Motorola/Kreatel STB 1510 as client. Definitely interesting results. Though the behavior was a bit flaky, the streams actually ran, and all trick-play operations worked, to some extent. However, there were some glitches, for instance video were quite a bit distorted on ffwd with high scales, and the client crashed at a number of times, during trick play operations. I've got two followup questions: * Is there a "reference" client to try with, supporting trick-play. Of course theres the openRTSP command-line tool, but since most issues I had were related to going back and forth, i can't really test that with the openRTSP utility, I think? * During the test, i ran a packet capture on the streaming server. The packet capture showed very clean nice network traffic out from the server during regular play, but during fast-forward, the bandwidths jumped sky high, and sometimes exceeded 20 mbit/second on a VBR stream with an average bitrate of 5mbit/s. Is this expected behavior? Regards / Ulrik From finlayson at live555.com Thu Mar 1 01:23:50 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 Mar 2007 01:23:50 -0800 Subject: [Live-devel] Simulation and packet loss In-Reply-To: <45E60C79.3090606@libero.it> References: <45E60C79.3090606@libero.it> Message-ID: >For my thesis, i have to study the effect of packet loss on a MPEG video >recived during a streaming session. I thinked to execute locally both >Live555MediaServer and VLC client. >Is it possibile to edit the live555MediaServer, in order to implement a >packet loss model during streaming of a MPEG video? What file i have to >edit? Search for "TEST_LOSS" in the file "liveMedia/MultiFramedRTPSink.cpp". There you'll see code (#ifdef'd out, by default) that simulates 10% packet loss (upon transmission). I suggest using that code (changing it, as appropriate, to simulate a particular desired packet loss rate). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 1 02:10:46 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 Mar 2007 02:10:46 -0800 Subject: [Live-devel] Trick play support In-Reply-To: <15c1dfa0703010129v62e91dd4o8598ec5d8176ff69@mail.gmail.com> References: <15c1dfa0703010129v62e91dd4o8598ec5d8176ff69@mail.gmail.com> Message-ID: >I just tried out the liveMediaServer on a MPEG2 TS stream, with the >Motorola/Kreatel STB 1510 as client. Definitely interesting results. >Though the behavior was a bit flaky, the streams actually ran, and all >trick-play operations worked, to some extent. > >However, there were some glitches, for instance video were quite a bit >distorted on ffwd with high scales, and the client crashed at a number >of times, during trick play operations. Only Motorola can help you with that :-) > >I've got two followup questions: >* Is there a "reference" client to try with, supporting trick-play. No, not really. However, the Amino 110 set-top box is known to work. >* During the test, i ran a packet capture on the streaming server. The >packet capture showed very clean nice network traffic out from the >server during regular play, but during fast-forward, the bandwidths >jumped sky high, and sometimes exceeded 20 mbit/second on a VBR stream >with an average bitrate of 5mbit/s. Is this expected behavior? Yes, because the streams used for fast-forward and reverse play consist solely of I-frames. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 1 02:47:23 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 Mar 2007 02:47:23 -0800 Subject: [Live-devel] Transport Stream In-Reply-To: <000001c75bb4$a8d1a160$0a2a320a@telxsi.com> References: <000001c75bb4$a8d1a160$0a2a320a@telxsi.com> Message-ID: >C:\Fresh Live555\live\testProgs>MPEG2TransportStreamIndexer FTN_Part1.ts >Writing index file "FTN_Part1.tsx"......done > >After that it creat FTN_Part1.txs file of size 0KB.I can not understand what >was happing. The problem with your Transport Stream file is that each Program Association Table (PAT) references more than one program (i.e. includes more than one "program_map_PID"), and, because of this, our indexing code does not know which program is the real one (i.e., the one that contains your actual audio and video data). For a Transport Stream file to be indexable, it should contain only one program. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From morgan.torvolt at gmail.com Thu Mar 1 04:12:19 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Thu, 1 Mar 2007 16:12:19 +0400 Subject: [Live-devel] Trick play support In-Reply-To: References: <15c1dfa0703010129v62e91dd4o8598ec5d8176ff69@mail.gmail.com> Message-ID: <3cc3561f0703010412l7fe85fdftb892533da7fe742@mail.gmail.com> > >* During the test, i ran a packet capture on the streaming server. The > >packet capture showed very clean nice network traffic out from the > >server during regular play, but during fast-forward, the bandwidths > >jumped sky high, and sometimes exceeded 20 mbit/second on a VBR stream > >with an average bitrate of 5mbit/s. Is this expected behavior? > > Yes, because the streams used for fast-forward and reverse play > consist solely of I-frames. I have heard that some kreatel boxes does not support high bitrates, even bitrates that are far within specs. I have also heard rumors that even some satellite channels need to be reencoded to lower bitrate for the STB to be able to play them. Given such a high bitrates as here, this does indeed sound like a kreatel issue. I have myself observed that some tv channels streamed to kreatel boxes has been reencoded, even on a fibre network. I cannot say the exact reason for this, but high bitrate troubles comes to mind... -Morgan- From ulrik.mikaelsson at gmail.com Thu Mar 1 05:34:28 2007 From: ulrik.mikaelsson at gmail.com (Ulrik Mikaelsson) Date: Thu, 1 Mar 2007 14:34:28 +0100 Subject: [Live-devel] Trick play support In-Reply-To: <3cc3561f0703010412l7fe85fdftb892533da7fe742@mail.gmail.com> References: <15c1dfa0703010129v62e91dd4o8598ec5d8176ff69@mail.gmail.com> <3cc3561f0703010412l7fe85fdftb892533da7fe742@mail.gmail.com> Message-ID: <15c1dfa0703010534l51b9110br1418461519789368@mail.gmail.com> > I have heard that some kreatel boxes does not support high bitrates, > even bitrates that are far within specs. I have also heard rumors that > even some satellite channels need to be reencoded to lower bitrate for > the STB to be able to play them. Given such a high bitrates as here, > this does indeed sound like a kreatel issue. > > I have myself observed that some tv channels streamed to kreatel boxes > has been reencoded, even on a fibre network. I cannot say the exact > reason for this, but high bitrate troubles comes to mind... Interesting point. I know the box works without issues on 12mbit streams, but I haven't tried any higher than that. (more than 12mbit for SDTV is quite silly anyways. ;) From ulrik.mikaelsson at gmail.com Thu Mar 1 05:37:18 2007 From: ulrik.mikaelsson at gmail.com (Ulrik Mikaelsson) Date: Thu, 1 Mar 2007 14:37:18 +0100 Subject: [Live-devel] Trick play support In-Reply-To: References: <15c1dfa0703010129v62e91dd4o8598ec5d8176ff69@mail.gmail.com> Message-ID: <15c1dfa0703010537u6d76d387r11afa4a78a0211d4@mail.gmail.com> > >However, there were some glitches, for instance video were quite a bit > >distorted on ffwd with high scales, and the client crashed at a number > >of times, during trick play operations. > Only Motorola can help you with that :-) Probably, but since I've been able to confirm trick-play on the client using other server solutions, I were looking for ways to reference-test the liveMediaServer using some other toolkit to better push Motorola. > >I've got two followup questions: > >* Is there a "reference" client to try with, supporting trick-play. > No, not really. However, the Amino 110 set-top box is known to work. Thank you for the tip. I think I have a 110 lying around somewhere at work. Will be interesting to watch. :) > >* During the test, i ran a packet capture on the streaming server. The > >packet capture showed very clean nice network traffic out from the > >server during regular play, but during fast-forward, the bandwidths > >jumped sky high, and sometimes exceeded 20 mbit/second on a VBR stream > >with an average bitrate of 5mbit/s. Is this expected behavior? > Yes, because the streams used for fast-forward and reverse play > consist solely of I-frames. Is pre-encoding material for ffwd/reverse in the roadmap? From morgan.torvolt at gmail.com Thu Mar 1 07:31:54 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Thu, 1 Mar 2007 19:31:54 +0400 Subject: [Live-devel] Trick play support In-Reply-To: <15c1dfa0703010534l51b9110br1418461519789368@mail.gmail.com> References: <15c1dfa0703010129v62e91dd4o8598ec5d8176ff69@mail.gmail.com> <3cc3561f0703010412l7fe85fdftb892533da7fe742@mail.gmail.com> <15c1dfa0703010534l51b9110br1418461519789368@mail.gmail.com> Message-ID: <3cc3561f0703010731q3efbf5sae93a0d3e303be9d@mail.gmail.com> > Interesting point. I know the box works without issues on 12mbit > streams, but I haven't tried any higher than that. (more than 12mbit > for SDTV is quite silly anyways. ;) I have to agree on the sillyness =) The channel I saw was average 8Mbit variable bitrate stream (2 streams shared 16Mbit, with minimum set to 4 if i remember correctly. I know this because I worked at the earth station transmitting the two channels, and knew the configuration of the equipment =) ). The problem could of course be caused by poor variable bitrate handling causing a need for reencoding to CBR. I never really considered that before just now. The stream was possibly high bitrate after encoding also, but reencoded streams not using 4:2:2 and I frame detection will give some pretty bad results most of the time. All I noticed was that it, as opposed to all the other services, this one was reencoded. Other services also had such high average bitrates, but none had VBR. I don't know if that information helps you any, as I realize now that I did not give you much to work with. A test of the VBR handling could possibly give you a hint, but I guess that these IPTV STBs usually have the same MPEG2 decoders as ordinary satellite STBs. -Morgan- From xcsmith at rockwellcollins.com Thu Mar 1 07:43:06 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Thu, 1 Mar 2007 09:43:06 -0600 Subject: [Live-devel] Trick play support Message-ID: I have heard that some kreatel boxes does not support high bitrates, even bitrates that are far within specs. -Morgan- Morgan, where do you find specs for bitrates? Thx! xochitl From morgan.torvolt at gmail.com Thu Mar 1 08:33:59 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Thu, 1 Mar 2007 20:33:59 +0400 Subject: [Live-devel] Trick play support In-Reply-To: References: Message-ID: <3cc3561f0703010833q67434683ke37cf9573ff5a02b@mail.gmail.com> > Morgan, where do you find specs for bitrates? Thx! > > xochitl For the STB you need to go to the producer I guess. For bitrates on services you need to measure it. As to the services I saw, I worked at the satellite uplink station (earth station), so I have inside information. -Morgan- From zhouh31415 at 163.com Thu Mar 1 19:46:36 2007 From: zhouh31415 at 163.com (=?gbk?B?1ty66w==?=) Date: Fri, 2 Mar 2007 11:46:36 +0800 (CST) Subject: [Live-devel] Question abaut RTCP Message-ID: <2414394.4602801172807196022.JavaMail.root@bj163app15.163.com> In the testprog, when I add? RTCPInstance::createNew(*env, &rtcpGroupsock, estimatedSessionBandwidth, CNAME, NULL , source);/* we're a client */ ,the RTCP is OK.But I wonder how RTCP run in the program. Can I use the protocol caculating the package loss ratio? Or use it estimating the state of our network? How to use it? thanks. welltrans Zhouhong -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070301/5f688614/attachment.html From finlayson at live555.com Thu Mar 1 23:26:28 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 Mar 2007 23:26:28 -0800 Subject: [Live-devel] Question abaut RTCP In-Reply-To: <2414394.4602801172807196022.JavaMail.root@bj163app15.163.com> References: <2414394.4602801172807196022.JavaMail.root@bj163app15.163.com> Message-ID: > ,the RTCP is OK.But I wonder how RTCP run in the program. Can I use >the protocol caculating the package loss ratio? Or use it estimating >the state of our network? How to use it? Look at how the "openRTSP" application implements the "-Q" option. I.e., if you are receiving RTP data (like "openRTSP" does), then use "RTPReceptionStats". If, however, you're sending RTP data, then use "RTPTransmissionStats". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ydq at nbicc.com Thu Mar 1 23:34:54 2007 From: ydq at nbicc.com (ydq) Date: Fri, 2 Mar 2007 15:34:54 +0800 Subject: [Live-devel] RTCP "BYE" command Message-ID: <20070302074257.1B93A24B5D8@slave.mail113.cn4e.com> _____ ???: ydq [mailto:ydq at nbicc.com] ????: 2007?3?2? 9:47 ???: 'finlayson at live555.com' ??: RTCP "BYE" command Hi , Thanks for your support ! In your letter you mentioned that Servers usually use the RTCP "BYE" command - which we support - to inform the client of the end of the stream. Dose Darwin Server support it ? In the openRTSP program, it set subsession->rtcpInstance()->setByeHandler(subsessionByeHandler , subsession) , so at the end of stream , program will close down session and exit immediately . But, in fact it does not do it ! Best regards ! -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070301/0127de3d/attachment.html From ydq at nbicc.com Thu Mar 1 23:36:51 2007 From: ydq at nbicc.com (ydq) Date: Fri, 2 Mar 2007 15:36:51 +0800 Subject: [Live-devel] RTCP "BYE" command Message-ID: <20070302074454.346DE24B34D@slave.mail113.cn4e.com> Hi , Thanks for your support ! In your letter you mentioned that Servers usually use the RTCP "BYE" command - which we support - to inform the client of the end of the stream. Dose Darwin Server support it ? In the openRTSP program, it set subsession->rtcpInstance()->setByeHandler(subsessionByeHandler , subsession) , so at the end of stream , program will close down session and exit immediately . But, in fact it does not do it ! Best regards ! -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070301/717d4e34/attachment.html From finlayson at live555.com Fri Mar 2 02:26:45 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Mar 2007 02:26:45 -0800 Subject: [Live-devel] RTCP "BYE" command In-Reply-To: <20070302074454.346DE24B34D@slave.mail113.cn4e.com> References: <20070302074454.346DE24B34D@slave.mail113.cn4e.com> Message-ID: >Hi , >Thanks for your support ! >In your letter you mentioned that Servers usually use the RTCP "BYE" >command - which we support - to inform the client of the end of the >stream. >Dose Darwin Server support it ? I don't know; you'll have to ask them. (This mailing list is not for the "Darwin Streaming Server" - that's Apple's software, not ours.) However, *our* RTSP server implementation (e.g, as used in the "LIVE555 Media Server") *does* send RTCP "BYE" packets when the stream ends, but only for media types for which 'seeking' is not supported. (For those media types, the server keeps the session open after the stream ends, in case the client wants to replay it from an earlier time.) > >In the openRTSP program, it set >subsession->rtcpInstance()->setByeHandler(subsessionByeHandler , >subsession) , so at the end of stream , program will close down > session and exit immediately . But, in fact it does not do it ! This code works correctly, *if* the server sends a RTCP "BYE" packet. You can test this by running "openRTSP" with the "-V" option (to specify verbose output). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070302/1ff9586f/attachment-0001.html From tribest_tata at hotmail.com Fri Mar 2 22:30:14 2007 From: tribest_tata at hotmail.com (=?big5?B?qkwgqHysUA==?=) Date: Sat, 03 Mar 2007 06:30:14 +0000 Subject: [Live-devel] testMPEG1or2VideoStreamer Message-ID: i just began to use testpro . I use testMPEG1or2VideoStreamer to stream my ?test.mpg?. Then I use Quicktime player rtsp:// /testStream . But don?t see any Video . How do I do anything?? _________________________________________________________________ Windows Live Messenger ??????????????????????? http://get.live.com/messenger/overview From finlayson at live555.com Fri Mar 2 22:50:28 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 Mar 2007 22:50:28 -0800 Subject: [Live-devel] testMPEG1or2VideoStreamer In-Reply-To: References: Message-ID: >i just began to use testpro . I use testMPEG1or2VideoStreamer to stream my >"test.mpg". > >Then I use Quicktime player rtsp:// /testStream . > >But don't see any Video . > >How do I do anything?? Start by reading the FAQ - in particular http://www.live555.com/liveMedia/faq.html#my-file-doesnt-work -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070302/37bd6552/attachment.html From tribest_tata at hotmail.com Sat Mar 3 00:17:17 2007 From: tribest_tata at hotmail.com (=?big5?B?qkwgqHysUA==?=) Date: Sat, 03 Mar 2007 08:17:17 +0000 Subject: [Live-devel] testMPEG1or2VideoStreamer In-Reply-To: Message-ID: >From: Ross Finlayson >Reply-To: LIVE555 Streaming Media - development & use >To: LIVE555 Streaming Media - development & use >Subject: Re: [Live-devel] testMPEG1or2VideoStreamer >Date: Fri, 2 Mar 2007 22:50:28 -0800 > >>i just began to use testpro . I use testMPEG1or2VideoStreamer to >>stream my >>"test.mpg". >> >>Then I use Quicktime player rtsp:// /testStream . >> >>But don't see any Video . >> >>How do I do anything?? > >Start by reading the FAQ - in particular > http://www.live555.com/liveMedia/faq.html#my-file-doesnt-work >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ >_______________________________________________ >live-devel mailing list >live-devel at lists.live555.com >http://lists.live555.com/mailman/listinfo/live-devel _________________________________________________________________ MSN ??? Match.com ??????????????? http://match.msn.com.tw/Registration/Registration.aspx?trackingid=281200 From stabrawa at stanford.edu Sat Mar 3 16:10:28 2007 From: stabrawa at stanford.edu (Tim Stabrawa) Date: Sat, 03 Mar 2007 18:10:28 -0600 Subject: [Live-devel] Layered video with Live555 In-Reply-To: References: <45DEA2A8.7060805@stanford.edu> Message-ID: <45EA0E74.109@stanford.edu> Ross Finlayson wrote: >> For an academic demonstration, I'm planning on extending Live555 to >> support RTP transport of scalable H.264 video and was hoping someone >> with a reasonable amount of experience with Live555 could help steer me >> in the direction of least pain ... >> >> Basically, I'll be using the reference codec for H.264 SVC (currently in >> development) to generate a file containing H.264 NAL units. The >> important difference between the output of this codec and a standard >> H.264 stream is the addition of two NAL unit types (20 & 21), which >> carry information about which layer of video is described in the >> preceding/current NAL unit. For now, assume I know how to parse this >> file and determine which NAL units belong to which layers. My intention >> is to send each layer out either multiplexed in the same RTP stream (the >> easy way) or in separate RTP streams (the hard / interesting way), >> according to this draft RFC: >> http://www.ietf.org/internet-drafts/draft-ietf-avt-rtp-svc-00.txt >> > > This is interesting. I suggest proceeding in three steps (with each > step requiring additional work building on the previous steps): > 1/ Stream regular (non-SVC) H.264 video from a file. You will be > able to test this using VLC. > 2/ Add additional SVC layers, multiplexed in the same RTP stream as > the base layer. > 3/ Use separate RTP streams for separate SVC layers. > > If you're streaming on a single RTP stream (steps 1/ or 2/), then > it's fairly straightforward: You'll need to write your own subclass > of "H264VideoStreamFramer"; that subclass will parse the input stream > (from a "ByteStreamFileSource"). You'll then 'play' this to a > "H264VideoRTPSink" object Ok, I think I have the StreamFramer class basically working except for one small problem. To parse the file, I created a subclass of MPEGVideoStreamParser (purely out of convenience), and defined a parse() routine that has two states: PARSING_START_SEQUENCE and PARSING_NAL_UNIT. The parser basically alternates between these two states either throwing data out (to find the first sequence), or saving it (until it finds the next). So naturally, there is no start sequence at the end of the file. It just sorta ends. So what I'm seeing when I play from my Framer source class to a H264VideoFileSink class is that all the NAL units are copied over to the output file except the last one. What I think is happening is, for the last NAL unit, my call to test4Bytes() is throwing an exception once it gets to the end of the file .. causing parse() to return 0. Meanwhile, the StreamParser class goes off and tries to read more from the file, sees that the file is at EOF, and closes the file, etc. I peeked around at some of the mechanisms for handling what to do when a stream gets closed, thinking this would afford me the opportunity to tell my stream parser to give me what it has left it its buffer, but I haven't been able to wrap my mind around it completely yet. Do you think this is the Right Way to do it? Any other suggestions? Below is the code for my parse routines .. there's not much to 'em. - Tim - snip - void H264JSVMVideoStreamParser :: parseStartSequence() { // Find start sequence (0001) u_int32_t test = test4Bytes(); while (test != 0x00000001) { skipBytes(1); test = test4Bytes(); } setParseState(PARSING_NAL_UNIT); skipBytes(4); } unsigned H264JSVMVideoStreamParser :: parseNALUnit() { // Find next start sequence (0001) or end of stream u_int32_t test = test4Bytes(); while (test != 0x00000001) { saveByte(get1Byte()); test = test4Bytes(); } setParseState(PARSING_START_SEQUENCE); return curFrameSize(); } From antoniotirri at libero.it Sat Mar 3 17:44:44 2007 From: antoniotirri at libero.it (Antonio Tirri) Date: Sun, 04 Mar 2007 02:44:44 +0100 Subject: [Live-devel] Simulation and packet loss, part two In-Reply-To: References: <45E60C79.3090606@libero.it> Message-ID: <45EA248C.6070402@libero.it> Hi, i tried to substitute the code: #ifdef TEST_LOSS if ((our_random()%10) != 0) // simulate 10% packet loss ##### #endif with the code: //#ifdef TEST_LOSS if (((double)our_random()%10000)/10000.0 <= 0.8) // simulate 10% packet loss ##### //#endif where 0.8 is (1-0.2), where 0.2 is the packet loss rate(in this case 20%) But my compiler (Visual C++ 6.0) gives me this error: MultiFramedRTPSink.cpp MultiFramedRTPSink.cpp(352) : error C2296: '%' : illegal, left operand has type 'double' NMAKE : fatal error U1077: '"C:\Programmi\Microsoft Visual Studio\VC98\bin\cl"' : return code '0x2' Stop. Error executing NMAKE. How can i edit the code in order to introduce the packet loss rate in correct way? Thanks a lot, Antonio Tirri Ross Finlayson ha scritto: >> For my thesis, i have to study the effect of packet loss on a MPEG video >> recived during a streaming session. I thinked to execute locally both >> Live555MediaServer and VLC client. >> Is it possibile to edit the live555MediaServer, in order to implement a >> packet loss model during streaming of a MPEG video? What file i have to >> edit? >> > > Search for "TEST_LOSS" in the file > "liveMedia/MultiFramedRTPSink.cpp". There you'll see code (#ifdef'd > out, by default) that simulates 10% packet loss (upon transmission). > I suggest using that code (changing it, as appropriate, to simulate a > particular desired packet loss rate). > From stabrawa at stanford.edu Sat Mar 3 17:56:24 2007 From: stabrawa at stanford.edu (Tim Stabrawa) Date: Sat, 03 Mar 2007 19:56:24 -0600 Subject: [Live-devel] Simulation and packet loss, part two In-Reply-To: <45EA248C.6070402@libero.it> References: <45E60C79.3090606@libero.it> <45EA248C.6070402@libero.it> Message-ID: <45EA2748.3040801@stanford.edu> Antonio Tirri wrote: > Hi, i tried to substitute the code: > > #ifdef TEST_LOSS > if ((our_random()%10) != 0) // simulate 10% packet loss ##### > #endif > > > with the code: > > //#ifdef TEST_LOSS > if (((double)our_random()%10000)/10000.0 <= 0.8) // simulate 10% packet loss ##### > //#endif > > How can i edit the code in order to introduce the packet loss rate in correct way? > This should do it: if ((our_random()%5) != 0) // simulate 20% packet loss ##### If you need something more flexible, read up on what range of numbers come out of our_random() and do something like this: if (our_random() > OUR_RAND_MAX * 0.2) // simulate 20% packet loss ##### Good luck ... - Tim From finlayson at live555.com Sun Mar 4 04:12:28 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 4 Mar 2007 04:12:28 -0800 Subject: [Live-devel] Layered video with Live555 In-Reply-To: <45EA0E74.109@stanford.edu> References: <45DEA2A8.7060805@stanford.edu> <45EA0E74.109@stanford.edu> Message-ID: >Ok, I think I have the StreamFramer class basically working except for >one small problem. To parse the file, I created a subclass of >MPEGVideoStreamParser (purely out of convenience), and defined a parse() >routine that has two states: PARSING_START_SEQUENCE and >PARSING_NAL_UNIT. The parser basically alternates between these two >states either throwing data out (to find the first sequence), or saving >it (until it finds the next). > >So naturally, there is no start sequence at the end of the file. It >just sorta ends. So what I'm seeing when I play from my Framer source >class to a H264VideoFileSink class is that all the NAL units are copied >over to the output file except the last one. > What I think is happening is, for the last NAL unit, my call to >test4Bytes() is throwing an exception once it gets to the end of the >file .. causing parse() to return 0. Meanwhile, the StreamParser class >goes off and tries to read more from the file, sees that the file is at >EOF, and closes the file, etc. Yes. Unfortunately, the current stream parsing code discards already-read data once it encounters the end of the input file. There are probably ways to work around this (e.g., pass some function other than "FramedSource::handleClosure" to the parent constructor when constructing "MPEGVideoStreamParser"), but it would be messy. Instead, I suggest just ignoring the issue if you can. (Is the very last NAL unit in each file really important to get?) If you can't ignore it, then you could also work around the problem by appending a 'start code' to the end of each file, before you start reading it. By the way, the *real* difficulty that you're going to face in writing your "H264VideoStreamFramer" subclass is computing proper presentation times and durations (the "fPresentationTime" and "fDurationInMicroseconds" member variables) for each NAL unit. (This has tripped up other people who have tried streaming H.264 video from a file.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stabrawa at stanford.edu Sun Mar 4 13:11:58 2007 From: stabrawa at stanford.edu (Tim Stabrawa) Date: Sun, 04 Mar 2007 15:11:58 -0600 Subject: [Live-devel] Layered video with Live555 In-Reply-To: References: <45DEA2A8.7060805@stanford.edu> <45EA0E74.109@stanford.edu> Message-ID: <45EB361E.4050901@stanford.edu> Ross Finlayson wrote: > Yes. Unfortunately, the current stream parsing code discards > already-read data once it encounters the end of the input file. > There are probably ways to work around this (e.g., pass some function > other than "FramedSource::handleClosure" to the parent constructor > when constructing "MPEGVideoStreamParser"), but it would be messy. > Instead, I suggest just ignoring the issue if you can. (Is the very > last NAL unit in each file really important to get?) If you can't > ignore it, then you could also work around the problem by appending a > 'start code' to the end of each file, before you start reading it. > Of the two sample files I'm working with right now, one ends with a suffix NAL unit, and the other actual slice data. So, the effect of dropping that would either be the preceeding frame gets sent on the base layer accidentally or the last chunk of slice data is never received. I suppose I could deal with either if they happened, but if it can be fixed, why not? I tried just putting a fake start code at the end of the file and that seems to do the trick (the received file is the same as the sent one, sans fake start code). That's good enough for now, I'd say. (It's for a demonstration, after all.) > By the way, the *real* difficulty that you're going to face in > writing your "H264VideoStreamFramer" subclass is computing proper > presentation times and durations (the "fPresentationTime" and > "fDurationInMicroseconds" member variables) for each NAL unit. (This > has tripped up other people who have tried streaming H.264 video from > a file. For my purposes, I plan on just computing these based on the average bitrate of the file. Not ideal, but should get the job done. That gets me to thinking though ... When it comes time to re-combine the streams on the client side, I assume the best (only?) way is to use the presentation times to put things back together correctly. Unfortunately, I don't see any "Mux" classes to base this off of. In fact, the only reordering code I see is the ReorderingPacketBuffer class, but this looks like it keys off of RTP sequence numbers, not presentation times. Have you ever heard of someone needing to do this? What would you suggest? - Tim From finlayson at live555.com Sun Mar 4 13:30:21 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 4 Mar 2007 13:30:21 -0800 Subject: [Live-devel] Layered video with Live555 In-Reply-To: <45EB361E.4050901@stanford.edu> References: <45DEA2A8.7060805@stanford.edu> <45EA0E74.109@stanford.edu> <45EB361E.4050901@stanford.edu> Message-ID: > > By the way, the *real* difficulty that you're going to face in >> writing your "H264VideoStreamFramer" subclass is computing proper >> presentation times and durations (the "fPresentationTime" and >> "fDurationInMicroseconds" member variables) for each NAL unit. (This >> has tripped up other people who have tried streaming H.264 video from >> a file. >For my purposes, I plan on just computing these based on the average >bitrate of the file. Not ideal, but should get the job done. > >That gets me to thinking though ... When it comes time to re-combine the >streams on the client side, I assume the best (only?) way is to use the >presentation times to put things back together correctly. >Unfortunately, I don't see any "Mux" classes to base this off of. In >fact, the only reordering code I see is the ReorderingPacketBuffer >class, but this looks like it keys off of RTP sequence numbers, not >presentation times. Have you ever heard of someone needing to do this? >What would you suggest? What you need is a 'jitter buffer' implementation (not really a "mux" ("multiplexor"), because that implies that two or more incoming streams are being merged into one). I suggest just basing your client implementation around VLC , because it already receives/plays (regiular, non-structured) H.264/RTP streams (using the "LIVE555 Streaming Media" code). If you use VLC, you'll be saving yourself lots of work (and, perhaps, the additions that you make to support structured H.264 video will end up being of use to the VLC codebase). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stabrawa at stanford.edu Sun Mar 4 15:32:21 2007 From: stabrawa at stanford.edu (Tim Stabrawa) Date: Sun, 04 Mar 2007 17:32:21 -0600 Subject: [Live-devel] Layered video with Live555 In-Reply-To: References: <45DEA2A8.7060805@stanford.edu> <45EA0E74.109@stanford.edu> <45EB361E.4050901@stanford.edu> Message-ID: <45EB5705.3050001@stanford.edu> Ross Finlayson wrote: > What you need is a 'jitter buffer' implementation (not really a "mux" > ("multiplexor"), because that implies that two or more incoming > streams are being merged into one). > > I suggest just basing your client implementation around VLC > , because it already receives/plays (regiular, > non-structured) H.264/RTP streams (using the "LIVE555 Streaming > Media" code). If you use VLC, you'll be saving yourself lots of work > (and, perhaps, the additions that you make to support structured > H.264 video will end up being of use to the VLC codebase). > Well, actually, I think we will have multiple incoming streams to deal with. What we intend to demonstrate is basically separating the layers into multiple RTP streams on the server side, then letting the client choose which streams to subscribe to and recombine them to produce the desired video quality. We plan on feeding the recombined stream into the reference SVC decoder to let it handle decoding the actual video. We originally considered VLC, but eventually decided not to use it, figuring it'd be too difficult to shoehorn the reference codec into it. Does it still sound like a mux approach is not the way to go? Thanks, - Tim From finlayson at live555.com Sun Mar 4 17:58:01 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 4 Mar 2007 17:58:01 -0800 Subject: [Live-devel] Layered video with Live555 In-Reply-To: <45EB5705.3050001@stanford.edu> References: <45DEA2A8.7060805@stanford.edu> <45EA0E74.109@stanford.edu> <45EB361E.4050901@stanford.edu> <45EB5705.3050001@stanford.edu> Message-ID: >Well, actually, I think we will have multiple incoming streams to deal >with. What we intend to demonstrate is basically separating the layers >into multiple RTP streams on the server side, then letting the client >choose which streams to subscribe to and recombine them to produce the >desired video quality. We plan on feeding the recombined stream into >the reference SVC decoder to let it handle decoding the actual video. >We originally considered VLC, but eventually decided not to use it, >figuring it'd be too difficult to shoehorn the reference codec into it. > >Does it still sound like a mux approach is not the way to go? Yes, what you're describing is 'multiplexing'. (I had forgotten that you were planning to have multiple incoming layers. If you have just one layer of RTP video (the normal case), then there's no multiplexing - just time-based playout (based on presentation time).) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bidibulle at operamail.com Mon Mar 5 06:49:47 2007 From: bidibulle at operamail.com (David Betrand) Date: Mon, 05 Mar 2007 15:49:47 +0100 Subject: [Live-devel] AMR timestamps Message-ID: <20070305144947.96B4E2474D@ws5-3.us4.outblaze.com> Hello Ross, I am experiencing a timestamp problem when I use AMRAudioRTPSource and when I receive several AMR frames in a sinle RTP packet. In RawAMRRTPSource:hasBeenSynchronizedUsingRTCP() : the method waits at least a complete interleave cycle of synchronized packets before returning true. So in this case some frames are not reported "synchronized" even if these frames are actually coming from a "synchronized" RTP packet. My problem is that I need exactly to know if a frame is coming from a "synchronized" or not "synchronized" RTP packet. It is crucial when there is a large gap between streamer and receiver clocks. In this case it could lead to huge discontinutity of timestamps, in the past or in the future. I solved this problem storing the "synchronization" info for each AMRDeinterleaverBuffer object. A boolean value is also stored in RawAMRRTPSource (actually fInputSource from AMRDeinterleaver point of view) and updated by reference at each time we call retreiveFrame method. By this mechanism the method hasBeenSynchronizedUsingRTCP() simply returns the new boolean. I attached my patch for consideration to this email. Are changes in the code suitable? Of course another suggestion to solve my problem will be welcome. Thanks a lot in advance for your help. David -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze -------------- next part -------------- 47,48d46 < Boolean fIsSynchronized; < 101c99 < RawAMRRTPSource* fInputSource; --- > FramedSource* fInputSource; 104d101 < 220c217 < fNumSuccessiveSyncedPackets(0), fIsSynchronized(false) { --- > fNumSuccessiveSyncedPackets(0) { 318c315 < return fIsWideband ? "audio/AMR-WB" : "audio/AMR-NB"; --- > return fIsWideband ? "audio/AMR-WB" : "audio/AMR-WB"; 322,326d318 < < return fIsSynchronized; < < // LIVE MEDIA VERSION < /* 336d327 < */ 413,414c404 < struct timeval& resultPresentationTime, < Boolean& resultIsSynchronized); --- > struct timeval& resultPresentationTime); 431,432d420 < < Boolean fIsSynchronized; 454c442,443 < return new AMRDeinterleaver(env, isWideband, numChannels, maxInterleaveGroupSize, inputSource); --- > return new AMRDeinterleaver(env, isWideband, numChannels, maxInterleaveGroupSize, > inputSource); 475d463 < 479,481c467 < fLastFrameHeader, fPresentationTime, < fInputSource->fIsSynchronized)) { < --- > fLastFrameHeader, fPresentationTime)) { 519d504 < 560d544 < 616d599 < inBin.fIsSynchronized = (RTPSource*)source->RTPSource::hasBeenSynchronizedUsingRTCP(); 630,632c613 < struct timeval& resultPresentationTime, < Boolean& resultIsSynchronized) { < --- > struct timeval& resultPresentationTime) { 636d616 < 640d619 < resultIsSynchronized = outBin.fIsSynchronized; From finlayson at live555.com Mon Mar 5 07:22:24 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 5 Mar 2007 07:22:24 -0800 Subject: [Live-devel] AMR timestamps In-Reply-To: <20070305144947.96B4E2474D@ws5-3.us4.outblaze.com> References: <20070305144947.96B4E2474D@ws5-3.us4.outblaze.com> Message-ID: Thanks, but please resend this as a proper patch file - e.g., generated using diff -c -B -b AMRAudioRTPSource.cpp.orig AMRAudioRTPSource.cpp -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jlfurlong at hotmail.com Mon Mar 5 07:48:12 2007 From: jlfurlong at hotmail.com (Jeff Furlong) Date: Mon, 5 Mar 2007 11:48:12 -0400 Subject: [Live-devel] Index file for H264 over mpeg2ts Message-ID: Hi Ross, I'm not sure if you've had a chance to look at this file yet, but I did notice in the MPEG2IndexFromTransportStream.cpp file, that it is currently only looking for stream types 1 or 2. For H264, the stream type is 27. I did modify it and tried it, and I got a tsx output file of about 40KB. It obviously still didn't work, but at least it was able to find the video PID. Anyway, just a note so that when and if you get a chance to look at it, it may save you a few minutes. Jeff Date: Tue, 27 Feb 2007 13:18:00 -0800To: live-devel at ns.live555.comFrom: finlayson at live555.comSubject: Re: [Live-devel] Index file for H264 over mpeg2ts I've seen some recent posts regarding H.264 - specifically one regarding the creation of index files. I have placed an HD H.264 sample clip from a live encoder and was wondering if someone (Ross ?) could take a look to see what would be required to be able to support indexing? You can access the file here: http://www.geeknet.ca/~temp/tenten16-2.ts Thanks. (I'm downloading this now.)-- Ross FinlaysonLive Networks, Inc.http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070305/586af45e/attachment.html From bidibulle at operamail.com Mon Mar 5 07:59:11 2007 From: bidibulle at operamail.com (David Betrand) Date: Mon, 05 Mar 2007 16:59:11 +0100 Subject: [Live-devel] AMR timestamps Message-ID: <20070305155911.7084644199@ws5-1.us4.outblaze.com> OK sorry about that. I also fixed a small bug in method MIMEtype() that always returned the string "audio/AMR-WB" whatever the type of AMR. David > ----- Original Message ----- > From: "Ross Finlayson" > To: "LIVE555 Streaming Media - development & use" > Subject: Re: [Live-devel] AMR timestamps > Date: Mon, 5 Mar 2007 07:22:24 -0800 > > > Thanks, but please resend this as a proper patch file - e.g., generated using > diff -c -B -b AMRAudioRTPSource.cpp.orig AMRAudioRTPSource.cpp > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze -------------- next part -------------- *** AMRAudioRTPSource.cpp.orig 2007-03-05 16:15:43.000000000 +0100 --- AMRAudioRTPSource.cpp 2007-03-05 16:30:31.000000000 +0100 *************** *** 44,49 **** --- 44,51 ---- unsigned char* TOC() const { return fTOC; } // FT+Q value for each TOC entry unsigned& frameIndex() { return fFrameIndex; } // index of frame-block within pkt + Boolean fIsSynchronized; + private: RawAMRRTPSource(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat, *************** *** 96,104 **** virtual void doStopGettingFrames(); private: ! FramedSource* fInputSource; class AMRDeinterleavingBuffer* fDeinterleavingBuffer; Boolean fNeedAFrame; }; --- 98,107 ---- virtual void doStopGettingFrames(); private: ! RawAMRRTPSource* fInputSource; class AMRDeinterleavingBuffer* fDeinterleavingBuffer; Boolean fNeedAFrame; + }; *************** *** 214,220 **** fIsWideband(isWideband), fIsOctetAligned(isOctetAligned), fIsInterleaved(isInterleaved), fCRCsArePresent(CRCsArePresent), fILL(0), fILP(0), fTOCSize(0), fTOC(NULL), fFrameIndex(0), ! fNumSuccessiveSyncedPackets(0) { } RawAMRRTPSource::~RawAMRRTPSource() { --- 217,223 ---- fIsWideband(isWideband), fIsOctetAligned(isOctetAligned), fIsInterleaved(isInterleaved), fCRCsArePresent(CRCsArePresent), fILL(0), fILP(0), fTOCSize(0), fTOC(NULL), fFrameIndex(0), ! fNumSuccessiveSyncedPackets(0), fIsSynchronized(false) { } RawAMRRTPSource::~RawAMRRTPSource() { *************** *** 312,330 **** } char const* RawAMRRTPSource::MIMEtype() const { ! return fIsWideband ? "audio/AMR-WB" : "audio/AMR-WB"; } Boolean RawAMRRTPSource::hasBeenSynchronizedUsingRTCP() { ! // Don't report ourselves as being synchronized until we've received ! // at least a complete interleave cycle of synchronized packets. ! // This ensures that the receiver is currently getting a frame from ! // a packet that was synchronized. ! if (fNumSuccessiveSyncedPackets > (unsigned)(fILL+1)) { ! fNumSuccessiveSyncedPackets = fILL + 2; // prevents overflow ! return True; ! } ! return False; } --- 315,325 ---- } char const* RawAMRRTPSource::MIMEtype() const { ! return fIsWideband ? "audio/AMR-WB" : "audio/AMR-NB"; } Boolean RawAMRRTPSource::hasBeenSynchronizedUsingRTCP() { ! return fIsSynchronized; } *************** *** 401,407 **** Boolean retrieveFrame(unsigned char* to, unsigned maxSize, unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes, u_int8_t& resultFrameHeader, ! struct timeval& resultPresentationTime); unsigned char* inputBuffer() { return fInputBuffer; } unsigned inputBufferSize() const { return AMR_MAX_FRAME_SIZE; } --- 396,403 ---- Boolean retrieveFrame(unsigned char* to, unsigned maxSize, unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes, u_int8_t& resultFrameHeader, ! struct timeval& resultPresentationTime, ! Boolean& resultIsSynchronized); unsigned char* inputBuffer() { return fInputBuffer; } unsigned inputBufferSize() const { return AMR_MAX_FRAME_SIZE; } *************** *** 418,423 **** --- 414,421 ---- unsigned char* frameData; u_int8_t frameHeader; struct timeval presentationTime; + + Boolean fIsSynchronized; }; unsigned fNumChannels, fMaxInterleaveGroupSize; *************** *** 439,446 **** ::createNew(UsageEnvironment& env, Boolean isWideband, unsigned numChannels, unsigned maxInterleaveGroupSize, RawAMRRTPSource* inputSource) { ! return new AMRDeinterleaver(env, isWideband, numChannels, maxInterleaveGroupSize, ! inputSource); } AMRDeinterleaver::AMRDeinterleaver(UsageEnvironment& env, --- 437,443 ---- ::createNew(UsageEnvironment& env, Boolean isWideband, unsigned numChannels, unsigned maxInterleaveGroupSize, RawAMRRTPSource* inputSource) { ! return new AMRDeinterleaver(env, isWideband, numChannels, maxInterleaveGroupSize, inputSource); } AMRDeinterleaver::AMRDeinterleaver(UsageEnvironment& env, *************** *** 464,470 **** // First, try getting a frame from the deinterleaving buffer: if (fDeinterleavingBuffer->retrieveFrame(fTo, fMaxSize, fFrameSize, fNumTruncatedBytes, ! fLastFrameHeader, fPresentationTime)) { // Success! fNeedAFrame = False; --- 462,470 ---- // First, try getting a frame from the deinterleaving buffer: if (fDeinterleavingBuffer->retrieveFrame(fTo, fMaxSize, fFrameSize, fNumTruncatedBytes, ! fLastFrameHeader, fPresentationTime, ! fInputSource->fIsSynchronized)) { ! // Success! fNeedAFrame = False; *************** *** 597,602 **** --- 599,605 ---- inBin.frameSize = frameSize; inBin.frameHeader = frameHeader; inBin.presentationTime = presentationTime; + inBin.fIsSynchronized = (RTPSource*)source->RTPSource::hasBeenSynchronizedUsingRTCP(); if (curBuffer == NULL) curBuffer = createNewBuffer(); fInputBuffer = curBuffer; *************** *** 610,616 **** ::retrieveFrame(unsigned char* to, unsigned maxSize, unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes, u_int8_t& resultFrameHeader, ! struct timeval& resultPresentationTime) { if (fNextOutgoingBin >= fOutgoingBinMax) return False; // none left FrameDescriptor& outBin = fFrames[fIncomingBankId^1][fNextOutgoingBin]; --- 613,621 ---- ::retrieveFrame(unsigned char* to, unsigned maxSize, unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes, u_int8_t& resultFrameHeader, ! struct timeval& resultPresentationTime, ! Boolean& resultIsSynchronized) { ! if (fNextOutgoingBin >= fOutgoingBinMax) return False; // none left FrameDescriptor& outBin = fFrames[fIncomingBankId^1][fNextOutgoingBin]; *************** *** 617,622 **** --- 623,629 ---- unsigned char* fromPtr = outBin.frameData; unsigned char fromSize = outBin.frameSize; outBin.frameSize = 0; // for the next time this bin is used + resultIsSynchronized = outBin.fIsSynchronized; // Check whether this frame is missing; if so, return a FT_NO_DATA frame: if (fromSize == 0) { From finlayson at live555.com Mon Mar 5 08:50:50 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 5 Mar 2007 08:50:50 -0800 Subject: [Live-devel] AMR timestamps In-Reply-To: <20070305155911.7084644199@ws5-1.us4.outblaze.com> References: <20070305155911.7084644199@ws5-1.us4.outblaze.com> Message-ID: Thanks. This will be included in the next release of the software. >OK sorry about that. I also fixed a small bug in method MIMEtype() >that always returned the string "audio/AMR-WB" whatever the type of AMR. Actually, I also made a small change to this fix. The actual choice (according to RFC 3267) is between "audio/AMR" and "audio/AMR-WB". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From TAYK0004 at ntu.edu.sg Mon Mar 5 09:39:36 2007 From: TAYK0004 at ntu.edu.sg (#TAY KOON HWEE#) Date: Tue, 6 Mar 2007 01:39:36 +0800 Subject: [Live-devel] PDA player Message-ID: <438567054C073949AEBE5A28B83E7DE133FCF5@MAIL21.student.main.ntu.edu.sg> Hi guys, I have completed a PC streamer (using Visual C++) which streams H.264/MP3 using MPEG2 TS. I have tested it with VLC on PC to receive the TS and it works perfectly. However, I am looking for a PDA player (Windows Mobile 5) to test the TS. I have tried using VLC (PDA version), however, it does not support multicast. Can anyone share his/her experience on this issue? I have tried to search for PDA softwares to no avail and have to resort to using livemedia mailing list for help. As part of sharing, I will be most willing to share my streamer source code with the rest after testing it on the pda. Thank you and regards. Zkunhui -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070305/67d5a40b/attachment.html From susovan at tataelxsi.co.in Tue Mar 6 01:38:42 2007 From: susovan at tataelxsi.co.in (Susovan Ghosh) Date: Tue, 6 Mar 2007 15:08:42 +0530 Subject: [Live-devel] Recv system call Message-ID: <002a01c75fd3$3af4d310$0a2a320a@telxsi.com> Hi all, i used live555 server to stream .ts file to STB.It is ok for some time ,but after that the STB can not receive data from socket and "recv system call" return zero. but recv system call can not return zero.I kniow it is the problem of SBV.If any one can identify this problem please help me. Thank You. SUSOVAN GHOSH PH No-9986667320 Engineer (D&D) PRDE TATAELXSI LIMITED From susovan at tataelxsi.co.in Tue Mar 6 02:51:03 2007 From: susovan at tataelxsi.co.in (Susovan Ghosh) Date: Tue, 6 Mar 2007 16:21:03 +0530 Subject: [Live-devel] Configuration Message-ID: <002f01c75fdd$5679ea30$0a2a320a@telxsi.com> Hi all, i want to configured the server, means i want to change the bit rate,Max no of client connection,Unicast IP address.how i can do this.What are those file where i have to change those parameter.If any one know those file plese help me to know those files. Thank You. SUSOVAN GHOSH PH No-9986667320 Engineer (D&D) PRDE TATAELXSI LIMITED From finlayson at live555.com Tue Mar 6 03:06:26 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 Mar 2007 03:06:26 -0800 Subject: [Live-devel] Configuration In-Reply-To: <002f01c75fdd$5679ea30$0a2a320a@telxsi.com> References: <002f01c75fdd$5679ea30$0a2a320a@telxsi.com> Message-ID: > i want to configured the server, means i want to change the bit >rate,Max no of client connection,Unicast IP address. None of those three things makes any sense. 1/ The bit rate depends upon the file that you're streaming (the data streams at the same rate that data would be consumed if the file were played locally). 2/ There is no inherent limit on the number of client connections (any such limit would be a property of your OS and/or your network, not the application). And 3/ the IP address that the server sends to is that of the client. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ruru605 at 163.com Tue Mar 6 06:58:51 2007 From: ruru605 at 163.com (ruru605) Date: Tue, 6 Mar 2007 23:58:51 +0900 Subject: [Live-devel] help_SDES Message-ID: <200703062358461255535@163.com> Hi, everyone I have a question as follows: I want to transmit "User name SDES item" using RTCP, so in the addSDES(), I enqueue the name after the SDES item in the form of NAME=2 length text, however, when I use Ethereal to grap packet, SDES items show that they only includes one Type(CNAME). What should I do if I want to add new item,please help me. Thanks a lot. Regards ru ruru605 2007-03-06 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070306/d94903a2/attachment.html From martin.gutenbrunner at telekom.at Wed Mar 7 00:35:43 2007 From: martin.gutenbrunner at telekom.at (Gutenbrunner Martin) Date: Wed, 7 Mar 2007 09:35:43 +0100 Subject: [Live-devel] Trick play on Amino 110 Message-ID: hi! Although there are many posts about trick play, I haven't been able to find the solution for my specific problem: I am able to stream an indexed transport stream with my amino box. but when I push ffwd or rewind, the image freezes. after pressing play again it becomes obvious that the video really forwarded (or rewinded) faster, the forwarding just wasn't visible. how comes that forwarding and rewinding aren't 'visible' to the user? my /mnt/nv/config.txt has amino.rtsp.scale set to 3 (also already tried with 6) amino.rtsp.server wasn't set, so I added amino.rtsp.server=nCube what else can I try to solve this problem? thanks in advance for your help and keep up the great work. It's amazing stuff you're doing here :-) regards martin gutenbrunner -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070307/b9b1e127/attachment.html From finlayson at live555.com Wed Mar 7 04:27:51 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 Mar 2007 04:27:51 -0800 Subject: [Live-devel] Trick play on Amino 110 In-Reply-To: References: Message-ID: >I am able to stream an indexed transport stream with my amino box. >but when I push ffwd or rewind, the image freezes. after pressing >play again it becomes obvious that the video really forwarded (or >rewinded) faster, the forwarding just wasn't visible. > >how comes that forwarding and rewinding aren't 'visible' to the user? I'm not sure. The only thing I can suggest is that perhaps the higher bit rate of the 'trick play' streams (because, in such streams, each frame is an I-frame) is overwhelming your set-top box or (less likely) your network. Assuming that you have sufficient network bandwidth (e.g., you're not going through a 10 Mbps Ethernet switch), then the only thing I can suggest is seeing if there is some way to increase your STB's input buffering. (Not being an expert on Amino STBs, however, I'm not sure how/if you can do this.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070307/20f7723b/attachment.html From morgan.torvolt at gmail.com Wed Mar 7 05:18:45 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Wed, 7 Mar 2007 17:18:45 +0400 Subject: [Live-devel] Trick play on Amino 110 In-Reply-To: References: Message-ID: <3cc3561f0703070518g64259e6cmf133f099788059fc@mail.gmail.com> > I am able to stream an indexed transport stream with my amino box. but when > I push ffwd or rewind, the image freezes. after pressing play again it > becomes obvious that the video really forwarded (or rewinded) faster, the > forwarding just wasn't visible. > > how comes that forwarding and rewinding aren't 'visible' to the user? > > > I'm not sure. The only thing I can suggest is that perhaps the higher bit > rate of the 'trick play' streams (because, in such streams, each frame is an > I-frame) is overwhelming your set-top box or (less likely) your network. Quite possible. I cannot confirm this though, as I have not tested your trick play yet. I do know that the 110H box does handle immense ammounts of data in an expected manner though. We tried dumping 100Mbit to it, and it decoded some portions at least, but with alot of garbage. Here it sounds as if there is nothing on screen at all, which sounds weird to me. As for a "fix", would it be possible to add a feature that ensures that not more than 5 (or something) I frames are transmitted per second, and then just skip the rest? That would limit the bandwidth demand quite a bit I would think, and could possibly make the frames more watchable. Maybe even 3 is enough per second. > Assuming that you have sufficient network bandwidth (e.g., you're not going > through a 10 Mbps Ethernet switch), then the only thing I can suggest is > seeing if there is some way to increase your STB's input buffering. (Not > being an expert on Amino STBs, however, I'm not sure how/if you can do > this.) -- I believe this is NDA material unfortunately since all the documentation covering this is marked confidential. Asking Amino about it or reading some documentation could answer this. -Morgan- From martin.gutenbrunner at telekom.at Wed Mar 7 06:44:19 2007 From: martin.gutenbrunner at telekom.at (Gutenbrunner Martin) Date: Wed, 7 Mar 2007 15:44:19 +0100 Subject: [Live-devel] Trick play on Amino 110 In-Reply-To: <3cc3561f0703070518g64259e6cmf133f099788059fc@mail.gmail.com> References: <3cc3561f0703070518g64259e6cmf133f099788059fc@mail.gmail.com> Message-ID: >As for a "fix", would it be possible to add a feature that ensures that not more than 5 (or something) I frames are transmitted per second, and then just skip the rest? That would limit the bandwidth demand quite a bit I would think, and could possibly make the frames more watchable. Maybe even 3 is enough per second. trick play does work with oracle video servers. does this mean that ovs automatically reduces bandwith to make trick play possible? is there anything I can do to find out, how ovs works? (ie ethereal stream sniffing, etc) thanks From morgan.torvolt at gmail.com Wed Mar 7 07:06:24 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Wed, 7 Mar 2007 19:06:24 +0400 Subject: [Live-devel] Trick play on Amino 110 In-Reply-To: References: <3cc3561f0703070518g64259e6cmf133f099788059fc@mail.gmail.com> Message-ID: <3cc3561f0703070706md4aa19dja454ff51fbe57c33@mail.gmail.com> > trick play does work with oracle video servers. does this mean that ovs > automatically reduces bandwith to make trick play possible? Yes, it should. It is a commercial video server, and the STB producers and VOD server producers make it work somehow. > is there anything I can do to find out, how ovs works? (ie ethereal > stream sniffing, etc) You could allways take a look at the stream it produces. You could dump it to disk using VLC or live555s RTSPclient. To check the bandwidht, you could do that while dumping it to disk. I am unsure if VLC supports trick play yet though, and if OVS is standard compliant enough to work with the mentioned clients. -Morgan- From finlayson at live555.com Wed Mar 7 21:31:36 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 Mar 2007 21:31:36 -0800 Subject: [Live-devel] Trick play on Amino 110 In-Reply-To: References: <3cc3561f0703070518g64259e6cmf133f099788059fc@mail.gmail.com> Message-ID: A reminder to people who may be having trouble getting Transport Stream trick play to work with some clients: We provide a utility "testMPEG2TransportStreamTrickPlay" which generates - from an original Transport Stream file plus index file - a new Transport Stream file that contains the effect of 'fast forward' or 'reverse play' applied to the original stream. This file will contain the exact same data that the server would send if you had requested 'fast forward' or 'reverse play' from the client. See for more details. Therefore, you can try streaming the resulting Transport Stream file (without indexing) to the client, to see how the client handles it. You may find this an easier way to debug your client. In general, though, note that this mailing list is *not* the right place to be be complaining about problems with clients (unless, of course, those clients use the "LIVE555 Streaming Media" software). In particular, problems with the Amino set-top boxes are best discussed on an Amino-related mailing list. Live Networks, Inc. is not affiliated with Amino Technologies, and we don't know enough about their hardware to debug it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070307/797d39c9/attachment.html From martin.gutenbrunner at telekom.at Thu Mar 8 01:50:12 2007 From: martin.gutenbrunner at telekom.at (Gutenbrunner Martin) Date: Thu, 8 Mar 2007 10:50:12 +0100 Subject: [Live-devel] Trick play on Amino 110 In-Reply-To: References: <3cc3561f0703070518g64259e6cmf133f099788059fc@mail.gmail.com> Message-ID: >A reminder to people who may be having trouble getting Transport Stream trick play to work with some clients: We provide a utility >"testMPEG2TransportStreamTrickPlay" which generates - from an original Transport Stream file plus index file - a new Transport Stream file that contains the >effect of 'fast forward' or 'reverse play' applied to the original stream. This file will contain the exact same data that the server would send if you had >requested 'fast forward' or 'reverse play' from the client. ok, tried that. the "accelerated" video stream (scale 3) plays correctly on the amino box. I have scale in /mnt/nv/config.txt also set to 3. that would mean that an unaccelerated stream should be fast forwarded absolutely the same way as the output file from testMPEG2TransportStreamTrickPlay.exe, no? >In general, though, note that this mailing list is *not* the right place to be be complaining about problems with clients (unless, of course, those clients use the >"LIVE555 Streaming Media" software). In particular, problems with the Amino set-top boxes are best discussed on an Amino-related mailing list. Live Networks, >Inc. is not affiliated with Amino Technologies, and we don't know enough about their hardware to debug it. Sorry if you got the impression that I'm complaining about anything. The fact that the box is working with ovs but not with live555 made me believe that this would be the right place to ask questions. Actually, I think it's really great work you accomplish, especially when regarding the fact that your software is free. So, complaining about anything really was the last thing I intended and your help is very much appreciated. I just thought it would be also of interest to you if your server was able to handle a larger variety of clients. So if I can be of any help to you in extending your software to work with amino (or adb, I also work with ADB-3800TW but that box will be tested in a few days) clients, I'd be really glad. Thanks martin From finlayson at live555.com Thu Mar 8 02:52:38 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Mar 2007 02:52:38 -0800 Subject: [Live-devel] Trick play on Amino 110 In-Reply-To: References: <3cc3561f0703070518g64259e6cmf133f099788059fc@ma il.gmail.com> Message-ID: > >A reminder to people who may be having trouble getting Transport Stream >trick play to work with some clients: We provide a utility >>"testMPEG2TransportStreamTrickPlay" which generates - from an original >Transport Stream file plus index file - a new Transport Stream file that >contains the >>effect of 'fast forward' or 'reverse play' applied to the original >stream. This file will contain the exact same data that the server would >send if you had >>requested 'fast forward' or 'reverse play' from the client. > >ok, tried that. the "accelerated" video stream (scale 3) plays correctly >on the amino box. I have scale in /mnt/nv/config.txt also set to 3. > >that would mean that an unaccelerated stream should be fast forwarded >absolutely the same way as the output file from >testMPEG2TransportStreamTrickPlay.exe, no? That's correct - which surprises me even more that you are having problems when playing 'fast forward' using an Amino 110 STB. (I, and most others who have used that client, have not had any such problems.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rmpg2001 at gmail.com Thu Mar 8 03:52:21 2007 From: rmpg2001 at gmail.com (Ramon Martin de Pozuelo) Date: Thu, 8 Mar 2007 12:52:21 +0100 Subject: [Live-devel] frameDurationInMicroseconds Message-ID: <389189e20703080352g4922797cjbb48c9b1fa48c954@mail.gmail.com> Hi all, I am working on a H264 Streamer. It works very well with VLC and QuickTime but I need that the frames will be sent exactly (as much as possible) at 40 mseg, so I set frameDurationInMicroseconds to 40000. I used Etherreal to analize packet's time and I saw that packets are sent in intervals of 32-33 mseg. and 47-48 mseg. Main time is correct but is there any way to send packets regulary to 40 mseg?? Someone says to me that it is possible an error of using Windows and Visual C++ to build Live libraries. It can be solved building Live in UNIX/Linux? Ramon -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070308/da4942c9/attachment.html From finlayson at live555.com Thu Mar 8 04:07:34 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Mar 2007 04:07:34 -0800 Subject: [Live-devel] frameDurationInMicroseconds In-Reply-To: <389189e20703080352g4922797cjbb48c9b1fa48c954@mail.gmail.com> References: <389189e20703080352g4922797cjbb48c9b1fa48c954@mail.gmail.com> Message-ID: >Hi all, >I am working on a H264 Streamer. It works very well with VLC and >QuickTime but I need that the frames will be sent exactly (as much >as possible) at 40 mseg, so I set frameDurationInMicroseconds to >40000. I used Etherreal to analize packet's time and I saw that >packets are sent in intervals of 32-33 mseg. and 47-48 mseg. Main >time is correct but is there any way to send packets regulary to 40 >mseg?? > >Someone says to me that it is possible an error of using Windows and >Visual C++ to build Live libraries. It can be solved building Live >in UNIX/Linux? Perhaps - Windows' timers are notoriously inaccurate. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From alvaro.i at ikusi.es Thu Mar 8 05:48:10 2007 From: alvaro.i at ikusi.es (=?iso-8859-1?Q?=C1lvaro_Pajuelo=2C_I=F1aki?=) Date: Thu, 8 Mar 2007 14:48:10 +0100 Subject: [Live-devel] Raw UDP streaming Message-ID: <435A66F076E8344FA6D3917739BD99A4914AC7@srviku004.ikusi.net> Hello, Please could you help me to configure the wis-streamer to stream raw udp rather than standard RTP streaming. I'm using an Amino IP STB that only acepts raw udp. It's possible ? which command must I use ?. Another question what is the bandwitdh limit for the wis-streamer. I'm using a demo board from micronas and I see pixelation when I increase the bandwidth (upper 6Mbis/seg). Best Regards, I?aki Alvaro PajueloI?aki Alvaro Pajuelo R&D Dept. Project Manager IP Area Design & Manufacturing Division alvaro.i at ikusi.es www.ikusi.es --- IKUSI - ?ngel Iglesias, S.A. Paseo Miram?n, 170 20009 San Sebasti?n SPAIN Tel: +34 943 448800 Fax: +34 943 448811 From finlayson at live555.com Thu Mar 8 06:11:09 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Mar 2007 06:11:09 -0800 Subject: [Live-devel] Raw UDP streaming In-Reply-To: <435A66F076E8344FA6D3917739BD99A4914AC7@srviku004.ikusi.net> References: <435A66F076E8344FA6D3917739BD99A4914AC7@srviku004.ikusi.net> Message-ID: >Please could you help me to configure the wis-streamer to stream raw >udp rather than standard >RTP streaming. I'm using an Amino IP STB that only acepts raw udp. >It's possible ? Yes - this is already supported by "wis-streamer" (and our other RTSP server applications, such as the "LIVE555 Media Server" ). Note, however, that the stream must be unicast on demand - not multicast, because the Amino STBs (at least, the ones that I have tested) do not handle multicast streams. Note also that if you use an Amino STB client to play your stream, then *all* clients must be Amino STBs. Because of limitations of the software, you cannot mix Amino STBs with other clients (such as VLC) that request RTP/UDP streaming. >Another question what is the bandwitdh limit for the wis-streamer. The application itself has no inherent bandwidth limit. The only limits are those imposed (indirectly) by your hardware, OS, and network. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From alvaro.i at ikusi.es Thu Mar 8 06:34:48 2007 From: alvaro.i at ikusi.es (=?iso-8859-1?Q?=C1lvaro_Pajuelo=2C_I=F1aki?=) Date: Thu, 8 Mar 2007 15:34:48 +0100 Subject: [Live-devel] Raw UDP streaming In-Reply-To: Message-ID: <435A66F076E8344FA6D3917739BD99A4914AC8@srviku004.ikusi.net> Thank you for your fast response. Do you Know which command must I use ?. I?aki. -----Mensaje original----- De: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]En nombre de Ross Finlayson Enviado el: 08/03/2007 03:11 Para: LIVE555 Streaming Media - development & use Asunto: Re: [Live-devel] Raw UDP streaming >Please could you help me to configure the wis-streamer to stream raw >udp rather than standard >RTP streaming. I'm using an Amino IP STB that only acepts raw udp. >It's possible ? Yes - this is already supported by "wis-streamer" (and our other RTSP server applications, such as the "LIVE555 Media Server" ). Note, however, that the stream must be unicast on demand - not multicast, because the Amino STBs (at least, the ones that I have tested) do not handle multicast streams. Note also that if you use an Amino STB client to play your stream, then *all* clients must be Amino STBs. Because of limitations of the software, you cannot mix Amino STBs with other clients (such as VLC) that request RTP/UDP streaming. >Another question what is the bandwitdh limit for the wis-streamer. The application itself has no inherent bandwidth limit. The only limits are those imposed (indirectly) by your hardware, OS, and network. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Thu Mar 8 06:45:27 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Mar 2007 06:45:27 -0800 Subject: [Live-devel] Raw UDP streaming In-Reply-To: <435A66F076E8344FA6D3917739BD99A4914AC8@srviku004.ikusi.net> References: <435A66F076E8344FA6D3917739BD99A4914AC8@srviku004.ikusi.net> Message-ID: >Thank you for your fast response. Do you Know which command must I use ?. Please read the documentation: . (Search for "MPEG Transport".) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070308/8e7c688c/attachment-0001.html From alvaro.i at ikusi.es Thu Mar 8 07:50:58 2007 From: alvaro.i at ikusi.es (=?iso-8859-1?Q?=C1lvaro_Pajuelo=2C_I=F1aki?=) Date: Thu, 8 Mar 2007 16:50:58 +0100 Subject: [Live-devel] Raw UDP streaming In-Reply-To: Message-ID: <435A66F076E8344FA6D3917739BD99A4914AC9@srviku004.ikusi.net> Dear Ross, I tried with the parameter you said -mpegtransport but I hope the stream is sent using RTP yet, now I can see it with the amino box, but the image goes slowly. It seems that doesn't have timming information or the information is not OK. In this mode are you using RTP headers or only UDP ?. you include PCR information ?. Best Regards, I?aki. -----Mensaje original----- De: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]En nombre de Ross Finlayson Enviado el: 08/03/2007 03:45 Para: LIVE555 Streaming Media - development & use Asunto: Re: [Live-devel] Raw UDP streaming Thank you for your fast response. Do you Know which command must I use ?. Please read the documentation: . (Search for "MPEG Transport".) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070308/51623ffe/attachment.html From finlayson at live555.com Thu Mar 8 09:20:57 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Mar 2007 09:20:57 -0800 Subject: [Live-devel] Raw UDP streaming In-Reply-To: <435A66F076E8344FA6D3917739BD99A4914AC9@srviku004.ikusi.net> References: <435A66F076E8344FA6D3917739BD99A4914AC9@srviku004.ikusi.net> Message-ID: >I tried with the parameter you said -mpegtransport but I hope the >stream is sent using RTP yet, now I can see it >with the amino box, but the image goes slowly. Try playing around with the encoder bit rate and frame rate parameters. That's all I can suggest right now... >In this mode are you using RTP headers or only UDP ? Yes, the Amino STB requests raw UDP streaming, so that's what the server delivers. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070308/442e210d/attachment.html From rmerca at adinet.com.uy Thu Mar 8 09:31:15 2007 From: rmerca at adinet.com.uy (rmerca at adinet.com.uy) Date: Thu, 8 Mar 2007 14:31:15 -0300 (UYT) Subject: [Live-devel] Need to stream wav files over RTP Message-ID: <4006584.1173375075654.JavaMail.tomcat@fe-ps01> Hi: I am developing a web application that has to be able to transmit wav files audio over RTP to a destination IP, where a Cisco IP Phone is ready to accept that RTP stream. I need to transmit 50 RTP packets per second, G711 u-law, and I have wav files format of 8 bit, 8000Hz, Mono. I wonder if you have any software solution that can be used in my ASP. NET application to accomplish this task. I would really appreciate any response Regards, Rodrigo Mercader -------------- next part -------------- An embedded message was scrubbed... From: "rmerca at adinet.com.uy" Subject: Need to stream wav files over RTP Date: Thu, 8 Mar 2007 11:56:20 -0300 (UYT) Size: 1011 Url: http://lists.live555.com/pipermail/live-devel/attachments/20070308/44d3514c/attachment.mht From finlayson at live555.com Thu Mar 8 11:32:17 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 Mar 2007 11:32:17 -0800 Subject: [Live-devel] Need to stream wav files over RTP In-Reply-To: <4006584.1173375075654.JavaMail.tomcat@fe-ps01> References: <4006584.1173375075654.JavaMail.tomcat@fe-ps01> Message-ID: >Hi: >I am developing a web application that has to be able to transmit wav >files audio over RTP to a destination IP, where a Cisco IP Phone is >ready >to accept that RTP stream. We have code, and applications (e.g., "testWAVAudioStreamer" (multicast), "testOnDemandRTSPServer" (unicast) and "live555MediaServer" (unicast) that will stream WAV file data via RTP. However, we do not yet have a SIP server implementation, so it's not clear whether or not you will be able to use our code to stream to an IP phone. You will need to evaluate our code for yourself to figure this out. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rmpg2001 at gmail.com Fri Mar 9 00:49:23 2007 From: rmpg2001 at gmail.com (Ramon Martin de Pozuelo) Date: Fri, 9 Mar 2007 09:49:23 +0100 Subject: [Live-devel] frameDurationInMicroseconds In-Reply-To: References: <389189e20703080352g4922797cjbb48c9b1fa48c954@mail.gmail.com> Message-ID: <389189e20703090049k3f2a2fadu62b90e8465ef7056@mail.gmail.com> Hi, I fix it. It's a Windows problem. I add to link options "winmm.lib" library. This library has some functions where you can change Windows timers' precision. Using timeBeginPeriod(1) you set the timer precision to 1 ms. (By default it seems that it's set to 16 ms). Though I solved it in Windows, I think I will start to move my project to UNIX, to avoid more future problems. Thank you very much, Ramon 2007/3/8, Ross Finlayson : > > >Hi all, > >I am working on a H264 Streamer. It works very well with VLC and > >QuickTime but I need that the frames will be sent exactly (as much > >as possible) at 40 mseg, so I set frameDurationInMicroseconds to > >40000. I used Etherreal to analize packet's time and I saw that > >packets are sent in intervals of 32-33 mseg. and 47-48 mseg. Main > >time is correct but is there any way to send packets regulary to 40 > >mseg?? > > > >Someone says to me that it is possible an error of using Windows and > >Visual C++ to build Live libraries. It can be solved building Live > >in UNIX/Linux? > > Perhaps - Windows' timers are notoriously inaccurate. > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070309/4dc0350b/attachment.html From yelite993 at 163.com Fri Mar 9 01:10:40 2007 From: yelite993 at 163.com (=?gbk?B?0ac=?=) Date: Fri, 9 Mar 2007 17:10:40 +0800 (CST) Subject: [Live-devel] HELP: About the LATM format in ISO/IEC 14496-3 Message-ID: <2886968.2928471173431440414.JavaMail.root@bj163app17.163.com> Dear All, I have a question about the AudioMuxElement() in LATM format in ISO/IEC 14496-3:2005. Following is the Syntax of AudioMuxElement(): AudioMuxElement(muxConfigPresent) { if (muxConfigPresent) { useSameStreamMux; if (!useSameStreamMux) StreamMuxConfig(); } if (audioMuxVersionA == 0) { for (i = 0; i <= numSubFrames; i++) { PayloadLengthInfo(); PayloadMux(); } . . . } } I think somebody should have konwn that the StreamMuxConfig() is not octet-aligned. So I have to shift all of the bytes of the AAC frame to generate the PayloadMux()? In this case it's very inefficient. Can anybody give me some suggestions? And tell me which palyers can play the LATM format audio data? Thank you very much, Best Wishes, yelite -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070309/919f1502/attachment.html From luc47654 at yahoo.it Fri Mar 9 13:25:18 2007 From: luc47654 at yahoo.it (luca norm) Date: Fri, 9 Mar 2007 22:25:18 +0100 (CET) Subject: [Live-devel] Frame Rate and (soft) real-time extensions Message-ID: <751997.57783.qm@web23413.mail.ird.yahoo.com> Hi, I have noticed that in some classes, there's a durationInMicroseconds var which appears as a parameter but it is never set with a value different than 0 (default). So, I ask: should I do my own implementation for the frame rate control (using it in doGetNextFrame())? In addition, the variable above is "marked" for microseconds: is this a tip for the implementation? If so, in order to have the control of a task with timing costraints in microseconds , should I add (soft) Real Time extensions to my tasks (for example RTAI or RTLinux)? For example: if I want to make a streamer which streams exactly at 25 fps, should i add these extensions? any help/info is greatly appreciated, Luca ___________________________________ L'email della prossima generazione? Puoi averla con la nuova Yahoo! Mail: http://it.docs.yahoo.com/nowyoucan.html From finlayson at live555.com Fri Mar 9 16:26:40 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 9 Mar 2007 16:26:40 -0800 Subject: [Live-devel] Frame Rate and (soft) real-time extensions In-Reply-To: <751997.57783.qm@web23413.mail.ird.yahoo.com> References: <751997.57783.qm@web23413.mail.ird.yahoo.com> Message-ID: >I have noticed that in some classes, there's a >durationInMicroseconds var which appears as a >parameter but it is never set with a value different >than 0 (default). At present, the "fDurationInMicroseconds" field is currently used only when *sending* RTP packets, to figure out how long to wait until sending the next packet. This means that - in practice - the field need be set only when data is being streamed from a file. (If data is being received - or if data is being streamed from a live source - then it's not important that that variable be set.) >For example: if I want to make a streamer which >streams exactly at 25 fps, should i add these >extensions? Trying to stream packets at a perfectly even rate - without any jitter, is pointless. The network will always add some jitter anyway, and the receiving client(s) - if implemented correctly - will have enough buffer memory to absorb network jitter. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From brainlai at gmail.com Fri Mar 9 19:07:11 2007 From: brainlai at gmail.com (Brain Lai) Date: Sat, 10 Mar 2007 11:07:11 +0800 Subject: [Live-devel] liveness issue on the server side Message-ID: Dear Sir: The RTSPServer has taken care of client session liveness. In contrast, the RTSPClient seems not to note server liveness now. Is it unnecessary or something missed? Regards Brain Lai -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070309/35712abc/attachment-0001.html From finlayson at live555.com Fri Mar 9 21:30:14 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 9 Mar 2007 21:30:14 -0800 Subject: [Live-devel] liveness issue on the server side In-Reply-To: References: Message-ID: >Dear Sir: > >The RTSPServer has taken care of client session liveness. >In contrast, the RTSPClient seems not to note server liveness now. Yes it does - by sending RTCP "Reception Report" (RR) packets. The server interprets incoming RTCP packets (or RTSP commands) from a client as indicating its 'liveness'. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From brainlai at gmail.com Sat Mar 10 00:37:12 2007 From: brainlai at gmail.com (Brain Lai) Date: Sat, 10 Mar 2007 16:37:12 +0800 Subject: [Live-devel] liveness issue on the server side In-Reply-To: References: Message-ID: Well, I mean if an RTSP server dies accidentally or closes the session without sending RTCP BYE, will an RTSPClient sense that with some kind of timeout mechanism? It seems that the RTSPClient has not implemented this feature so far ... Regards Brain Lai 2007/3/10, Ross Finlayson : > > >Dear Sir: > > > >The RTSPServer has taken care of client session liveness. > >In contrast, the RTSPClient seems not to note server liveness now. > > Yes it does - by sending RTCP "Reception Report" (RR) packets. The > server interprets incoming RTCP packets (or RTSP commands) from a > client as indicating its 'liveness'. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070310/6af7283e/attachment.html From info at dnastudios.it Sun Mar 11 14:16:21 2007 From: info at dnastudios.it (DNA STUDIOS s.r.l.) Date: Sun, 11 Mar 2007 23:16:21 +0100 Subject: [Live-devel] livemedia & mplayer Message-ID: <45F47FB5.9050902@dnastudios.it> I have find a bug (i think...) in the way that mplayer use livemedia. When i stream with mplayer from DSS, in "connected user" of Darwin i see 2 connection from the same host...this very strange and i think that this is a bug. I know that mplayer use live555 libraries to stream RTSP, i have tried VLC and openRTS but only with mplayer i see 2 connection in Darwin; i tried with and without -rtsp-stream-over-tcp but is the same thing.... Someone know this "bug" and someone knows in that way can be solved this problem? Thanks. ----------------------------- Nicola From dweber at robotics.net Sun Mar 11 15:44:39 2007 From: dweber at robotics.net (Dan Weber) Date: Sun, 11 Mar 2007 18:44:39 -0500 Subject: [Live-devel] Segfault in doEventLoop Message-ID: <20070311234439.GA2294@Barney.robotics.net> Hi there, I called doEventLoop and it's segfaulting on the fact that it appears SingleStep is inexistant i.e. pure virtual. It compiled fine, and I made sure I was using BasicTaskScheduler::createNew(). In the debugger, the task scheduler clearly was resolved but in BasicTaskScheduler0... it called SingleStep and segfaulted. Do you have any ideas? Dan From finlayson at live555.com Sun Mar 11 16:29:02 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 11 Mar 2007 17:29:02 -0700 Subject: [Live-devel] livemedia & mplayer In-Reply-To: <45F47FB5.9050902@dnastudios.it> References: <45F47FB5.9050902@dnastudios.it> Message-ID: >I have find a bug (i think...) in the way that mplayer use livemedia. >When i stream with mplayer from DSS, in "connected user" of Darwin i see >2 connection from the same host... By "connection", do you mean TCP connection? If so, then yes, it probably is a bug, but Remember, You Have Complete Source Code, which will enable you to track it down. If by "connection", however, you mean UDP socket, then this is normal - there's one socket for RTP, and one for RTCP, for each media type. So, if the stream contains both audio and video, then there will be 4 UDP sockets. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Mar 12 23:14:19 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 12 Mar 2007 23:14:19 -0700 Subject: [Live-devel] Testing - please ignore Message-ID: Testing our new mail server - please ignore. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jarod.dong at gmail.com Tue Mar 13 00:29:56 2007 From: jarod.dong at gmail.com (Jarod Dong) Date: Tue, 13 Mar 2007 15:29:56 +0800 Subject: [Live-devel] the frame gotten from MPEG4ESVideoRTPSource Message-ID: <3099c0f30703130029s443e5a1fn977d6f124054ff8a@mail.gmail.com> Hi everyone, I just want to know what I get when using getNextFrame() in class MPEG4ESVideoRTPSource. Is it a VOP? Thank you. -- Free trade reduces world suffering. From TAYK0004 at ntu.edu.sg Tue Mar 13 01:38:44 2007 From: TAYK0004 at ntu.edu.sg (#TAY KOON HWEE#) Date: Tue, 13 Mar 2007 16:38:44 +0800 Subject: [Live-devel] Multicast over Wireless LAN Message-ID: <438567054C073949AEBE5A28B83E7DE133FCF9@MAIL21.student.main.ntu.edu.sg> Hi guys, I have the following problem with multicasting. My program installed in my PC is able to unicast to laptops/PDA connected wireless to my router. However, for multicasting, it only works for devices connected by LAN cable to my router. Therefore PDA will not be able to receive the multicast while laptop if connected via LAN cable to the router will be able to receive the multicast. Weird thing is if my program installed on my laptop (wireless connected to my router) is about to stream (unicast and multcast) to devices connected by wires to the router. Does the RTCP Bandwidth got to do with it? I have set my estimatedSessionBandwidthVideo = 512. Can anyone advise on the above problem? Thank you and regards. zkunhui -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070313/6e8987d8/attachment.html From stabrawa at stanford.edu Tue Mar 13 02:18:00 2007 From: stabrawa at stanford.edu (Tim Stabrawa) Date: Tue, 13 Mar 2007 04:18:00 -0500 Subject: [Live-devel] Multicast over Wireless LAN In-Reply-To: <438567054C073949AEBE5A28B83E7DE133FCF9@MAIL21.student.main.ntu.edu.sg> References: <438567054C073949AEBE5A28B83E7DE133FCF9@MAIL21.student.main.ntu.edu.sg> Message-ID: <45F66C48.5000409@stanford.edu> #TAY KOON HWEE# wrote: > Hi guys, > > I have the following problem with multicasting. My program installed > in my PC is able to unicast to laptops/PDA connected wireless to my > router. However, for multicasting, it only works for devices connected > by LAN cable to my router. Therefore PDA will not be able to receive > the multicast while laptop if connected via LAN cable to the router > will be able to receive the multicast. > > Weird thing is if my program installed on my laptop (wireless > connected to my router) is about to stream (unicast and multcast) to > devices connected by wires to the router. > > Does the RTCP Bandwidth got to do with it? I have set my > estimatedSessionBandwidthVideo = 512. > > Can anyone advise on the above problem? > My best guess is that your router itself is at fault. The way most switches work with multicast is they'll treat it as broadcast and flood the network. (You'd have to shell out extra bucks to get one that handles it smarter - read up on GMRP (802.1p) if you care.) Now, wireless routers on the other hand, tend to do something funky with how they connect up the wireless and wired clients. For example, the Linksys WRT54G has physically separate interfaces to the main processor for wireless and wired clients. It provides connectivity between the two with a software bridge (which effectively combines them as if they were connected via a hardware switch). FWIW, it looks like multicast traffic is being forwarded to my wireless link on my WRT54G (only verified by looking at the LED's though). Chances are your particular router is trying to be smart and is filtering multicast traffic from appearing on the wireless interface (presumably so it doesn't bog down the wireless link with useless data). It's possible, although unlikely, that you can receive the data from your wireless device if you do a GMRP join procedure for the multicast session(s) you're interested in. I've never done this in practice though, since I've never actually seen a switch that supports GMRP. Anyways, hopefully some of this is useful or interesting. It kept me amused writing it at least. :-) Good luck, - Tim From mrnikhilagrawal at gmail.com Mon Mar 12 23:53:34 2007 From: mrnikhilagrawal at gmail.com (Nikhil Agrawal) Date: Tue, 13 Mar 2007 12:23:34 +0530 Subject: [Live-devel] Regarding streaming Jpeg images Message-ID: <733cde3e0703122353l106760cm4af263a53ae2ecad@mail.gmail.com> Hi Ross, I want to stream Jpeg images. I have still pictures in Jpeg format , what classes i need to derive and what needs to be implemented . What functions need to be implemented. Thanks and Regards, Nikhil Agrawal -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070312/289d5619/attachment.html From finlayson at live555.com Tue Mar 13 03:11:17 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2007 03:11:17 -0700 Subject: [Live-devel] Regarding streaming Jpeg images In-Reply-To: <733cde3e0703122353l106760cm4af263a53ae2ecad@mail.gmail.com> References: <733cde3e0703122353l106760cm4af263a53ae2ecad@mail.gmail.com> Message-ID: >Hi Ross, > >I want to stream Jpeg images. I have still pictures in Jpeg format , >what classes i need to derive and what needs to be implemented . >What functions need to be implemented. Please read the FAQ. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From opera at kth.se Tue Mar 13 03:45:10 2007 From: opera at kth.se (=?ISO-8859-1?Q?Gustaf_R=E4ntil=E4?=) Date: Tue, 13 Mar 2007 11:45:10 +0100 Subject: [Live-devel] Streaming MPEG2 TS with separate audio and video input Message-ID: <45F680B6.5010006@kth.se> Hi, I'd like to stream (with rtsp) a transport stream of MPEG2 to an amino. It works well for single files muxed with video and audio (using the technique as for the onDemand-test-program, with a test.ts file). But I'd like to use one video file and one audio file. When using two instances of : sms->addSubsession(MPEG2TransportFileServerMediaSubsession::createNew(...)); the players (amino, mplayer, vlc) dies or halts. Mplayer says it received two video streams. Is there a solution for how to accomplish this, or do I have to use pre-muxed single files? Note; The output stream needs to be bundled as a TS. Gustaf From rmerca at adinet.com.uy Tue Mar 13 05:33:58 2007 From: rmerca at adinet.com.uy (rmerca at adinet.com.uy) Date: Tue, 13 Mar 2007 09:33:58 -0300 (UYT) Subject: [Live-devel] Need to stream wav files over RTP Message-ID: <23695311.1173789238974.JavaMail.tomcat@fe-ps01> I only need a software capable of handle concurrent RTP request, though, the ability to send RTP from wav to many IP addresses and ports. >----Mensaje original---- >De: live-devel-request at ns.live555.com >Fecha: 13/03/2007 07:16 >Para: >Asunto: live-devel Digest, Vol 41, Issue 8 > >Send live-devel mailing list submissions to > live-devel at lists.live555.com > >To subscribe or unsubscribe via the World Wide Web, visit > http://lists.live555.com/mailman/listinfo/live-devel >or, via email, send a message with subject or body 'help' to > live-devel-request at lists.live555.com > >You can reach the person managing the list at > live-devel-owner at lists.live555.com > >When replying, please edit your Subject line so it is more specific >than "Re: Contents of live-devel digest..." > > >Today's Topics: > > 1. Re: liveness issue on the server side (Ross Finlayson) > 2. Re: liveness issue on the server side (Brain Lai) > 3. livemedia & mplayer (DNA STUDIOS s.r.l.) > 4. Segfault in doEventLoop (Dan Weber) > 5. Re: livemedia & mplayer (Ross Finlayson) > 6. Testing - please ignore (Ross Finlayson) > 7. the frame gotten from MPEG4ESVideoRTPSource (Jarod Dong) > 8. Multicast over Wireless LAN (#TAY KOON HWEE#) > 9. Re: Multicast over Wireless LAN (Tim Stabrawa) > 10. Regarding streaming Jpeg images (Nikhil Agrawal) > 11. Re: Regarding streaming Jpeg images (Ross Finlayson) > > >---------------------------------------------------------------------- > >Message: 1 >Date: Fri, 9 Mar 2007 21:30:14 -0800 >From: Ross Finlayson >Subject: Re: [Live-devel] liveness issue on the server side >To: LIVE555 Streaming Media - development & use > >Message-ID: >Content-Type: text/plain; charset="us-ascii" ; format="flowed" > >>Dear Sir: >> >>The RTSPServer has taken care of client session liveness. >>In contrast, the RTSPClient seems not to note server liveness now. > >Yes it does - by sending RTCP "Reception Report" (RR) packets. The >server interprets incoming RTCP packets (or RTSP commands) from a >client as indicating its 'liveness'. >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ > > >------------------------------ > >Message: 2 >Date: Sat, 10 Mar 2007 16:37:12 +0800 >From: "Brain Lai" >Subject: Re: [Live-devel] liveness issue on the server side >To: "LIVE555 Streaming Media - development & use" > >Message-ID: > >Content-Type: text/plain; charset="iso-8859-1" > >Well, I mean if an RTSP server dies accidentally or closes the session >without sending RTCP BYE, >will an RTSPClient sense that with some kind of timeout mechanism? >It seems that the RTSPClient has not implemented this feature so far ... > >Regards >Brain Lai > >2007/3/10, Ross Finlayson : >> >> >Dear Sir: >> > >> >The RTSPServer has taken care of client session liveness. >> >In contrast, the RTSPClient seems not to note server liveness now. >> >> Yes it does - by sending RTCP "Reception Report" (RR) packets. The >> server interprets incoming RTCP packets (or RTSP commands) from a >> client as indicating its 'liveness'. >> -- >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: http://lists.live555.com/pipermail/live- devel/attachments/20070310/6af7283e/attachment-0001.html > >------------------------------ > >Message: 3 >Date: Sun, 11 Mar 2007 23:16:21 +0100 >From: "DNA STUDIOS s.r.l." >Subject: [Live-devel] livemedia & mplayer >To: live-devel at ns.live555.com >Message-ID: <45F47FB5.9050902 at dnastudios.it> >Content-Type: text/plain; charset=ISO-8859-15; format=flowed > >I have find a bug (i think...) in the way that mplayer use livemedia. >When i stream with mplayer from DSS, in "connected user" of Darwin i see >2 connection from the same host...this very strange and i think that >this is a bug. >I know that mplayer use live555 libraries to stream RTSP, i have tried >VLC and openRTS but only with mplayer i see 2 connection in Darwin; i >tried with and without -rtsp-stream-over-tcp but is the same thing.... >Someone know this "bug" and someone knows in that way can be solved this >problem? >Thanks. >----------------------------- >Nicola > > >------------------------------ > >Message: 4 >Date: Sun, 11 Mar 2007 18:44:39 -0500 >From: Dan Weber >Subject: [Live-devel] Segfault in doEventLoop >To: live-devel at ns.live555.com >Message-ID: <20070311234439.GA2294 at Barney.robotics.net> >Content-Type: text/plain; charset=us-ascii > > >Hi there, > >I called doEventLoop and it's segfaulting on the fact that it >appears SingleStep is inexistant i.e. pure virtual. It compiled fine, >and I made sure I was using BasicTaskScheduler::createNew(). In the debugger, >the task scheduler clearly was resolved but in BasicTaskScheduler0... it >called SingleStep and segfaulted. Do you have any ideas? > >Dan > > >------------------------------ > >Message: 5 >Date: Sun, 11 Mar 2007 17:29:02 -0700 >From: Ross Finlayson >Subject: Re: [Live-devel] livemedia & mplayer >To: LIVE555 Streaming Media - development & use > >Message-ID: >Content-Type: text/plain; charset="us-ascii" ; format="flowed" > >>I have find a bug (i think...) in the way that mplayer use livemedia. >>When i stream with mplayer from DSS, in "connected user" of Darwin i see >>2 connection from the same host... > >By "connection", do you mean TCP connection? If so, then yes, it >probably is a bug, but Remember, You Have Complete Source Code, which >will enable you to track it down. > >If by "connection", however, you mean UDP socket, then this is normal >- there's one socket for RTP, and one for RTCP, for each media type. >So, if the stream contains both audio and video, then there will be 4 >UDP sockets. >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ > > >------------------------------ > >Message: 6 >Date: Mon, 12 Mar 2007 23:14:19 -0700 >From: Ross Finlayson >Subject: [Live-devel] Testing - please ignore >To: live-devel at ns.live555.com >Message-ID: >Content-Type: text/plain; charset="us-ascii" ; format="flowed" > >Testing our new mail server - please ignore. >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ > > >------------------------------ > >Message: 7 >Date: Tue, 13 Mar 2007 15:29:56 +0800 >From: "Jarod Dong" >Subject: [Live-devel] the frame gotten from MPEG4ESVideoRTPSource >To: live-devel at ns.live555.com >Message-ID: > <3099c0f30703130029s443e5a1fn977d6f124054ff8a at mail.gmail.com> >Content-Type: text/plain; charset=ISO-8859-1; format=flowed > >Hi everyone, > >I just want to know what I get when using getNextFrame() in class >MPEG4ESVideoRTPSource. Is it a VOP? Thank you. > >-- >Free trade reduces world suffering. > > >------------------------------ > >Message: 8 >Date: Tue, 13 Mar 2007 16:38:44 +0800 >From: "#TAY KOON HWEE#" >Subject: [Live-devel] Multicast over Wireless LAN >To: >Message-ID: > <438567054C073949AEBE5A28B83E7DE133FCF9 at MAIL21.student.main.ntu.edu. sg> > >Content-Type: text/plain; charset="iso-8859-1" > >Hi guys, > >I have the following problem with multicasting. My program installed in my PC is able to unicast to laptops/PDA connected wireless to my router. However, for multicasting, it only works for devices connected by LAN cable to my router. Therefore PDA will not be able to receive the multicast while laptop if connected via LAN cable to the router will be able to receive the multicast. > >Weird thing is if my program installed on my laptop (wireless connected to my router) is about to stream (unicast and multcast) to devices connected by wires to the router. > >Does the RTCP Bandwidth got to do with it? I have set my estimatedSessionBandwidthVideo = 512. > >Can anyone advise on the above problem? > >Thank you and regards. > >zkunhui >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: http://lists.live555.com/pipermail/live- devel/attachments/20070313/6e8987d8/attachment-0001.html > >------------------------------ > >Message: 9 >Date: Tue, 13 Mar 2007 04:18:00 -0500 >From: Tim Stabrawa >Subject: Re: [Live-devel] Multicast over Wireless LAN >To: LIVE555 Streaming Media - development & use > >Message-ID: <45F66C48.5000409 at stanford.edu> >Content-Type: text/plain; charset=ISO-8859-1; format=flowed > >#TAY KOON HWEE# wrote: >> Hi guys, >> >> I have the following problem with multicasting. My program installed >> in my PC is able to unicast to laptops/PDA connected wireless to my >> router. However, for multicasting, it only works for devices connected >> by LAN cable to my router. Therefore PDA will not be able to receive >> the multicast while laptop if connected via LAN cable to the router >> will be able to receive the multicast. >> >> Weird thing is if my program installed on my laptop (wireless >> connected to my router) is about to stream (unicast and multcast) to >> devices connected by wires to the router. >> >> Does the RTCP Bandwidth got to do with it? I have set my >> estimatedSessionBandwidthVideo = 512. >> >> Can anyone advise on the above problem? >> > >My best guess is that your router itself is at fault. The way most >switches work with multicast is they'll treat it as broadcast and flood >the network. (You'd have to shell out extra bucks to get one that >handles it smarter - read up on GMRP (802.1p) if you care.) > >Now, wireless routers on the other hand, tend to do something funky with >how they connect up the wireless and wired clients. For example, the >Linksys WRT54G has physically separate interfaces to the main processor >for wireless and wired clients. It provides connectivity between the >two with a software bridge (which effectively combines them as if they >were connected via a hardware switch). FWIW, it looks like multicast >traffic is being forwarded to my wireless link on my WRT54G (only >verified by looking at the LED's though). > >Chances are your particular router is trying to be smart and is >filtering multicast traffic from appearing on the wireless interface >(presumably so it doesn't bog down the wireless link with useless >data). It's possible, although unlikely, that you can receive the data >from your wireless device if you do a GMRP join procedure for the >multicast session(s) you're interested in. I've never done this in >practice though, since I've never actually seen a switch that supports GMRP. > >Anyways, hopefully some of this is useful or interesting. It kept me >amused writing it at least. :-) > >Good luck, > >- Tim > > >------------------------------ > >Message: 10 >Date: Tue, 13 Mar 2007 12:23:34 +0530 >From: "Nikhil Agrawal" >Subject: [Live-devel] Regarding streaming Jpeg images >To: live-devel at ns.live555.com >Message-ID: > <733cde3e0703122353l106760cm4af263a53ae2ecad at mail.gmail.com> >Content-Type: text/plain; charset="iso-8859-1" > >Hi Ross, > >I want to stream Jpeg images. I have still pictures in Jpeg format , what >classes i need to derive and what needs to be implemented . >What functions need to be implemented. > >Thanks and Regards, >Nikhil Agrawal >-------------- next part -------------- >An HTML attachment was scrubbed... >URL: http://lists.live555.com/pipermail/live- devel/attachments/20070312/289d5619/attachment-0001.html > >------------------------------ > >Message: 11 >Date: Tue, 13 Mar 2007 03:11:17 -0700 >From: Ross Finlayson >Subject: Re: [Live-devel] Regarding streaming Jpeg images >To: LIVE555 Streaming Media - development & use > >Message-ID: >Content-Type: text/plain; charset="us-ascii" ; format="flowed" > >>Hi Ross, >> >>I want to stream Jpeg images. I have still pictures in Jpeg format , >>what classes i need to derive and what needs to be implemented . >>What functions need to be implemented. > >Please read the FAQ. >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ > > >------------------------------ > >_______________________________________________ >live-devel mailing list >live-devel at lists.live555.com >http://lists.live555.com/mailman/listinfo/live-devel > > >End of live-devel Digest, Vol 41, Issue 8 >***************************************** > From finlayson at live555.com Tue Mar 13 06:38:40 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2007 06:38:40 -0700 Subject: [Live-devel] Streaming MPEG2 TS with separate audio and video input In-Reply-To: <45F680B6.5010006@kth.se> References: <45F680B6.5010006@kth.se> Message-ID: >Hi, > >I'd like to stream (with rtsp) a transport stream of MPEG2 to an amino. >It works well for single files muxed with video and audio (using the >technique as for the onDemand-test-program, with a test.ts file). But >I'd like to use one video file and one audio file. When using two >instances of : >sms->addSubsession(MPEG2TransportFileServerMediaSubsession::createNew(...)); >the players (amino, mplayer, vlc) dies or halts. Mplayer says it >received two video streams. > >Is there a solution for how to accomplish this, or do I have to use >pre-muxed single files? > >Note; The output stream needs to be bundled as a TS. Yes, you can multiplex the input audio and video streams together into a single Transport Stream, and stream that. You would do this using a subclass of "MPEG2TransportStreamMultiplexor" - e.g., "MPEG2TransportStreamFromESSource.hh". For example, look at how the "wis-streamer" code combines separate audio and video sources into a single Transport Stream - see "WISMPEG2TransportStreamServerMediaSubsession.cpp". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 13 06:52:42 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2007 06:52:42 -0700 Subject: [Live-devel] the frame gotten from MPEG4ESVideoRTPSource In-Reply-To: <3099c0f30703130029s443e5a1fn977d6f124054ff8a@mail.gmail.com> References: <3099c0f30703130029s443e5a1fn977d6f124054ff8a@mail.gmail.com> Message-ID: >Hi everyone, > >I just want to know what I get when using getNextFrame() in class >MPEG4ESVideoRTPSource. See RFC 3016 - especially sections 3.2 and 3.3. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070313/a1cdf165/attachment.html From vinodjoshi at tataelxsi.co.in Tue Mar 13 07:24:23 2007 From: vinodjoshi at tataelxsi.co.in (Vinod Madhav Joshi) Date: Tue, 13 Mar 2007 19:54:23 +0530 Subject: [Live-devel] FW: Query for Live555 forum. Message-ID: <002a01c7657b$4ca87640$022a320a@telxsi.com> Hi all, We are using Live 555 Streaming Server to stream MPEG-2 TS to the Set Top Box as a client. The transactions "DESCRIBE", "SETUP"," PLAY" are correctly happening on both server and client side. But we are hit at a problem , the problem is that after a particular time period server stops streaming before "TEARDOWN" transaction to happen. At that instant recv( ) system call is failing. We want to know that whether the client needs to send any message during streaming. Is anybody facing the same problem? What the solution for this can be to prevent this ? Is anybody having more information about this? Thank You. From xcsmith at rockwellcollins.com Tue Mar 13 08:26:45 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Tue, 13 Mar 2007 10:26:45 -0500 Subject: [Live-devel] Re: Query for Live555 forum. Message-ID: >> We want to know that whether the client needs to send any message during streaming. If your client does not implement RTCP, you need to send periodic RTSP messages. If you are not sending RTCP reports or RTSP messages, then the RTSP server will think the client has died, and the server will stop the stream. You can check the way you instantiate the RTSP server, but I think the default might be something like 45 seconds. Maybe this is not what is happening to you though, I'm not sure what you have for a client. How long is the period of time before the stream stops? Is the stream from a file or some other device? From antoniotirri at libero.it Tue Mar 13 12:46:42 2007 From: antoniotirri at libero.it (Antonio Tirri) Date: Tue, 13 Mar 2007 20:46:42 +0100 Subject: [Live-devel] Simulation and packet loss, part three In-Reply-To: References: <45E60C79.3090606@libero.it> Message-ID: <45F6FFA2.9000902@libero.it> Hi, I edited the live555 mediaserver in order to apply the Gilbert model to send the video packets (gilbert model.jpg) The code is: void MultiFramedRTPSink::sendPacketIfNecessary() { //antonio static int state = 0; double p = 0.70; // defining p double q = 0.20; // defining q if (fNumFramesUsedSoFar > 0) { if ((state == 0) && ((double)(our_random()%10000))/10000.0 <= p) stato =1; else if((state == 1) && ((double)(our_random()%10000))/10000.0 <=q) stato =0; if(stato == 0)fRTPInterface.sendPacket(fOutBuf->packet(), fOutBuf->curPacketSize()); ++fPacketCount; fTotalOctetCount += fOutBuf->curPacketSize(); fOctetCount += fOutBuf->curPacketSize() - rtpHeaderSize - fSpecialHeaderSize - fTotalFrameSpecificHeaderSizes; ++fSeqNo; // for next time } if (fOutBuf->haveOverflowData() && fOutBuf->totalBytesAvailable() > fOutBuf->totalBufferSize()/2) { // Efficiency hack: Reset the packet start pointer to just in front of // the overflow data (allowing for the RTP header and special headers), // so that we probably don't have to "memmove()" the overflow data // into place when building the next packet: unsigned newPacketStart = fOutBuf->curPacketSize() - (rtpHeaderSize + fSpecialHeaderSize + frameSpecificHeaderSize()); fOutBuf->adjustPacketStart(newPacketStart); } else { // Normal case: Reset the packet start pointer back to the start: fOutBuf->resetPacketStart(); } fOutBuf->resetOffset(); fNumFramesUsedSoFar = 0; if (fNoFramesLeft) { // We're done: onSourceClosure(this); } else { // We have more frames left to send. Figure out when the next frame // is due to start playing, then make sure that we wait this long before // sending the next packet. struct timeval timeNow; gettimeofday(&timeNow, NULL); int uSecondsToGo; if (fNextSendTime.tv_sec < timeNow.tv_sec) { uSecondsToGo = 0; // prevents integer underflow if too far behind } else { uSecondsToGo = (fNextSendTime.tv_sec - timeNow.tv_sec)*1000000 + (fNextSendTime.tv_usec - timeNow.tv_usec); } // Delay this amount of time: nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecondsToGo, (TaskFunc*)sendNext, this); } } I need to implement a scrambler and a descrambler as showed in the modello.gif, in order to implement this technique of scrambling (for more information: http://en.wikipedia.org/wiki/Scrambler_%28randomizer%29 ) How can i start this work? Thanks, Antonio Tirri -------------- next part -------------- A non-text attachment was scrubbed... Name: modello.GIF Type: image/gif Size: 2376 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070313/72f82c1f/attachment-0001.gif -------------- next part -------------- A non-text attachment was scrubbed... Name: gilbert model.jpg Type: image/jpeg Size: 11902 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070313/72f82c1f/attachment-0001.jpg From lkml.list at gmail.com Tue Mar 13 16:44:14 2007 From: lkml.list at gmail.com (Karthik) Date: Tue, 13 Mar 2007 19:44:14 -0400 Subject: [Live-devel] H263+ streamer Message-ID: <5718a99f0703131644q1ee8b6a1w17838f20e5012a89@mail.gmail.com> Hi, I am looking to implement a H263+ streamer. I tried to implement the streamer based on the testMPEG1or2VideoStreamer.cpp file. I did this by replacing the MPEG1or2 classes as H263plus classes (esp the H263plusVideoRTPSink class). Is this right? Now, I provide a video file encoded using an H263+ encoder as an input the testH263plusVideoStreamer executable. I also built the receiver part of the program by using the H263plusVideoRTPSource class. I stream the video ( video.263) file to the receiver and storethe received video. When I check the file size, they are different. Also, the packet trace generated provides an invalid packet of 3 bytes size at the beginning of the stream. Is this a bug? I also find that some of the header fields of some frames do not match the original H263 stream. Is Live modifying the headers or omitting some (GOB headers)? If anyone else has come across this problem please let me know. TIA, Karthik -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070313/1c1c9fbf/attachment.html From finlayson at live555.com Tue Mar 13 18:25:03 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2007 18:25:03 -0700 Subject: [Live-devel] H263+ streamer In-Reply-To: <5718a99f0703131644q1ee8b6a1w17838f20e5012a89@mail.gmail.com> References: <5718a99f0703131644q1ee8b6a1w17838f20e5012a89@mail.gmail.com> Message-ID: >I am looking to implement a H263+ streamer. I tried to implement the >streamer based on the testMPEG1or2VideoStreamer.cpp file. > >I did this by replacing the MPEG1or2 classes as H263plus classes >(esp the H263plusVideoRTPSink class). Is this right? Yes. > >Now, I provide a video file encoded using an H263+ encoder as an >input the testH263plusVideoStreamer executable. I also built the >receiver part of the program by using the H263plusVideoRTPSource >class. I stream the video ( video.263) file to the receiver and >storethe received video. > >When I check the file size, they are different. Also, the packet >trace generated provides an invalid packet of 3 bytes size at the >beginning of the stream. Is this a bug? Perhaps. Some of the H.263 code - specifically, "H263plusVideoFileServerMediaSubsession.cpp", "H263plusVideoStreamFramer.cpp" and "H263plusVideoStreamParser.cpp" - was written by a 3rd party, and I have not had time to review it in detail myself. So it's conceivable that there might be a bug there (or perhaps in some of the other H.263-related code). >I also find that some of the header fields of some frames do not >match the original H263 stream. Is Live modifying the headers or >omitting some (GOB headers)? I don't know. Unfortunaetly you're going to have to track this down yourself, by reviewing the source code, and RFC 2429, which it implements. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jarod.dong at gmail.com Tue Mar 13 20:50:36 2007 From: jarod.dong at gmail.com (Jarod Dong) Date: Wed, 14 Mar 2007 11:50:36 +0800 Subject: [Live-devel] the frame gotten from MPEG4ESVideoRTPSource In-Reply-To: References: <3099c0f30703130029s443e5a1fn977d6f124054ff8a@mail.gmail.com> Message-ID: <3099c0f30703132050l372326l7d9e55e677e87fc1@mail.gmail.com> Hi Ross, As my understanding of the RFC, there should be some frames containing GOV headers, but my test shows that all frames started with VOP headers. 2007/3/13, Ross Finlayson : > > > Hi everyone, > > I just want to know what I get when using getNextFrame() in class > MPEG4ESVideoRTPSource. > > > See RFC 3016 - > especially sections 3.2 and 3.3. -- > > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Free trade reduces world suffering. From vinodjoshi at tataelxsi.co.in Tue Mar 13 23:09:44 2007 From: vinodjoshi at tataelxsi.co.in (Vinod Madhav Joshi) Date: Wed, 14 Mar 2007 11:39:44 +0530 Subject: [Live-devel] Query for Live555 forum. In-Reply-To: Message-ID: <002d01c765ff$5d611da0$022a320a@telxsi.com> Hi, Thanks for the reply. The video is stopping near about 45 seconds. I want to know at what time interval do the client needs to send RTCP packets or RTSP message. If RTSP message can be sent to the server, which specific RTSP message do we need to send? Thank You.. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of xcsmith at rockwellcollins.com Sent: Tuesday, March 13, 2007 8:57 PM To: live-devel at ns.live555.com Subject: [Live-devel] Re: Query for Live555 forum. >> We want to know that whether the client needs to send any message during streaming. If your client does not implement RTCP, you need to send periodic RTSP messages. If you are not sending RTCP reports or RTSP messages, then the RTSP server will think the client has died, and the server will stop the stream. You can check the way you instantiate the RTSP server, but I think the default might be something like 45 seconds. Maybe this is not what is happening to you though, I'm not sure what you have for a client. How long is the period of time before the stream stops? Is the stream from a file or some other device? _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Mar 13 23:17:32 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 Mar 2007 23:17:32 -0700 Subject: [Live-devel] Query for Live555 forum. In-Reply-To: <002d01c765ff$5d611da0$022a320a@telxsi.com> References: <002d01c765ff$5d611da0$022a320a@telxsi.com> Message-ID: > I want to know at what time interval do the client needs to send RTCP >packets or RTSP message. If RTCP is implemented properly, then RTCP packets will be sent much more frequently than 45s. Note that - according to the RTP standard - RTCP is *not* optional. You should implement it (provided of course, that you are streaming via RTP, and not raw-UDP). > If RTSP message can be sent to the server, which specific RTSP message > do we need to send? "GET_PARAMETER" will work. (Our RTSP server implementation will ignore the contents of that message, and just send back an empty response.) (Note that the "LIVE555 Streaming Media" software includes a RTSP/RTP/RTCP *client* implementation, as well as a server implementation.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mrnikhilagrawal at gmail.com Wed Mar 14 03:54:49 2007 From: mrnikhilagrawal at gmail.com (Nikhil Agrawal) Date: Wed, 14 Mar 2007 16:24:49 +0530 Subject: [Live-devel] Regarding streaming Jpeg images In-Reply-To: References: <733cde3e0703122353l106760cm4af263a53ae2ecad@mail.gmail.com> Message-ID: <733cde3e0703140354o42f741d0vf4794daad5a87426@mail.gmail.com> Hi, I have implemented all the classes ( subclass of JPEGVideoSource and JPEGVideoFileServerMediaSubsession). I have two queries 1. I am using a single image output.jpeg ( taken from live555 forum mail attachment). I dont know what is the Q factor of the image. How can i calculate Q factor.(I tried with some default values) (also provided in attachment) 2. What the type value i should use , i have tried both 0 and 1. 3. Looking into packets that are flowing across network , i found that first packet( 1428 bytes approx) is missing and remaining data is flowing.Also I dumped data using client OpenRTSP , is shows data with first few ( ~ 1428) bytes missing and remining present). What further steps I need to take? Regards, Nikhil Agrawal On 3/13/07, Ross Finlayson wrote: > > >Hi Ross, > > > >I want to stream Jpeg images. I have still pictures in Jpeg format , > >what classes i need to derive and what needs to be implemented . > >What functions need to be implemented. > > Please read the FAQ. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070314/98d9199f/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: test.jpg Type: image/jpeg Size: 22060 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070314/98d9199f/attachment-0001.jpg From kurutepe at nue.tu-berlin.de Wed Mar 14 06:01:04 2007 From: kurutepe at nue.tu-berlin.de (Engin Kurutepe) Date: Wed, 14 Mar 2007 14:01:04 +0100 Subject: [Live-devel] doEventLoop watch variable Message-ID: <45F7F210.8020107@nue.tu-berlin.de> Dear Ross, I want to implement stream switching in my RTSP client. So after I start streaming by doEventLoop(watch), I understand I can interrupt the loop by setting *watch=1. My idea is to use a separate control thread to watch for keyboard inputs, which will set the watch variable. If the event loop is interrupted streams will be switched and another doEventLoop(watch) to keep streaming. My question is how are the scheduled tasks affected when the event loop is interrupted. Are they unscheduled or do they continue unaffected after the loop resumes? Thanks a lot, Engin. From finlayson at live555.com Wed Mar 14 06:14:53 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2007 06:14:53 -0700 Subject: [Live-devel] doEventLoop watch variable In-Reply-To: <45F7F210.8020107@nue.tu-berlin.de> References: <45F7F210.8020107@nue.tu-berlin.de> Message-ID: >I want to implement stream switching in my RTSP client. So after I start >streaming by doEventLoop(watch), I understand I can interrupt the loop >by setting *watch=1. > >My idea is to use a separate control thread to watch for keyboard >inputs, which will set the watch variable. If the event loop is >interrupted streams will be switched and another doEventLoop(watch) to >keep streaming. > >My question is how are the scheduled tasks affected when the event loop >is interrupted. Are they unscheduled or do they continue unaffected >after the loop resumes? The latter. Any pending tasks remain in the scheduler until they are handled (or removed). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bidibulle at operamail.com Wed Mar 14 07:51:06 2007 From: bidibulle at operamail.com (David Betrand) Date: Wed, 14 Mar 2007 15:51:06 +0100 Subject: [Live-devel] track synchronization with live555 Message-ID: <20070314145107.1B87ECA0A7@ws5-11.us4.outblaze.com> Hello Ross, This mail will again talk about synchronization issues, sorry about that ;-) I know live555 library uses RTCP SR reports to synchronize audio and video streams, but doesn't use the rtp-info header present in the PLAY response. I will be quite direct : I think you shouldn't, and should follow section 12.33 from RFC2326 and section 14.38 from draft-ietf-mmusic-rfc2326bis-14.txt instead. Those standards clearly state that the only way to guarantee synchronization is to use 'rtptime' timestamps from the rtp-info header. I think that the misunderstanding here is that the library doesn't make the difference between NPT and NTP times. Yes, RTCP SRs can be used to map RTP timestamps into NTP wall clock, but only 'rtptime' can be used to map RTP timestamps into the NPT time. The main problems with relying on SR is that you need at least one SR from each track before you can do this, and not all implementations send SRs right away (note also that immediate SRs are effectively nothing much different than RTP-Info values, though the values actually used would likely be different). The other problem is that data can be delayed in the network at different rates meaning even if the source isnt doing this intentionally, it could happen anyway. Let's take the case for which I'm currently in trouble using live555 library: my application is a RTSP client connecting to a live encoder (Envivio M2 to be accurate) that can be accessed via RTSP. Here is the sequence of packets I receive from the encoder, for the audio track(AMR, 8000 Hz sampling) : t0 : reception of rtp-info : RTP timestamp = 11146 t0 + 3.7 sec : reception of first RTP packet : RTP timestamp = 32373 t0 + 3.9 sec : reception of second RTP packet : RTP timestamp = 33970 t0 + 4 sec : reception of first RTCP SR : RTP timestamp = 43150 t0 + 4.1 sec : reception of next RTP packet : RTP timestamp = 35570 So, -if you look at event #1 and #3 in this sequence, this seems fine : 4 seconds elapse and the timestamps increase with (4 * 8000) = 32000 -if you look at events #2 #3 and #5 in this sequence, this seems fine too because the RTP timstamp differences are typical for AMR (1600 timestamps, for 200 ms) But : - if you look at event #1 and #2, you see that almost 4 seconds elapse but the RTP timestamps only increase with 21200 while we could expect a value around 30000. --> look at the timestamp jump between event#3 and event #4 : very big ! --> look at the timstamp "jump back" between event #4 and event #5 : a negative value representing approximatively 1 second ! --> ALL those jumps will be reflected in RTPReceptionStats::fSyncTime and the resultPresentationTime will of course follow those jumps too ... The "uncommon" thing in those timestamps is that the synchronizations points given in the rtp-info' and in the SR are not multiplexed in time sequence with the surrounding RTP packets : it is like the audio packets have to be played in the past ! As unexpected it might be, I don't think it is actually illegal in any way. If you've seen something in the relevant specs that makes you disagree, of course let me know ... So, my opinion is that only the rtp-info header shouldbe used to map the RTP timestamps to a common point in time (corresponding to NPT in PLAY response). On the other hand, SRs MAY be used to furthermore compute the difference between the receiver wall clock and the sender wal clock in case of long presentations. I don't know why but I've the feeling that you won't agree easily ... Am I right ? Looking forward to sharing this with U David -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From finlayson at live555.com Wed Mar 14 08:30:30 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2007 08:30:30 -0700 Subject: [Live-devel] track synchronization with live555 In-Reply-To: <20070314145107.1B87ECA0A7@ws5-11.us4.outblaze.com> References: <20070314145107.1B87ECA0A7@ws5-11.us4.outblaze.com> Message-ID: >I know live555 library uses RTCP SR reports to synchronize audio and >video streams, but doesn't use the rtp-info header present in the >PLAY response. It does, however, make this header information available for clients to use - see "MediaSession.hh". You're correct, though, that the RTP-Info header information is the only way to generate a proper presentation time for RTP packets that arrive before the first RTCP SR packet. The VLC developers (in particular, Derk-Jan Hartman) tell me that they are currently modifying VLC to use the RTP-Info header information upon initialization. At some point I should probably modify the LIVE555 "RTSPClient" code to do this automatically, so that each client application developer doesn't have to deal with this himself. Note, though, that not all RTP receivers use RTSP (or SIP) for set-up, and so not all will have "RTP-Info" information available. These receivers must be prepared to deal with inaccurate presentation times prior to the arrival of the first RTCP SR. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From opera at kth.se Wed Mar 14 08:46:23 2007 From: opera at kth.se (=?ISO-8859-1?Q?Gustaf_R=E4ntil=E4?=) Date: Wed, 14 Mar 2007 16:46:23 +0100 Subject: [Live-devel] Streaming MPEG2 TS with separate audio and video input In-Reply-To: References: <45F680B6.5010006@kth.se> Message-ID: <45F818CF.7000908@kth.se> Ross Finlayson wrote: >> Hi, >> >> I'd like to stream (with rtsp) a transport stream of MPEG2 to an amino. >> It works well for single files muxed with video and audio (using the >> technique as for the onDemand-test-program, with a test.ts file). But >> I'd like to use one video file and one audio file. When using two >> instances of : >> sms->addSubsession(MPEG2TransportFileServerMediaSubsession::createNew(...)); >> the players (amino, mplayer, vlc) dies or halts. Mplayer says it >> received two video streams. >> >> Is there a solution for how to accomplish this, or do I have to use >> pre-muxed single files? >> >> Note; The output stream needs to be bundled as a TS. >> > > Yes, you can multiplex the input audio and video streams together > into a single Transport Stream, and stream that. You would do this > using a subclass of "MPEG2TransportStreamMultiplexor" - e.g., > "MPEG2TransportStreamFromESSource.hh". > > For example, look at how the "wis-streamer" code > combines separate audio and > video sources into a single Transport Stream - see > "WISMPEG2TransportStreamServerMediaSubsession.cpp". > These solutions are very complex and unfortunately not really doing what I need. To stream many simultaneous TS streams without using too much CPU, I would like to not have to transcode the sources. So what I want is to read a video TS file and an audio TS file (AC3 for instance) and stream as a muxed TS. Is this possible? I can't find any class for reading AC3 files for instance. Am I supposed to write my own AC3-reader and subclass some FrameSource-thing? As I wrote, I tried to add an MPEG2TransportFileServerMediaSubsession and added it to a ServerMediaSession which works (if I only add one video file). Tried by adding an audio file (as TS) but when I play, mplayer says it received two video streams. To me it seems that MPEG2TransportFileServerMediaSubsession only reads _video_ TS files, which the class name doesn't really state. The examples from the WiS contains a lot of reading V4L-files etc. I want to read video and audio TS files into ServerMediaSubsessions. Is there no support for this? Gustaf From bidibulle at operamail.com Wed Mar 14 09:04:58 2007 From: bidibulle at operamail.com (David Betrand) Date: Wed, 14 Mar 2007 17:04:58 +0100 Subject: [Live-devel] track synchronization with live555 Message-ID: <20070314160458.9A840CA0A4@ws5-11.us4.outblaze.com> Ross, > It does, however, make this header information available for clients > to use - see "MediaSession.hh". I know but the only place where this struct is filled in is within RTSPClient class. And having a look at parseRTPInfoHeader(), I see that only the first track is parsed in this header so I guess this is a bug in the library. > > You're correct, though, that the RTP-Info header information is the > only way to generate a proper presentation time for RTP packets that > arrive before the first RTCP SR packet. I think it is even more "incorrect" than that in the library now. Even after receiving the first SR, your presentation times are not coherent in some circumstances. This is because you compute fSyncTime first with timeNow, and then with the SR NTP, confusing NPT and NTP. You should use those SR another way, by comparing your wall clock with the wall clock of the sender. My example in my previous mail clearly illustrates this case. The VLC developers (in > particular, Derk-Jan Hartman) tell me that they are currently > modifying VLC to use the RTP-Info header information upon > initialization. I saw this thread and had a look at the VLC code. Apparently, VLC only needs the sequence numbers present in this rtp-info header. This is useful for seeking and pausing/resuming the stream At some point I should probably modify the LIVE555 > "RTSPClient" code to do this automatically, so that each client > application developer doesn't have to deal with this himself. As the way I see it is quite different from the current implementation, what I could suggest is to send a patch with a IFDEF clause, allowing to switch between a SR-based synchronization (current implementation), and a rtp-info-based synchronization ? Would it be a good start for you ? > > Note, though, that not all RTP receivers use RTSP (or SIP) for > set-up, and so not all will have "RTP-Info" information available. > These receivers must be prepared to deal with inaccurate presentation > times prior to the arrival of the first RTCP SR. Yes I know, and my application DOES care about such clients (in particular SIP clients) so we will try to handle those also in the patch Best Regards, David > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From xcsmith at rockwellcollins.com Wed Mar 14 14:32:48 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Wed, 14 Mar 2007 16:32:48 -0500 Subject: [Live-devel] RAW UDP Mulicast Message-ID: Hello! I was updating to LIVE555 latest version, and I noticed in RTSPServer::handleCmd_SETUP, the Transport Parameters for Multicast RTP_UDP are different than the Transport Parameters for Multicast RAW_UDP. Shouldn't RAW_UDP streaming use "port" instead of "client_port" "server_port" because it is multicast? I did not want to change this myself because I was not sure what effect it would have on that specialized client which needs raw UDP multicast. xochitl From finlayson at live555.com Wed Mar 14 17:16:48 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2007 17:16:48 -0700 Subject: [Live-devel] RAW UDP Mulicast In-Reply-To: References: Message-ID: >I was updating to LIVE555 latest version, and I noticed in >RTSPServer::handleCmd_SETUP, the Transport Parameters for Multicast RTP_UDP >are different than the Transport Parameters for Multicast RAW_UDP. >Shouldn't RAW_UDP streaming use "port" instead of "client_port" >"server_port" because it is multicast? Yes. Thanks for noticing this; it will be fixed in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Mar 14 17:36:37 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2007 17:36:37 -0700 Subject: [Live-devel] Streaming MPEG2 TS with separate audio and video input In-Reply-To: <45F818CF.7000908@kth.se> References: <45F680B6.5010006@kth.se> <45F818CF.7000908@kth.se> Message-ID: >These solutions are very complex and unfortunately not really doing what >I need. >To stream many simultaneous TS streams without using too much CPU, I >would like to not have to transcode the sources. What I described in my earlier message was not "transcoding" - at least, not really. It was simply taking existing MPEG Elementary Stream data, and packaging it into an outgoing Transport Stream. >So what I want is to read a video TS file and an audio TS file (AC3 for >instance) and stream as a muxed TS. Is this possible? There's currently no support in the library for this - sorry. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Mar 14 18:02:05 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2007 18:02:05 -0700 Subject: [Live-devel] track synchronization with live555 In-Reply-To: <20070314160458.9A840CA0A4@ws5-11.us4.outblaze.com> References: <20070314160458.9A840CA0A4@ws5-11.us4.outblaze.com> Message-ID: > > It does, however, make this header information available for clients >> to use - see "MediaSession.hh". > >I know but the only place where this struct is filled in is within >RTSPClient class. And having a look at parseRTPInfoHeader(), I see >that only the first track is parsed in this header Yes, that code currently works only when parsing the response to a *non-aggregate* PLAY command (i.e., a PLAY for a single track only). >As the way I see it is quite different from the current >implementation, what I could suggest is to send a patch with a IFDEF >clause, allowing to switch between a SR-based synchronization >(current implementation), and a rtp-info-based synchronization ? No, I think you're confused here. RTCP SR packets are *always* used to generate synchronized presentation times. This is not - and never will be - optional functionality. Note that "presentation time" is not the same as NPT ("normal play time"). For example, if the user seeks backwards in the stream, then NPT will move backwards, but RTP timestamps - and thus presentation times - from the server will continue moving forward (except for things like video B-frames, for which the presentation times may move a short distance backwards). I was incorrect in my earlier response when I said that the RTP-Info header information could be used to generate presentation times prior to the arrival of the first RTCP SR. That's incorrect because as you noted - the RTP-Info header information maps RTP timestamps to NPT, not presentation time. Because NPT is an application-level concept, it's up to the application-level code that calls our library to keep track of NPT, and the offset between presentation time and NPT. (This is what I presume the ongoing revision to VLC will be doing better.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From zhouh31415 at 163.com Wed Mar 14 18:59:22 2007 From: zhouh31415 at 163.com (=?gbk?B?1ty66w==?=) Date: Thu, 15 Mar 2007 09:59:22 +0800 (CST) Subject: [Live-devel] RTP but no RTCP VS UDP Message-ID: <1205473188.2577791173923962760.JavaMail.root@bj163app125.163.com> Hi, I think UDP has more efficent if I run RTP without RTCP, hasn't it? Now I can stream raw-H.264 stream, the stream has no containers like ts. I found VLC couldplay it online by using RTSP. But if I save the stream on my harddisk, Vlc cannot play the file, why? Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070314/73fa202e/attachment.html From finlayson at live555.com Wed Mar 14 20:05:50 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2007 20:05:50 -0700 Subject: [Live-devel] RTP but no RTCP VS UDP In-Reply-To: <1205473188.2577791173923962760.JavaMail.root@bj163app125.163.com> References: <1205473188.2577791173923962760.JavaMail.root@bj163app125.163.com> Message-ID: >Hi, > I think UDP has more efficent if I run RTP without RTCP, hasn't it? RTCP is quite efficient, and is *not* optional. If you implement RTP, you should also implement RTCP. (Fortunately, the "LIVE555 Streaming Media" software implements RTCP easily, using the "RTCPInstance" class.) > > Now I can stream raw-H.264 stream, the stream has no containers >like ts. I found VLC couldplay it online by using RTSP. But if I >save the stream on my harddisk, Vlc cannot play the file, why? I don't know. That is a question for a VLC mailing list. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rupak.p at gmail.com Wed Mar 14 22:08:16 2007 From: rupak.p at gmail.com (Rupak Patel) Date: Thu, 15 Mar 2007 10:38:16 +0530 Subject: [Live-devel] Regarding JPEG Streaming Message-ID: Hi Nikhil, I am also working on JPEG streaming using Live555. In my case all packets are going to client. I have tried using OpenRTSP and dumped received data to a file and it correctly forms a JPEG image that got streamed using -m option. Ross, I am facing another problem, I read earlier mails and accordingly sent only data after header ( excluding complete header) , but I am able to see a blank (dark) screen on both Quicktime and VLC players. Also image is getting reconstructed properly using -m option with OpenRTSP client. I have given some arbitrary QFactor but atleast it must show some graphics (may be wild ones). Can you tell me where I am going wrong. Bye, Rupak. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070314/f739699e/attachment.html From finlayson at live555.com Wed Mar 14 23:06:01 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 Mar 2007 23:06:01 -0700 Subject: [Live-devel] Regarding JPEG Streaming Message-ID: >I am facing another problem, I read earlier mails and accordingly >sent only data after header ( excluding complete header) , but I am >able to see a blank (dark) screen on both Quicktime and VLC players. >Also image is getting reconstructed properly using -m option with >OpenRTSP client. That's good, because it suggests that your RTSP/RTP streaming is working OK. > I have given some arbitrary QFactor but atleast it must show some >graphics (may be wild ones). Can you tell me where I am going wrong. Unfortunately not, because it seems the problem is with the media player's decoder/renderer, rather than with the RTSP/RTP streaming. I suggest looking into why VLC is not playing your stream properly (e.g., by looking at the source code, and asking on VLC mailing lists). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mihaim at tfm.ro Thu Mar 15 03:37:10 2007 From: mihaim at tfm.ro (Mihai Moldovanu) Date: Thu, 15 Mar 2007 12:37:10 +0200 Subject: [Live-devel] Darwin Injector problems ("precondition failed") Message-ID: <45F921D6.1020706@tfm.ro> Hello , //I tried the MPEG4 darwin injector test program. And i get the infamous error: //injector->setDestination() failed: cannot handle ANNOUNCE response: RTSP/1.0 412 Precondition Failed My darwin version is 5.5.4. It does it every singe time no matter how much time i wait between runs. The only difference between the original injector and my version is that i used authentification: // Next, specify the destination Darwin Streaming Server: if (!injector->setDestination(dssNameOrAddress, remoteStreamName, programName, "LIVE555 Streaming Media",554,"my_user","my_pass","test","test")) { Any ideea what's wrong ? Regards, Mihai Moldovanu From kamildobk at poczta.onet.pl Thu Mar 15 05:27:29 2007 From: kamildobk at poczta.onet.pl (Kamil) Date: Thu, 15 Mar 2007 13:27:29 +0100 Subject: [Live-devel] Fw: Implementing RTP streaming from live source /multithreading issues/ Message-ID: <004001c766fd$6aeeea70$6703a8c0@KAMILNET> From: kamil To: live-devel at lists.live555.com Sent: Thursday, March 15, 2007 11:17 AM Subject: Implementing RTP streaming from live source /multithreading issues/ I have following problem. I have to implement RTP server, which streams the data from video capture board and on the other side I have to implement RTP client which receives and decodes that stream. The problem is how to notify server thread that a new video frame is available for streaming ? My capture board delivers frames in separate threads ( I have the same problem on the client side. LiveMedia runs in separate thread but I need to connect, disconnect and invoke PLAY,PAUSE from main GUI thread ) I searched through list archives but found only some ugly-hacks with setting bool variable and checking it when select timeouts. I need more efficient solution, because I'm going to stream up to 400fps from twenty or more cameras ( it can be also a one camera at 1fps ). Is there any other solution eg. dummy socket or changing all sockets to WSAEventSelect() ( I'm running it on Windows ) ? Anybody tried it ? Please help ... Thank you in advance Kamil From luc47654 at yahoo.it Thu Mar 15 09:22:09 2007 From: luc47654 at yahoo.it (luca norm) Date: Thu, 15 Mar 2007 17:22:09 +0100 (CET) Subject: [Live-devel] Frame Rate and (soft) real-time extensions In-Reply-To: Message-ID: <20070315162210.30339.qmail@web23410.mail.ird.yahoo.com> > > >For example: if I want to make a streamer which > >streams exactly at 25 fps, should i add these > >extensions? > > Trying to stream packets at a perfectly even rate - > without any > jitter, is pointless. The network will always add > some jitter > anyway, and the receiving client(s) - if implemented > correctly - will > have enough buffer memory to absorb network jitter. Consider what follows: A live source sends to me frames through a socket and I have to stream them: in this case I have a jitter on each timestamp that I have to set on each frame, due to network latencies beetwen the live source and the streamer. Suppose now that the live source is not a socket but an acquisition card which acquires frames with a deterministic frequence and no latency beetwen each sample. Before streaming each frame, in this case I can set (with real time extensions, like rtai or rtlinux) an accurate timestamp and if the sampling rate corresponds to about 50fps the consequent absence of the jitter is relevant. In addition: the real time extensions above mentioned support also deterministic UDP networking. (so, i ask if they could be added to RTP streaming on UDP). What do you think about this? regards, Luca ___________________________________ L'email della prossima generazione? Puoi averla con la nuova Yahoo! Mail: http://it.docs.yahoo.com/nowyoucan.html From finlayson at live555.com Thu Mar 15 09:35:34 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2007 09:35:34 -0700 Subject: [Live-devel] Frame Rate and (soft) real-time extensions In-Reply-To: <20070315162210.30339.qmail@web23410.mail.ird.yahoo.com> References: <20070315162210.30339.qmail@web23410.mail.ird.yahoo.com> Message-ID: >Suppose now that the live source is not a socket but >an acquisition card which acquires frames with a >deterministic frequence and no latency beetwen each >sample. Before streaming each frame, in this case I >can set (with real time extensions, like rtai or >rtlinux) an accurate timestamp and if the sampling >rate corresponds to about 50fps the consequent absence >of the jitter is relevant. >In addition: the real time extensions above mentioned >support also deterministic UDP networking. >(so, i ask if they could be added to RTP streaming on >UDP). Again, there'd be little or no benefit to doing this. Because you have an accurate timestamp for your data when it is acquired, the RTP timestamps (i.e., presentation times) in each outgoing packet will also be accurate. Therefore the network jitter between the server and client(s) doesn't matter. That's the whole point of having RTP timestamps. Remember, we're Internet people, not old world ATM-style 'telephants' :-) :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From marthi at graphics.cs.uni-sb.de Thu Mar 15 10:04:36 2007 From: marthi at graphics.cs.uni-sb.de (Martin) Date: Thu, 15 Mar 2007 18:04:36 +0100 Subject: [Live-devel] Information accessible in TransmissionStatsDB Message-ID: <45F97CA4.5070905@graphics.cs.uni-sb.de> Hi Ross, I have some questions regarding data accessible in TransmissionStatsDB. Can you please confirm the exact meaning of the return values of the function and its units. 1) I think that both lastSRTime() diffSR_RRTime() return the time in units of 1/65536 seconds. Is that right? 2) timeCreated(): I always receive a constant value. Is that the NTP time the session was created? 3) lastTimeReceived() Does it return the NTP time the last report was received (for calculation of round trip delay)? 4) lastPacketNumReceived() Is that the "extended highest sequence number received" as defined in RFC3550? Furthermore I have a question regaring the sending ofrtcp reports. It is possible to send RTCP reports in a defined time interval through the sendReport() function. But still the "normal" scheduled RTCP-Reports are beeing send. Is it possible to disable the sending of these "normal" RTCP reports? Thank you very much for your help! Martin From bidibulle at operamail.com Thu Mar 15 11:40:13 2007 From: bidibulle at operamail.com (David Betrand) Date: Thu, 15 Mar 2007 19:40:13 +0100 Subject: [Live-devel] track synchronization with live555 Message-ID: <20070315184013.9071044011@ws5-1.us4.outblaze.com> Ross, > No, I think you're confused here. RTCP SR packets are *always* used > to generate synchronized presentation times. This is not - and never > will be - optional functionality. I agree that's the only way to synchronize if rtp-info is not available. > Note that "presentation time" is > not the same as NPT ("normal play time"). I shouldn't have talk about NPT, it introduced confusion. My idea was to show you that there is probably something incorrect in the way you process timestamps in noteIncomingSR(). You can't simply say something like : fSyncTime = ntpTimestamp; //wall clock time received in SR fSyncTimestamp = rtpTimestamp; //timestamp received in the SR You should instead do something like : ts= timevalToTimestamp(ntpTimestamp)-timestampBase; // express in timestamps the synchronization time difference between us and the sender fSyncTimestamp = rtpTimestamp + ts; // update our reference timestamp, taking into account this difference Furtheremore, I really would like to use the rtp-info if it is available. This allows to have live555 library handle synchronization perfectly in case of RTSP (in this kind of applications -not real-time- where it is the most important) Anyway, let me a couple of days and I will send a complete proposal for this. I hope we will be able to continue this thread at that time ... Cheers, David -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From xcsmith at rockwellcollins.com Thu Mar 15 11:42:59 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Thu, 15 Mar 2007 13:42:59 -0500 Subject: [Live-devel] how to patch LIVE555 Message-ID: I have created some patch files like this: diff -c -b -r live > patch1 When I unroll the latest version fresh and try to use the patch file, some items fail. Any things to watch out for when making patch files? Is it an issue if a few items fail? patch -p1 -d live < patch1 Thx! xochitl From finlayson at live555.com Thu Mar 15 14:21:33 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 15 Mar 2007 14:21:33 -0700 Subject: [Live-devel] track synchronization with live555 In-Reply-To: <20070315184013.9071044011@ws5-1.us4.outblaze.com> References: <20070315184013.9071044011@ws5-1.us4.outblaze.com> Message-ID: >You can't simply say something like : >fSyncTime = ntpTimestamp; //wall clock time received in SR >fSyncTimestamp = rtpTimestamp; //timestamp received in the SR I can, and I do :-) This is perfectly correct. The NTP time is the presentation time - using the sender's 'wall clock' - that corresponds to the corresponding RTP timestamp. Presentation times derived from this are passed directly to the receiver, for its use in rendering the media (and synchronizing with other streams - i.e., 'lip sync'). Note that there is no notion here of "synchronization time difference between us and the sender" as you mentioned in your last message. (The receiver *might* choose to keep track of this - or, more precisely, the drift over time between the sender and receiver clocks - and make appropriate adjustments to its buffering/rendering.) But that is not what RTP gives you. The timestamps/presentation times that RTP (and thus the LIVE555 library) gives you are presentation times based on the sender's clock. >Furtheremore, I really would like to use the rtp-info if it is available. It *is* available - in "MediaSession.hh". This is what you would use to keep track of NPT, but *not* presentation time. >Anyway, let me a couple of days and I will send a complete proposal for this. I don't believe there is anything wrong with the current code. It implements the RTP/RTCP specification. Therefore it's unlikely that I will be making any changes to the existing code in this case. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From xngzhng at yahoo.com.cn Thu Mar 15 20:13:12 2007 From: xngzhng at yahoo.com.cn (xiang zhang) Date: Fri, 16 Mar 2007 11:13:12 +0800 (CST) Subject: [Live-devel] About the RTSP server Message-ID: <922214.4891.qm@web15613.mail.cnb.yahoo.com> Hello! I am Zhang Xiang, graduate student from Zhe Jiang University,China. I make a embeded system as RTSP server for real-time H.264 stream. And I manage to establish a connection between this server and a remote PC. At the very beginning, the remote PC could play this video stream 12 fps with Mplayer under linux environment and the play is real-time. But after about one minute, the transmission is slow down to 3 fps or 2 fps. I find that there is a delay before calling dogetnextfram() function. Why is there such delay? Thank you !! --------------------------------- ????????-3.5G???20M??? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070315/8966a6e5/attachment.html From vinodjoshi at tataelxsi.co.in Thu Mar 15 21:47:36 2007 From: vinodjoshi at tataelxsi.co.in (Vinod Madhav Joshi) Date: Fri, 16 Mar 2007 10:17:36 +0530 Subject: [Live-devel] Streaming problem for set top box Message-ID: <000001c76786$38729360$022a320a@telxsi.com> Hi all, We are using Live 555 Streaming Media Server to stream MPEG2 TS to the set toop box as client. We have one Multiple Program Transport Stream which can be streamed without any problem to the set top box. Also it was not possible to generate .tsx for such a file to support Trick Play. So we converted to it to the SPTS and generated .tsx for that which is greater than 0 bytes.. With that stream we are able to stream VLC as a client fine. But for the same stream to stream to the set top box its giving glitches and giving blurred video and sometimes it will not start decoding, and still server is streaming.We are sure that server on side there is no problem. Is there any way to control rate to resolve it? Can we control rate of streaming from server side? Or any other solution can be for this? Thank You. From susovan at tataelxsi.co.in Thu Mar 15 22:38:59 2007 From: susovan at tataelxsi.co.in (Susovan Ghosh) Date: Fri, 16 Mar 2007 11:08:59 +0530 Subject: [Live-devel] RTSP message format for FF & FR Message-ID: <000601c7678d$65c35be0$0a2a320a@telxsi.com> Hi all, we are able to stream MPEG2 ts file to the STB by Live 555 ,and it is ok. Now we want fast forward and fast rewind functionality to implement.we know that we have to change scale value. We have print state ment to display the message from client and tested with VLC and Quick time player.It displayed all DESCRIBE,SETUP,PLAY,PAUSE messages.But it is not displaying anything for FF & FR. So can any one help me to let me know waht is the format or can we do this with this message format -- C->S: PLAY rtsp://server IP/Movie name RTSP/1.0 CSeq: 835 Session: 12345678 Scale = 2.0 For Fast Forward Range: npt=10-15 C->S: PLAY rtsp://server IP/Movie name RTSP/1.0 CSeq: 835 Session: 12345678 Scale = -2.0 For Fast Rewind Range: npt=10-15 Thank You SUSOVAN GHOSH PH No-9986667320 Engineer (D&D) PRDE TATAELXSI LIMITED From asmundg at snap.tv Fri Mar 16 02:36:40 2007 From: asmundg at snap.tv (=?utf-8?q?=C3=85smund_Grammeltvedt?=) Date: Fri, 16 Mar 2007 10:36:40 +0100 Subject: [Live-devel] RAW UDP Mulicast In-Reply-To: References: Message-ID: <200703161036.43231.asmundg@snap.tv> On Thursday 15 March 2007 01:16, Ross Finlayson wrote: > >I was updating to LIVE555 latest version, and I noticed in > >RTSPServer::handleCmd_SETUP, the Transport Parameters for Multicast > > RTP_UDP are different than the Transport Parameters for Multicast > > RAW_UDP. Shouldn't RAW_UDP streaming use "port" instead of "client_port" > >"server_port" because it is multicast? > > Yes. Thanks for noticing this; it will be fixed in the next release > of the software. Whoops, I guess that's partially my mistake. Anyway, rfc-2326 indicates that port/client_port/server_port are RTP-specific, so I'm not sure if they should be there at all. -- ?smund Grammeltvedt Snap TV -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070316/12854b89/attachment.bin From susovan at tataelxsi.co.in Fri Mar 16 03:59:31 2007 From: susovan at tataelxsi.co.in (Susovan Ghosh) Date: Fri, 16 Mar 2007 16:29:31 +0530 Subject: [Live-devel] Query for Live555 forum Message-ID: <002301c767ba$2d0ca590$0a2a320a@telxsi.com> Hi all, We are using Live 555 Streaming Media Server to stream MPEG2 TS to the set toop box as client. We have one Multiple Program Transport Stream which can be streamed without any problem to the set top box. Also it was not possible to generate .tsx for such a file to support Trick Play. So we converted to it to the SPTS and generated .tsx for that which is greater than 0 bytes.. With that stream we are able to stream VLC as a client fine. But for the same stream to stream to the set top box its giving glitches and giving blurred video and sometimes it will not start decoding, and still server is streaming.I can not under stand the problem. 1)Is it a problem with bit or fram rate miss match or buffer(our STB buffer size 7*188). 2) Is there any way to control bit or fram rate to resolve it? 3)Can we control rate of streaming from server side or configure the server? 4)any other solution can be for this? Thank You. SUSOVAN GHOSH PH No-9986667320 Engineer (D&D) PRDE TATAELXSI LIMITED From finlayson at live555.com Fri Mar 16 05:03:06 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 16 Mar 2007 13:03:06 +0100 Subject: [Live-devel] RAW UDP Mulicast In-Reply-To: <200703161036.43231.asmundg@snap.tv> References: <200703161036.43231.asmundg@snap.tv> Message-ID: > > >I was updating to LIVE555 latest version, and I noticed in >> >RTSPServer::handleCmd_SETUP, the Transport Parameters for Multicast >> > RTP_UDP are different than the Transport Parameters for Multicast >> > RAW_UDP. Shouldn't RAW_UDP streaming use "port" instead of "client_port" >> >"server_port" because it is multicast? >> >> Yes. Thanks for noticing this; it will be fixed in the next release > > of the software. > >Whoops, I guess that's partially my mistake. Anyway, rfc-2326 indicates that >port/client_port/server_port are RTP-specific, so I'm not sure if they should >be there at all. Yes, there's really no standard at all for how raw-UDP RTSP streams are supposed to work, so right now we're pretty much 'winging it', based on the NCube & Amino 'de facto standard' (for raw-UDP Transport Streams via RTSP). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From xcsmith at rockwellcollins.com Fri Mar 16 13:59:06 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Fri, 16 Mar 2007 15:59:06 -0500 Subject: [Live-devel] RTP-Info: rtptime Message-ID: I noticed one of my non-LIVE555 RTSP servers report a negative rtptime in a response to a PLAY message. Can rtptime be negative? I thought it rolled over to 0, but would not be negative. Thx. xcsmith From bidibulle at operamail.com Sat Mar 17 00:56:51 2007 From: bidibulle at operamail.com (David Betrand) Date: Sat, 17 Mar 2007 08:56:51 +0100 Subject: [Live-devel] RTP-Info: rtptime Message-ID: <20070317075651.61E4CCA0A4@ws5-11.us4.outblaze.com> This must be a bug in your RTSP server. minimal value for 'rtptime' is zero. David > ----- Original Message ----- > From: xcsmith at rockwellcollins.com > To: "LIVE555 Streaming Media - development & use" > Subject: [Live-devel] RTP-Info: rtptime > Date: Fri, 16 Mar 2007 15:59:06 -0500 > > > > I noticed one of my non-LIVE555 RTSP servers report a negative rtptime in a > response to a PLAY message. Can rtptime be negative? I thought it rolled > over to 0, but would not be negative. Thx. > > xcsmith > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From finlayson at live555.com Sun Mar 18 00:48:42 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 18 Mar 2007 08:48:42 +0100 Subject: [Live-devel] Query for Live555 forum In-Reply-To: <002301c767ba$2d0ca590$0a2a320a@telxsi.com> References: <002301c767ba$2d0ca590$0a2a320a@telxsi.com> Message-ID: > With that stream we are able to stream VLC as a client fine. > But for the same stream to stream to the set top box its giving >glitches and giving blurred video and sometimes it will not start > decoding, and still server is streaming.I can not under stand the >problem. > 1)Is it a problem with bit or fram rate miss match or buffer(our STB >buffer size 7*188). 7*188 bytes (probably) the size of a network *packet*, not the amount of memory buffering available at your STB client - which should (I hope) be much larger than this > 2) Is there any way to control bit or fram rate to resolve it? > 3)Can we control rate of streaming from server side or configure the >server? No - the server streams the Transport Stream data at its natural rate. (How could it do anything different??) If you want to stream at a lower bit rate, you will need to reencode your Transport Stream to use a lower bit rate. Because VLC plays your stream OK, the problem is with your STB client, so you will need to ask the manufacturer of this client for help. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Mar 18 06:44:31 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 18 Mar 2007 14:44:31 +0100 Subject: [Live-devel] RTSP message format for FF & FR In-Reply-To: <000601c7678d$65c35be0$0a2a320a@telxsi.com> References: <000601c7678d$65c35be0$0a2a320a@telxsi.com> Message-ID: > we are able to stream MPEG2 ts file to the STB by Live 555 ,and it is >ok. Now we want fast forward and fast rewind functionality to implement.we >know that we have to > change scale value. > We have print state ment to display the message from client and >tested with VLC and Quick time player.It displayed all >DESCRIBE,SETUP,PLAY,PAUSE > messages.But it is not displaying anything for FF & FR. > So can any one help me to let me know waht is the format or can we do >this with this message format -- > > C->S: PLAY rtsp://server IP/Movie name RTSP/1.0 > CSeq: 835 > Session: 12345678 > Scale = 2.0 For Fast Forward Should be: Scale: 2.0 > Range: npt=10-15 > > > C->S: PLAY rtsp://server IP/Movie name RTSP/1.0 > CSeq: 835 > Session: 12345678 > Scale = -2.0 For Fast Rewind Should be: Scale: 2.0 > Range: npt=10-15 Should be: Range: npt=15-10 Please review RFC 2326 - the RTSP (1.0) specification. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From morgan.torvolt at gmail.com Sun Mar 18 08:55:37 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Sun, 18 Mar 2007 19:55:37 +0400 Subject: [Live-devel] Query for Live555 forum In-Reply-To: <002301c767ba$2d0ca590$0a2a320a@telxsi.com> References: <002301c767ba$2d0ca590$0a2a320a@telxsi.com> Message-ID: <3cc3561f0703180855u2833e7eal50f27052f19984a1@mail.gmail.com> > But for the same stream to stream to the set top box its giving > glitches and giving blurred video and sometimes it will not start > decoding, and still server is streaming.I can not under stand the > problem. Check if the stream has variable bitrate. Try without variable bitrate. Some of the stbs have as little as 64kB buffer, and some even less, which could cause problems with VBR streams. -Morgan- From nmsguru at yahoo.com Sun Mar 18 18:01:45 2007 From: nmsguru at yahoo.com (Cuong Nguyen) Date: Sun, 18 Mar 2007 18:01:45 -0700 (PDT) Subject: [Live-devel] Receiving & re-transmitting MPEG2TS stream? Message-ID: <315278.54966.qm@web83801.mail.sp1.yahoo.com> I'm working on an application where I need to receive an mpeg2-ts / RTP stream then strip off the RTP header to get back just the mpeg2-ts then I need to demux the TS file to get back the ES and manipulate the stream and finally serve the stream using normal RTSP (meaning the audio & video in separated RTP streams). I've been looking through the live555 code and I can see how easy it would be for me to create an MPEG2-TS from ES or PS. I can see how to stream out MPEG PS or ES via RTP. I don't see a way to convert from a TS to a PS, nor do I see a "source" for mpeg2-ts over RTP. I can see that I can receive a simple RTP source & I can see that I could receive MPEG1/2 video or audio via RTP but I don't see a source for MPEG2-TS. Am I missing something? Is there a way to do this within live555? Thanks for any help. Charlie. From jarod.dong at gmail.com Sun Mar 18 20:25:19 2007 From: jarod.dong at gmail.com (Jarod Dong) Date: Mon, 19 Mar 2007 11:25:19 +0800 Subject: [Live-devel] The RTP timestamp of MPEG-4 ES Message-ID: <3099c0f30703182025v44eb40f4ue545524a47b88c59@mail.gmail.com> Hi, I use live555 to get MPEG-4 RTP packets from a DSS server, I find out that several VOPs have the same timestamp. if one VOP is one frame, then the timestamps shouldn't be the same, how could this happen? Regards, Jarod -- Free trade reduces world suffering. From finlayson at live555.com Mon Mar 19 02:34:26 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2007 10:34:26 +0100 Subject: [Live-devel] The RTP timestamp of MPEG-4 ES In-Reply-To: <3099c0f30703182025v44eb40f4ue545524a47b88c59@mail.gmail.com> References: <3099c0f30703182025v44eb40f4ue545524a47b88c59@mail.gmail.com> Message-ID: >Hi, > >I use live555 to get MPEG-4 RTP packets from a DSS server, I find out >that several VOPs have the same timestamp. if one VOP is one frame, >then the timestamps shouldn't be the same, how could this happen? I don't know - that is a question for a DSS mailing list. (Perhaps the DSS's file was not 'hinted' properly?) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Mar 19 02:41:55 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2007 10:41:55 +0100 Subject: [Live-devel] Receiving & re-transmitting MPEG2TS stream? In-Reply-To: <315278.54966.qm@web83801.mail.sp1.yahoo.com> References: <315278.54966.qm@web83801.mail.sp1.yahoo.com> Message-ID: >I'm working on an application where I need to receive an mpeg2-ts / >RTP stream then strip off the RTP header to get back just the >mpeg2-ts then I need to demux the TS file to get back the ES and >manipulate the stream and finally serve the stream using normal RTSP >(meaning the audio & video in separated RTP streams). > >I've been looking through the live555 code and I can see how easy it >would be for me to create an MPEG2-TS from ES or PS. I can see how >to stream out MPEG PS or ES via RTP. I don't see a way to convert >from a TS to a PS, nor do I see a "source" for mpeg2-ts over RTP. I >can see that I can receive a simple RTP source & I can see that I >could receive MPEG1/2 video or audio via RTP but I don't see a >source for MPEG2-TS. MPEG2-TS RTP streams are received using "SimpleRTPSource". (See "liveMedia/MediaSession.cpp", line 721) However, our libraries currently do not contain a mechanism for demultiplexing Elementary Streams from a Transport Stream. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From info at dnastudios.it Mon Mar 19 06:08:37 2007 From: info at dnastudios.it (DNA Studios s.r.l.) Date: Mon, 19 Mar 2007 14:08:37 +0100 Subject: [Live-devel] to hint a movie file Message-ID: <45FE8B55.20401@dnastudios.it> Hi to all, Now i hint my movies with mpeg4ip with this command: mp4creator -hint "track that i want to hint...audio or video". I can't hint with QT because my movies are encoded on h264 main profile. There is an other way/tool to hint for Darwin streaming server? Thanks. ------------------------ Nicola -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070319/54df3c0e/attachment.html From finlayson at live555.com Mon Mar 19 06:16:46 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 Mar 2007 14:16:46 +0100 Subject: [Live-devel] to hint a movie file In-Reply-To: <45FE8B55.20401@dnastudios.it> References: <45FE8B55.20401@dnastudios.it> Message-ID: 'Hinting' is a trick (hack) that is used/needed *only* for users of the "Darwin Streaming Server". It has nothing to do with us. This is *not* the right mailing list for your question - you should ask a "mpeg4ip" or "DSS" mailing list. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From marthi at graphics.cs.uni-sb.de Mon Mar 19 07:47:54 2007 From: marthi at graphics.cs.uni-sb.de (Martin) Date: Mon, 19 Mar 2007 15:47:54 +0100 Subject: [Live-devel] Units and meanings of members/functions in TransmissionStats Message-ID: <45FEA29A.4030007@graphics.cs.uni-sb.de> Hello Ross, I looked into the source code to get the exact meaning of members and functions in TransmissionStats. Since I'm not sure about everything, I would be really glad if you could confirm the units and meanings below: 1) I think that both lastSRTime() diffSR_RRTime() return the time in units of 1/65536 seconds. Is that right? 2) timeCreated(): I always receive a constant value. Is that the NTP time the session was created? 3) lastTimeReceived() Does it return the NTP time the last report was received (for calculation of round trip delay)? 4) lastPacketNumReceived() Is that the "extended highest sequence number received" as defined in RFC3550? Thank you very much for your help! Martin From antoniotirri at libero.it Mon Mar 19 10:50:04 2007 From: antoniotirri at libero.it (Antonio Tirri) Date: Mon, 19 Mar 2007 17:50:04 -0000 Subject: [Live-devel] Please, help me with m4e format In-Reply-To: References: <45E60C79.3090606@libero.it> Message-ID: <46279D86.6060506@libero.it> Hello, I have got a file in .yuv in CIF format. From this file, (or from a .m4v file) i need to obtain an .m4e file. How can i do it? I read the FAQ but it only esplains how to obtain a .m4e file from an .mp4 file streamed by a streaming server via rtsp. Thank You From susovan at tataelxsi.co.in Mon Mar 19 23:52:17 2007 From: susovan at tataelxsi.co.in (Susovan Ghosh) Date: Tue, 20 Mar 2007 12:22:17 +0530 Subject: [Live-devel] Query for Live555 forum In-Reply-To: <3cc3561f0703180855u2833e7eal50f27052f19984a1@mail.gmail.com> Message-ID: <003701c76abc$4d563640$0a2a320a@telxsi.com> hi, Thank you very much for reply.I checked that audio video buffer size in the stb client is 64KB.I changed that valu upto three times,and got the same result.What should be the size of the audio video buffer? I made an experiment with VLS server instade of live 555 server, and STB client could play it.Then what will be the proper cause. Than k You SUSOVAN GHOSH PH No-9986667320 Engineer (D&D) PRDE TATAELXSI LIMITED -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Morgan Torvolt Sent: Sunday, March 18, 2007 9:26 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Query for Live555 forum > But for the same stream to stream to the set top box its giving > glitches and giving blurred video and sometimes it will not start > decoding, and still server is streaming.I can not under stand the > problem. Check if the stream has variable bitrate. Try without variable bitrate. Some of the stbs have as little as 64kB buffer, and some even less, which could cause problems with VBR streams. -Morgan- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Mar 20 02:17:16 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Mar 2007 10:17:16 +0100 Subject: [Live-devel] Query for Live555 forum In-Reply-To: <003701c76abc$4d563640$0a2a320a@telxsi.com> References: <003701c76abc$4d563640$0a2a320a@telxsi.com> Message-ID: > Thank you very much for reply.I checked that audio video buffer size in >the stb client is 64KB.I changed that valu upto > three times,and got the same result.What should be the size of the audio >video buffer? I don't know, but 64 kBytes is ridiculously small. Any STB manufacturer that builds such limited hardware in this day and age is probably not going remain in business long... One thing you could try is adjusting the timing parameters in the "MPEG2TransportStreamFramer" code to try to compensate for your brain-damaged client hardware. E.g., you could try reducing "MAX_PLAYOUT_BUFFER_DURATION" from 0.1 seconds to 0.05 seconds, or even lower. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From morgan.torvolt at gmail.com Tue Mar 20 02:54:31 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Tue, 20 Mar 2007 13:54:31 +0400 Subject: [Live-devel] Query for Live555 forum In-Reply-To: <003701c76abc$4d563640$0a2a320a@telxsi.com> References: <3cc3561f0703180855u2833e7eal50f27052f19984a1@mail.gmail.com> <003701c76abc$4d563640$0a2a320a@telxsi.com> Message-ID: <3cc3561f0703200254x50c91efdo45a062d7ce701598@mail.gmail.com> Hi 64kB or even 256kB could still be way to little if you have a highly variable bitrate stream. Is it buffer underrun you get or is it buffer overrun? You can sort of check this Try one without variable bitrate first to see if that works, and also with single service transportstream. I have never tested live with multiple service files. There could be something going wrong in the transportstreamframer if one of the services has wierd PCR timestamps or something like that. Ross: I don't think many STB producer has much more buffer than that. It is: 64kB / (8Mbit/s)/(8bit/byte) = 64ms buffer on a high bandwidth stream, and of course double that or more on a low bandwidth stream. I guess you have access to the same NDA documents from Amino that I do, so you can check what they use yourself. Amino is definately still going strong, so I do not agree with your assessment there. There is of course also a decoding buffer that keeps the I and P frames for decoding of the B frames, but it must still be fed data according to the DTS in the PES headers. Most local area lans and ADSL network connections have maximum 5ms delay from server, and usually the network jitter is much less than that. The reason for making the buffer like this is as noted earlier, the low delay trick play functionality. Having a two second buffer is not ideal when going ffwd at 32x and pressing play when you see where you want to watch from. It will take you a minute forward before you are back at 1x. That would give a bad user experience. There is some logic behind it all I believe. It puts alot of responsibility on the server though, but in a total system cost analysis you would also want a more expensive server rather than more expensive clients most of the time. Especially in big installations with multicast and thousands of clients. This is why I have been trying to push for better timing of the transportstream data in Live. -Morgan- On 20/03/07, Susovan Ghosh wrote: > hi, > Thank you very much for reply.I checked that audio video buffer size in > the stb client is 64KB.I changed that valu upto > three times,and got the same result.What should be the size of the audio > video buffer? > I made an experiment with VLS server instade of live 555 > server, and STB client could play it.Then what will > be the proper cause. > > Than > k You > > SUSOVAN GHOSH > PH No-9986667320 > Engineer (D&D) > PRDE > TATAELXSI LIMITED > > > -----Original Message----- > From: live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Morgan Torvolt > Sent: Sunday, March 18, 2007 9:26 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Query for Live555 forum > > > > But for the same stream to stream to the set top box its giving > > glitches and giving blurred video and sometimes it will not start > > decoding, and still server is streaming.I can not under stand the > > problem. > > Check if the stream has variable bitrate. Try without variable > bitrate. Some of the stbs have as little as 64kB buffer, and some even > less, which could cause problems with VBR streams. > > -Morgan- > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > From susovan at tataelxsi.co.in Tue Mar 20 03:48:31 2007 From: susovan at tataelxsi.co.in (Susovan Ghosh) Date: Tue, 20 Mar 2007 16:18:31 +0530 Subject: [Live-devel] Query for Live555 forum In-Reply-To: Message-ID: <003c01c76add$4daae2f0$0a2a320a@telxsi.com> Hi, Thanx for help me.I tried it and changed the value of "MAX_PLAYOUT_BUFFER_DURATION" from 1.0 to .05 upto .01.But get the same result.Now i will try with without variable bit rate stream. What may be the diffrence that client can play multi service TS But can not play single ervice TS . Tha nk you SUSOVAN GHOSH PH No-9986667320 Engineer (D&D) PRDE TATAELXSI LIMITED -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Ross Finlayson Sent: Tuesday, March 20, 2007 2:47 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Query for Live555 forum > Thank you very much for reply.I checked that audio video buffer size in >the stb client is 64KB.I changed that valu upto > three times,and got the same result.What should be the size of the audio >video buffer? I don't know, but 64 kBytes is ridiculously small. Any STB manufacturer that builds such limited hardware in this day and age is probably not going remain in business long... One thing you could try is adjusting the timing parameters in the "MPEG2TransportStreamFramer" code to try to compensate for your brain-damaged client hardware. E.g., you could try reducing "MAX_PLAYOUT_BUFFER_DURATION" from 0.1 seconds to 0.05 seconds, or even lower. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Mar 20 06:22:45 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Mar 2007 14:22:45 +0100 Subject: [Live-devel] Query for Live555 forum In-Reply-To: <3cc3561f0703200254x50c91efdo45a062d7ce701598@mail.gmail.com> References: <3cc3561f0703180855u2833e7eal50f27052f19984a1@mail.gmail.com> <003701c76abc$4d563640$0a2a320a@telxsi.com> <3cc3561f0703200254x50c91efdo45a062d7ce701598@mail.gmail.com> Message-ID: >I guess >you have access to the same NDA documents from Amino that I do No, I don't have *any* documentation about the Amino STBs. (We just happen to have an AmiNet 110 sitting around as a result of a project done for an earlier customer; that's what I've been using for testing.) Once again, Live Networks, Inc. has no affiliation with Amino Technologies, and we don't know enough about their hardware to debug it (even if we had the time/inclination to do so; remember that they're not the only STB hardware company in existence). If people are having problems with our server(s) streaming to Amino's hardware, then the ball is in Amino's court to help figure out exactly why. We have done our part by making our source code available, and by providing this public forum. We also have plans (discussed extensively on this list in early February; see the mail archives for details) to provide additional Transport Stream 'framer' classes that should produce smoother inter-packet delays when streaming. (No, there is no ETA for this; please don't ask :-) >Most local area lans and ADSL network connections have maximum 5ms >delay from server, and usually the network jitter is much less than >that. Agreed. However, the point stands that if any such client wishes to work well over the general Internet - as opposed to a single LAN only - then it *must* provide sufficient jitter buffering. There is *nothing* that a server can ever do that can overcome a lack of sufficient buffering at the client. One of the basic principles of the Internet is that the network is assumed to be dumb, and therefore intelligence is migrated to the endpoints. In particular, this means that endpoints should provide sufficient buffer memory to compensate for network jitter. With memory being dirt cheap, there is really no excuse anymore not to do this. (If your hardware budget is so limited that you can't provide more than 64 kBytes of buffer memory, then perhaps you just can't expect your the general Internet to be part of your client's target application space.) > The reason for making the buffer like this is as noted earlier, >the low delay trick play functionality. Having a two second buffer is >not ideal when going ffwd at 32x and pressing play when you see where >you want to watch from. It will take you a minute forward before you >are back at 1x. I don't buy this. When the user presses 1x play again, the client (STB) can just flush any remaining data in the buffer, and request that the server resume 1x play from the point in time (NPT) that the user wishes to resume, not just the last place where it streamed data at 32x. The RTSP protocol supports this (using the "Range: npt=" header in the "PLAY" request). (Again: Make the endpoints sufficiently intelligent...) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From morgan.torvolt at gmail.com Tue Mar 20 07:24:13 2007 From: morgan.torvolt at gmail.com (=?ISO-8859-1?Q?Morgan_T=F8rvolt?=) Date: Tue, 20 Mar 2007 18:24:13 +0400 Subject: [Live-devel] Query for Live555 forum In-Reply-To: References: <3cc3561f0703180855u2833e7eal50f27052f19984a1@mail.gmail.com> <003701c76abc$4d563640$0a2a320a@telxsi.com> <3cc3561f0703200254x50c91efdo45a062d7ce701598@mail.gmail.com> Message-ID: <3cc3561f0703200724r78684cf5l9a9cb67d57ee1e72@mail.gmail.com> > We also have plans (discussed > extensively on this list in early February; see the mail archives for > details) to provide additional Transport Stream 'framer' classes that > should produce smoother inter-packet delays when streaming. (No, > there is no ETA for this; please don't ask :-) I know, It was part of discussion. Did not know the fact about you not having a deal with Amino though. I will advice them to get in touch with you. > Agreed. However, the point stands that if any such client wishes to > work well over the general Internet - as opposed to a single LAN only > - then it *must* provide sufficient jitter buffering. There is > *nothing* that a server can ever do that can overcome a lack of > sufficient buffering at the client. I believe that very few STBs are produced for use over general internet. It would be way to unreliable for high quality TV anyway as bit error rates at 10^-9 or better is neede to not degrade user experience notably, and packetloss is a common problem on internet. Boxes are supposed to be on a strictly controlled network with lots of datarate headroom, at least in the backbone. > this. (If your hardware budget is so limited that you can't provide > more than 64 kBytes of buffer memory, then perhaps you just can't > expect your the general Internet to be part of your client's target > application space.) Exactly. Nobody even wants other providers to provide triple play services trough their first mile equipment, so it will probably not happen in near future eighter. If you don't need it, why pay for it? Very small buffer also gives simpler software, possibly simpler circuits and board layout, less errors (as larger memory chips have a higher error rate), easier to design and so on. > I don't buy this. When the user presses 1x play again, the client > (STB) can just flush any remaining data in the buffer, and request > that the server resume 1x play from the point in time (NPT) that the > user wishes to resume, not just the last place where it streamed data > at 32x. The RTSP protocol supports this (using the "Range: npt=" > header in the "PLAY" request). (Again: Make the endpoints > sufficiently intelligent...) Agreed. That would be the smart thing to do. The problem could be syncing the decoder again. If the server and STB is sufficiently smart, it should work. I do not know of any standard on how to do this, so I suspect it would be implemented differently on different servers, giving much of the same problem RTSP has today, but probably even worse. -Morgan- From jlfurlong at hotmail.com Tue Mar 20 10:41:37 2007 From: jlfurlong at hotmail.com (Jeff Furlong) Date: Tue, 20 Mar 2007 14:41:37 -0300 Subject: [Live-devel] WinCE select issue Message-ID: Has anyone tested the live555 server on WinCE? I have a version built and running on a wince device which will accept rtsp requests from other devices. In this case I'm using raw udp as the video transport. The rtsp handshaking works fine, but as soon as the PLAY command is received, the server attempts to send out the video over UDP and the select() method in the BasicTaskScheduler exits with an error code WSAEINVAL. I've noticed the comment about windows which required a dummy readable socket, so I tried this for WinCE. Now, after it creates the dummy socket it never returns from the select() command. Thanks, Jeff -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070320/153f9e59/attachment.html From finlayson at live555.com Tue Mar 20 11:26:14 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Mar 2007 19:26:14 +0100 Subject: [Live-devel] WinCE select issue In-Reply-To: References: Message-ID: >I have a version built and running on a wince device which will >accept rtsp requests from other devices. In this case I'm using raw >udp as the video transport. The rtsp handshaking works fine, but as >soon as the PLAY command is received, the server attempts to send >out the video over UDP and the select() method in the >BasicTaskScheduler exits with an error code WSAEINVAL. I've noticed >the comment about windows which required a dummy readable socket, so >I tried this for WinCE. Now, after it creates the dummy socket it >never returns from the select() command. Until your server receives another RTSP command, the only way you'll return from select() is with a timeout, specifying some delayed task that needs to be performed. If you are streaming your UDP data properly, this delayed task will have been generated by the call to "scheduleDelayedTask\" - in "BasicUDPSink.cpp", line 94. I suggest checking whether the call to "scheduleDelayedTask()" in in "BasicUDPSink.cpp", line 94 is actually occurring, and, if so, whether "uSecondsToGo" has a reasonable value. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From pcg at agathongroup.com Tue Mar 20 11:58:53 2007 From: pcg at agathongroup.com (peter green) Date: Tue, 20 Mar 2007 12:58:53 -0600 Subject: [Live-devel] openrtsp + H261? Message-ID: <20070320185853.GZ7012@agathongroup.com> All, I have a Polycom VSX7000e video teleconferencing thing that I'm trying to use to simultaneously broadcast and archive a copy of video. I send the video to a Darwin Streaming Server and broadcast from there; the streams work great. I tried using openRTSP to grab the stream and save to a file, but received an error: # /root/live/testProgs/openRTSP -e 10 -w 352 -h 288 -q rtsp://10.9.0.1/videostream.sdp > stream.mov Warning: The -q, -4 or -i option was used, but not -f. Assuming a video frame rate of 15 frames-per-second Opened URL "rtsp://10.9.0.1/videostream.sdp", returning a SDP description: v=0 o=- 100 1 IN IP4 10.9.0.100 s=GFA-US c=IN IP4 0.0.0.0 t=0 0 a=tool:Polycom Viewstation a=control:* m=video 0 RTP/AVP 31 a=control:trackID=1 m=audio 0 RTP/AVP 0 a=control:trackID=2 Created receiver for "video/H261" subsession (client ports 36346-36347) Created receiver for "audio/PCMU" subsession (client ports 36348-36349) Setup "video/H261" subsession (client ports 36346-36347) Setup "audio/PCMU" subsession (client ports 36348-36349) Warning: We don't implement a QuickTime Video Media Data Type for the "H261" track, so we'll insert a dummy "????" Media Data Atom instead. A separate, codec-specific editing pass will be needed before this track can be played. Started playing session Receiving streamed data (for up to 10.000000 seconds)... The resulting stream.mov file is garbage, apparently since video/H261 isn't supported by openRTSP. My question(s): is it possible to do what I'm trying to do with the live555 tools? If so, how? If not, what can I use to accomplish my goal? I've also looked at VLC (doesn't support DSS) and mp4live (GUI crashes, command-line mp4live simply refuses to save to file). Nothing out there seems to adequately describe how to do what I want, so I'm hoping someone here has a brilliant idea. :-) Thanks! /pg -- Peter Green : Agathon Group : pcg at agathongroup.com From finlayson at live555.com Tue Mar 20 13:49:15 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 20 Mar 2007 21:49:15 +0100 Subject: [Live-devel] openrtsp + H261? In-Reply-To: <20070320185853.GZ7012@agathongroup.com> References: <20070320185853.GZ7012@agathongroup.com> Message-ID: >The resulting stream.mov file is garbage, apparently since video/H261 isn't >supported by openRTSP. Yes, "video/H261" is supported; if you run "openRTSP" without the "-q" option, you'll get a separate file containing raw H.261 video data. However, the "QuickTimeFileSink" class (which implements the "-q" and "-4" options in "openRTSP") does not currently support H.261 video. (However, this would probably not be difficult to add - e.g., you could use the H.263 implementation as a model.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From pcg at agathongroup.com Tue Mar 20 16:53:19 2007 From: pcg at agathongroup.com (peter green) Date: Tue, 20 Mar 2007 17:53:19 -0600 Subject: [Live-devel] openrtsp + H261? In-Reply-To: References: <20070320185853.GZ7012@agathongroup.com> Message-ID: <20070320235319.GC7012@agathongroup.com> Thanks for the quick response! * Ross Finlayson [070320 14:51]: > >The resulting stream.mov file is garbage, apparently since video/H261 isn't > >supported by openRTSP. > > Yes, "video/H261" is supported; if you run "openRTSP" without the > "-q" option, you'll get a separate file containing raw H.261 video > data. Ah, right on. I can use ffmpeg to build that into a mov or mpg. However, I can't figure out what to use for the audio input format for ffmpeg. I've tried: alaw pcm A law format mulaw pcm mu law format u16be pcm unsigned 16 bit big endian format u16le pcm unsigned 16 bit little endian format u8 pcm unsigned 8 bit format And it always comes out sped up by about 3x. There are also a ton of h261 errors when I run ffmpeg, but I suspect those are a problem with the source. The first error associated with the audio stream is: Seems stream 0 codec frame rate differs from container frame rate: 29.97 (30000/1001) -> 25.00 (25/1) Now, I know this isn't an ffmpeg mailing list, and I'll happily head that way if that's what I need to do. But I thought I'd check. > However, the "QuickTimeFileSink" class (which implements the "-q" and > "-4" options in "openRTSP") does not currently support H.261 video. > (However, this would probably not be difficult to add - e.g., you > could use the H.263 implementation as a model.) "difficult" is, of course, relative. :-) /pg -- Peter Green : Agathon Group : pcg at agathongroup.com From max at code-it-now.com Wed Mar 21 05:28:43 2007 From: max at code-it-now.com (Maxim Petrov) Date: Wed, 21 Mar 2007 14:28:43 +0200 Subject: [Live-devel] Streaming non-standard JPEGs Message-ID: <460124FB.9020006@code-it-now.com> Hi All. I have few cameras which send MJPEG stream over HTTP. I would like to restream these frames over RTP. Unfortunately, JPEG images do not correspond to RFC2435 (they have non standard tables). Is there any way to stream such images over RTP wthout transcoing ones? Or I need to create my own payload format? From jlfurlong at hotmail.com Wed Mar 21 05:27:42 2007 From: jlfurlong at hotmail.com (Jeff Furlong) Date: Wed, 21 Mar 2007 09:27:42 -0300 Subject: [Live-devel] WinCE select issue Message-ID: Thanks Ross, you were right. For some reason the scheduleDelayTask is never getting called. I built a test app that just reads a file and multicasts the video over UDP, and I'm getting the same result - I'm in the process of trying to narrow down where the problem is occurring. Any thoughts/ideas on what you think is happening would be greatly appreciated. Thanks, Jeff > Date: Tue, 20 Mar 2007 19:26:14 +0100> To: live-devel at ns.live555.com> From: finlayson at live555.com> Subject: Re: [Live-devel] WinCE select issue> > >I have a version built and running on a wince device which will > >accept rtsp requests from other devices. In this case I'm using raw > >udp as the video transport. The rtsp handshaking works fine, but as > >soon as the PLAY command is received, the server attempts to send > >out the video over UDP and the select() method in the > >BasicTaskScheduler exits with an error code WSAEINVAL. I've noticed > >the comment about windows which required a dummy readable socket, so > >I tried this for WinCE. Now, after it creates the dummy socket it > >never returns from the select() command.> > Until your server receives another RTSP command, the only way you'll > return from select() is with a timeout, specifying some delayed task > that needs to be performed. If you are streaming your UDP data > properly, this delayed task will have been generated by the call to > "scheduleDelayedTask\" - in "BasicUDPSink.cpp", line 94.> > I suggest checking whether the call to "scheduleDelayedTask()" in in > "BasicUDPSink.cpp", line 94 is actually occurring, and, if so, > whether "uSecondsToGo" has a reasonable value.> -- > > Ross Finlayson> Live Networks, Inc.> http://www.live555.com/> _______________________________________________> live-devel mailing list> live-devel at lists.live555.com> http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070321/1718df73/attachment.html From finlayson at live555.com Wed Mar 21 06:19:32 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2007 14:19:32 +0100 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: <460124FB.9020006@code-it-now.com> References: <460124FB.9020006@code-it-now.com> Message-ID: >I have few cameras which send MJPEG stream over HTTP. I would like to >restream these frames over RTP. >Unfortunately, JPEG images do not correspond to RFC2435 (they have non >standard tables). RFC 2435 - and our sending/receiving implementation ("JPEGVideoRTPSink"/"JPEGVideoRTPSource") - *does* support non-standard quantization tables (they are included in JPEG header in the RTP packet). To stream such a source, you must implement a subclass of "JPEGVideoSource" class, and, in particular - because you have non-standard quantization tables - implement the "quantizationTables()" virtual function. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Mar 21 06:32:48 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2007 14:32:48 +0100 Subject: [Live-devel] WinCE select issue In-Reply-To: References: Message-ID: >Thanks Ross, you were right. For some reason the scheduleDelayTask >is never getting called. Assuming that "startPlaying" is being called on your "BasicUDPSink" (which should be the case if you have not modified the "OnDemandServerMediaSubsession" code), then the problem must be that the data source object (that your "BasicUDPSink" object reads from) is not delivering frames. At this point - because your data source is your own code - you're on your own. Good luck... Ross. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rupak.p at gmail.com Wed Mar 21 07:46:53 2007 From: rupak.p at gmail.com (Rupak Patel) Date: Wed, 21 Mar 2007 20:16:53 +0530 Subject: [Live-devel] Regarding the seeking slider on the web page playing streamed data. Message-ID: Hi, This is not directly related to live555, but I came across this issue while using the server. While embedding a plugin for slider on my webpage using activeX MSComctlLib.Slider.2 clsid:F08DF954-8592-11D1-B16A-00C0F0283628 on a machine with Windows server edition 2003 operating system, I failed to embed the slider while the same activeX code is working fine with machine with windows XP. It would be great if any one could suggest the possible cause/solution of the same. Thanks and Regards, Roopak. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070321/91c04b71/attachment.html From max at code-it-now.com Wed Mar 21 08:16:17 2007 From: max at code-it-now.com (Maxim Petrov) Date: Wed, 21 Mar 2007 17:16:17 +0200 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: References: <460124FB.9020006@code-it-now.com> Message-ID: <46014C41.1080307@code-it-now.com> Ross Finlayson wrote: >RFC 2435 - and our sending/receiving implementation >("JPEGVideoRTPSink"/"JPEGVideoRTPSource") - *does* support >non-standard quantization tables (they are included in JPEG header in >the RTP packet). > >To stream such a source, you must implement a subclass of >"JPEGVideoSource" class, and, in particular - because you have >non-standard quantization tables - implement the >"quantizationTables()" virtual function. > > Thanks, Ross. But unfortune, it does not solve problem. Because I must use same qtables on both server and client side. But as I said I have several cameras which send JPEGs with different qtables :( So as I understand: 1) I need some way to inform client what qtables need to use to generate frames; 2) I need to send to client JPEG frames as is. Then client will not add qtables to frame. From finlayson at live555.com Wed Mar 21 08:21:10 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2007 16:21:10 +0100 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: <46014C41.1080307@code-it-now.com> References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> Message-ID: >Ross Finlayson wrote: > >>RFC 2435 - and our sending/receiving implementation >>("JPEGVideoRTPSink"/"JPEGVideoRTPSource") - *does* support >>non-standard quantization tables (they are included in JPEG header in >>the RTP packet). >> >>To stream such a source, you must implement a subclass of >>"JPEGVideoSource" class, and, in particular - because you have >>non-standard quantization tables - implement the >>"quantizationTables()" virtual function. >> >> >Thanks, Ross. But unfortune, it does not solve problem. Because I must >use same qtables on both server and client side. That's not a problem, because the quantization tables - because they are non-standard - are carried within the RTP packets. I.e., the server tells the client(s) which quantization tables are used for each image. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From igor at mir2.org Wed Mar 21 08:26:32 2007 From: igor at mir2.org (Igor Bukanov) Date: Wed, 21 Mar 2007 16:26:32 +0100 Subject: [Live-devel] OnDemandServerMediaSubsession with reusing issues Message-ID: <7dee4710703210826k391ee9d8i75c85c81d82ed0e7@mail.gmail.com> Hi, To stream on demand to multiple clients a live source I use a subclass of OnDemandServerMediaSubsession with reuseFirstSource set to true. Initially I thought that in a such case OnDemandServerMediaSubsession::createNewStreamSource will be called only once so I can perform on-demand initialization there. But it turned out that it is called twice with the first time just to get sdp lines. Moreover, this first call will then close the source. Since for the source I use isMPEG4VideoStreamDiscreteFramer, that would close the original source as well. And that is expensive to construct. I worked around that by overriding sdpLines in the subclass of OnDemandServerMediaSubsession but that forced to dupliacte all live555 code that automatically constructs the sdp lines based on the source parameters. Ideally it would be nice if OnDemandServerMediaSubsession calls createNewStreamSource only once, but a patch that I tried to create became rather messy due to the need to store estBitrate results of the first call to createNewStreamSource. That gives a suggestion: maybe it is better to remove estBitrate from createNewStreamSource replacing it by a method in FramedSource ? Regards, Igor From bidibulle at operamail.com Wed Mar 21 08:59:18 2007 From: bidibulle at operamail.com (David Betrand) Date: Wed, 21 Mar 2007 16:59:18 +0100 Subject: [Live-devel] track synchronization with live555 Message-ID: <20070321155918.5F7507AEA1@ws5-10.us4.outblaze.com> Ross, > I don't believe there is anything wrong with the current code. There is nothing wrong in your code as long as you DON'T handle rtp-info information (except of course the fact your streams are not synchronized until one SR report is received for each track). But if you use rtp-info as well (like I do in my app), for sure there will be a problem. This is because when receiving the rtp-info timestamps you don't get the sender wall clock, so you must initialize fSyncTime with YOUR wall clock (gettimeofday). Then, your track are perfectly synchronized. Then as soon as you receive the first SR, you desynchronize your streams because you will use sender's wall clock (by setting fSyncTime variable). Thus you have a jump in your presentation time. That's why I talked about a "synchronization time difference". This simply reflects the wall clock difference between the sender and the receiver. Please, if you disagree, tell me how you would use those timestamps sent in the rtp-info, without breaking something in the current code ? Personnaly, I see two solutions : - first one is the easy one : modify the synchronization mechanism so that it uses rtp-info time when it is available, and SR time when no rtp-info is available - second one is the complex one : modify SR report handling to cope ALSO with rtp-info synchronization. This means keeping the existing code in case there is no rtp-info available, else computing some wall clock time difference between local and sender, and updating synchronization accordingly (which is mainly usefull to compensate for drift for long, uninterrupted presentations, as described in RFC 2326) So, if you agree that there might be some interest for the library, I can propose the code for solution 1 or 2 (with a preference for solution 1 for simplicity). Otherwise, I will have to choose solution 1 and keep this patch for my own app (as well as the small modifications needed in some other places to make the rtp-info processing possible). This would mean I will not be able to upgrade liveMedia with the latest official build anymore :-( David -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From finlayson at live555.com Wed Mar 21 09:53:23 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2007 17:53:23 +0100 Subject: [Live-devel] OnDemandServerMediaSubsession with reusing issues In-Reply-To: <7dee4710703210826k391ee9d8i75c85c81d82ed0e7@mail.gmail.com> References: <7dee4710703210826k391ee9d8i75c85c81d82ed0e7@mail.gmail.com> Message-ID: >Initially I thought that in a such case >OnDemandServerMediaSubsession::createNewStreamSource will be called >only once so I can perform on-demand initialization there. But it >turned out that it is called twice with the first time just to get sdp >lines. This is unfortunate, but unavoidable. The SDP data has to be returned (in the RTSP "DESCRIBE" response) before streaming can begin, and unfortunately (for MPEG-4 video only) the SDP data includes configuration data that can only be obtained (in general) by reading part of the stream > Moreover, this first call will then close the source. Since for >the source I use isMPEG4VideoStreamDiscreteFramer, that would close >the original source as well. And that is expensive to construct. If this is a major problem for you, then you could overcome it by inserting a new 'filter' object (that you would write) between the MPEG-4 video source and the "MPEG4VideoStreamDiscreteFramer". This filter object would know that it is being open/read twice. The first time, it would cache (in allocated memory) each MPEG-4 frame that gets read (usually only a few). The second time, it would first deliver frames that had previously been stored in the cache. After that, it would read from the input source, as usual. In this way, you'd need only open/read the actual MPEG-4 video source once. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From igor at mir2.org Wed Mar 21 10:07:48 2007 From: igor at mir2.org (Igor Bukanov) Date: Wed, 21 Mar 2007 18:07:48 +0100 Subject: [Live-devel] OnDemandServerMediaSubsession with reusing issues In-Reply-To: References: <7dee4710703210826k391ee9d8i75c85c81d82ed0e7@mail.gmail.com> Message-ID: <7dee4710703211007m78ebc014s52ab85ab7dad832f@mail.gmail.com> On 21/03/07, Ross Finlayson wrote: > >Initially I thought that in a such case > >OnDemandServerMediaSubsession::createNewStreamSource will be called > >only once so I can perform on-demand initialization there. But it > >turned out that it is called twice with the first time just to get sdp > >lines. > > This is unfortunate, but unavoidable. The SDP data has to be > returned (in the RTSP "DESCRIBE" response) before streaming can > begin, and unfortunately (for MPEG-4 video only) the SDP data > includes configuration data that can only be obtained (in general) by > reading part of the stream I have no problems with that, for me the issue is the fact that OnDemandServerMediaSubsession::sdpLines closes the stream instead of keeping it opened to feed to clients later. As I wrote, I worked around it, but that is not elegant and kind of defeats the purpose of reuse flag. Regards, Igor From finlayson at live555.com Wed Mar 21 10:58:53 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 21 Mar 2007 18:58:53 +0100 Subject: [Live-devel] track synchronization with live555 In-Reply-To: <20070321155918.5F7507AEA1@ws5-10.us4.outblaze.com> References: <20070321155918.5F7507AEA1@ws5-10.us4.outblaze.com> Message-ID: ***This will be my last posting on this thread. Please do not bother responding, unless you don't mind hearing nothing more back from me about this.*** I think you're misunderstanding what it is that the "LIVE555 Streaming Media" library is intended to deliver to higher-level code (such as media players). In this case, it intends to deliver exactly the information that is provided by the RTP/RTCP and RTSP protocols. In particular, it provides: 1/ A presentation time, in the sender's clock, which is accurate after the first RTCP "SR" packet is received. (Unfortunately, but unavoidably, it's not accurate before this time; however, the library also provides a Boolean function that tells you when the presentation times have become accurate.) 2/ The information that's delivered - by the server - in the "RTP-Info" header. Note that it provides the *exact* information that the server delivers. It doesn't do any translation on this information; that's not the job of the LIVE555 library. It's up to the higher-level code (such as yours) that uses the LIVE555 libraries to make use of this raw information, however it sees fit. Note, however, that these two pieces of information - the presentation time, and the RTP-Info header - are only loosely related (see section 12.33 of RFC 2326). In particular, the RTP-Info header information is certainly not intended to ever replace the synchronized presentation time that's provided by RTCP SR. If you wish, you can use the RTP-Info information to map the presentation time to NPT (using a fixed offset, which you can compute once the first RTCP SR arrives ). But you shouldn't just ignore the presentation time, and think that you can just use the RTP-Info information, plus the RTP timestamps in each packet, to generate your own substitute 'presentation times'. You can't do this, especially if you want to guarantee A/V synchronization over time. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jlfurlong at hotmail.com Wed Mar 21 11:15:33 2007 From: jlfurlong at hotmail.com (Jeff Furlong) Date: Wed, 21 Mar 2007 15:15:33 -0300 Subject: [Live-devel] WinCE select issue Message-ID: Just an FYI... Apparently, WinCE has the same issues as Windows - I had to set READ_FROM_FILES_SYNCHRONOUSLY in order to get the streaming piece to work. Thanks, Jeff > Date: Wed, 21 Mar 2007 14:32:48 +0100> To: live-devel at ns.live555.com> From: finlayson at live555.com> Subject: Re: [Live-devel] WinCE select issue> > >Thanks Ross, you were right. For some reason the scheduleDelayTask > >is never getting called.> > Assuming that "startPlaying" is being called on your "BasicUDPSink" > (which should be the case if you have not modified the > "OnDemandServerMediaSubsession" code), then the problem must be that > the data source object (that your "BasicUDPSink" object reads from) > is not delivering frames.> > At this point - because your data source is your own code - you're on > your own. Good luck...> > Ross.> -- > > Ross Finlayson> Live Networks, Inc.> http://www.live555.com/> _______________________________________________> live-devel mailing list> live-devel at lists.live555.com> http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070321/2a8349ca/attachment.html From max at code-it-now.com Thu Mar 22 01:17:00 2007 From: max at code-it-now.com (Maxim Petrov) Date: Thu, 22 Mar 2007 10:17:00 +0200 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> Message-ID: <46023B7C.8000600@code-it-now.com> Ross Finlayson wrote: >>Thanks, Ross. But unfortune, it does not solve problem. Because I must >>use same qtables on both server and client side. >> >> > >That's not a problem, because the quantization tables - because they >are non-standard - are carried within the RTP packets. I.e., the >server tells the client(s) which quantization tables are used for >each image. > > Thanks Ross, it's what I need! One more question. If I would like to send additional data along with JPEG frame, for example camera name, which frame was captured or something else.. Can I use "RTP header extensions" (does liveMedia support this?) or is there other way? From finlayson at live555.com Thu Mar 22 02:02:29 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Mar 2007 10:02:29 +0100 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: <46023B7C.8000600@code-it-now.com> References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> <46023B7C.8000600@code-it-now.com> Message-ID: > >>But unfortune, it does not solve problem. Because I must > >>use same qtables on both server and client side. >>> >>> >> >>That's not a problem, because the quantization tables - because they >>are non-standard - are carried within the RTP packets. I.e., the >>server tells the client(s) which quantization tables are used for >>each image. >> >> >Thanks Ross, it's what I need! One more question. If I would like to >send additional data along with JPEG frame, for example camera name, >which frame was captured or something else.. Can I use "RTP header >extensions" (does liveMedia support this?) or is there other way? I don't recommend using RTP header extensions (our library doesn't really support these). Instead, it would be better to include this information in the JPEG header itself. (This way, it will get included in the JPEG image if the receiver saves it to disk.) I believe that the JPEG standard allows for this sort of information to be included in the JPEG header (but I don't know the details). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From max at code-it-now.com Thu Mar 22 05:35:44 2007 From: max at code-it-now.com (Maxim Petrov) Date: Thu, 22 Mar 2007 14:35:44 +0200 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> <46023B7C.8000600@code-it-now.com> Message-ID: <46027820.1030600@code-it-now.com> Ross Finlayson wrote: >I don't recommend using RTP header extensions (our library doesn't >really support these). > >Instead, it would be better to include this information in the JPEG >header itself. (This way, it will get included in the JPEG image if >the receiver saves it to disk.) I believe that the JPEG standard >allows for this sort of information to be included in the JPEG header >(but I don't know the details). > > Yes, sure, JPEG has "Comment marker" (0xFE), but I don't know how to transfer these data using RFC 2435. Idea what I have now is insert "comment header" after "qtables" data. Seems I need to try to check if it will work correct :) From marthi at graphics.cs.uni-sb.de Thu Mar 22 09:36:01 2007 From: marthi at graphics.cs.uni-sb.de (Martin) Date: Thu, 22 Mar 2007 17:36:01 +0100 Subject: [Live-devel] TransmissionStats meanings again Message-ID: <4602B071.6080706@graphics.cs.uni-sb.de> Hello, I'm using live555 for several month now and I always received excellent and quick help for the questions I asked. Lately, I asked the questions below two times but got no reply yet. Is it not the right place to ask this kind of questions? Is there another list I can use or is there another problem? I would really be happy if someone could help me! Thanks, Martin > I looked into the source code to get the exact meaning of members and > functions in TransmissionStats. Since I'm not sure about everything, I > would be really glad if you could confirm the units and meanings below: > > > 1) I think that both > lastSRTime() > diffSR_RRTime() > return the time in units of 1/65536 seconds. Is that right? > > > 2) timeCreated(): > I always receive a constant value. Is that the NTP time the session was > created? > > > 3) lastTimeReceived() > Does it return the NTP time the last report was received (for > calculation of round trip delay)? > > > 4) lastPacketNumReceived() > Is that the "extended highest sequence number received" as defined in > RFC3550? From finlayson at live555.com Thu Mar 22 11:08:54 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Mar 2007 19:08:54 +0100 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: <46027820.1030600@code-it-now.com> References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> <46023B7C.8000600@code-it-now.com> <46027820.1030600@code-it-now.com> Message-ID: > >I don't recommend using RTP header extensions (our library doesn't >>really support these). >> >>Instead, it would be better to include this information in the JPEG >>header itself. (This way, it will get included in the JPEG image if >>the receiver saves it to disk.) I believe that the JPEG standard >>allows for this sort of information to be included in the JPEG header >>(but I don't know the details). >> >> >Yes, sure, JPEG has "Comment marker" (0xFE), but I don't know how to >transfer these data using RFC 2435. >Idea what I have now is insert "comment header" after "qtables" data. Yes, I think that will work (provided that you set the 'length' field in the special 'qtables' header appropriately. (RTP doesn't care what that data actually contains; it just gets passed from sender to receiver.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 22 11:13:43 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 Mar 2007 19:13:43 +0100 Subject: [Live-devel] TransmissionStats meanings again In-Reply-To: <4602B071.6080706@graphics.cs.uni-sb.de> References: <4602B071.6080706@graphics.cs.uni-sb.de> Message-ID: >I'm using live555 for several month now and I always received excellent >and quick help for the questions I asked. Lately, I asked the questions >below two times but got no reply yet. Is it not the right place to ask >this kind of questions? Yes, this is certainly the right mailing list; however, not every question posted to this list will get answered. Personally, in this case I didn't answer your questions about the "RTPTransmissionStats" fields because (i) I didn't have time, and (ii) you should be able to get your answers by reviewing the code, and the RTP/RTCP specification. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From binli1115 at 163.com Thu Mar 22 23:12:42 2007 From: binli1115 at 163.com (binli1115) Date: Fri, 23 Mar 2007 14:12:42 +0800 (CST) Subject: [Live-devel] what's mean of this sentence? Message-ID: <30228776.1579041174630362875.JavaMail.root@bj163app64.163.com> Hi everyone, I use mplayer which has compile the source of live as the client,and I use the helix as the serve ,but I receive the message : STREAM_LIVE555, URL: rtsp://202.194.20.86/mpg4video.mp4 Stream not seekable! Unable to determine our source address: This computer has an invalid IP address: 0x0 Unable to determine our source address: This computer has an invalid IP address: 0x0 Unable to determine our source address: This computer has an invalid IP address: 0x0 Initiated "video/MP4V-ES" RTP subsession on port 32822 Unable to determine our source address: This computer has an invalid IP address: 0x0 Unable to determine our source address: This computer has an invalid IP address: 0x0 Initiated "audio/MPEG4-GENERIC" RTP subsession on port 32824 what's wrong with it ? Best wishes! yours Lee -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070322/57ffcf73/attachment.html From finlayson at live555.com Fri Mar 23 00:27:32 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2007 08:27:32 +0100 Subject: [Live-devel] what's mean of this sentence? In-Reply-To: <30228776.1579041174630362875.JavaMail.root@bj163app64.163.com> References: <30228776.1579041174630362875.JavaMail.root@bj163app64.163.com> Message-ID: > >Hi everyone, > I use mplayer which has compile the source of live as the >client,and I use the helix as the serve ,but I receive the message : > > > >STREAM_LIVE555, URL: rtsp://202.194.20.86/mpg4video.mp4 >Stream not seekable! >Unable to determine our source address: This computer has an invalid >IP address: 0x0 >Unable to determine our source address: This computer has an invalid >IP address: 0x0 >Unable to determine our source address: This computer has an invalid >IP address: 0x0 >Initiated "video/MP4V-ES" RTP subsession on port 32822 >Unable to determine our source address: This computer has an invalid >IP address: 0x0 >Unable to determine our source address: This computer has an invalid >IP address: 0x0 >Initiated "audio/MPEG4-GENERIC" RTP subsession on port 32824 >what's wrong with it ? The code is having trouble figuring out your IP address, perhaps (in part) because you don't have multicast configured properly on your system. I suggest using VLC instead - it has pre-built binary versions that you can download, without having to compile. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From igor at mir2.org Fri Mar 23 03:33:28 2007 From: igor at mir2.org (Igor Bukanov) Date: Fri, 23 Mar 2007 11:33:28 +0100 Subject: [Live-devel] OnDemandServerMediaSubsession with reusing issues In-Reply-To: <7dee4710703211007m78ebc014s52ab85ab7dad832f@mail.gmail.com> References: <7dee4710703210826k391ee9d8i75c85c81d82ed0e7@mail.gmail.com> <7dee4710703211007m78ebc014s52ab85ab7dad832f@mail.gmail.com> Message-ID: <7dee4710703230333x165bf153kab07ad97e3a9fdc5@mail.gmail.com> I have found a simple way to solve my problems. It is enough just to extend OnDemandServerMediaSubsession with a new virtual function: virtual void closeStreamSource(FramedSource *inputSource) { Medium::close(inputSource); } and call this function in OnDemandServerMediaSubsession.cpp whenever the current code calls Medium::close on the result of createNewStreamSource. The attached patch does just that. In this way I can override createNewStreamSource in a subclass of OnDemandServerMediaSubsession to return a lazily initialized source stored in a field of the subclass and then override closeStreamSource to do nothing delegating the closing of the source to the destructor of my subclass. Regards, Igor On 21/03/07, Igor Bukanov wrote: > On 21/03/07, Ross Finlayson wrote: > > >Initially I thought that in a such case > > >OnDemandServerMediaSubsession::createNewStreamSource will be called > > >only once so I can perform on-demand initialization there. But it > > >turned out that it is called twice with the first time just to get sdp > > >lines. > > > > This is unfortunate, but unavoidable. The SDP data has to be > > returned (in the RTSP "DESCRIBE" response) before streaming can > > begin, and unfortunately (for MPEG-4 video only) the SDP data > > includes configuration data that can only be obtained (in general) by > > reading part of the stream > > I have no problems with that, for me the issue is the fact that > OnDemandServerMediaSubsession::sdpLines closes the stream instead of > keeping it opened to feed to clients later. As I wrote, I worked > around it, but that is not elegant and kind of defeats the purpose of > reuse flag. > > Regards, Igor > -------------- next part -------------- A non-text attachment was scrubbed... Name: close_control.patch Type: application/octet-stream Size: 6923 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070323/49a93ace/attachment.obj From antoniotirri at libero.it Fri Mar 23 04:34:02 2007 From: antoniotirri at libero.it (Antonio Tirri) Date: Fri, 23 Mar 2007 12:34:02 +0100 Subject: [Live-devel] RTP Payload Length (RPL) of a MPEG4 video In-Reply-To: References: <45E60C79.3090606@libero.it> Message-ID: <4603BB2A.3070804@libero.it> Hello, how can i evaluate the RTP Payload Length (RPL) of a MPEG4 video streamed with live555MediaServer? Is RPL related with the bitrate of the MPEG4 video and in which mode? Thanks, Antonio Tirri From max at code-it-now.com Fri Mar 23 07:55:02 2007 From: max at code-it-now.com (Maxim Petrov) Date: Fri, 23 Mar 2007 16:55:02 +0200 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> <46023B7C.8000600@code-it-now.com> <46027820.1030600@code-it-now.com> Message-ID: <4603EA46.7060200@code-it-now.com> Ross Finlayson wrote: >>Yes, sure, JPEG has "Comment marker" (0xFE), but I don't know how to >>transfer these data using RFC 2435. >>Idea what I have now is insert "comment header" after "qtables" data. >> >> > >Yes, I think that will work (provided that you set the 'length' field >in the special 'qtables' header appropriately. (RTP doesn't care >what that data actually contains; it just gets passed from sender to >receiver.) > > Today I noticed, that my JPEG files have different Huffman tables. RFC 2435, which says: "While parameters like Huffman tables and color space are likely to remain fixed for the lifetime of the video stream, other parameter should be allowed to vary...." This means I cannot stream these files using RFC 2435. Right? From finlayson at live555.com Fri Mar 23 07:59:52 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2007 15:59:52 +0100 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: <4603EA46.7060200@code-it-now.com> References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> <46023B7C.8000600@code-it-now.com> <46027820.1030600@code-it-now.com> <4603EA46.7060200@code-it-now.com> Message-ID: >Today I noticed, that my JPEG files have different Huffman tables. >RFC 2435, which says: >"While parameters like Huffman tables and color space are likely to >remain fixed for the lifetime of the video stream, other parameter >should be allowed to vary...." > >This means I cannot stream these files using RFC 2435. Right? No - streaming your files using RFC 2435 is OK. "Likely to remain fixed" doesn't mean "must remain fixed". As long as you include the quantization tables in the RTP packets, it should be OK that they change. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Mar 23 08:02:00 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 23 Mar 2007 16:02:00 +0100 Subject: [Live-devel] RTP Payload Length (RPL) of a MPEG4 video In-Reply-To: <4603BB2A.3070804@libero.it> References: <45E60C79.3090606@libero.it> <4603BB2A.3070804@libero.it> Message-ID: >Hello, > >how can i evaluate the RTP Payload Length (RPL) of a MPEG4 video >streamed with live555MediaServer? I don't know what you mean by "RTP payload length". If you just mean "packet size", then you can easily figure out in the code where to get this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From p4olo_prete at yahoo.it Fri Mar 23 14:28:15 2007 From: p4olo_prete at yahoo.it (Paolo Prete) Date: Fri, 23 Mar 2007 22:28:15 +0100 (CET) Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: <4603EA46.7060200@code-it-now.com> Message-ID: <20070323212815.94458.qmail@web28007.mail.ukl.yahoo.com> > Today I noticed, that my JPEG files have different > Huffman tables. > RFC 2435, which says: > "While parameters like Huffman tables and color > space are likely to > remain fixed for the lifetime of the video stream, > other parameter > should be allowed to vary...." > > This means I cannot stream these files using RFC > 2435. Right? > You can stream the video also with non-standard Huffman tables. Notice that non-standard quantization tables have to be included in the Quantization Table header (see RFC 2435 paragraph 3.1.8) and they are associated to a Q factor between 128 and 255 (notice too that livemedia (great) libs have support for this). Hope this can help you (I have had the same "problem" and I've solved it in the way descripted above). Otherwise i'll try to give more detailed instruction Paolo ___________________________________ L'email della prossima generazione? Puoi averla con la nuova Yahoo! Mail: http://it.docs.yahoo.com/nowyoucan.html From finlayson at live555.com Sat Mar 24 00:58:49 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 24 Mar 2007 08:58:49 +0100 Subject: [Live-devel] OnDemandServerMediaSubsession with reusing issues In-Reply-To: <7dee4710703230333x165bf153kab07ad97e3a9fdc5@mail.gmail.com> References: <7dee4710703210826k391ee9d8i75c85c81d82ed0e7@mail.gmail.com> <7dee4710703211007m78ebc014s52ab85ab7dad832f@mail.gmail.com> <7dee4710703230333x165bf153kab07ad97e3a9fdc5@mail.gmail.com> Message-ID: >I have found a simple way to solve my problems. It is enough just to >extend OnDemandServerMediaSubsession with a new virtual function: > >virtual void closeStreamSource(FramedSource *inputSource) { > Medium::close(inputSource); >} > >and call this function in OnDemandServerMediaSubsession.cpp whenever >the current code calls Medium::close on the result of >createNewStreamSource. > >The attached patch does just that. Thanks. This will be included in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From marthi at graphics.cs.uni-sb.de Sun Mar 25 06:53:04 2007 From: marthi at graphics.cs.uni-sb.de (Martin) Date: Sun, 25 Mar 2007 15:53:04 +0200 Subject: [Live-devel] TransmissionStats meanings again In-Reply-To: References: <4602B071.6080706@graphics.cs.uni-sb.de> Message-ID: <46067EC0.50808@graphics.cs.uni-sb.de> Hi, > Personally, in this case I didn't answer your questions about the > "RTPTransmissionStats" fields because (i) I didn't have time, and > (ii) you should be able to get your answers by reviewing the code, > and the RTP/RTCP specification. well, I looked into the code and into the specification. Anyhow, I wanted to make sure that I understood everything correctly. Therefore, I asked for confirmation (yes/no) only. Of course I understand that you must priorize questions if you don't have enough time. Maybe you'll find some time for answering in the next weeks, even if it's not one of the most important questions. Thanks, Martin From xngzhng at yahoo.com.cn Mon Mar 26 02:00:22 2007 From: xngzhng at yahoo.com.cn (xiang zhang) Date: Mon, 26 Mar 2007 17:00:22 +0800 (CST) Subject: [Live-devel] How does the function SingleStep() work? Message-ID: <768797.45455.qm@web15614.mail.cnb.yahoo.com> Hello! I am Zhang Xiang, graduate student from Zhe Jiang University,China. I?m confused about how the function SingleStep() works. Specifically speaking, when there is a video connection request from remote PC, how does SingleStep() handle this request and call other functions such as DeliverFrame(), doGetNextframe(), to deliver video stream which is encoded in H.264 form? Thanks!! Best wishes! yours Xiang --------------------------------- Mp3???-??????? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070326/a521f741/attachment.html From sademperor at gmail.com Mon Mar 26 02:38:55 2007 From: sademperor at gmail.com (LL Wang) Date: Mon, 26 Mar 2007 17:38:55 +0800 Subject: [Live-devel] About testOnDemandRTSPServer Message-ID: hi, I've run the testOnDemandRTSPServer.exe under Win XP sp2 (compiled by VC++ 6.0 sp6) to access the followed URL : rtsp ://kmdi.utoronto.ca:555/osconf/2004_may9.1.mp4 I wish to get a standard MPEG-4 stream file to test my own code, but when the programme finished successfuly, the result was abnormal. audio-MPEG4-GENERIC-1 is 0 bytes, and video-MP4V-ES-2 is 51 bytes I don't knwo the reason , could u tell me ? ps. I'm a user in CERNET (education net in CHINA mainland , and access the foreign Internet by a soft named NETPAS) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070326/30cbbd03/attachment.html From info at dnastudios.it Mon Mar 26 06:55:03 2007 From: info at dnastudios.it (DNA Studios s.r.l.) Date: Mon, 26 Mar 2007 15:55:03 +0200 Subject: [Live-devel] media server mp4 trick play Message-ID: <4607D0B7.2040201@dnastudios.it> Hi Ross, I dont' succeeded to found a player that seek well from Darwin streaming server(only Quick Time)...In fact this player not exist i think ;) I would want, do you know when the Live555media server will support seeking and trick-play for h264 based stream? Thanks. ---------------------- Nicola From info at dnastudios.it Mon Mar 26 06:58:14 2007 From: info at dnastudios.it (DNA Studios s.r.l.) Date: Mon, 26 Mar 2007 15:58:14 +0200 Subject: [Live-devel] [Fwd: media server mp4 trick play] Message-ID: <4607D176.5080505@dnastudios.it> Sorry, i missed a piece.. Hi Ross, I dont' succeeded to found a player that seek well from Darwin streaming server(only Quick Time)...In fact this player not exist i think ;) I would want *"TO MIGRATE TO Live555media server"*; do you know when the Live555media server will support seeking and trick-play for h264 based stream? Thanks. ---------------------- Nicola -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070326/8765dd4d/attachment.html From gerardsweeney at ntlworld.com Mon Mar 26 09:40:35 2007 From: gerardsweeney at ntlworld.com (Gerard Sweeney) Date: Mon, 26 Mar 2007 17:40:35 +0100 Subject: [Live-devel] SAPWatch on Win32? Message-ID: <46080593.12367.EC2486A@localhost> Hello, all.. In my workplace, we use an IP-based TV system (www.exterity.co.uk), where each channel is a separate UDP stream, and the channels are advertised using SAP Announcements. Using VLC's SAP Discovery, I can view the SAP announcements. I'm looking for a small utility where I can export the SAP announcement details to a text file. Is there anyone who could advise how I would go about using Live555's SAPWatch as a basis? (I'd ideally be using VBasic, but I'm open to trying other languages if they're easy enough heheh).. Cheers, Gerard -- Gerard Sweeney Hackers Anonymous/Team Amiga http://fly.to/ha3 and http://www.the-tipshop.co.uk When you're living on your own, without a girlfriend or a home, people speculate. But I'm much stronger than them. (Dave. Something Special. Closer To Heaven musical soundtrack) From finlayson at live555.com Mon Mar 26 13:01:23 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Mar 2007 22:01:23 +0200 Subject: [Live-devel] About testOnDemandRTSPServer In-Reply-To: References: Message-ID: >hi, >I've run the testOnDemandRTSPServer.exe under Win XP sp2 (compiled >by VC++ 6.0 sp6) to access the followed URL : >rtsp://kmdi.utoronto.ca:555/osconf/2004_may9.1. mp4 > >I wish to get a standard MPEG-4 stream file to test my own code, but >when the programme finished successfuly, the result was abnormal. > > audio-MPEG4-GENERIC-1 is 0 bytes, and video-MP4V-ES-2 is 51 bytes > >I don't knwo the reason , could u tell me ? Most likely there is a firewall - between your client (openRTSP) and your server - that is blocking UDP traffic. To overcome this, try requesting RTP-over-TCP streaming, by adding the "-t" option to the "openRTSP" command line. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070326/7a255f55/attachment.html From finlayson at live555.com Mon Mar 26 14:44:42 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 Mar 2007 23:44:42 +0200 Subject: [Live-devel] media server mp4 trick play In-Reply-To: <4607D0B7.2040201@dnastudios.it> References: <4607D0B7.2040201@dnastudios.it> Message-ID: >I would want, do you know when the Live555media server will support >seeking and trick-play for h264 based stream? No - but (obviously) it will not happen before we have the ability to demultiplex and stream from .mp4-format files. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sademperor at gmail.com Mon Mar 26 18:11:44 2007 From: sademperor at gmail.com (LL Wang) Date: Tue, 27 Mar 2007 09:11:44 +0800 Subject: [Live-devel] About testOnDemandRTSPServer In-Reply-To: References: Message-ID: Thanks for your attention. but Tencent QQ (a popular Instant Message software in China) which is also based on UDP could run normally on my computer. 2007/3/27, Ross Finlayson : > > hi, > > I've run the testOnDemandRTSPServer.exe under Win XP sp2 (compiled by VC++ > 6.0 sp6) to access the followed URL : rtsp > ://kmdi.utoronto.ca:555/osconf/2004_may9.1. mp4 > > > > I wish to get a standard MPEG-4 stream file to test my own code, but when > the programme finished successfuly, the result was abnormal. > > > > audio-MPEG4-GENERIC-1 is 0 bytes, and video-MP4V-ES-2 is 51 bytes > > > > I don't knwo the reason , could u tell me ? > > > > Most likely there is a firewall - between your client (openRTSP) and your > server - that is blocking UDP traffic. > > > To overcome this, try requesting RTP-over-TCP streaming, by adding the > "-t" option to the "openRTSP" command line. > > -- > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070326/2f983182/attachment-0001.html From zhouh31415 at 163.com Mon Mar 26 18:59:04 2007 From: zhouh31415 at 163.com (=?gbk?B?1ty66w==?=) Date: Tue, 27 Mar 2007 09:59:04 +0800 (CST) Subject: [Live-devel] How does the function SingleStep() work? In-Reply-To: <768797.45455.qm@web15614.mail.cnb.yahoo.com> References: <768797.45455.qm@web15614.mail.cnb.yahoo.com> Message-ID: <194431362.4908631174960744535.JavaMail.root@bj163app126.163.com> Sorry, I think writing Chinese is more simple to me. To Zhang Xiang: ??testprog??????????,????????C++???. ???????????????,?????????????. ???????????,doeventloop()?play().doeventloop???????singlestep()?while()??????,???????schedule?????,????????????.???dogetnextframe()?????????????, play()-->afterplaying()-->play()???????,???????getnextframe()????????????. ????????????????????????,??????????testprog?????,?????????.??????,?????????stream???write_data()????,??????????read_data()?. ?2007-03-26?xiang zhang ??? Hello!I am Zhang Xiang, graduate student from Zhe Jiang University,China. I?m confused about how the function SingleStep() works. Specifically speaking, when there is a video connection request from remote PC, how does SingleStep() handle this request and call other functions such as DeliverFrame(), doGetNextframe(), to deliver video stream which is encoded in H.264 form? Thanks!!Best wishes! yours Xiang Mp3???-??????? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070326/00a4edb3/attachment.html From vinodjoshi at tataelxsi.co.in Mon Mar 26 21:04:48 2007 From: vinodjoshi at tataelxsi.co.in (Vinod Madhav Joshi) Date: Tue, 27 Mar 2007 09:34:48 +0530 Subject: [Live-devel] Problem during pause and resume implementation Message-ID: <001a01c77025$10333240$022a320a@telxsi.com> Hi all, We are using Live 555 Streaming Server to stream MPEG2 TS to stream to set top box. For normal streaming, streaming is fine. Also PAUSE is successful. But when RESUME request is sent, we are getting error in sync byte.And application is getting crashed. What the problem is? If anybody knows please help me.. Thanks. From finlayson at live555.com Mon Mar 26 23:24:31 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Mar 2007 08:24:31 +0200 Subject: [Live-devel] Problem during pause and resume implementation In-Reply-To: <001a01c77025$10333240$022a320a@telxsi.com> References: <001a01c77025$10333240$022a320a@telxsi.com> Message-ID: > We are using Live 555 Streaming Server to stream MPEG2 TS to stream >to set top box. > For normal streaming, streaming is fine. > Also PAUSE is successful. But when RESUME request is sent, we are >getting error in sync byte.And application is > getting crashed. Are these "sync byte" errors (and crashes) happening at the server end, or at the client end? If they're happening at the server end, then I don't know what could be causing the problem. Are you streaming from MPEG-2 TS files, or from a live MPEG-2 TS source? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sademperor at gmail.com Tue Mar 27 01:24:21 2007 From: sademperor at gmail.com (LL Wang) Date: Tue, 27 Mar 2007 16:24:21 +0800 Subject: [Live-devel] about using live video source in testOnDemandRTSPServer Message-ID: Hi, now I've modified the testOnDemandRTSPServer.cpp (just simply change the test.m4e to my device name) to transmit video from my encoded chip(IME6410) , but I found the data packets from the chip is not exact MPEG-4 payload, the packets have a header added by the chip , so I need to delete the header in order to transmit and play normally. I wonder to know where should I implement the packets processing in testOnDemandRTSPServer.cpp ? Shall I rewrite the following function "announceStream(rtspServer, sms, streamName, inputFileName)" ?? thanks Frank -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070327/2cb14805/attachment.html From yelite993 at 163.com Tue Mar 27 03:41:50 2007 From: yelite993 at 163.com (=?gbk?B?0ac=?=) Date: Tue, 27 Mar 2007 18:41:50 +0800 (CST) Subject: [Live-devel] HELP:Which player can decode TS+LOAS+LATM format audio data? Message-ID: <16972774.5149221174992110048.JavaMail.root@bj163app32.163.com> Hello all, Does anybody could tell me which player can decode TS+LOAS+LATM format audio data?Thank you in advance. Yelite -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070327/e128646a/attachment.html From max at code-it-now.com Tue Mar 27 05:02:38 2007 From: max at code-it-now.com (Maxim Petrov) Date: Tue, 27 Mar 2007 15:02:38 +0300 Subject: [Live-devel] Streaming non-standard JPEGs In-Reply-To: References: <460124FB.9020006@code-it-now.com> <46014C41.1080307@code-it-now.com> <46023B7C.8000600@code-it-now.com> <46027820.1030600@code-it-now.com> <4603EA46.7060200@code-it-now.com> Message-ID: <460907DE.7080901@code-it-now.com> Ross Finlayson wrote: > >No - streaming your files using RFC 2435 is OK. "Likely to remain >fixed" doesn't mean "must remain fixed". As long as you include the >quantization tables in the RTP packets, it should be OK that they >change. > > Thanks a lot! I finally got it working! As to additional information what I wanted to send within JPEG frame. I noticed that if send comment after Qtables it does not work (decoder could not parse frame), but if send comment before EOI (end of image) it works fine, VLC and Qtime players can play this frames. I'm not sure if this solution is completely correct and does not break JPEG bitstream format, but I did not find another way. From finlayson at live555.com Tue Mar 27 06:16:12 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Mar 2007 15:16:12 +0200 Subject: [Live-devel] about using live video source in testOnDemandRTSPServer In-Reply-To: References: Message-ID: >Hi, now I've modified the testOnDemandRTSPServer.cpp (just simply >change the test.m4e to my device name) to transmit video from my >encoded chip(IME6410) , but I found the data packets from the chip >is not exact MPEG-4 payload, the packets have a header added by the >chip , so I need to delete the header in order to transmit and play >normally. > >I wonder to know where should I implement the packets processing in >testOnDemandRTSPServer.cpp ? You could perhaps modify the code in "MPEG4VideoStreamFramer.cpp". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Mar 27 06:17:33 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 27 Mar 2007 15:17:33 +0200 Subject: [Live-devel] about using live video source in testOnDemandRTSPServer Message-ID: >Hi, now I've modified the testOnDemandRTSPServer.cpp (just simply >change the test.m4e to my device name) to transmit video from my >encoded chip(IME6410) Also, don't forget to change "reuseFirstSource" to True. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From antoniotirri at libero.it Thu Mar 29 02:13:16 2007 From: antoniotirri at libero.it (Antonio Tirri) Date: Thu, 29 Mar 2007 11:13:16 +0200 Subject: [Live-devel] RTP Payload Length (RPL) of a MPEG4 video In-Reply-To: References: <45E60C79.3090606@libero.it> <4603BB2A.3070804@libero.it> Message-ID: <460B832C.1030903@libero.it> Ross Finlayson ha scritto: >> Hello, >> >> how can i evaluate the RTP Payload Length (RPL) of a MPEG4 video >> streamed with live555MediaServer? >> > > I don't know what you mean by "RTP payload length". If you just mean > "packet size", then you can easily figure out in the code where to > get this. > Yes, i'd like to know the RTP Payload size, where can i find this information? Thanks Antonio Tirri -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070329/31cd0edb/attachment.html From antoniotirri at libero.it Thu Mar 29 02:18:17 2007 From: antoniotirri at libero.it (Antonio Tirri) Date: Thu, 29 Mar 2007 11:18:17 +0200 Subject: [Live-devel] RTP Payload Length (RPL) of a MPEG4 video In-Reply-To: References: <45E60C79.3090606@libero.it> <4603BB2A.3070804@libero.it> Message-ID: <460B8459.9010207@libero.it> Ross Finlayson ha scritto: >> Hello, >> >> how can i evaluate the RTP Payload Length (RPL) of a MPEG4 video >> streamed with live555MediaServer? >> > > I don't know what you mean by "RTP payload length". If you just mean > "packet size", then you can easily figure out in the code where to > get this. > Yes, i'd like to know the RTP Payload size, where can i find this information? Thanks Antonio Tirri -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070329/c37d74a2/attachment-0001.html From max at code-it-now.com Thu Mar 29 03:25:02 2007 From: max at code-it-now.com (Maxim Petrov) Date: Thu, 29 Mar 2007 13:25:02 +0300 Subject: [Live-devel] possible candidate for patch (Groupsock.cpp) Message-ID: <460B93FE.4090902@code-it-now.com> Hi Ross. I noticed strange performance troubles on server side (CPU usage was more than usually) when streaming huge JPEG files. After few minutes of researching I noticed that for sending each RTP packet we call "outputToAllMemberExcept" function even if we don't have any member. So basicaly it's useless? Here's patch: @@ -269,11 +269,14 @@ statsGroupOutgoing.countPacket(bufferSize); // Then, forward to our members: - int numMembers = - outputToAllMembersExcept(interfaceNotToFwdBackTo, + int numMembers = 0; + if (!members().IsEmpty()) { + numMembers = + outputToAllMembersExcept(interfaceNotToFwdBackTo, ttlToSend, buffer, bufferSize, ourSourceAddressForMulticast(env)); if (numMembers < 0) break; + } if (DebugLevel >= 3) { env << *this << ": wrote " << bufferSize << " bytes, ttl " After applying this patch CPU usage normalized again. Can you please comment is patch correct or no? From finlayson at live555.com Thu Mar 29 14:31:36 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Mar 2007 14:31:36 -0700 Subject: [Live-devel] possible candidate for patch (Groupsock.cpp) In-Reply-To: <460B93FE.4090902@code-it-now.com> References: <460B93FE.4090902@code-it-now.com> Message-ID: >I noticed strange performance troubles on server side (CPU usage was >more than usually) when streaming huge JPEG files. After few minutes of >researching I noticed that for sending each RTP packet we call >"outputToAllMemberExcept" function even if we don't have any member. So >basicaly it's useless? Here's patch: > >@@ -269,11 +269,14 @@ > statsGroupOutgoing.countPacket(bufferSize); > > // Then, forward to our members: >- int numMembers = >- outputToAllMembersExcept(interfaceNotToFwdBackTo, >+ int numMembers = 0; >+ if (!members().IsEmpty()) { >+ numMembers = >+ outputToAllMembersExcept(interfaceNotToFwdBackTo, > ttlToSend, buffer, bufferSize, > ourSourceAddressForMulticast(env)); > if (numMembers < 0) break; >+ } > > if (DebugLevel >= 3) { > env << *this << ": wrote " << bufferSize << " bytes, ttl " > > >After applying this patch CPU usage normalized again. Can you please >comment is patch correct or no? Yes, it's OK - although I'm very sceptical about it having a significant effect on performance. Your change doesn't save much more than a function call. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From spectr at gmail.com Thu Mar 29 14:33:42 2007 From: spectr at gmail.com (Spectr) Date: Fri, 30 Mar 2007 00:33:42 +0300 Subject: [Live-devel] question about fmtp fields for MJPEG RTP Message-ID: <813e6e2a0703291433y7869eb7ao3c0611fcd43d087@mail.gmail.com> Hello. I work in Elphel, Inc (http://www3.elphel.com/index.php), we make high resolution network videocamera, what send videostream over RTP/RTSP in MJPEG and Theora format. I have question, about dynamic payload type for MJPEG in live555 library. RTP MJPEG RFC format with payload #26 have limitation for image size - each size of image must be less than 2040 pixels, and larger size not supported. In current live555 code exist hack for this limitation - if width or height in MJPEG header == 0 - used 2048 size (what in general not correct, but ok for 3MPx sensors with 2048x1536 image). We start use 5MPx sensors, and get again this problem. In SDP parser (of live555 library) exist parser of field "a=x-dimensions:WW,HH", but this data used only for container - not when we restore JPEG headers from MJPEG packets; and also, in code, in this time we can't get access to this properties (MediaSubsession::videoWidth(), MediaSubsession::videoHeight() ) of subsession from JPEG packet implementation class. I ask about add support for this in live555 - use width and height from subsession description, when we make JPEG headers from MJPEG (JPEG images have limit to 65526 pixels, not in 2040, so all must be ok with JPEG headers). And also, if this feature possible - may be, in live555 can be used for image size not only this extension ("a=x-dimensions:..."), but attributes from SDP "a=fmtp: width=WW; height=HH" attributes, like in drafts of RTP Theora, or RTP JPEG2000? In general, it's easy, if we can get access to subsession description from JPEG header class. -- With best regards, Spectr. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070329/e4b62915/attachment.html From finlayson at live555.com Thu Mar 29 14:39:35 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Mar 2007 14:39:35 -0700 Subject: [Live-devel] RTP Payload Length (RPL) of a MPEG4 video In-Reply-To: <460B832C.1030903@libero.it> References: <45E60C79.3090606@libero.it> <4603BB2A.3070804@libero.it> <460B832C.1030903@libero.it> Message-ID: >Yes, i'd like to know the RTP Payload size, where can i find this information? To get the total size of each received RTP packet, call bPacket->dataSize() in "MultiFramedRTPSource::networkReadHandler()", after the call to "fillInData()". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Mar 29 15:12:29 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 29 Mar 2007 15:12:29 -0700 Subject: [Live-devel] question about fmtp fields for MJPEG RTP In-Reply-To: <813e6e2a0703291433y7869eb7ao3c0611fcd43d087@mail.gmail.com> References: <813e6e2a0703291433y7869eb7ao3c0611fcd43d087@mail.gmail.com> Message-ID: >I work in Elphel, Inc >(http://www3.elphel.com/index.php), >we make high resolution network videocamera, what send videostream >over RTP/RTSP in MJPEG and Theora format. I assume you're familiar with our Elphel server application code: http://www.live555.com/Elphel/ >I have question, about dynamic payload type for MJPEG in live555 library. > >RTP MJPEG RFC format with payload #26 have limitation for image size >- each size of image must be less than 2040 pixels, and larger size >not supported. In current live555 code exist hack for this >limitation - if width or height in MJPEG header == 0 - used 2048 >size (what in general not correct, but ok for 3MPx sensors with >2048x1536 image). We start use 5MPx sensors, and get again this >problem. >In SDP parser (of live555 library) exist parser of field >"a=x-dimensions:WW,HH", but this data used only for container - not >when we restore JPEG headers from MJPEG packets; and also, in code, >in this time we can't get access to this properties >(MediaSubsession::videoWidth(), MediaSubsession::videoHeight() ) of >subsession from JPEG packet implementation class. > >I ask about add support for this in live555 - use width and height >from subsession description, when we make JPEG headers from MJPEG >(JPEG images have limit to 65526 pixels, not in 2040, so all must be >ok with JPEG headers). > >And also, if this feature possible - may be, in live555 can be used >for image size not only this extension ("a=x-dimensions:..."), but >attributes from SDP "a=fmtp: width=WW; height=HH" attributes, like >in drafts of RTP Theora, or RTP JPEG2000? In general, it's easy, if >we can get access to subsession description from JPEG header class. I'm generally reluctant to support attributes (like "x-dimensions") or payload formats that are not official IETF standards, or on the standards track. It wasn't clear (to me) from your message exactly what it is that you are asking for. Are you asking it/when we will support the JPEG 2000 RTP payload format ? Or are you asking that we support a non-standard SDP attribute with the RFC 2435 MJPEG RTP Payload Format? (If it's the latter, then I would be very reluctant to do this.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070329/186869d8/attachment.html From max at code-it-now.com Fri Mar 30 01:02:40 2007 From: max at code-it-now.com (Maxim Petrov) Date: Fri, 30 Mar 2007 11:02:40 +0300 Subject: [Live-devel] possible candidate for patch (Groupsock.cpp) In-Reply-To: References: <460B93FE.4090902@code-it-now.com> Message-ID: <460CC420.4010209@code-it-now.com> Ross Finlayson wrote: >>I noticed strange performance troubles on server side (CPU usage was >>more than usually) when streaming huge JPEG files. After few minutes of >>researching I noticed that for sending each RTP packet we call >>"outputToAllMemberExcept" function even if we don't have any member. So >>basicaly it's useless? Here's patch: >> >>@@ -269,11 +269,14 @@ >> statsGroupOutgoing.countPacket(bufferSize); >> >> // Then, forward to our members: >>- int numMembers = >>- outputToAllMembersExcept(interfaceNotToFwdBackTo, >>+ int numMembers = 0; >>+ if (!members().IsEmpty()) { >>+ numMembers = >>+ outputToAllMembersExcept(interfaceNotToFwdBackTo, >> ttlToSend, buffer, bufferSize, >> ourSourceAddressForMulticast(env)); >> if (numMembers < 0) break; >>+ } >> >> if (DebugLevel >= 3) { >> env << *this << ": wrote " << bufferSize << " bytes, ttl " >> >> >>After applying this patch CPU usage normalized again. Can you please >>comment is patch correct or no? >> >> > >Yes, it's OK - although I'm very sceptical about it having a >significant effect on performance. Your change doesn't save much >more than a function call. > > Well, it's really more than a function call. Look at the attached picture (output from valgrind). Here's details of what I streamed: 1) JPEG file size ~ 106 kb; 2) Framerate ~30 fps As you see from picture "Groupsock::output" was invoked 23 242 times (in generally it's ~10 sec of streaming). Next 3 lines are our interest. Because "ourSourceAddressForMulticast" invoked every time we have 7 206 890 calls of "our_random" function. Here is performance trouble. Are you agree? -------------- next part -------------- A non-text attachment was scrubbed... Name: perf_issue.jpg Type: image/jpeg Size: 26805 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070330/f6487758/attachment-0001.jpg From max at code-it-now.com Fri Mar 30 01:26:51 2007 From: max at code-it-now.com (Maxim Petrov) Date: Fri, 30 Mar 2007 11:26:51 +0300 Subject: [Live-devel] how to define that client is died? Message-ID: <460CC9CB.6040405@code-it-now.com> Hi Ross. Here is situation. Client requested video on demand. After few minutes client's network was disconnected (and probably will never connect again). But server continues send UDP packets to non-existent network and never will stop. How we can define that client does not exist anymore? I think RTCP can help in this case. But I'm not sure what's right solution can be.. I have just one idea to periodicaly check if client sends RTCP reports. If no RTCP reports in 5-10 minutes we can stop streaming. Does it have sense, or there is more efficient/right solution? From vinodjoshi at tataelxsi.co.in Fri Mar 30 07:00:19 2007 From: vinodjoshi at tataelxsi.co.in (Vinod Madhav Joshi) Date: Fri, 30 Mar 2007 19:30:19 +0530 Subject: [Live-devel] Problem for Trick Play mode streaming Message-ID: <006201c772d3$c0ede2f0$022a320a@telxsi.com> Hi all, We are using Live555 streaming server to stream MPEG2 TS and VLC as a client. For normal streaming the streaming is fine. Also we successfully created .tsx of respective streams which are greater than 0 bytes. When we fast forward the video, for few seconds it will start it will display the video is being forwarded. But after 4-5 seconds the video will stuck up, which should not happen. What the problem can be here? Anybody is facing the same problem? If anyone have more information about this please help us.. Thank You. From luis.figueiredo at tagus.ist.utl.pt Fri Mar 30 08:26:26 2007 From: luis.figueiredo at tagus.ist.utl.pt (=?ISO-8859-1?Q?Lu=EDs_Figueiredo?=) Date: Fri, 30 Mar 2007 16:26:26 +0100 Subject: [Live-devel] Problem for Trick Play mode streaming In-Reply-To: <006201c772d3$c0ede2f0$022a320a@telxsi.com> References: <006201c772d3$c0ede2f0$022a320a@telxsi.com> Message-ID: <460D2C22.4030801@tagus.ist.utl.pt> Hi. I have the same problem, but I dont't know the solution Thanks. Vinod Madhav Joshi wrote: > Hi all, > > We are using Live555 streaming server to stream MPEG2 TS and VLC as a > client. > > For normal streaming the streaming is fine. > > Also we successfully created .tsx of respective streams which are > greater than 0 bytes. > > When we fast forward the video, for few seconds it will start it will > display the video is being forwarded. > > But after 4-5 seconds the video will stuck up, which should not > happen. > > What the problem can be here? > Anybody is facing the same problem? > > If anyone have more information about this please help us.. > > Thank You. > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Fri Mar 30 07:45:29 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Mar 2007 07:45:29 -0700 Subject: [Live-devel] how to define that client is died? In-Reply-To: <460CC9CB.6040405@code-it-now.com> References: <460CC9CB.6040405@code-it-now.com> Message-ID: >Client requested video on demand. After few minutes client's network was >disconnected (and probably will never connect again). But server >continues send UDP packets to non-existent network and never will stop. >How we can define that client does not exist anymore? I think RTCP can >help in this case. Yes. The "reclamationTestSeconds" parameter to "RTSPServer::createNew()" is used to set a timeout in client inactivity. The default value is 45 (seconds). Therefore, if a RTSP/RTP client dies, then the server will stop sending to it within 45 seconds. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Mar 30 08:01:03 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Mar 2007 08:01:03 -0700 Subject: [Live-devel] possible candidate for patch (Groupsock.cpp) In-Reply-To: <460CC420.4010209@code-it-now.com> References: <460B93FE.4090902@code-it-now.com> <460CC420.4010209@code-it-now.com> Message-ID: >>Yes, it's OK - although I'm very sceptical about it having a >>significant effect on performance. Your change doesn't save much >>more than a function call. >> >Well, it's really more than a function call. Look at the attached >picture (output from valgrind). Here's details of what I streamed: >1) JPEG file size ~ 106 kb; >2) Framerate ~30 fps >As you see from picture "Groupsock::output" was invoked 23 242 times >(in generally it's ~10 sec of streaming). Next 3 lines are our >interest. Because "ourSourceAddressForMulticast" invoked every time >we have 7 206 890 calls of "our_random" function. There's your real problem. Apparently "ourSourceAddressForMulticast()" is not working properly for you. (If it's working properly, then the static variable "ourAddress " will be set to a non-zero value the first time it's called, and then just used on each subsequent call.) So, your next job is to go through the code for "ourSourceAddressForMulticast", to figure out why it's not working properly for you. (It's usually because you don't have multicast configured properly on your network interface.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Mar 30 08:02:54 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Mar 2007 08:02:54 -0700 Subject: [Live-devel] Problem for Trick Play mode streaming In-Reply-To: <006201c772d3$c0ede2f0$022a320a@telxsi.com> References: <006201c772d3$c0ede2f0$022a320a@telxsi.com> Message-ID: >Hi all, > > We are using Live555 streaming server to stream MPEG2 TS and VLC as a >client. > > For normal streaming the streaming is fine. > > Also we successfully created .tsx of respective streams which are >greater than 0 bytes. > > When we fast forward the video, for few seconds it will start it will >display the video is being forwarded. > > But after 4-5 seconds the video will stuck up, which should not >happen. > > What the problem can be here? > Anybody is facing the same problem? > > If anyone have more information about this please help us.. The problem here is VLC; it currently does not perform/handle 'trick play' operations properly. Fortunately, this is a known problem, and the VLC developers are working on it. A future version of VLC - probably Real Soon Now - should fix this problem. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From adutta at research.telcordia.com Fri Mar 30 08:21:52 2007 From: adutta at research.telcordia.com (Ashutosh Dutta) Date: Fri, 30 Mar 2007 11:21:52 -0400 Subject: [Live-devel] how to define that client is died? In-Reply-To: References: <460CC9CB.6040405@code-it-now.com> Message-ID: <460D2B10.4020408@research.telcordia.com> I have a similar question in the case of server inactivity. What happens to the client if it does not receive any RTP stream for sometime from the server? Can RTCP info be used to signal the client to behave accordingly? Thanks Ashutosh Ross Finlayson wrote: >> Client requested video on demand. After few minutes client's network was >> disconnected (and probably will never connect again). But server >> continues send UDP packets to non-existent network and never will stop. >> How we can define that client does not exist anymore? I think RTCP can >> help in this case. > > Yes. The "reclamationTestSeconds" parameter to > "RTSPServer::createNew()" is used to set a timeout in client > inactivity. The default value is 45 (seconds). Therefore, if a > RTSP/RTP client dies, then the server will stop sending to it within > 45 seconds. From glen at lincor.com Fri Mar 30 08:58:39 2007 From: glen at lincor.com (Glen Gray) Date: Fri, 30 Mar 2007 16:58:39 +0100 Subject: [Live-devel] Problem for Trick Play mode streaming In-Reply-To: <006201c772d3$c0ede2f0$022a320a@telxsi.com> References: <006201c772d3$c0ede2f0$022a320a@telxsi.com> Message-ID: <460D33AF.6050004@lincor.com> Hey guys, This has been covered before on the list if you check the archives. In summary, VLC has no support for RTSP trickplay. However, it's coming, and hopefully soon. I know I keep saying that :) What your seeing when you fastforward is the rate of play on the ts buffer increase. But once the buffer gets emptied then the video stream breaks up, as it can't fill the buffer fast enough. What should happen is that VLC should tell the the RTSP server to increase the scale of play so that the stream feed to VLC contains the increased playback rate. I've patches for VLC that allow trickplay support but only against Kasenna servers. I've been in discussions with the VLC guys about how to make those patches more generic so that they'd be accepted upstream by the project. I've a working set diffs for this implementation for VLC 0.8.6a that just needs a small bit of work to fix up the KeepAlive mode on Kasenna servers (what we currently use). Then I'll be able to test against the live555 streaming server and make sure I've handled the non-kasenna way properly. As per usual I'm swamped with other projects at the moment, so much so that I've been doing the 0.8.6a patchset in my limited free time. However, this is now becoming more of a priority at work as we need to be flexible in our VOD server, some customers have existing VOD servers for example. I'll make an announcement to the list soon, with link to the patches and perhaps a source rpm or something. Kind Regards, Vinod Madhav Joshi wrote: > Hi all, > > We are using Live555 streaming server to stream MPEG2 TS and VLC as a > client. > > For normal streaming the streaming is fine. > > Also we successfully created .tsx of respective streams which are > greater than 0 bytes. > > When we fast forward the video, for few seconds it will start it will > display the video is being forwarded. > > But after 4-5 seconds the video will stuck up, which should not > happen. > > What the problem can be here? > Anybody is facing the same problem? > > If anyone have more information about this please help us.. > > Thank You. > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Glen Gray Digital Depot, Thomas Street Senior Software Engineer Dublin 8, Ireland Lincor Solutions Ltd. Ph: +353 (0) 1 4893682 From finlayson at live555.com Fri Mar 30 12:00:05 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Mar 2007 12:00:05 -0700 Subject: [Live-devel] Problem for Trick Play mode streaming In-Reply-To: <460D33AF.6050004@lincor.com> References: <006201c772d3$c0ede2f0$022a320a@telxsi.com> <460D33AF.6050004@lincor.com> Message-ID: >I'll make an announcement to the list soon, with link to the patches and >perhaps a source rpm or something. Glen, Any patches to VLC should be sent to a VLC-related mailing list, not this list. If you haven't already done so, you should coordinate with Derk-Jan Hartman and Jean-Paul Saman, who (I believe) are working on updating VLC to properly support RTSP 'trick play' operations - just so you don't end up duplicating their work. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Mar 30 14:19:53 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 Mar 2007 14:19:53 -0700 Subject: [Live-devel] how to define that client is died? In-Reply-To: <460D2B10.4020408@research.telcordia.com> References: <460CC9CB.6040405@code-it-now.com> <460D2B10.4020408@research.telcordia.com> Message-ID: >I have a similar question in the case of server inactivity. What happens >to the client if it does not receive any RTP stream for sometime from >the server? Can RTCP info be used to signal the client to behave >accordingly? Yes. Note that RTSP server implementation (e.g, as used in the "LIVE555 Media Server") sends send RTCP "BYE" packets when the stream ends, but only for media types for which 'seeking' is not supported. (For those media types, the server keeps the session open after the stream ends, in case the client wants to replay it from an earlier time.) RTSP clients already interpret incoming RTCP "BYE" packets as signalling the end of a stream, and handle this accordingly. However, if you want to watch for a server just 'dying', before signalling the end of a stream, then you could also watch for RTCP "SR" packets coming from the server (using the "RTCPInstance::setSRHandler()" function). In most cases, though, this shouldn't be necessary, because if a server dies, then the RTSP TCP connection will also (eventually) get closed, and the client will detect this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From adutta at research.telcordia.com Fri Mar 30 14:49:22 2007 From: adutta at research.telcordia.com (Ashutosh Dutta) Date: Fri, 30 Mar 2007 17:49:22 -0400 Subject: [Live-devel] how to define that client is died? In-Reply-To: References: <460CC9CB.6040405@code-it-now.com> <460D2B10.4020408@research.telcordia.com> Message-ID: <460D85E2.4040408@research.telcordia.com> Ross, thanks for your reply. I think my scenario is little different. We are assuming a scenario where the server stops sending traffic for certain time, either controlled by a third party, or by its own, and then it is asked to send the traffic. Thus for a while, client does not get any RTP traffic from the server and after a while it starts getting traffic. I think it could very similar to network congestion in some sense. Thanks Ashutosh Ross Finlayson wrote: >> I have a similar question in the case of server inactivity. What happens >> to the client if it does not receive any RTP stream for sometime from >> the server? Can RTCP info be used to signal the client to behave >> accordingly? > > Yes. Note that RTSP server implementation (e.g, as used in the > "LIVE555 Media Server") sends send RTCP "BYE" packets when the stream > ends, but only for media types for which 'seeking' is not supported. > (For those media types, the server keeps the session open after the > stream ends, in case the client wants to replay it from an earlier > time.) > > RTSP clients already interpret incoming RTCP "BYE" packets as > signalling the end of a stream, and handle this accordingly. > > However, if you want to watch for a server just 'dying', before > signalling the end of a stream, then you could also watch for RTCP > "SR" packets coming from the server (using the > "RTCPInstance::setSRHandler()" function). In most cases, though, > this shouldn't be necessary, because if a server dies, then the RTSP > TCP connection will also (eventually) get closed, and the client will > detect this.