From shalom.shushan at gmail.com Wed Aug 2 00:55:00 2006 From: shalom.shushan at gmail.com (shalom shushan) Date: Wed, 2 Aug 2006 09:55:00 +0200 Subject: [Live-devel] RTSP Packet time Interval Message-ID: Hi All, Can anyone help me to determine the time interval between sending adjacent packets? Is there a parameter to control it? Where can i find the schedulare of sending packets? thank you shalom -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060802/a4ba3317/attachment.html From ymreddy at ssdi.sharp.co.in Wed Aug 2 01:44:50 2006 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Wed, 2 Aug 2006 14:14:50 +0530 Subject: [Live-devel] MPEG2 streaming Message-ID: <7FB4685EA93D014C8E30AA087B66E7520337CF01@ssdimailsrvnt01.ssdi.sharp.co.in> Hi Everybody, We are trying to stream MPEG2 files and receive the files at the client using the test programs: testMPEG1or2VideoStreamer.cpp and testMPEG1or2VideoReceiver.cpp files. When we compare the original mpeg2 file with the received file at the client, there is a difference of 70 bytes. We tested this in an isolated network (only server and client are connected using a hub). Server and client are given private IP addresses. Also we ran the network analyzer tool at the client side and there is no packet loss at the client side. The received mpeg2 file at the client side is decoded using mpeg2 decoder and found that two frames are missing when compared with the original file. Also decoded the received file into yuv format and found that yuv file size is less than the original. Any clues on this. Thanks and Regards Y. Mallikharjuna Reddy From weymar at ibr.cs.tu-bs.de Wed Aug 2 05:37:05 2006 From: weymar at ibr.cs.tu-bs.de (Rodrigo Weymar) Date: Wed, 2 Aug 2006 14:37:05 +0200 (CEST) Subject: [Live-devel] calling Live555 libraries from a C app under Embedded Visual C++ 4.0 In-Reply-To: <44CA0580.6030503@rbg.informatik.tu-darmstadt.de> References: <1153494350.44c0ed4e45ade@webmail.soton.ac.uk> <55786.84.133.237.208.1153663529.squirrel@webmail.ibr.cs.tu-bs.de> <44C47125.3080106@rbg.informatik.tu-darmstadt.de> <49884.134.169.35.238.1153908524.squirrel@webmail.ibr.cs.tu-bs.de> <44CA0580.6030503@rbg.informatik.tu-darmstadt.de> Message-ID: <2835.134.169.35.207.1154522225.squirrel@webmail.ibr.cs.tu-bs.de> Hi Bertram, thank you for your help. Basically I want to use the complete openRTSP program from inside a C app. To run the openRTSP.exe under WinCE is not a problem. Also to buid the Live555 libs (groupsock, liveMedia, UsageEnvironment and BasicUsageEnvironment) to WinCE is not a problem. The problem is to get the C++ classes working from inside the C app I am working on. I am able to compile and link the classes without errors, but not to get they working. Probably I am using the C++ classes in a wrong way. Anyway I will keep trying. Best regards, Rodrigo > Hi Rodrigo, > > my application is written in C++. I have renamed the main() function of > the Test-Application playSIP in playCommon.cpp from > main(int argc, char** argv) > to > mainplayCommon(int argc, char** argv) > > and then in my own application I declare > extern int mainplayCommon(int, char**); > > So I can call later in my application the complete testProgramm playSIP > like this: > > char *progName = "playSIP"; > const int argc = 12; > char *argOne = "-V"; > char *argTwo = "-M"; > char *argThree = "-A"; > char *argFour = "8"; > char *argFive = "-e"; > char *argSix = "10"; > char *argSeven = "-P"; > char *argEight = "49211"; > char *argNine = "-W"; > char *argTen = "0"; > char *argEleven = "172.16.1.100"; > > char * argv[argc]; > argv[ 0] = progName; > argv[ 1] = argOne; > argv[ 2] = argTwo; > argv[ 3] = argThree; > argv[ 4] = argFour; > argv[ 5] = argFive; > argv[ 6] = argSix; > argv[ 7] = argSeven; > argv[ 8] = argEight; > argv[ 9] = argNine; > argv[10] = argTen; > argv[11] = argEleven; > > mainplayCommon(argc,argv, this->m_frame); > > Did this help you? > > For an easier way to build my programm for Win32 and WinCE with one GUI > and one Code my programm is build-on the minimalSample from wxWidgets. I > don't know if this is realy intersting for your problem. But this sample > has one base code and you could build the same code with the same GUI > for Win32 and WinCE. In this sample if have include the code above. > > Best regard, > > Bertram > > > Rodrigo Weymar schrieb: > >> Hi Bertram, >> >> >> is your application written in C or C++ ? Were you able to mix C and C++ >> code and to declare C++ functions as extern "C" under EVC++ without >> using >> #ifdef __cplusplus ? >> >> If I try to use extern "C" under EVC++ without using #ifdef __cplusplus, >> I >> get a >> >> error C2059: syntax error : 'string' >> >> as explained in http://www.kbalertz.com/kb_133070.aspx >> >> >> regards, >> Rodrigo >> >> >> >> >>>Hi Rodrigo, >>> >>>i have written an SIP phone with RTP Stack from LIVE555 for Pocket PC / >>>Windows CE with EVC++ 4.0. Unfortunately I'm not able to solve your >>>problem but I can ensure you, that LIVE555 works fine under EVC++. >>>Further, I remember me fine, I have not set up any preprocessor >>> directive. >>> >>>Best regards, >>> >>> Bertram >>> >>> >>> >>>>Hi all, >>>> >>>>probably this is not the right mailing list to ask, since my question >>>> is >>>>not directly concerning the Live555 libraries. But maybe someone can >>>>help >>>>me or give me an advice. >>>> >>>>I am trying to integrate the Live555 libs (I am interested in the RTSP >>>>support provided by these libs) into an C app, more exactly a streaming >>>>player for Pocket PC/WinCE420, which is written in C. >>>> >>>>I use Embedded Visual C++ 4.0 as IDE. >>>> >>>>I don't get any compilation errors, since from the C source code I am >>>>using the following preprocessor directive: >>>> >>>>#ifdef __cplusplus >>>>extern "C" { >>>> >>>>here I put the Live555 C++ classes >>>> >>>>} >>>>#endif >>>> >>>> >>>>What happens is that, when I run the app, the Live555 classes don't >>>> take >>>>any effect. That is, the Embedded Visual C++ C/C++ compiler seems to >>>>just >>>>ignore what is in between the preprocessor directives. It seems to have >>>>not linked the C++ classes to the C objs. >>>> >>>>The document in [1] says that the C++ runtime libraries should be >>>>explicitly >>>>linked to the app. The problem is that I can not figure out how >>>> Embedded >>>>Visual C++ manages that. I was not able to find any option in "Project >>>>Settings" or in "Tools -> Options" concerning that. >>>> >>>>I also googled for solutions, but was not able to find one. >>>> >>>>Would someone have previous experience with that? Could someone give me >>>>help, please ? >>>> >>>> >>>>Thanks a lot! >>>> >>>>Rodrigo >>>> >>>> >>>>[1] http://developers.sun.com/prodtech/cc/articles/mixing.html#linking >>>> >>>>_______________________________________________ >>>>live-devel mailing list >>>>live-devel at lists.live555.com >>>>http://lists.live555.com/mailman/listinfo/live-devel >>>> >>> >>> >>>_______________________________________________ >>>live-devel mailing list >>>live-devel at lists.live555.com >>>http://lists.live555.com/mailman/listinfo/live-devel >>> >> >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From larissalucena at gmail.com Wed Aug 2 12:55:17 2006 From: larissalucena at gmail.com (Larissa Lucena) Date: Wed, 2 Aug 2006 16:55:17 -0300 Subject: [Live-devel] Link error Message-ID: <440165240608021255k7a1b3489xcdd0dd37a7d9b1cf@mail.gmail.com> Hi there, when I was trying to compile my code, received this error message ++ -o"HugeViewer" ./HVJPEGVideoFileServerMediaSubsession.o ./HVJPEGVideoSource.o ./onDemandRTPServer.o -l/usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a /usr/bin/ld: cannot find -l/usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a collect2: ld returned 1 exit status somebody knows how can I solve this problem? thanks in advance, Larissa -- "O maior prazer do inteligente ? bancar o idiota diante de um idiota que banca o inteligente". -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060802/121672b9/attachment-0001.html From Fabrice.Aeschbacher at siemens.com Thu Aug 3 00:34:41 2006 From: Fabrice.Aeschbacher at siemens.com (Aeschbacher, Fabrice) Date: Thu, 3 Aug 2006 09:34:41 +0200 Subject: [Live-devel] Link error In-Reply-To: <440165240608021255k7a1b3489xcdd0dd37a7d9b1cf@mail.gmail.com> Message-ID: Hi, Try: -o HugeViewer ./HVJPEGVideoFileServerMediaSubsession.o ./HVJPEGVideoSource.o ./onDemandRTPServer.o /usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a Fabrice ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Larissa Lucena Sent: Mittwoch, 2. August 2006 21:55 To: live-devel at ns.live555.com Subject: [Live-devel] Link error Hi there, when I was trying to compile my code, received this error message ++ -o"HugeViewer" ./HVJPEGVideoFileServerMediaSubsession.o ./HVJPEGVideoSource.o ./onDemandRTPServer.o -l/usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a /usr/bin/ld: cannot find -l/usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a collect2: ld returned 1 exit status somebody knows how can I solve this problem? thanks in advance, Larissa -- "O maior prazer do inteligente ? bancar o idiota diante de um idiota que banca o inteligente". From huutribk2001 at yahoo.com Thu Aug 3 02:16:41 2006 From: huutribk2001 at yahoo.com (Tran Huu Tri) Date: Thu, 3 Aug 2006 02:16:41 -0700 (PDT) Subject: [Live-devel] step by step to use liveMedia library to do stream frame captured from webcam to RTP server Message-ID: <20060803091642.85379.qmail@web52304.mail.yahoo.com> Hi, I am Tri Tran, I have downloaded the live media library that use RTP to do a video stream,...from http://www.live555.com/liveMedia/ website, I buit this on Windown platform successfuly, I can run the test program to do stream a MPEG video. I see this very helpful for my project that do a streaming video over internet. My project is: capture image (frame buffer) from webcam or canvas windown, and do streaming as RTP package to RPT server(or client) as JMStudio that can listen and play RPT session as video. I don't know what the step by step to use the library to do encode as RTP package with some codec as MPEG1, H263,... from a captured buffer. Can you help me with this? I am very thank to you for replying. Thank in advance. __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From weiyutao36 at 163.com Thu Aug 3 05:44:44 2006 From: weiyutao36 at 163.com (weiyutao36) Date: Thu, 3 Aug 2006 20:44:44 +0800 (CST) Subject: [Live-devel] A problem: receive data from server----co nnection down and reconnect Message-ID: <44D1EFBC.000028.27224@bj163app66.163.com> Hello everyone, Now I am planning to solve a problem: I am receiving a movie file from the server using the live library but suddenly, the network connection is down and the program calls shutdown() and exits, I only got a part of the movie file -as a file saved on my disc. Now,I want to change this---when the network connection is down,I DO NOT exit the program but wait for a specified time and when the network connection is OK,I re-connect to the server and CONTINUE writing the remained movie file to my already saved file segment. However,I do not know where to add some codes to the program and a basic solution is not formed. Does anyone have some advice? Thanks in advance. ______________________ FilexBlue,P.R.China -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060803/f6e29a8b/attachment.html From ymreddy at ssdi.sharp.co.in Thu Aug 3 07:20:15 2006 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Thu, 3 Aug 2006 19:50:15 +0530 Subject: [Live-devel] MPEG2 streaming Message-ID: <7FB4685EA93D014C8E30AA087B66E7520337CF10@ssdimailsrvnt01.ssdi.sharp.co.in> Hi Everybody, Further to my mail on MPEG2 streaming, we observed that server program (testMPEG1or2VideoStreamer.cpp) is sending the extra 70 bytes and the client program (testMPEG1or2VideoReceiver.cpp) is receiving these 70 bytes. Any idea why these extra bytes in the server side. When we tried to decode the original server MPEG2 stream and the received MPEG2 stream in the client using Mplayer into ppm files format, one last ppm file is missing on the client side. Any clues on this, please respond. Thanks and Regards Y. Mallikharjuna Reddy From finlayson at live555.com Thu Aug 3 07:59:28 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Aug 2006 07:59:28 -0700 Subject: [Live-devel] MPEG2 streaming In-Reply-To: <7FB4685EA93D014C8E30AA087B66E7520337CF10@ssdimailsrvnt01.ssdi.sharp.co.in > References: <7FB4685EA93D014C8E30AA087B66E7520337CF10@ssdimailsrvnt01.ssdi.sharp.co.in > Message-ID: >Further to my mail on MPEG2 streaming, we observed that server program >(testMPEG1or2VideoStreamer.cpp) is sending the extra 70 bytes and the client >program (testMPEG1or2VideoReceiver.cpp) is receiving these 70 bytes. Any >idea why these extra bytes in the server side. No, but Remember, You Have Complete Source Code. >When we tried to decode the original server MPEG2 stream and the received >MPEG2 stream in the client using Mplayer into ppm files format, one last ppm >file is missing on the client side. That is a MPlayer-specific issue; you'll need to ask about this on a MPlayer mailing list (or else just use VLC instead, like most other people). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From huutribk2001 at yahoo.com Thu Aug 3 23:15:20 2006 From: huutribk2001 at yahoo.com (Tran Huu Tri) Date: Thu, 3 Aug 2006 23:15:20 -0700 (PDT) Subject: [Live-devel] Step by step to use liveMedia library to do stream frame captured from webcam to RTP server Message-ID: <20060804061520.46283.qmail@web52308.mail.yahoo.com> Hi, I am Tri Tran, I have downloaded the live media library that use RTP to do a video stream,...from http://www.live555.com/liveMedia/ website, I buit this on Windown platform successfuly, I can run the test program to do stream a MPEG video. I see this very helpful for my project that do a streaming video over internet. My project is: capture image (frame buffer) from webcam or canvas windown, and do streaming as RTP package to RPT server(or client) as JMStudio that can listen and play RPT session as video. I don't know what the step by step to use the library to do encode as RTP package with some codec as MPEG1, H263,... from a captured buffer. Can you help me with this? I am very thank to you for replying. Thank in advance. __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From larissalucena at gmail.com Fri Aug 4 04:49:48 2006 From: larissalucena at gmail.com (Larissa Lucena) Date: Fri, 4 Aug 2006 08:49:48 -0300 Subject: [Live-devel] Link error In-Reply-To: References: <440165240608021255k7a1b3489xcdd0dd37a7d9b1cf@mail.gmail.com> Message-ID: <440165240608040449v6857b272ta301610116ced75d@mail.gmail.com> OK, hanks, I'll try it! Larissa On 8/3/06, Aeschbacher, Fabrice wrote: > > Hi, > > Try: > > -o HugeViewer ./HVJPEGVideoFileServerMediaSubsession.o > ./HVJPEGVideoSource.o ./onDemandRTPServer.o > /usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a > > Fabrice > > > > > ________________________________ > > From: live-devel-bounces at ns.live555.com [mailto: > live-devel-bounces at ns.live555.com] On Behalf Of Larissa Lucena > Sent: Mittwoch, 2. August 2006 21:55 > To: live-devel at ns.live555.com > Subject: [Live-devel] Link error > > > Hi there, > > when I was trying to compile my code, received this error message > > ++ -o"HugeViewer" ./HVJPEGVideoFileServerMediaSubsession.o > ./HVJPEGVideoSource.o ./onDemandRTPServer.o > -l/usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a > /usr/bin/ld: cannot find > -l/usr/lib/live/BasicUsageEnvironment/libBasicUsageEnvironment.a > collect2: ld returned 1 exit status > > somebody knows how can I solve this problem? > > thanks in advance, > > Larissa > > -- > "O maior prazer do inteligente ? bancar o idiota > diante de um idiota que banca o inteligente". > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- "O maior prazer do inteligente ? bancar o idiota diante de um idiota que banca o inteligente". -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060804/1b497cea/attachment.html From yossyd at nayos.com Sun Aug 6 02:21:07 2006 From: yossyd at nayos.com (Yossy Dreyfus) Date: Sun, 6 Aug 2006 11:21:07 +0200 Subject: [Live-devel] about testWAVAudioStreamer Message-ID: <000001c6b939$aba4da60$570a1f0a@nayos.local> Hello, The "testWAVAudioStreamer" program works only with PCM (audio format = 1). I need to send via rtsp a WAV file with audio format =6 (aLaw), which changes should I do? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060806/2a39ddd9/attachment-0001.html From shalom.shushan at gmail.com Sun Aug 6 03:32:31 2006 From: shalom.shushan at gmail.com (shalom shushan) Date: Sun, 6 Aug 2006 13:32:31 +0300 Subject: [Live-devel] about testWAVAudioStreamer In-Reply-To: <000001c6b939$aba4da60$570a1f0a@nayos.local> References: <000001c6b939$aba4da60$570a1f0a@nayos.local> Message-ID: To stream G.711 a-law audio, you would need to write a new "aLawFromPCMAudio" filter, similar to the existing "uLawFromPCMAudio" filter class. On 8/6/06, Yossy Dreyfus wrote: > > Hello, > > > > The "testWAVAudioStreamer" program works only with PCM (audio format = 1). > > > > I need to send via rtsp a WAV file with audio format =6 (aLaw), which > changes should I do? > > > > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060806/2c2e979a/attachment.html From atariq at gmail.com Sun Aug 6 05:42:11 2006 From: atariq at gmail.com (Adnan) Date: Sun, 6 Aug 2006 14:42:11 +0200 Subject: [Live-devel] Two Sink in same thread Message-ID: <9519eabd0608060542y39abc817pb6431f5eb6ac177b@mail.gmail.com> Hi, I am just new to Liblive555, so need some help from you ppl. I have to do something like below RTPSource --> FileSink ---- some processing ---- FileSource ---> RTPSink Is it possible that i call FileSink->startplaying() RTPSink->startplaying() in same thread and Will TaskScheduler::doEventLoop() be able to start generating tasks that need to be done for both sinks. Actually i have tried this and TaskScheduler::doEventLoop() never process any event. Is having two sinks in same thread not possible or i am doing somthing wrong? Thanks for your time best regards adnan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060806/d3d17709/attachment.html From atariq at gmail.com Sun Aug 6 05:44:27 2006 From: atariq at gmail.com (Adnan) Date: Sun, 6 Aug 2006 14:44:27 +0200 Subject: [Live-devel] Using own decoder within live 555 Message-ID: <9519eabd0608060544m4caf8d95q6ae9297a6a9f9370@mail.gmail.com> Hi, I have another question I have to develop an application which recieve RTP stream (Mpeg2 or Mpeg4) , decode it then encode it (with user defined parameters) and then send it again as RTP stream. RTPSource --> Decode ---> Encode ---> RTPSink According to FAQs ..... in order to send encoder output to RTPSink, one has to write FramedSource subclass than encapsulate the encoder and delievers audio or video frames directly to the appropriate RTPSink. But what about the Decoder. How to get frames from RTPSource ? regards adnan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060806/ecd12c91/attachment.html From finlayson at live555.com Sun Aug 6 07:12:33 2006 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 6 Aug 2006 07:12:33 -0700 Subject: [Live-devel] about testWAVAudioStreamer In-Reply-To: <000001c6b939$aba4da60$570a1f0a@nayos.local> References: <000001c6b939$aba4da60$570a1f0a@nayos.local> Message-ID: >Hello, > >The "testWAVAudioStreamer" program works only with PCM (audio format = 1). > >I need to send via rtsp a WAV file with audio format =6 (aLaw), >which changes should I do? As I said in an earlier response about a week ago (to someone else's question, I hope): Streaming a-law audio over RTP is done the same way as streaming u-law audio over RTP, except that the MIME subtype is "PCMA" (rather than "PCMU"), and - for 8 kHz mono - the static payload type 8 (rather than 0). (See RFC 3551.) You should be able to figure out from the existing code how to do this... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060806/d8dfe669/attachment.html From finlayson at live555.com Sun Aug 6 07:21:28 2006 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 6 Aug 2006 07:21:28 -0700 Subject: [Live-devel] Two Sink in same thread In-Reply-To: <9519eabd0608060542y39abc817pb6431f5eb6ac177b@mail.gmail.com> References: <9519eabd0608060542y39abc817pb6431f5eb6ac177b@mail.gmail.com> Message-ID: >Hi, > >I am just new to Liblive555, so need some help from you ppl. > >I have to do something like below > >RTPSource --> FileSink ---- some processing ---- FileSource ---> RTPSink No, there is only one sink ("MediaSink" subclass) object in each data chain: It's the object at the end of the chain. What you really want is to write your own "FramedFilter" subclass that acts like a "FileSink". I.e., your class's implementation of the "doGetNextFrame()" virtual function will be similar to "FileSink"s implementation of the "continuePlaying()" virtual function - except that it would also deliver data to the downstream object. (As always, look at "liveMedia/DeviceSource.cpp" for guidance.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Aug 6 07:23:49 2006 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 6 Aug 2006 07:23:49 -0700 Subject: [Live-devel] Using own decoder within live 555 In-Reply-To: <9519eabd0608060544m4caf8d95q6ae9297a6a9f9370@mail.gmail.com> References: <9519eabd0608060544m4caf8d95q6ae9297a6a9f9370@mail.gmail.com> Message-ID: >I have to develop an application which recieve RTP stream (Mpeg2 or >Mpeg4) , decode it then encode it (with user defined parameters) and >then send it again as RTP stream. > >RTPSource --> Decode ---> Encode ---> RTPSink > >According to FAQs ..... in order to send encoder output to RTPSink, >one has to write FramedSource subclass than encapsulate the encoder >and delievers audio or video frames directly to the appropriate >RTPSink. > > But what about the Decoder. How to get frames from RTPSource ? Call yourRTPSourceObject->getNextFrame( ... ) There are several examples in the code of this (search for "getNextFrame(") -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 7 00:38:29 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Aug 2006 00:38:29 -0700 Subject: [Live-devel] New "LIVE555 Streaming Media" code release - changes RTSP server port number selection Message-ID: A new version (2006.08.07) of the "LIVE555 Streaming Media" code has now been installed. A noteworthy change in this release: The way that "OnDemandServerMediaSubsession" creates server ports for unicast RTSP sessions has been changed. Rather than letting the OS choose an 'ephemeral' portm number, we now choose server port numbers starting with a specific port number (which is now an optional parameter to the "OnDemandServerMediaSubsession" constructor). The default value of this initial port number parameter is 6970. This matches the port number range used by other common RTSP server implementations, including Darwin Streaming Server and Helix. From yunjnz at yahoo.com Mon Aug 7 01:42:58 2006 From: yunjnz at yahoo.com (yj) Date: Mon, 7 Aug 2006 01:42:58 -0700 (PDT) Subject: [Live-devel] New "LIVE555 Streaming Media" code release - changes RTSP server port number selection In-Reply-To: Message-ID: <20060807084258.47610.qmail@web35812.mail.mud.yahoo.com> Hi, did you plan for the trick mode support for MPEG2 and MPEG4 streaming? Thanks, sean. --- Ross Finlayson wrote: > A new version (2006.08.07) of the "LIVE555 Streaming > Media" code has > now been installed. > > A noteworthy change in this release: > > The way that "OnDemandServerMediaSubsession" creates > server ports for > unicast RTSP sessions has been changed. Rather > than letting the OS > choose an 'ephemeral' portm number, we now choose > server port numbers > starting with a specific port number (which is now > an optional > parameter to the "OnDemandServerMediaSubsession" > constructor). The > default value of this initial port number parameter > is 6970. This > matches the port number range used by other common > RTSP server > implementations, > including Darwin Streaming Server and Helix. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From xcsmith at rockwellcollins.com Mon Aug 7 11:25:24 2006 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Mon, 7 Aug 2006 13:25:24 -0500 Subject: [Live-devel] Q: No RTCP Sink Avail. / Run-time stream changes Message-ID: 2 Questions: 1. Will the Live555 RTSP server have problems sending or receiving RTP data if I cannot provide a source/sink for RTCP packets? 2. I want to add and remove Streams while my application is running to allow for selectable file names. I notice in the testOnDemandRTSPServer application, all the streams are added prior to starting the environment loop. Can you tell me if there would be a problem if another thread setup streams for use with future sessions and added those streams to the environment while the environment was running? Would it be instead necessary to "wakeup" the environment thread and have that thread add the streams? Or is there a better method for this altogether? (I am aware of potential security problem with allowing selectable files.) Thanks very much! ~medra From finlayson at live555.com Mon Aug 7 16:32:31 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Aug 2006 16:32:31 -0700 Subject: [Live-devel] Q: No RTCP Sink Avail. / Run-time stream changes In-Reply-To: References: Message-ID: >1. Will the Live555 RTSP server have problems sending or receiving RTP data >if I cannot provide a source/sink for RTCP packets? I don't really understand this question, but if you're asking whether you can make RTCP optional, then the answer is no. RTCP is a mandatory part of the RTP/RTCP standard. >2. I want to add and remove Streams while my application is running to >allow for selectable file names. I notice in the testOnDemandRTSPServer >application, all the streams are added prior to starting the environment >loop. Can you tell me if there would be a problem if another thread setup >streams for use with future sessions and added those streams to the >environment while the environment was running? You can add new "ServerMediaSession" objects to (or remove existing "ServerMediaSession" objects from) a running RTSPServer at any time. But you do this within the event loop - not using threads. Please, everybody, read the FAQ! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From tchristensen at nordija.com Tue Aug 8 05:40:40 2006 From: tchristensen at nordija.com (Thomas Christensen) Date: Tue, 8 Aug 2006 14:40:40 +0200 Subject: [Live-devel] Possible to join multicast and extract jpegs? Message-ID: <501AD2B0-9683-4152-833A-4F04BE342244@nordija.com> Hi I want to extract images from a multicasted mpeg2 transport stream TS. The TS is raw udp and does not use RTP. I figure the required streps are the following: 1. Receive MPEG2-TS and keep a buffer that only keeps the latest couple of screens 2. Extract a key-frame from that buffer and save as jpeg Regarding step 1, I've looked at the various testMPEG*VideoReceiver.cpp programs. They seem to require RTP and RTCP streams. Does anyone know of a better place to start? Step 2, extracting Jpeg. Has anyone done that? Thanks for producing a great library. cheers Thomas From finlayson at live555.com Tue Aug 8 14:13:51 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Aug 2006 14:13:51 -0700 Subject: [Live-devel] Possible to join multicast and extract jpegs? In-Reply-To: <501AD2B0-9683-4152-833A-4F04BE342244@nordija.com> References: <501AD2B0-9683-4152-833A-4F04BE342244@nordija.com> Message-ID: >Hi > >I want to extract images from a multicasted mpeg2 transport stream >TS. The TS is raw udp and does not use RTP. I figure the required >streps are the following: > >1. Receive MPEG2-TS and keep a buffer that only keeps the latest >couple of screens >2. Extract a key-frame from that buffer and save as jpeg Well, you've missed two *very* important steps in the middle of 2: - decode the MPEG-2 frame - reencode the decoded frame to JPEG > >Regarding step 1, I've looked at the various >testMPEG*VideoReceiver.cpp programs. They seem to require RTP and >RTCP streams. Does anyone know of a better place to start? For receiving the MPEG-2 TS/UDP stream (step 1), you could use our "BasicUDPSource" class. But apart from that, there's little that the existing "LIVE555 Streaming Media" software gives you (because it doesn't contain any decoding or encoding software). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From terry1 at beam.ltd.uk Tue Aug 8 23:06:27 2006 From: terry1 at beam.ltd.uk (Terry Barnaby) Date: Wed, 09 Aug 2006 07:06:27 +0100 Subject: [Live-devel] Streaming H264 raw files using RTSP Message-ID: <44D97B63.4030302@beam.ltd.uk> Hi, I am a bit new to the live555 system .... I would like to use the library to stream raw H264 files (as generated with "ffmpeg -f h264") using the RTSP protocol. I have managed to stream MPEG4 ES files Ok. I assume that support for streaming H264 files over RSTP is not quite there yet ? If this is the case would I just have to implement: "H264VideoFileServerMediaSubsession" as a subclass of "FileServerMediaSubsession" which would use "H264VideoRTPSink" and a new class "H264VideoStreamFramerRaw" derived from "H264VideoStreamFramer" with the "H264VideoStreamFramerRaw" class packetising the H264 stream into NAL units ? Any guidance would be gratefully received ... Cheers Terry From huutribk2001 at yahoo.com Wed Aug 9 00:16:37 2006 From: huutribk2001 at yahoo.com (Tran Huu Tri) Date: Wed, 9 Aug 2006 00:16:37 -0700 (PDT) Subject: [Live-devel] live-devel Digest, Vol 34, Issue 3 In-Reply-To: Message-ID: <20060809071637.83872.qmail@web52308.mail.yahoo.com> Hi, I have question about how to do streaming with buffer frame, I mean my input is not a file, the input is the frame that capture from webcam in RGB format. I wrote the class as the same DeviceSource class, this worked with input is a mpg file. But I want to use this with doing streaming a webcam to RTP server. What the step by step I need to do with RGB buffer for streaimg by using this libary? can we stream with jpeg file inputs, can I change test.mpg to test.jpg to do as test*Stream example? Thank in advance! __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From TonyBai at viatech.com.cn Wed Aug 9 00:48:25 2006 From: TonyBai at viatech.com.cn (TonyBai at viatech.com.cn) Date: Wed, 9 Aug 2006 15:48:25 +0800 Subject: [Live-devel] A question about RTP in livestream Message-ID: <00DA072AF14656448C31F5C0182E7D0A72CFD6@exchsh01.viatech.com.sh> Hi, all Is the RTP in livestream over TCP or UDP or both? If the RTP is over both, how to choice using RTP over UDP? Thank you very much TonyBai -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060809/319d338b/attachment.html From finlayson at live555.com Wed Aug 9 01:17:01 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Aug 2006 01:17:01 -0700 Subject: [Live-devel] Streaming H264 raw files using RTSP In-Reply-To: <44D97B63.4030302@beam.ltd.uk> References: <44D97B63.4030302@beam.ltd.uk> Message-ID: >I assume that support for streaming H264 files over RSTP is not quite >there yet ? No - in part because there's no clear definition of what a "H264 file" means. > >If this is the case would I just have to implement: >"H264VideoFileServerMediaSubsession" as a subclass of >"FileServerMediaSubsession" which would use "H264VideoRTPSink" and >a new class "H264VideoStreamFramerRaw" derived from >"H264VideoStreamFramer" with the "H264VideoStreamFramerRaw" class >packetising the H264 stream into NAL units ? That's exactly right! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From TonyBai at viatech.com.cn Wed Aug 9 01:36:36 2006 From: TonyBai at viatech.com.cn (TonyBai at viatech.com.cn) Date: Wed, 9 Aug 2006 16:36:36 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIEEgcXVlc3Rpb24gYWJvdXQgUlRQIGlu?= =?gb2312?b?IGxpdmVzdHJlYW0=?= Message-ID: <00DA072AF14656448C31F5C0182E7D0A72D050@exchsh01.viatech.com.sh> I know the answer~ Thank you _____ ???: TonyBai at viatech.com.cn [mailto:TonyBai at viatech.com.cn] ????: 2006?8?9? 15:48 ???: live-devel at ns.live555.com ??: [Live-devel] A question about RTP in livestream Hi, all Is the RTP in livestream over TCP or UDP or both? If the RTP is over both, how to choice using RTP over UDP? Thank you very much TonyBai -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060809/975b76cb/attachment.html From terry1 at beam.ltd.uk Wed Aug 9 02:37:17 2006 From: terry1 at beam.ltd.uk (Terry Barnaby) Date: Wed, 09 Aug 2006 10:37:17 +0100 Subject: [Live-devel] Streaming H264 raw files using RTSP In-Reply-To: References: <44D97B63.4030302@beam.ltd.uk> Message-ID: <44D9ACCD.1040604@beam.ltd.uk> Ross Finlayson wrote: >>I assume that support for streaming H264 files over RSTP is not quite >>there yet ? > > > No - in part because there's no clear definition of what a "H264 file" means. > > >>If this is the case would I just have to implement: >>"H264VideoFileServerMediaSubsession" as a subclass of >>"FileServerMediaSubsession" which would use "H264VideoRTPSink" and >>a new class "H264VideoStreamFramerRaw" derived from >>"H264VideoStreamFramer" with the "H264VideoStreamFramerRaw" class >>packetising the H264 stream into NAL units ? > > > That's exactly right! Thanks for the info, I will have a go at implementing it. What file format are others using to store H264 video data ? Cheers Terry From TonyBai at viatech.com.cn Wed Aug 9 02:54:46 2006 From: TonyBai at viatech.com.cn (TonyBai at viatech.com.cn) Date: Wed, 9 Aug 2006 17:54:46 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIFN0cmVhbWluZyBIMjY0IHJhdyBmaWxl?= =?gb2312?b?cyB1c2luZyBSVFNQ?= Message-ID: <00DA072AF14656448C31F5C0182E7D0A72D12A@exchsh01.viatech.com.sh> Avc file format based on ISO Base Media file format. -----????----- ???: Terry Barnaby [mailto:terry1 at beam.ltd.uk] ????: 2006?8?9? 17:37 ???: LIVE555 Streaming Media - development & use ??: Re: [Live-devel] Streaming H264 raw files using RTSP Ross Finlayson wrote: >>I assume that support for streaming H264 files over RSTP is not quite >>there yet ? > > > No - in part because there's no clear definition of what a "H264 file" means. > > >>If this is the case would I just have to implement: >>"H264VideoFileServerMediaSubsession" as a subclass of >>"FileServerMediaSubsession" which would use "H264VideoRTPSink" and >>a new class "H264VideoStreamFramerRaw" derived from >>"H264VideoStreamFramer" with the "H264VideoStreamFramerRaw" class >>packetising the H264 stream into NAL units ? > > > That's exactly right! Thanks for the info, I will have a go at implementing it. What file format are others using to store H264 video data ? Cheers Terry _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From spacelis at gmail.com Wed Aug 9 05:15:04 2006 From: spacelis at gmail.com (SpaceLi) Date: Wed, 9 Aug 2006 20:15:04 +0800 Subject: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 Message-ID: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> I tried testOnDemandRTSPServer in testProgs. But the message showed 0.0.0.0 while I have ifconfiged to 192.168.0.128. And realplayer cannot link to the server. rtsp://192.168.0.128:8554/mp3AudioTest I build the code to nios2 uclinux. Is there any suggestions? From terry1 at beam.ltd.uk Wed Aug 9 06:03:31 2006 From: terry1 at beam.ltd.uk (Terry Barnaby) Date: Wed, 09 Aug 2006 14:03:31 +0100 Subject: [Live-devel] MPEG/H264 Analysis tools Message-ID: <44D9DD23.7010605@beam.ltd.uk> Hi, Does anyone know of any open source MPEG/H264 analysis tools to help me debug H264/MPGE streams ? Cheers Terry From nitin.e at gmail.com Wed Aug 9 07:12:54 2006 From: nitin.e at gmail.com (nitin jain) Date: Wed, 9 Aug 2006 19:42:54 +0530 Subject: [Live-devel] receiving MPEG2 TS stream Message-ID: Hello everybody, I am not able to correctly receive MPEG2 TS stream transmitted by testMPEG2Transportstreamer.cpp demo test program.For receiving I am using sessionState.source = MPEG1or2VideoRTPSource::createNew(*env, &rtpGroupsock,32,90000); program.I have given rtppayload type = 33 as: ********************************************************************************* sessionState.source = MPEG1or2VideoRTPSource::createNew(*env, &rtpGroupsock,33,90000); ********************************************************************************* in receiver program. Is it correct or I have to made any other changes . Thanks and Regards, Ness -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060809/d67c0d98/attachment.html From finlayson at live555.com Wed Aug 9 12:31:04 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Aug 2006 12:31:04 -0700 Subject: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 In-Reply-To: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> References: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> Message-ID: >I tried testOnDemandRTSPServer in testProgs. >But the message showed 0.0.0.0 This is happening because - for some unknown reason - the function "ourSourceAddressForMulticast()" (in "groupsock/Groupsock.cpp") is failing to find the computer's own IP address. You'll need to trace through that code, to figure out why it is not working on your computer. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Aug 9 12:44:29 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Aug 2006 12:44:29 -0700 Subject: [Live-devel] receiving MPEG2 TS stream In-Reply-To: References: Message-ID: >I am not able to correctly receive MPEG2 TS stream transmitted >by testMPEG2Transportstreamer.cpp demo test program.For receiving I >am using sessionState.source = MPEG1or2VideoRTPSource::createNew That's the mistake. The "MPEG1or2VideoRTPSource" class is for receiving MPEG Video *Elementary Stream* data - not Transport Stream data. To receive a MPEG-2 Transport Stream over RTP, you must first create a "SimpleRTPSource", and then pass its output to a "MPEG2TransportStreamFramer". For an example of this, see lines 722 through 726 of "liveMedia/MediaSession.cpp". (If you intend only to store the received data in a file, then you do not need to create the "MPEG2TransportStreamFramer". If, however, you want to render the data (in a media player, e.g.), then you should create the "MPEG2TransportStreamFramer", so that correct timestamps are generated.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From spacelis at gmail.com Wed Aug 9 20:00:29 2006 From: spacelis at gmail.com (SpaceLi) Date: Thu, 10 Aug 2006 11:00:29 +0800 Subject: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 In-Reply-To: References: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> Message-ID: <4f8ff5c10608092000n495b030es274f226f29212f1d@mail.gmail.com> I have read that part of code, it seems to use multicast ip to figure out ourSourceAddress. But the os does not support multicast on my board or I am not sure if multicast could work properly on the system. And I am not very interest in multicast. Is there any way to make it without multicast. And even specified static ip in code is acceptable. thx for reply. On 8/10/06, Ross Finlayson wrote: > >I tried testOnDemandRTSPServer in testProgs. > >But the message showed 0.0.0.0 > > This is happening because - for some unknown reason - the function > "ourSourceAddressForMulticast()" (in "groupsock/Groupsock.cpp") is > failing to find the computer's own IP address. > > You'll need to trace through that code, to figure out why it is not > working on your computer. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Wed Aug 9 21:23:06 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Aug 2006 21:23:06 -0700 Subject: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 In-Reply-To: <4f8ff5c10608092000n495b030es274f226f29212f1d@mail.gmail.com> References: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> <4f8ff5c10608092000n495b030es274f226f29212f1d@mail.gmail.com> Message-ID: >I have read that part of code, it seems to use multicast ip to figure >out ourSourceAddress. But the os does not support multicast on my >board or I am not sure if multicast could work properly on the system. For many systems, getting multicast to work requires only that you add a route for 224.0.0/4 (for your local network interface). >And I am not very interest in multicast. Is there any way to make it >without multicast. If multicast doesn't work, then the code uses "gethostname()" followed by "gethostbyname()" to try to figure out your IP address. If neither method works, then I suppose you could hard-wire your address into the implementation of "ourSourceAddressForMulticast()". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From dbikash at gmail.com Wed Aug 9 23:43:10 2006 From: dbikash at gmail.com (Deeptendu Bikash) Date: Thu, 10 Aug 2006 12:13:10 +0530 Subject: [Live-devel] How to extract AudioMuxConfig from an ADTS AAC file? Message-ID: <29758ec70608092343x2e0e4d9fja8edf7220e7d3212@mail.gmail.com> Hi, How can I extract the AudioMuxConfig from an ADTS AAC file? The ISO 14496-3 document gives the bitstream syntax of AudioMuxConfig and also describes the ADTS file format, but I cannot derive the information from there. Thanks, Deeptendu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060809/d94d655d/attachment.html From dbikash at gmail.com Thu Aug 10 00:06:36 2006 From: dbikash at gmail.com (Deeptendu Bikash) Date: Thu, 10 Aug 2006 12:36:36 +0530 Subject: [Live-devel] How to extract AudioMuxConfig from an ADTS AAC file? In-Reply-To: <29758ec70608092343x2e0e4d9fja8edf7220e7d3212@mail.gmail.com> References: <29758ec70608092343x2e0e4d9fja8edf7220e7d3212@mail.gmail.com> Message-ID: <29758ec70608100006o43b8b034sa16c16b3f218e6b1@mail.gmail.com> Sorry, I meant to ask AudioSpecificConfig. On 8/10/06, Deeptendu Bikash wrote: > > Hi, > > How can I extract the AudioMuxConfig from an ADTS AAC file? The ISO > 14496-3 document gives the bitstream syntax of AudioMuxConfig and also > describes the ADTS file format, but I cannot derive the information from > there. > > Thanks, > Deeptendu > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060810/ca92da42/attachment.html From finlayson at live555.com Thu Aug 10 01:10:44 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Aug 2006 01:10:44 -0700 Subject: [Live-devel] How to extract [AudioSpecificConfig] from an ADTS AAC file? In-Reply-To: <29758ec70608092343x2e0e4d9fja8edf7220e7d3212@mail.gmail.com> References: <29758ec70608092343x2e0e4d9fja8edf7220e7d3212@mail.gmail.com> Message-ID: >How can I extract the [AudioSpecificConfig] from an ADTS AAC file? The "ADTSAudioFileSource" class does this automatically. You don't need to do anything more yourself. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From TonyBai at viatech.com.cn Thu Aug 10 02:03:42 2006 From: TonyBai at viatech.com.cn (TonyBai at viatech.com.cn) Date: Thu, 10 Aug 2006 17:03:42 +0800 Subject: [Live-devel] Questiong about QoS Message-ID: <00DA072AF14656448C31F5C0182E7D0A72D72F@exchsh01.viatech.com.sh> Hi, all Is there QoS(Quality of Service) in live streaming media? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060810/70724f2a/attachment.html From finlayson at live555.com Thu Aug 10 02:24:21 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Aug 2006 02:24:21 -0700 Subject: [Live-devel] Questiong about QoS In-Reply-To: <00DA072AF14656448C31F5C0182E7D0A72D72F@exchsh01.viatech.com.sh> References: <00DA072AF14656448C31F5C0182E7D0A72D72F@exchsh01.viatech.com.sh> Message-ID: >Is there QoS(Quality of Service) in live streaming media? This is a very vague question - but the answer is probably "no". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060810/2b8566db/attachment.html From darnold at futurec.net Thu Aug 10 07:17:28 2006 From: darnold at futurec.net (David Arnold) Date: Thu, 10 Aug 2006 07:17:28 -0700 Subject: [Live-devel] Behavior of DeviceSource::doGetNextFrame() when no data available In-Reply-To: Message-ID: What should a DeviceSource, such as a frame grabber, do in doGetNextFrame() when it determines there is no frame available? I find no answer in the FAQ. The comment in DeviceSource.cpp reads: // This must be done in a non-blocking fashion - i.e., so that we // return immediately from this function even if no data is // currently available. // The problem is if I just return, as in: doGetNextFrame() { if (grabNextFrame() == NO_FRAME_AVAILABLE) return; } The task scheduler seems to stop. It appears as though I must invoke afterGetting(this). If I must invoke afterGetting() what to I set all the expected output parameters to (such as fFrameSize, etc) to. This also raises questions about timing. doGetNextFrame() runs because the interval fired. How do I keep presentation times for the next interval correct? In the past, my call to grabNextFrame() blocked in the case where no frame was available. When I added an audio stream to the video the blocking call seems to cause all kinds of timing problems and eventually, VLC and QuickTime fail to play the stream. Thank you, Dave Arnold Future Concepts The information contained in this electronic mail transmission is intended only for the use of the individual or entity named above and is privileged and confidential. If you are not the intended recipient, please do not read, copy, use or disclose this communication to others. Any dissemination, distribution or copying of this communication other than to the person or entity named above is strictly prohibited. If you have received this communication in error, please immediately delete it from your system. From clem.taylor at gmail.com Thu Aug 10 08:53:44 2006 From: clem.taylor at gmail.com (Clem Taylor) Date: Thu, 10 Aug 2006 11:53:44 -0400 Subject: [Live-devel] Behavior of DeviceSource::doGetNextFrame() when no data available In-Reply-To: References: Message-ID: On 8/10/06, David Arnold wrote: > What should a DeviceSource, such as a frame grabber, do in doGetNextFrame() > when it determines there is no frame available? I find no answer in the > FAQ. The comment in DeviceSource.cpp reads: You need to reschedule a retry. In my device source (which I copied from one of the examples), after reading and validating the frame from the device it uses scheduleDelayedTask with a 0 delay to schedule a call to 'afterGetting'. If a frame is not available it uses scheduleDelayedTask to retry reading a frame from the device at some point in the future. If you have a read selectable file descriptor for your device then there is a more efficient method to do this. For example: /* poll again in 5ms */ nextTask() = envir().taskScheduler().scheduleDelayedTask ( 5 * 1000, (TaskFunc*) dspRetryGet, this ); --Clem From finlayson at live555.com Thu Aug 10 10:57:49 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Aug 2006 10:57:49 -0700 Subject: [Live-devel] Behavior of DeviceSource::doGetNextFrame() when no data available In-Reply-To: References: Message-ID: >On 8/10/06, David Arnold wrote: >> What should a DeviceSource, such as a frame grabber, do in doGetNextFrame() >> when it determines there is no frame available? I find no answer in the >> FAQ. The comment in DeviceSource.cpp reads: > >You need to reschedule a retry. > >In my device source (which I copied from one of the examples), after >reading and validating the frame from the device it uses >scheduleDelayedTask with a 0 delay to schedule a call to >'afterGetting'. If a frame is not available it uses >scheduleDelayedTask to retry reading a frame from the device at some >point in the future. If you have a read selectable file descriptor for >your device then there is a more efficient method to do this. > >For example: > /* poll again in 5ms */ > nextTask() = envir().taskScheduler().scheduleDelayedTask ( > 5 * 1000, (TaskFunc*) dspRetryGet, this ); This works, but can be inefficient - because it polls frequently, and because it incurs a delay of at least 5 ms (in your example), even if new frame data becomes available almost immediately. A better way is to make the availability of new frame data an 'event', and just handle this in the event loop. If your frame grabber is implemented as a readable open file (i.e., socket), then you can just call "envir().taskScheduler().turnOnBackgroundReadHandling( ...)" on it. This will schedule a handler function that gets called (via "select()") within the event loop. If your frame grabber is *not* a readable socket, then you need to do something else to trigger the arrival of new frame data as being an 'event'. One simple way to do this is to use the "watchVariable" parameter to "doEventLoop()". (Look at the code, and the FAQ.) If your custom events are more complicated than can be handled using the "watchVariable", then you would need to write your own event loop - i.e., your own subclass of "TaskScheduler" - that handles them. >The problem is if I just return, as in: > >doGetNextFrame() >{ > if (grabNextFrame() == NO_FRAME_AVAILABLE) > return; >} > >The task scheduler seems to stop. The task scheduler doesn't 'stop', it is just waiting until events happen. That's why you need to make the availability of new frame data an 'event'. Apart from that, your code above is correct. > It appears as though I must invoke >afterGetting(this). If I must invoke afterGetting() what to I set all the >expected output parameters to (such as fFrameSize, etc) to. You must invoke "afterGetting()" *only* after you have new data that you have just delivered to the downstream reader - i.e., only after you have copied new data to "fTo", and set "fFrameSize", etc. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From hartman at videolan.org Thu Aug 10 17:04:17 2006 From: hartman at videolan.org (Derk-Jan Hartman) Date: Fri, 11 Aug 2006 02:04:17 +0200 Subject: [Live-devel] [vlc-devel] Re: [vlc] Illegal characters in RTSP stream user/password mechanism In-Reply-To: <6.2.3.4.1.20050805225003.02254210@localhost> References: <42f3cee8176a07.21356515@freemail.gr> <6.2.3.4.1.20050805225003.02254210@localhost> Message-ID: <6ABAC197-FE96-4976-A1A9-C5697A340523@videolan.org> On 6-aug-2005, at 8:09, Ross Finlayson wrote: >> It seems that when there is an illegal character in the user/password >> authentication mechanism of an RTSP URL, then VLC fails to deliver >> the content. >> >> For example, for user "user" and password "passw at rd" the RTSP URL >> is: rtsp://user:passw at rd@somesite/somecatalog/file.rm and that's >> something that causes VideoLAN not to show the stream. > > The problem here is caused by the RTSP client code's parsing of > optional ":@" fields in "rtsp://" URLs (in the > LIVE.COM libraries). It stops at the first '@' (or '/'). > > Unfortunately, there doesn't seem to be any good way to fix this, > because of the fact that the '@', '/' and '.' characters can also > appear in the stream name portion of the URL (i.e., after the host > name or address). I don't believe that this can be parsed > unambiguously. Basically, ":password@" is a non-standard > hack, and one that we shouldn't really be using in any case. > > The real solution to this problem is to fix VLC so that - if a RTSP > (or HTTP!) authentication failure occurs - it will then pop up a > dialog box in wihch users can enter a , pair. > Then, retry using that and . > > This is what QuickTime Player and other media players do; VLC > should be doing it as well. > > If the VLC developers would implement such a dialog box, and use it > for HTTP authentication, then I'll add the appropriate code to the > LIVE.COM libraries and "modules/demux/livedotcom.cpp" to also use > it for RTSP authentication. Just let me know... Since we now have the interaction framework this could be possible. https://trac.videolan.org/vlc/browser/trunk/modules/access/http.c#L279 So Ross :D how are we gonna do this :D DJ BTW I'm rewriting some parts of our live555 module to be a bit more readable atm. that made me remember this issue. From TonyBai at viatech.com.cn Thu Aug 10 19:23:57 2006 From: TonyBai at viatech.com.cn (TonyBai at viatech.com.cn) Date: Fri, 11 Aug 2006 10:23:57 +0800 Subject: [Live-devel] question about buffer Message-ID: <00DA072AF14656448C31F5C0182E7D0A72D9C1@exchsh01.viatech.com.sh> Hi, all Can I control the buffer of RTP data and how? Thank you, Tony -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060810/c5ac9987/attachment.html From spacelis at gmail.com Thu Aug 10 21:54:43 2006 From: spacelis at gmail.com (SpaceLi) Date: Fri, 11 Aug 2006 12:54:43 +0800 Subject: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 In-Reply-To: References: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> <4f8ff5c10608092000n495b030es274f226f29212f1d@mail.gmail.com> Message-ID: <4f8ff5c10608102154r7bbb5330g2597f9a9a4153437@mail.gmail.com> Thank you for your advice, I have secessfully setup ip multicast on uclinux, the address is now right. But I still couldn't link to server, is there any simple way to check whether the server setup correctly. When I tried to capture the packets between server and client, I found realplayer always send "OPTION" packets with incorrect checksum. So I don't know who is wrong, server or client. thx On 8/10/06, Ross Finlayson wrote: > >I have read that part of code, it seems to use multicast ip to figure > >out ourSourceAddress. But the os does not support multicast on my > >board or I am not sure if multicast could work properly on the system. > > For many systems, getting multicast to work requires only that you > add a route for 224.0.0/4 (for your local network interface). > > >And I am not very interest in multicast. Is there any way to make it > >without multicast. > > If multicast doesn't work, then the code uses "gethostname()" > followed by "gethostbyname()" to try to figure out your IP address. > If neither method works, then I suppose you could hard-wire your > address into the implementation of "ourSourceAddressForMulticast()". > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Thu Aug 10 22:33:22 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Aug 2006 22:33:22 -0700 Subject: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 In-Reply-To: <4f8ff5c10608102154r7bbb5330g2597f9a9a4153437@mail.gmail.com> References: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> <4f8ff5c10608092000n495b030es274f226f29212f1d@mail.gmail.com> <4f8ff5c10608102154r7bbb5330g2597f9a9a4153437@mail.gmail.com> Message-ID: >Thank you for your advice, I have secessfully setup ip multicast on >uclinux, the address is now right. > But I still couldn't link to server, is there any simple way to >check whether the server setup correctly. > When I tried to capture the packets between server and client, I >found realplayer always send "OPTION" packets with incorrect checksum. >So I don't know who is wrong, server or client. Your client - realplayer - is wrong: It is badly non-standards compliant. Don't bother using it - instead, use VLC , or QuickTime Player. You can also use the "openRTSP" client - especially with the "-V" (verbose output) option. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From huutribk2001 at yahoo.com Fri Aug 11 03:25:17 2006 From: huutribk2001 at yahoo.com (Tran Huu Tri) Date: Fri, 11 Aug 2006 03:25:17 -0700 (PDT) Subject: [Live-devel] How to stream jpeg buffer? Message-ID: <20060811102517.99401.qmail@web52310.mail.yahoo.com> Hi, I need to streaming jpeg buffer frames, I implemented a frame source that drive JPEGVideoSource as class MyJPEGFramedSource : public JPEGVideoSource, but in doGetNextFrame I don't know how to encode the buffer wiht correct format to copy to fTo variable. I have the jpeg image format already, what stept i need to do next before call doGetNextFrame function to stream? I try to follow the code in DeviceSource code to stream jpeg image buffer(that i capture from webcam and to jpg format), but i don't know what DeviceParameters type? Please help me with streaming by using buffer frame input instead of file input as example programs Thank in advance! __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From tire.tw at gmail.com Fri Aug 11 17:05:55 2006 From: tire.tw at gmail.com (Tire) Date: Sat, 12 Aug 2006 08:05:55 +0800 Subject: [Live-devel] fixe the RTP port and the RTCP port Message-ID: <7992557a0608111705g75a34922v958c1db261001439@mail.gmail.com> In live.2006.08.07 version, you can choose server port numbers starting with a specific port number. Whether we can through the fixed RTP port and the RTCP port to send package and receive the package ? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060811/0b3ffa9a/attachment.html From weiyutao36 at 163.com Sat Aug 12 00:08:03 2006 From: weiyutao36 at 163.com (weiyutao36) Date: Sat, 12 Aug 2006 15:08:03 +0800 (CST) Subject: [Live-devel] About detection of network connection Message-ID: <44DD7E53.000078.21745@bj163app20.163.com> Hi everyone, I am using live libray. When the network connection is down,I want to continuously send some command/request to the RTSP server in order to know when the network connection is OK and if it is OK,I continue receiving data from the server. Does anyone know which function can I use to this? Any advice will be appreciated. ================== FilexBlue -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060812/202ea14c/attachment.html From hartman at videolan.org Sun Aug 13 08:01:07 2006 From: hartman at videolan.org (Derk-Jan Hartman) Date: Sun, 13 Aug 2006 17:01:07 +0200 Subject: [Live-devel] H264 FU-A Message-ID: <2BD95C26-5839-4286-874D-ADF58FFA1116@videolan.org> He Ross, I'm looking into H264 a bit more. I was wondering. I cannot find what liveMedia does with "incomplete" FU-A or FU-B packets. As far as I know, if an FU-A or FU-B packet becomes "broken", the receiver should discard all data up to the next packet, and may pass along what it has already received, as long as it marks the NALU as invalid (or discard it completely of course). Is this happening? I'm wondering why I need to feed the liveMedia stream trough our packetizer. Either ffmpeg is just very fault intolerant for H264, or something is wrong with the data fed from RTSP. DJ From atariq at gmail.com Sun Aug 13 08:21:41 2006 From: atariq at gmail.com (Adnan) Date: Sun, 13 Aug 2006 17:21:41 +0200 Subject: [Live-devel] How to slow down the RTP packets sent by server Message-ID: <9519eabd0608130821h366f0f44u14b24287fd9a3fc6@mail.gmail.com> Hi, Is it possible to slow down the rate of RTP packets sent by server (test*Streamer) by using RTCP Instance of client (test*Receiver) . If yes could somebody please tell me the starting hint/point. regards adnan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060813/45e38bdd/attachment.html From finlayson at live555.com Sun Aug 13 08:40:56 2006 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 13 Aug 2006 08:40:56 -0700 Subject: [Live-devel] H264 FU-A In-Reply-To: <2BD95C26-5839-4286-874D-ADF58FFA1116@videolan.org> References: <2BD95C26-5839-4286-874D-ADF58FFA1116@videolan.org> Message-ID: >I'm looking into H264 a bit more. >I was wondering. I cannot find what liveMedia does with "incomplete" >FU-A or FU-B packets. (I assume you're referring to *receiving* H.264/RTP packets.) The receiving code ("H264VideoRTPSource", and its parent class "MultiFramedRTPSource") treats NAL units (including FU-A and FU-B) as being discrete 'frames'. Their data is not delivered to the downstream reader unless *all* fragments are received. (If any fragment is lost, then the entire NAL unit is discarded.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Aug 13 08:43:13 2006 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 13 Aug 2006 08:43:13 -0700 Subject: [Live-devel] How to slow down the RTP packets sent by server In-Reply-To: <9519eabd0608130821h366f0f44u14b24287fd9a3fc6@mail.gmail.com> References: <9519eabd0608130821h366f0f44u14b24287fd9a3fc6@mail.gmail.com> Message-ID: >Is it possible to slow down the rate of RTP packets sent by server >(test*Streamer) by using RTCP Instance of client (test*Receiver) . RTCP is not used for this. Instead, the object that feeds into a "RTPSink" (subclass) should simply set "fDurationInMicroseconds" appropriately. (Of course, if you intend the data - when received - to be rendered in a media player, then you should also set "fPresentationTime" to match.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From schramm at i4.informatik.rwth-aachen.de Mon Aug 14 06:05:55 2006 From: schramm at i4.informatik.rwth-aachen.de (Martin Schramm) Date: Mon, 14 Aug 2006 15:05:55 +0200 Subject: [Live-devel] Problem with MPEG2 video streaming and a long GoP Message-ID: <200608141505.55136.schramm@i4.informatik.rwth-aachen.de> Hello everybody, for testing purposes and for measurements I try to stream a MPEG video file consisting of MPEG2 video and MPEG2 audio using liveMedia included in the Network-Integrated Multimedia Middleware (NMM)(www.networkmultimedia.org). The framerate is 25 fps and the GoP length is 100 to get an I-Frame every 4 seconds. The problem is now that the streaming does not work properly and the video is not played. There seems to exist a problem computing the timestamps for the video part. The synchronization of NMM drops many packets, because there are too late. Using local playback with NMM the video works, hence it should be no problem with parts of NMM. I attach a text file with debug output I inserted into MPEG1or2VideoRTPSink and MPEG1or2VideoStreamFramer. The value of "fCurPicTemporalReference" must be in the range 0 to 99/100 if the length of a GoP is 100, I think. Nevertheless, the maximal value is below 50. With a mpeg file using a GoP length of 15 there is no problem and the value is in the range from 0 to 15. I'm using the version from 2006.08.07 of liveMedia. Has anyone an idea what the problem in this case is? Thanks in advance. Martin -------------- next part -------------- Framer: Parsing video sequence header. Framer: parsing VSH: framerate: 25 fPicturesSinceLastGop: 0 Framer: VSH computed timestamp:1.15556e+09 Sink: SequenceHeader found. Sink: setting timestamp: 1155557927 sec and 660669 usec. Framer: parsing GoP header. Framer: slice: fCurPicTemporalReference: 0 Framer: GoP computed timestamp:1.15556e+09 Sink: setting timestamp: 1155557927 sec and 660669 usec. Framer: parsing picture header. Framer: picture coding type: 1 Framer: picture: fCurPicTemporalReference: 0 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: setting timestamp: 1155557927 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice ... Framer: slice: fCurPicTemporalReference: 17 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 340669 usec. Framer: slice: fCurPicTemporalReference: 17 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 340669 usec. Framer: slice: fCurPicTemporalReference: 17 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 340669 usec. Framer: parsing picture header. Framer: picture coding type: 2 Framer: picture: fCurPicTemporalReference: 21 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: slice: fCurPicTemporalReference: 21 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 500669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 19 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557928 sec and 420669 usec. Framer: slice: fCurPicTemporalReference: 19 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 420669 usec. Framer: slice: fCurPicTemporalReference: 19 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 420669 usec. Framer: slice: fCurPicTemporalReference: 19 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 420669 usec. Framer: slice: fCurPicTemporalReference: 19 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 420669 usec. Framer: slice: fCurPicTemporalReference: 19 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557928 sec and 420669 usec. Framer: slice: fCurPicTemporalReference: 19 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice ... Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 47 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: slice: fCurPicTemporalReference: 47 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 540669 usec. Framer: Parsing video sequence header. Framer: parsing VSH: framerate: 25 fPicturesSinceLastGop: 49 Framer: VSH computed timestamp:1.15556e+09 Sink: SequenceHeader found. Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: parsing GoP header. Framer: slice: fCurPicTemporalReference: 0 Framer: GoP computed timestamp:1.15556e+09 Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: parsing picture header. Framer: picture coding type: 1 Framer: picture: fCurPicTemporalReference: 2 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 700668 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 0 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 620669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 1 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 660669 usec. Framer: Parsing video sequence header. Framer: parsing VSH: framerate: 25 fPicturesSinceLastGop: 3 Framer: VSH computed timestamp:1.15556e+09 Sink: SequenceHeader found. Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: parsing GoP header. Framer: slice: fCurPicTemporalReference: 0 Framer: GoP computed timestamp:1.15556e+09 Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: parsing picture header. Framer: picture coding type: 1 Framer: picture: fCurPicTemporalReference: 2 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 820669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 0 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 740669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 1 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 780669 usec. Framer: Parsing video sequence header. Framer: parsing VSH: framerate: 25 fPicturesSinceLastGop: 3 Framer: VSH computed timestamp:1.15556e+09 Sink: SequenceHeader found. Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: parsing GoP header. Framer: slice: fCurPicTemporalReference: 0 Framer: GoP computed timestamp:1.15556e+09 Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: parsing picture header. Framer: picture coding type: 1 Framer: picture: fCurPicTemporalReference: 2 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 940669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 0 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 860669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 1 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: slice: fCurPicTemporalReference: 1 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 900669 usec. Framer: parsing picture header. Framer: picture coding type: 2 Framer: picture: fCurPicTemporalReference: 5 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: slice: fCurPicTemporalReference: 5 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 60669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 3 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: slice: fCurPicTemporalReference: 3 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557929 sec and 980669 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 4 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: slice: fCurPicTemporalReference: 4 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155557930 sec and 20669 usec. Framer: parsing picture header. Framer: picture coding type: 2 Framer: picture: fCurPicTemporalReference: 8 Framer: picture computed timestamp:1.15556e+09 From schramm at i4.informatik.rwth-aachen.de Mon Aug 14 06:22:58 2006 From: schramm at i4.informatik.rwth-aachen.de (Martin Schramm) Date: Mon, 14 Aug 2006 15:22:58 +0200 Subject: [Live-devel] Problem with MPEG2 video streaming and a long GoP In-Reply-To: <200608141505.55136.schramm@i4.informatik.rwth-aachen.de> References: <200608141505.55136.schramm@i4.informatik.rwth-aachen.de> Message-ID: <200608141522.58974.schramm@i4.informatik.rwth-aachen.de> Sorry, I made a mistake. The used video had a GoP length of 50 (I've tested other GoP lengths). Here is the output of a video with GoP length 100. The values of fCurPicTemporalReference are now greater than 50 :-) I attach a new file with debug output as the problem remains. Am Montag, 14. August 2006 15:05 schrieb Martin Schramm: > Hello everybody, > > for testing purposes and for measurements I try to stream a MPEG video file > consisting of MPEG2 video and MPEG2 audio using liveMedia included in the > Network-Integrated Multimedia Middleware (NMM)(www.networkmultimedia.org). > The framerate is 25 fps and the GoP length is 100 to get an I-Frame every 4 > seconds. The problem is now that the streaming does not work properly and > the video is not played. > There seems to exist a problem computing the timestamps for the video part. > The synchronization of NMM drops many packets, because there are too late. > Using local playback with NMM the video works, hence it should be no > problem with parts of NMM. > I attach a text file with debug output I inserted into MPEG1or2VideoRTPSink > and MPEG1or2VideoStreamFramer. > The value of "fCurPicTemporalReference" must be in the range 0 to 99/100 if > the length of a GoP is 100, I think. Nevertheless, the maximal value is > below 50. > With a mpeg file using a GoP length of 15 there is no problem and the value > is in the range from 0 to 15. > I'm using the version from 2006.08.07 of liveMedia. > > Has anyone an idea what the problem in this case is? > > Thanks in advance. > Martin -------------- next part -------------- Framer: Parsing video sequence header. Framer: parsing VSH: framerate: 25 fPicturesSinceLastGop: 0 Framer: VSH computed timestamp:1.15556e+09 Sink: SequenceHeader found. Sink: setting timestamp: 1155561567 sec and 820707 usec. Framer: parsing GoP header. Framer: slice: fCurPicTemporalReference: 0 Framer: GoP computed timestamp:1.15556e+09 Sink: setting timestamp: 1155561567 sec and 820707 usec. Framer: parsing picture header. Framer: picture coding type: 1 Framer: picture: fCurPicTemporalReference: 0 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: setting timestamp: 1155561567 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561567 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561567 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561567 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561567 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 0 Framer: Slice computed timestamp:1.15556e+09 Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: slice: fCurPicTemporalReference: 88 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 540707 usec. Framer: parsing picture header. Framer: picture coding type: 2 Framer: picture: fCurPicTemporalReference: 92 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: slice: fCurPicTemporalReference: 92 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 700707 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 90 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: slice: fCurPicTemporalReference: 90 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 620706 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 91 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: slice: fCurPicTemporalReference: 91 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 660706 usec. Framer: parsing picture header. Framer: picture coding type: 2 Framer: picture: fCurPicTemporalReference: 95 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: slice: fCurPicTemporalReference: 95 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 820707 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 93 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: slice: fCurPicTemporalReference: 93 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 740707 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 94 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: slice: fCurPicTemporalReference: 94 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 780707 usec. Framer: parsing picture header. Framer: picture coding type: 2 Framer: picture: fCurPicTemporalReference: 98 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: slice: fCurPicTemporalReference: 98 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 940706 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 96 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: slice: fCurPicTemporalReference: 96 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 860706 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 97 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: slice: fCurPicTemporalReference: 97 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561573 sec and 900706 usec. Framer: Parsing video sequence header. Framer: parsing VSH: framerate: 25 fPicturesSinceLastGop: 99 Framer: VSH computed timestamp:1.15556e+09 Sink: SequenceHeader found. Sink: setting timestamp: 1155561573 sec and 980706 usec. Framer: parsing GoP header. Framer: slice: fCurPicTemporalReference: 0 Framer: GoP computed timestamp:1.15556e+09 Sink: setting timestamp: 1155561573 sec and 980707 usec. Framer: parsing picture header. Framer: picture coding type: 1 Framer: picture: fCurPicTemporalReference: 2 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: slice: fCurPicTemporalReference: 2 Framer: Slice computed timestamp:1.15556e+09 Sink: Slice Sink: setting timestamp: 1155561574 sec and 60707 usec. Framer: parsing picture header. Framer: picture coding type: 3 Framer: picture: fCurPicTemporalReference: 0 Framer: picture computed timestamp:1.15556e+09 Sink: Picture start code found. Sink: B-Frame found. Sink: P-Frame of falltrough. From jlazar at xperts.hu Mon Aug 14 08:52:25 2006 From: jlazar at xperts.hu (Joseph Lazar) Date: Mon, 14 Aug 2006 17:52:25 +0200 Subject: [Live-devel] Finding out the hostname in GroupsockHelper... In-Reply-To: <7.0.0.16.1.20051214125115.01ef9e80@live555.com> References: <4395AD7C.2090407@xperts.hu> <6.2.3.4.1.20051206095257.03232c60@localhost> <4395E499.3020704@xperts.hu> <6.2.3.4.1.20051206170907.03237a20@localhost> <4396A6DB.5090105@xperts.hu> <6.2.3.4.1.20051207012433.03248080@localhost> <4396D579.6070904@xperts.hu> <6.2.3.4.1.20051207063922.03243060@localhost> <439E91EF.9090602@xperts.hu> <7.0.0.16.1.20051213014202.01ee0ac8@live555.com> <439E9F01.9050504@xperts.hu> <7.0.0.16.1.20051214003636.01f27578@live555.com> <43A00601.2040709@xperts.hu> <036E396D-CAC0-4533-9564-39B5EE16AEA1@videolan.org> <7.0.0.16.1.20051214125115.01ef9e80@live555.com> Message-ID: <44E09C39.4000309@xperts.hu> Hi Ross, In GroupsockHelper.cpp (ourSourceAddressForMulticast) you start finding out our own host IP via sending out a multicast packet with a 5 sec timeout. In most cases this will run to a timeout, while the gethostname() or gethostbyname() will always work and much faster than the previous. Couldn't it be a better solution to have those before the multicast stuff. Now we always start the stream 5 secs later (practically a 5 secs delay every time we open a session). -- joseph From finlayson at live555.com Mon Aug 14 07:41:05 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Aug 2006 07:41:05 -0700 Subject: [Live-devel] Problem with MPEG2 video streaming and a long GoP In-Reply-To: <200608141522.58974.schramm@i4.informatik.rwth-aachen.de> References: <200608141505.55136.schramm@i4.informatik.rwth-aachen.de> <200608141522.58974.schramm@i4.informatik.rwth-aachen.de> Message-ID: Can you reproduce the problem using "testOnDemandRTSPServer" as the server, with VLC as the client? If so, please put a copy of your "test.mpg" file online, and send us the URL, so we can take a look at it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 14 09:45:15 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Aug 2006 09:45:15 -0700 Subject: [Live-devel] Finding out the hostname in GroupsockHelper... In-Reply-To: <44E09C39.4000309@xperts.hu> References: <4395AD7C.2090407@xperts.hu> <6.2.3.4.1.20051206095257.03232c60@localhost> <4395E499.3020704@xperts.hu> <6.2.3.4.1.20051206170907.03237a20@localhost> <4396A6DB.5090105@xperts.hu> <6.2.3.4.1.20051207012433.03248080@localhost> <4396D579.6070904@xperts.hu> <6.2.3.4.1.20051207063922.03243060@localhost> <439E91EF.9090602@xperts.hu> <7.0.0.16.1.20051213014202.01ee0ac8@live555.com> <439E9F01.9050504@xperts.hu> <7.0.0.16.1.20051214003636.01f27578@live555.com> <43A00601.2040709@xperts.hu> <036E396D-CAC0-4533-9564-39B5EE16AEA1@videolan.org> <7.0.0.16.1.20051214125115.01ef9e80@live555.com> <44E09C39.4000309@xperts.hu> Message-ID: >In GroupsockHelper.cpp (ourSourceAddressForMulticast) you start finding >out our own host IP via sending out a multicast packet with a 5 sec >timeout. In most cases this will run to a timeout That's not true at all. In most cases, the multicast packet will get received immediately. If that's not happening for you, then perhaps you don't have multicast configured properly on your system. >, while the >gethostname() or gethostbyname() will always work That's not true either. There are several systems (especially 'embedded' systems) for which the gethostname()/gethostbyname() combination will not work (because they don't have a 'host name'). > Couldn't it be a better solution to have those before the >multicast stuff. Feel free to change this in your copy of the source code, if you wish... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From darnold at futurec.net Mon Aug 14 16:50:37 2006 From: darnold at futurec.net (David Arnold) Date: Mon, 14 Aug 2006 16:50:37 -0700 Subject: [Live-devel] live-devel Digest, Vol 34, Issue 6 In-Reply-To: Message-ID: Ross, could you give us sample psuedo-code that demonstrates the event method you describe? How would this new EventLoop interact with FramedSource derived classes and MultiFramedRTPSink derived classes? The interface to the H.264 encoder I am using requires that I call DSP_GetFrame() at correct intervals or else it will always return frames at 30 fps. Thank you, Dave Arnold Future Concepts, La Verne From: Ross Finlayson Subject: Re: [Live-devel] Behavior of DeviceSource::doGetNextFrame() when no data available To: LIVE555 Streaming Media - development & use Message-ID: Content-Type: text/plain; charset="us-ascii" ; format="flowed" >On 8/10/06, David Arnold wrote: >> What should a DeviceSource, such as a frame grabber, do in doGetNextFrame() >> when it determines there is no frame available? I find no answer in the >> FAQ. The comment in DeviceSource.cpp reads: > >You need to reschedule a retry. > >In my device source (which I copied from one of the examples), after >reading and validating the frame from the device it uses >scheduleDelayedTask with a 0 delay to schedule a call to >'afterGetting'. If a frame is not available it uses >scheduleDelayedTask to retry reading a frame from the device at some >point in the future. If you have a read selectable file descriptor for >your device then there is a more efficient method to do this. > >For example: > /* poll again in 5ms */ > nextTask() = envir().taskScheduler().scheduleDelayedTask ( > 5 * 1000, (TaskFunc*) dspRetryGet, this ); This works, but can be inefficient - because it polls frequently, and because it incurs a delay of at least 5 ms (in your example), even if new frame data becomes available almost immediately. A better way is to make the availability of new frame data an 'event', and just handle this in the event loop. If your frame grabber is implemented as a readable open file (i.e., socket), then you can just call "envir().taskScheduler().turnOnBackgroundReadHandling( ...)" on it. This will schedule a handler function that gets called (via "select()") within the event loop. If your frame grabber is *not* a readable socket, then you need to do something else to trigger the arrival of new frame data as being an 'event'. One simple way to do this is to use the "watchVariable" parameter to "doEventLoop()". (Look at the code, and the FAQ.) If your custom events are more complicated than can be handled using the "watchVariable", then you would need to write your own event loop - i.e., your own subclass of "TaskScheduler" - that handles them. >The problem is if I just return, as in: > >doGetNextFrame() >{ > if (grabNextFrame() == NO_FRAME_AVAILABLE) > return; >} > >The task scheduler seems to stop. The task scheduler doesn't 'stop', it is just waiting until events happen. That's why you need to make the availability of new frame data an 'event'. Apart from that, your code above is correct. > It appears as though I must invoke >afterGetting(this). If I must invoke afterGetting() what to I set all the >expected output parameters to (such as fFrameSize, etc) to. You must invoke "afterGetting()" *only* after you have new data that you have just delivered to the downstream reader - i.e., only after you have copied new data to "fTo", and set "fFrameSize", etc. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------ Message: 3 Date: Fri, 11 Aug 2006 02:04:17 +0200 From: Derk-Jan Hartman Subject: Re: [Live-devel] [vlc-devel] Re: [vlc] Illegal characters in RTSP stream user/password mechanism To: vlc-devel at videolan.org Cc: LIVE555 Streaming Media - development & use Message-ID: <6ABAC197-FE96-4976-A1A9-C5697A340523 at videolan.org> Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed On 6-aug-2005, at 8:09, Ross Finlayson wrote: >> It seems that when there is an illegal character in the user/password >> authentication mechanism of an RTSP URL, then VLC fails to deliver >> the content. >> >> For example, for user "user" and password "passw at rd" the RTSP URL >> is: rtsp://user:passw at rd@somesite/somecatalog/file.rm and that's >> something that causes VideoLAN not to show the stream. > > The problem here is caused by the RTSP client code's parsing of > optional ":@" fields in "rtsp://" URLs (in the > LIVE.COM libraries). It stops at the first '@' (or '/'). > > Unfortunately, there doesn't seem to be any good way to fix this, > because of the fact that the '@', '/' and '.' characters can also > appear in the stream name portion of the URL (i.e., after the host > name or address). I don't believe that this can be parsed > unambiguously. Basically, ":password@" is a non-standard > hack, and one that we shouldn't really be using in any case. > > The real solution to this problem is to fix VLC so that - if a RTSP > (or HTTP!) authentication failure occurs - it will then pop up a > dialog box in wihch users can enter a , pair. > Then, retry using that and . > > This is what QuickTime Player and other media players do; VLC > should be doing it as well. > > If the VLC developers would implement such a dialog box, and use it > for HTTP authentication, then I'll add the appropriate code to the > LIVE.COM libraries and "modules/demux/livedotcom.cpp" to also use > it for RTSP authentication. Just let me know... Since we now have the interaction framework this could be possible. https://trac.videolan.org/vlc/browser/trunk/modules/access/http.c#L279 So Ross :D how are we gonna do this :D DJ BTW I'm rewriting some parts of our live555 module to be a bit more readable atm. that made me remember this issue. ------------------------------ Message: 4 Date: Fri, 11 Aug 2006 10:23:57 +0800 From: Subject: [Live-devel] question about buffer To: Message-ID: <00DA072AF14656448C31F5C0182E7D0A72D9C1 at exchsh01.viatech.com.sh> Content-Type: text/plain; charset="us-ascii" Hi, all Can I control the buffer of RTP data and how? Thank you, Tony -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060810/c5ac9987/ attachment-0001.html ------------------------------ Message: 5 Date: Fri, 11 Aug 2006 12:54:43 +0800 From: SpaceLi Subject: Re: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 To: "LIVE555 Streaming Media - development & use" Message-ID: <4f8ff5c10608102154r7bbb5330g2597f9a9a4153437 at mail.gmail.com> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Thank you for your advice, I have secessfully setup ip multicast on uclinux, the address is now right. But I still couldn't link to server, is there any simple way to check whether the server setup correctly. When I tried to capture the packets between server and client, I found realplayer always send "OPTION" packets with incorrect checksum. So I don't know who is wrong, server or client. thx On 8/10/06, Ross Finlayson wrote: > >I have read that part of code, it seems to use multicast ip to figure > >out ourSourceAddress. But the os does not support multicast on my > >board or I am not sure if multicast could work properly on the system. > > For many systems, getting multicast to work requires only that you > add a route for 224.0.0/4 (for your local network interface). > > >And I am not very interest in multicast. Is there any way to make it > >without multicast. > > If multicast doesn't work, then the code uses "gethostname()" > followed by "gethostbyname()" to try to figure out your IP address. > If neither method works, then I suppose you could hard-wire your > address into the implementation of "ourSourceAddressForMulticast()". > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > ------------------------------ Message: 6 Date: Thu, 10 Aug 2006 22:33:22 -0700 From: Ross Finlayson Subject: Re: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 To: LIVE555 Streaming Media - development & use Message-ID: Content-Type: text/plain; charset="us-ascii" ; format="flowed" >Thank you for your advice, I have secessfully setup ip multicast on >uclinux, the address is now right. > But I still couldn't link to server, is there any simple way to >check whether the server setup correctly. > When I tried to capture the packets between server and client, I >found realplayer always send "OPTION" packets with incorrect checksum. >So I don't know who is wrong, server or client. Your client - realplayer - is wrong: It is badly non-standards compliant. Don't bother using it - instead, use VLC , or QuickTime Player. You can also use the "openRTSP" client - especially with the "-V" (verbose output) option. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------ Message: 7 Date: Fri, 11 Aug 2006 03:25:17 -0700 (PDT) From: Tran Huu Tri Subject: [Live-devel] How to stream jpeg buffer? To: live-devel at ns.live555.com Message-ID: <20060811102517.99401.qmail at web52310.mail.yahoo.com> Content-Type: text/plain; charset=iso-8859-1 Hi, I need to streaming jpeg buffer frames, I implemented a frame source that drive JPEGVideoSource as class MyJPEGFramedSource : public JPEGVideoSource, but in doGetNextFrame I don't know how to encode the buffer wiht correct format to copy to fTo variable. I have the jpeg image format already, what stept i need to do next before call doGetNextFrame function to stream? I try to follow the code in DeviceSource code to stream jpeg image buffer(that i capture from webcam and to jpg format), but i don't know what DeviceParameters type? Please help me with streaming by using buffer frame input instead of file input as example programs Thank in advance! __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com ------------------------------ Message: 8 Date: Sat, 12 Aug 2006 08:05:55 +0800 From: Tire Subject: [Live-devel] fixe the RTP port and the RTCP port To: live-devel at ns.live555.com Message-ID: <7992557a0608111705g75a34922v958c1db261001439 at mail.gmail.com> Content-Type: text/plain; charset="iso-8859-1" In live.2006.08.07 version, you can choose server port numbers starting with a specific port number. Whether we can through the fixed RTP port and the RTCP port to send package and receive the package ? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060811/0b3ffa9a/ attachment-0001.html ------------------------------ Message: 9 Date: Sat, 12 Aug 2006 15:08:03 +0800 (CST) From: "weiyutao36" Subject: [Live-devel] About detection of network connection To: "live?*7"U_SJ<~AP1m " Message-ID: <44DD7E53.000078.21745 at bj163app20.163.com> Content-Type: text/plain; charset="gb2312" Hi everyone, I am using live libray. When the network connection is down,I want to continuously send some command/request to the RTSP server in order to know when the network connection is OK and if it is OK,I continue receiving data from the server. Does anyone know which function can I use to this? Any advice will be appreciated. ================== FilexBlue -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060812/202ea14c/ attachment-0001.html ------------------------------ Message: 10 Date: Sun, 13 Aug 2006 17:01:07 +0200 From: Derk-Jan Hartman Subject: [Live-devel] H264 FU-A To: LIVE555 Streaming Media - development & use Message-ID: <2BD95C26-5839-4286-874D-ADF58FFA1116 at videolan.org> Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed He Ross, I'm looking into H264 a bit more. I was wondering. I cannot find what liveMedia does with "incomplete" FU-A or FU-B packets. As far as I know, if an FU-A or FU-B packet becomes "broken", the receiver should discard all data up to the next packet, and may pass along what it has already received, as long as it marks the NALU as invalid (or discard it completely of course). Is this happening? I'm wondering why I need to feed the liveMedia stream trough our packetizer. Either ffmpeg is just very fault intolerant for H264, or something is wrong with the data fed from RTSP. DJ ------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel End of live-devel Digest, Vol 34, Issue 6 ***************************************** The information contained in this electronic mail transmission is intended only for the use of the individual or entity named above and is privileged and confidential. If you are not the intended recipient, please do not read, copy, use or disclose this communication to others. Any dissemination, distribution or copying of this communication other than to the person or entity named above is strictly prohibited. If you have received this communication in error, please immediately delete it from your system. From darnold at futurec.net Mon Aug 14 17:06:33 2006 From: darnold at futurec.net (David Arnold) Date: Mon, 14 Aug 2006 17:06:33 -0700 Subject: [Live-devel] "Record" Implementation In-Reply-To: Message-ID: Ross, We need to implement server-side record functionality in our video server, which is based on the live555 library. What is your recommendation for this? 1) Implement RTSP record (described in section 10.11 of RFC2326) in RTSPServer.cpp ? 2) Use FileSink classes ? 3) Use OpenRTSP (or similar program) to capture RTP traffic in a local file ? Pros/cons? Personal experience? Thank you, Dave Arnold Future Concepts, La Verne CA The information contained in this electronic mail transmission is intended only for the use of the individual or entity named above and is privileged and confidential. If you are not the intended recipient, please do not read, copy, use or disclose this communication to others. Any dissemination, distribution or copying of this communication other than to the person or entity named above is strictly prohibited. If you have received this communication in error, please immediately delete it from your system. From finlayson at live555.com Mon Aug 14 17:26:53 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Aug 2006 17:26:53 -0700 Subject: [Live-devel] "Record" Implementation In-Reply-To: References: Message-ID: >We need to implement server-side record functionality in our video server Can you say some more about what this means? In particular, do you want to make a recording of your (encoded) live video source, at the same time that you're streaming it? What about audio? Which codecs are being used for video (and audio, if appropriate)? Also, do you plan on recording video even if no clients are currently receiving it? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 14 17:52:21 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Aug 2006 17:52:21 -0700 Subject: [Live-devel] live-devel Digest, Vol 34, Issue 6 In-Reply-To: References: Message-ID: >Ross, could you give us sample psuedo-code that demonstrates the event >method you describe? Not really - at least, not without knowing the details of how your frame grabber works. > How would this new EventLoop interact with >FramedSource derived classes and MultiFramedRTPSink derived classes? It would make no difference. These other ("liveMedia") classes see only the base class "TaskScheduler". If (big if) you need to write your own "TaskScheduler" subclass, then the rest of the code would use it in exactly the same way. >The interface to the H.264 encoder I am using requires that I call >DSP_GetFrame() at correct intervals or else it will always return frames at >30 fps. Do you have any way to be 'notified' when a new frame is available? (E.g., is your frame grabber device a readable socket, or can it set global variable when a new frame is available?) If not, then (from your description above), it looks like you'll have to schedule a periodic handler function to read frames - using "scheduleDelayedTask()". In general it's impossible to give a generic answer to the question "How do I interface to a frame grabber?", because different frame grabbers work in different ways. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From weiyutao36 at 163.com Mon Aug 14 19:50:50 2006 From: weiyutao36 at 163.com (weiyutao36) Date: Tue, 15 Aug 2006 10:50:50 +0800 (CST) Subject: [Live-devel] about re-connection to RTSP server Message-ID: <44E1368A.000005.31017@bj163app66.163.com> Hi everyone, I am using live library to receive media data from HelixServer. Consider the following scene: the network connection is broken and after some time(e.g.,60sec) it is OK,i want to know: 1.whether it is possible to re-connect to HelixServer and continue receiving data from where it is interrupted? I do not want to receive data from the beginning. 2.whether HelixServer has this function? Thanks in advance. tom -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060814/f6a9acc6/attachment.html From finlayson at live555.com Mon Aug 14 20:14:13 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Aug 2006 20:14:13 -0700 Subject: [Live-devel] about re-connection to RTSP server In-Reply-To: <44E1368A.000005.31017@bj163app66.163.com> References: <44E1368A.000005.31017@bj163app66.163.com> Message-ID: >Hi everyone, >I am using live library to receive media data from HelixServer. > >Consider the following scene: >the network connection is broken and after some time(e.g.,60sec) it >is OK,i want to know: >1.whether it is possible to re-connect to HelixServer and continue >receiving data from >where it is interrupted? I do not want to receive data from the beginning. In principle, you could create a new RTSP connection, and then start playing from the time where you stopped playing last time - using the "start" parameter to "playMediaSession()". > >2.whether HelixServer has this function? I don't know - you'll have to ask them. Helix is not our software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060814/5a45150a/attachment-0001.html From smartbrisk at 163.com Mon Aug 14 20:56:33 2006 From: smartbrisk at 163.com (=?gb2312?B?1cW99bfm?=) Date: Tue, 15 Aug 2006 11:56:33 +0800 (CST) Subject: [Live-devel] Why class H264VideoStreamFramer isn't der ived from class FramedFilter? Message-ID: <44E145F1.0000AA.14963@bj163app38.163.com> Hi, I find that most the Video Stream Framer classes are derived from class FramedFilter, while H264VideoStreamFramer is derived from class FramedSource. In this implementation, this class has not the same manner in use. Would you mind telling me the reason implementing this class in this way? Thank you! Best Regards, Jinfeng Zhang -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060814/2d9662a0/attachment.html From finlayson at live555.com Mon Aug 14 21:04:42 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Aug 2006 21:04:42 -0700 Subject: [Live-devel] Why class H264VideoStreamFramer isn't der ived from class FramedFilter? In-Reply-To: <44E145F1.0000AA.14963@bj163app38.163.com> References: <44E145F1.0000AA.14963@bj163app38.163.com> Message-ID: >Hi, > I find that most the Video Stream Framer >classes are derived from class FramedFilter, >while H264VideoStreamFramer is derived from >class FramedSource. In this implementation, this >class has not the same manner in use. > Would you mind telling me the reason implementing this class in this way? Thanks for pointing this out. This was a mistake. It will get fixed in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ? From schramm at i4.informatik.rwth-aachen.de Tue Aug 15 01:50:55 2006 From: schramm at i4.informatik.rwth-aachen.de (Martin Schramm) Date: Tue, 15 Aug 2006 10:50:55 +0200 Subject: [Live-devel] Problem with MPEG2 video streaming and a long GoP In-Reply-To: References: <200608141505.55136.schramm@i4.informatik.rwth-aachen.de> <200608141522.58974.schramm@i4.informatik.rwth-aachen.de> Message-ID: <200608151050.55579.schramm@i4.informatik.rwth-aachen.de> Hello Ross, thanks for your answer. I tried the "testOnDemandRTSPServer" together with VLC and the synchronization did not work, too. VLC plays the video, but without audio or only distorted audio. The video file can be found at http://www-i4.informatik.rwth-aachen.de/~schramm/test.mpg Best regards Martin Am Montag, 14. August 2006 16:41 schrieb Ross Finlayson: > Can you reproduce the problem using "testOnDemandRTSPServer" as the > server, with VLC as the client? If so, please put a copy of your > "test.mpg" file online, and send us the URL, so we can take a look at > it. From TonyBai at viatech.com.cn Tue Aug 15 01:50:48 2006 From: TonyBai at viatech.com.cn (TonyBai at viatech.com.cn) Date: Tue, 15 Aug 2006 16:50:48 +0800 Subject: [Live-devel] how can I get the AvgBitRate in SDP Message-ID: <00DA072AF14656448C31F5C0182E7D0A7E79B1@exchsh01.viatech.com.sh> Hi, all How can I get the AvgBitRate in SDP? Is there a SDP parser in live lib? Thanks Best Regards, Tony -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060815/5fc303f0/attachment.html From ishan73 at yahoo.com Tue Aug 15 05:15:56 2006 From: ishan73 at yahoo.com (Ishan Vaishnavi) Date: Tue, 15 Aug 2006 13:15:56 +0100 (BST) Subject: [Live-devel] callback doesnot work for MP4 Message-ID: <20060815121556.70957.qmail@web35611.mail.mud.yahoo.com> Hi Guys, My team is trying to use live to get RTSP streams to play in our player. The following code . . MediaSubsession* subsession; . . . subsession->readSource()->getNextFrame(m_context->video_packet, MAX_RTP_FRAME_SIZE, after_reading_video, m_context, on_source_close,m_context); . . calls the getNextFrame routine of the framedsource class. The callback function pointer is provided but doesnot get called back for MP4V-ES format video whereas it works perfectly for MPEG 1 or 2. I can't seem to understand this code can anyone explain what is happening in Live ? And why for MP4V-ES stream the callback function is not called. Any ideas ? Ishan ___________________________________________________________ All new Yahoo! Mail "The new Interface is stunning in its simplicity and ease of use." - PC Magazine http://uk.docs.yahoo.com/nowyoucan.html From spacelis at gmail.com Tue Aug 15 05:40:43 2006 From: spacelis at gmail.com (SpaceLi) Date: Tue, 15 Aug 2006 20:40:43 +0800 Subject: [Live-devel] Why the testOnDemandRTSPServer show 0.0.0.0:8554 In-Reply-To: References: <4f8ff5c10608090515v3a1dc933k14e116788f459c0d@mail.gmail.com> <4f8ff5c10608092000n495b030es274f226f29212f1d@mail.gmail.com> <4f8ff5c10608102154r7bbb5330g2597f9a9a4153437@mail.gmail.com> Message-ID: <4f8ff5c10608150540o2323679fu7745374e3a747dc6@mail.gmail.com> After a long period debug, I guess I find the reason why there is no response. In SingleStep in doEventLoop, the function select() always returned 0 immediately after called.(As default, the time time is set to 0 for select(), so it will block until something happen, am I right?) And it was traped somewhere after that. I am using uclinux to run the code, and it seems to support select(). So it there any configuration I need to pay attend to compile the kernel of uclinux or the live library? thx From darnold at futurec.net Tue Aug 15 06:26:26 2006 From: darnold at futurec.net (David Arnold) Date: Tue, 15 Aug 2006 06:26:26 -0700 Subject: [Live-devel] Behavior of DeviceSource::doGetNextFrame()t, Vol 34, Issue 9 In-Reply-To: Message-ID: >From: Ross Finlayson >Subject: Re: [Live-devel] Behavior of DeviceSource::doGetNextFrame() > when no data available >To: LIVE555 Streaming Media - development & use > >Message-ID: >Content-Type: text/plain; charset="us-ascii" ; format="flowed" >[...] >Do you have any way to be 'notified' when a new frame is available? >(E.g., is your frame grabber device a readable socket, or can it set >global variable when a new frame is available?) If not, then (from >your description above), it looks like you'll have to schedule a >periodic handler function to read frames - using >"scheduleDelayedTask()". No, the frame grabber device is not a readable socket. It's a windoze interface and only provides a unblocking "GetNextFrame" function. Since the interface to the frame grabber we are using provide no mechanism for being notified when a frame becomes available I am going to assume it is not possible to use the event-driven approach you mentioned. Dave Arnold Future Concepts The information contained in this electronic mail transmission is intended only for the use of the individual or entity named above and is privileged and confidential. If you are not the intended recipient, please do not read, copy, use or disclose this communication to others. Any dissemination, distribution or copying of this communication other than to the person or entity named above is strictly prohibited. If you have received this communication in error, please immediately delete it from your system. From darnold at futurec.net Tue Aug 15 06:27:29 2006 From: darnold at futurec.net (David Arnold) Date: Tue, 15 Aug 2006 06:27:29 -0700 Subject: [Live-devel] Behavior of DeviceSource::doGetNextFrame()t, Vol 34, Issue 9 Message-ID: >From: Ross Finlayson >Subject: Re: [Live-devel] Behavior of DeviceSource::doGetNextFrame() > when no data available >To: LIVE555 Streaming Media - development & use > >Message-ID: >Content-Type: text/plain; charset="us-ascii" ; format="flowed" >[...] >Do you have any way to be 'notified' when a new frame is available? >(E.g., is your frame grabber device a readable socket, or can it set >global variable when a new frame is available?) If not, then (from >your description above), it looks like you'll have to schedule a >periodic handler function to read frames - using >"scheduleDelayedTask()". No, the frame grabber device is not a readable socket. It's a windoze interface and only provides a unblocking "GetNextFrame" function. Since the interface to the frame grabber we are using provide no mechanism for being notified when a frame becomes available I am going to assume it is not possible to use the event-driven approach you mentioned. Dave Arnold Future Concepts The information contained in this electronic mail transmission is intended only for the use of the individual or entity named above and is privileged and confidential. If you are not the intended recipient, please do not read, copy, use or disclose this communication to others. Any dissemination, distribution or copying of this communication other than to the person or entity named above is strictly prohibited. If you have received this communication in error, please immediately delete it from your system. From darnold at futurec.net Tue Aug 15 06:47:01 2006 From: darnold at futurec.net (David Arnold) Date: Tue, 15 Aug 2006 06:47:01 -0700 Subject: [Live-devel] Subject: Re: "Record" Implementation In-Reply-To: Message-ID: Date: Mon, 14 Aug 2006 17:26:53 -0700 From: Ross Finlayson Subject: Re: [Live-devel] "Record" Implementation To: LIVE555 Streaming Media - development & use Message-ID: Content-Type: text/plain; charset="us-ascii" ; format="flowed" >>We need to implement server-side record functionality in our video server >Can you say some more about what this means? In particular, do you >want to make a recording of your (encoded) live video source, at the >same time that you're streaming it? What about audio? Which codecs >are being used for video (and audio, if appropriate)? >Also, do you plan on recording video even if no clients are currently >receiving it? >-- > >Ross Finlayson >Live Networks, Inc. Ross, We need to make a recording of encoded H.264 Video and (MPEG-2 or PCMU) Audio. What I mean by recording is to capture the audio/video stream to disk to be played back (streamed) by the video server at a later time. While a recording is in progress we need the ability to stream the live session at the same time. Also, the recording or streaming could be initiated in any order (streaming starts first or recording starts first). Typically, recording will be run in the background at all times, but that isn't always the case. I hope that clarifies things. Thank you, Dave Arnold Future Concepts The information contained in this electronic mail transmission is intended only for the use of the individual or entity named above and is privileged and confidential. If you are not the intended recipient, please do not read, copy, use or disclose this communication to others. Any dissemination, distribution or copying of this communication other than to the person or entity named above is strictly prohibited. If you have received this communication in error, please immediately delete it from your system. From hartman at videolan.org Tue Aug 15 06:56:08 2006 From: hartman at videolan.org (Derk-Jan Hartman) Date: Tue, 15 Aug 2006 15:56:08 +0200 Subject: [Live-devel] base64 encode and vlc streaming h264 to quicktime Message-ID: <187DD20E-6CF8-43FA-932F-399E638F6566@videolan.org> Hi all, I've been working feverishly this week to get VLC to be able to stream H264 to Quicktime Player, and I succeeded. The code will be in VLC shortly. Ross, while working on this, i had to Base64 encode the SPS and PPS. I couldn't get it work at first, until i noticed that the logic of both the b64_encode in vlc and in can only handle string data, and not binary data (which may contain \0). It might be wise to adapt your base64 encoder to take (data, i_len) as input instead, allow for internal 0's, and output a proper base64 char*, as i will be doing for VLC. DJ From barounis at ceid.upatras.gr Tue Aug 15 07:08:48 2006 From: barounis at ceid.upatras.gr (barounis at ceid.upatras.gr) Date: Tue, 15 Aug 2006 17:08:48 +0300 Subject: [Live-devel] Elementary Stream Message-ID: <1155650928.44e1d5707784b@my.ceid.upatras.gr> Hello to the list and Ross, can you please tell me where i can find Elementary Stream videos for RTSP Server or how can i create one with a bitrate of my choice ? Thank u very much Best regards ---------------------------------------------------- This mail was sent through http://my.ceid.upatras.gr From finlayson at live555.com Tue Aug 15 10:01:23 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Aug 2006 10:01:23 -0700 Subject: [Live-devel] base64 encode and vlc streaming h264 to quicktime In-Reply-To: <187DD20E-6CF8-43FA-932F-399E638F6566@videolan.org> References: <187DD20E-6CF8-43FA-932F-399E638F6566@videolan.org> Message-ID: >It might be wise to adapt your base64 encoder to take (data, i_len) >as input instead DJ, That's a good idea. I'll make this change in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Aug 15 10:04:23 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Aug 2006 10:04:23 -0700 Subject: [Live-devel] Behavior of DeviceSource::doGetNextFrame()t, Vol 34, Issue 9 In-Reply-To: References: Message-ID: >No, the frame grabber device is not a readable socket. It's a windoze >interface and only provides a unblocking "GetNextFrame" function. Since the >interface to the frame grabber we are using provide no mechanism for being >notified when a frame becomes available I am going to assume it is not >possible to use the event-driven approach you mentioned. No, it appears not. Instead, you will need to poll periodically (using "scheduledDelayedTask()"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Aug 15 10:10:52 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Aug 2006 10:10:52 -0700 Subject: [Live-devel] callback doesnot work for MP4 In-Reply-To: <20060815121556.70957.qmail@web35611.mail.mud.yahoo.com> References: <20060815121556.70957.qmail@web35611.mail.mud.yahoo.com> Message-ID: >subsession->readSource()->getNextFrame(m_context->video_packet, >MAX_RTP_FRAME_SIZE, after_reading_video, m_context, >on_source_close,m_context); >. >. > >calls the getNextFrame routine of the framedsource >class. The callback function pointer is provided but >doesnot get called back for MP4V-ES format video >whereas it works perfectly for MPEG 1 or 2. I can't >seem to understand this code can anyone explain what >is happening in Live ? And why for MP4V-ES stream the >callback function is not called. It may be because the input MP4V-ES/RTP stream is not valid. In particular, perhaps it does not use the RTP "M" bit to mark the last fragment of each video frame, so that the "fCurrentPacketCompletesFrame" variable in "MPEG4ESVideoRTPSource" gets set correctly. But without having access to your input stream, I can only speculate... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Aug 15 10:22:43 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Aug 2006 10:22:43 -0700 Subject: [Live-devel] how can I get the AvgBitRate in SDP In-Reply-To: <00DA072AF14656448C31F5C0182E7D0A7E79B1@exchsh01.viatech.com.sh> References: <00DA072AF14656448C31F5C0182E7D0A7E79B1@exchsh01.viatech.com.sh> Message-ID: >Content-Class: urn:content-classes:message >Content-Type: multipart/alternative; > boundary="----_=_NextPart_001_01C6C047.E790E2AC" > >Hi, all > >How can I get the AvgBitRate in SDP? Is there a SDP parser in live lib? Yes, of course - the "MediaSession" class. However, we currently don't parse SDP "b=" lines. You would need to add that code yourself. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060815/84ea5b6e/attachment.html From finlayson at live555.com Tue Aug 15 10:31:58 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Aug 2006 10:31:58 -0700 Subject: [Live-devel] Subject: Re: "Record" Implementation In-Reply-To: References: Message-ID: >We need to make a recording of encoded H.264 Video and (MPEG-2 or PCMU) >Audio. What I mean by recording is to capture the audio/video stream to >disk to be played back (streamed) by the video server at a later time. >While a recording is in progress we need the ability to stream the live >session at the same time. Also, the recording or streaming could be >initiated in any order (streaming starts first or recording starts first). >Typically, recording will be run in the background at all times, but that >isn't always the case. Probably the simplest way to do this - at least for now - would be to modify your input source object (that interfaces to your encoder) to do the recording directly into a file (in addition to delivering the data to the downstream liveMedia object). I.e., you wouldn't be using a liveMedia object chain to do the recording. The reason for doing things this way is that there's currently no generic mechanism in the library for duplicating a single input stream into multiple output streams. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From spacelis at gmail.com Wed Aug 16 03:52:48 2006 From: spacelis at gmail.com (SpaceLi) Date: Wed, 16 Aug 2006 18:52:48 +0800 Subject: [Live-devel] How to port the code to uclinux with no system time clock? Message-ID: <4f8ff5c10608160352s27607a75s4a08395a87e1842@mail.gmail.com> I want to port the code to uclinux, but there is no system time clock, so function ftime won't work. Is there any way instead of using ftime to let the code work? thx From schramm at i4.informatik.rwth-aachen.de Wed Aug 16 04:52:35 2006 From: schramm at i4.informatik.rwth-aachen.de (Martin Schramm) Date: Wed, 16 Aug 2006 13:52:35 +0200 Subject: [Live-devel] Problem with MPEG2 video streaming and a long GoP In-Reply-To: <200608151050.55579.schramm@i4.informatik.rwth-aachen.de> References: <200608141505.55136.schramm@i4.informatik.rwth-aachen.de> <200608151050.55579.schramm@i4.informatik.rwth-aachen.de> Message-ID: <200608161352.35551.schramm@i4.informatik.rwth-aachen.de> Hello, I've encoded another file (mpeg2, GoP length 100) with Transcode/mpeg2enc instead of ffmpeg. This file now is normally streamed using testOnDemandRTSPServer and VLC, but the problem remains using NMM/liveMedia. So it seems to be a problem rergarding NMM and ffmpeg. I will post a message on their mailing list. Best Regards Martin From weiyutao36 at 163.com Wed Aug 16 06:52:11 2006 From: weiyutao36 at 163.com (weiyutao36) Date: Wed, 16 Aug 2006 21:52:11 +0800 (CST) Subject: [Live-devel] About the NPT(Normal Play Time) of the me dia Message-ID: <44E3230B.00007A.09250@bj163app28.163.com> Hi everyone, I am using live library to receive data from a helixserver and the network connection is suddenly broken. Now I want to get the media's NPT at where it is interrupted,so I can call playMediaSession() with parameter "start" as following: playMediaSession(session,start,-1.0f,1.0f) Whether it is feasible to get NPT from the last received RTP packet's timestamp? If it is yes,how? If it is not,does anybody have some opinions or method ? Any advice is appreciated. Thanks in advance. FilexBlue -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060816/1719bb76/attachment.html From finlayson at live555.com Wed Aug 16 07:47:50 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Aug 2006 07:47:50 -0700 Subject: [Live-devel] About the NPT(Normal Play Time) of the me dia In-Reply-To: <44E3230B.00007A.09250@bj163app28.163.com> References: <44E3230B.00007A.09250@bj163app28.163.com> Message-ID: >Now I want to get the media's NPT at where it is interrupted,so I >can call playMediaSession() with parameter "start" >as following: > >playMediaSession(session,start,-1.0f,1.0f) > >Whether it is feasible to get NPT from the last received RTP >packet's timestamp? Yes, but use the "presentationTime", because that is computed from the RTP timestamps. (Code that uses these libraries usually does not need to look at RTP timestamps.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060816/152d5138/attachment.html From finlayson at live555.com Wed Aug 16 12:51:48 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Aug 2006 12:51:48 -0700 Subject: [Live-devel] Problem with MPEG2 video streaming and a long GoP In-Reply-To: <200608161352.35551.schramm@i4.informatik.rwth-aachen.de> References: <200608141505.55136.schramm@i4.informatik.rwth-aachen.de> <200608151050.55579.schramm@i4.informatik.rwth-aachen.de> <200608161352.35551.schramm@i4.informatik.rwth-aachen.de> Message-ID: >I've encoded another file (mpeg2, GoP length 100) with Transcode/mpeg2enc >instead of ffmpeg. This file now is normally streamed using >testOnDemandRTSPServer and VLC, but the problem remains using NMM/liveMedia. >So it seems to be a problem rergarding NMM and ffmpeg. I will post a message >on their mailing list. Make sure that "NMM" is using a recent version of the "LIVE555 Streaming Media" code. (For example, there was a bug fix in July 2005 that might be related to your problem.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mjn at oxysys.com Wed Aug 16 16:22:37 2006 From: mjn at oxysys.com (Marc Neuberger) Date: Wed, 16 Aug 2006 19:22:37 -0400 Subject: [Live-devel] Performance/Security concern Message-ID: <44E3A8BD.3080402@oxysys.com> I am new to live555. I am working on a product in which I need to serve RTSP/RTP and liveMedia looks to be well-suited to my needs. However, in studying the code to learn about the architecture, I encountered something odd: The RTSP connection is blocking. In RTSPServer::RTSPClientSession::incomingRequestHandler1(), the code essentially waits until a full RTSP request header has arrived. As a result, if I connect to the server and begin sending a header, but do not finish the header, the server is wedged until I either complete the header or disconnect. I have actually verified this using telnet and my fingers as a slow-moving client. I've tried both win32 and linux with the same results. This has the performance scaling issue that under heavy load the server could be blocking from time to time. It has the security issue that a malicious, or poorly-coded client (like telnet and my fingers) can cause the server to wedge. Am I missing something? Thanks, Marc Neuberger Oxy Systems From finlayson at live555.com Wed Aug 16 16:47:32 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Aug 2006 16:47:32 -0700 Subject: [Live-devel] Performance/Security concern In-Reply-To: <44E3A8BD.3080402@oxysys.com> References: <44E3A8BD.3080402@oxysys.com> Message-ID: >However, in studying the code to learn about the architecture, I >encountered something odd: The RTSP connection is blocking. In >RTSPServer::RTSPClientSession::incomingRequestHandler1(), the code >essentially waits until a full RTSP request header has arrived. As a >result, if I connect to the server and begin sending a header, but do >not finish the header, the server is wedged until I either complete the >header or disconnect. Yes - good call. You're correct - to fit the code's event-driven execution model, all of the server's request-handling reads (not just the first one), should be done asynchronously. I will be shortly be releasing a new version of the code to fix this problem. Stay tuned... (It turns out that "RTSPClient" has the same problem, but that's less serious - for most applications, it doesn't matter so much if a RTSP client request blocks.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rob.casey at swishgroup.com.au Wed Aug 16 17:00:24 2006 From: rob.casey at swishgroup.com.au (Rob Casey) Date: Thu, 17 Aug 2006 10:00:24 +1000 Subject: [Live-devel] Trick play for MPEG Transport Stream files Message-ID: <675C5A5A309B1F4EB9EE1EC5FA94AE455EA8DE@MAILSERVER.swishgroup.local> Ross, The FAQ document for the liveMedia libraries highlights the possibility of the integration of trick play support for MPEG transport stream files into the public source code release in late 2006. Is this something which is still likely to occur? Or is there still merit in the private development of this functionality by external parties? Regards, Rob Rob Casey Chief Technology Officer Swish Interactive a division of The Swish Group Limited 170 Dorcas Street South Melbourne, Victoria 3205 Australia [P] +61 3 9686 6640 [F] +61 3 9686 6680 [M] +61 401 460 490 [E] rob.casey at swishgroup.com.au -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060816/882d4186/attachment.html From finlayson at live555.com Wed Aug 16 17:09:13 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Aug 2006 17:09:13 -0700 Subject: [Live-devel] Trick play for MPEG Transport Stream files In-Reply-To: <675C5A5A309B1F4EB9EE1EC5FA94AE455EA8DE@MAILSERVER.swishgroup.local> References: <675C5A5A309B1F4EB9EE1EC5FA94AE455EA8DE@MAILSERVER.swishgroup.local> Message-ID: >The FAQ document for the liveMedia libraries highlights the >possibility of the integration of trick play support for MPEG >transport stream files into the public source code release in late >2006. Is this something which is still likely to occur? Yes. But the more people keep asking about it, the less likely it is to happen :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060816/db09c4cc/attachment.html From rob.casey at swishgroup.com.au Wed Aug 16 17:21:57 2006 From: rob.casey at swishgroup.com.au (Rob Casey) Date: Thu, 17 Aug 2006 10:21:57 +1000 Subject: [Live-devel] Trick play for MPEG Transport Stream files Message-ID: <675C5A5A309B1F4EB9EE1EC5FA94AE455EA8E0@MAILSERVER.swishgroup.local> Well, that certainly makes sense from a commercial perspective - I have to give you that :-) Regards, Rob Rob Casey Chief Technology Officer Swish Interactive a division of The Swish Group Limited 170 Dorcas Street South Melbourne, Victoria 3205 Australia [P] +61 3 9686 6640 [F] +61 3 9686 6680 [M] +61 401 460 490 [E] rob.casey at swishgroup.com.au ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, 17 August 2006 10:09 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Trick play for MPEG Transport Stream files The FAQ document for the liveMedia libraries highlights the possibility of the integration of trick play support for MPEG transport stream files into the public source code release in late 2006. Is this something which is still likely to occur? Yes. But the more people keep asking about it, the less likely it is to happen :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060816/fa47b877/attachment.html From tchristensen at nordija.com Thu Aug 17 00:58:13 2006 From: tchristensen at nordija.com (Thomas Christensen) Date: Thu, 17 Aug 2006 09:58:13 +0200 Subject: [Live-devel] Possible to join multicast and extract jpegs? In-Reply-To: References: <501AD2B0-9683-4152-833A-4F04BE342244@nordija.com> Message-ID: Den 08/08/2006 kl. 23.13 skrev Ross Finlayson: >> I want to extract images from a multicasted mpeg2 transport stream >> TS. The TS is raw udp and does not use RTP. I figure the required >> streps are the following: >> >> 1. Receive MPEG2-TS and keep a buffer that only keeps the latest >> couple of screens >> 2. Extract a key-frame from that buffer and save as jpeg > > Well, you've missed two *very* important steps in the middle of 2: > - decode the MPEG-2 frame > - reencode the decoded frame to JPEG The two obvious MPEG decoders are VLC and MPlayer. I had hoped for other extensions of the Live library to perform the requested operation. However it turned out that MPlayer can do exactly what I was looking for: ./mplayer -ss 00:00:01 -frames 1 -nosound -vop scale=704:567 - jpeg:outdir=img udp://@239.1.1.91:5501 Drops a frame from the specified multicast as jpeg in directory img If someone on this list knows of other and lighter decoders (open source), let me know. Thanks, Thomas From rob.casey at swishgroup.com.au Thu Aug 17 03:07:49 2006 From: rob.casey at swishgroup.com.au (Rob Casey) Date: Thu, 17 Aug 2006 20:07:49 +1000 Subject: [Live-devel] Range header parsing Message-ID: <675C5A5A309B1F4EB9EE1EC5FA94AE4562C541@MAILSERVER.swishgroup.local> Ross, I would note that within the parseRangeHeader function (in the file liveMedia/RTSPServer.cpp), the Range header is parsed expecting spaces to be present between the elements of the NPT specification. See below: 590 float start, end; 591 if (sscanf(fields, "npt = %f - %f", &start, &end) == 2) { 592 rangeStart = start; 593 rangeEnd = end; 594 } else if (sscanf(fields, "npt = %f -", &start) == 1) { 595 rangeStart = start; 596 } else { 597 return False; // The header is malformed 598 } Is there a reason for the parsing of NPT information in this manner? The specification of NPT within RFC2326 does not appear to include spaces between NPT elements. And I would additionally note that even when Range headers are returned by the liveMedia library, these do not include spaces between NPT elements. As such I would propose the following amendment of this code (which maintains operability with existing code which may depend upon this specification of NPT information): 590 float start, end; 591 if (sscanf(fields, "npt = %f - %f", &start, &end) == 2 || 592 sscanf(fields, "npt=%f-%f", &start, &end) == 2) { 593 rangeStart = start; 594 rangeEnd = end; 595 } else if (sscanf(fields, "npt = %f -", &start) == 1 || 596 sscanf(fields, "npt=%f-", &start) == 1) { 597 rangeStart = start; 598 } else { 599 return False; // The header is malformed 600 } Regards, Rob -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060817/7b9dd485/attachment.html From rob.casey at swishgroup.com.au Thu Aug 17 04:11:00 2006 From: rob.casey at swishgroup.com.au (Rob Casey) Date: Thu, 17 Aug 2006 21:11:00 +1000 Subject: [Live-devel] Parsing of x-playNow header Message-ID: <675C5A5A309B1F4EB9EE1EC5FA94AE4562C542@MAILSERVER.swishgroup.local> Attached please find a patch for the live.2006.08.07 code base to add support for the x-playNow header to initiate playback of a stream immediately following the submission of a SETUP request by a RTSP client (in the same manner by which support for the Range header within SETUP requests is implemented currently). Regards, Rob -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060817/7ec78452/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: live-parsePlayNow.patch Type: application/octet-stream Size: 1198 bytes Desc: live-parsePlayNow.patch Url : http://lists.live555.com/pipermail/live-devel/attachments/20060817/7ec78452/attachment-0001.obj From finlayson at live555.com Thu Aug 17 05:04:21 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Aug 2006 05:04:21 -0700 Subject: [Live-devel] Range header parsing In-Reply-To: <675C5A5A309B1F4EB9EE1EC5FA94AE4562C541@MAILSERVER.swishgroup.local> References: <675C5A5A309B1F4EB9EE1EC5FA94AE4562C541@MAILSERVER.swishgroup.local> Message-ID: >I would note that within the parseRangeHeader function (in the file >liveMedia/RTSPServer.cpp), the Range header is parsed expecting >spaces to be present between the elements of the NPT specification. >See below: > > 590 float start, end; > 591 if (sscanf(fields, "npt = %f - %f", &start, &end) == 2) { > 592 rangeStart = start; > 593 rangeEnd = end; > 594 } else if (sscanf(fields, "npt = %f -", &start) == 1) { > 595 rangeStart = start; > 596 } else { > 597 return False; // The header is malformed > 598 } > >Is there a reason for the parsing of NPT information in this manner? Yes, because it *allows*, but does not *require*, white space to be present in the input. Note the following, from "man sscanf()": "White space (such as blanks, tabs, or newlines) in the format string match any amount of white space, including none, in the input." > The specification of NPT within RFC2326 does not appear to include >spaces between NPT elements. And I would additionally note that >even when Range headers are returned by the liveMedia library, these >do not include spaces between NPT elements. However, because of the way that "scanf()" works, these should be parsed OK by the existing code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060817/9057fb9a/attachment.html From rob.casey at swishgroup.com.au Thu Aug 17 05:28:47 2006 From: rob.casey at swishgroup.com.au (Rob Casey) Date: Thu, 17 Aug 2006 22:28:47 +1000 Subject: [Live-devel] Range header parsing Message-ID: <675C5A5A309B1F4EB9EE1EC5FA94AE4562C544@MAILSERVER.swishgroup.local> Mea culpa - Thanks for the reply. Regards, Rob ________________________________ From: live-devel-bounces at ns.live555.com on behalf of Ross Finlayson Sent: Thu 17/08/2006 10:04 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Range header parsing I would note that within the parseRangeHeader function (in the file liveMedia/RTSPServer.cpp), the Range header is parsed expecting spaces to be present between the elements of the NPT specification. See below: 590 float start, end; 591 if (sscanf(fields, "npt = %f - %f", &start, &end) == 2) { 592 rangeStart = start; 593 rangeEnd = end; 594 } else if (sscanf(fields, "npt = %f -", &start) == 1) { 595 rangeStart = start; 596 } else { 597 return False; // The header is malformed 598 } Is there a reason for the parsing of NPT information in this manner? Yes, because it *allows*, but does not *require*, white space to be present in the input. Note the following, from "man sscanf()": "White space (such as blanks, tabs, or newlines) in the format string match any amount of white space, including none, in the input." The specification of NPT within RFC2326 does not appear to include spaces between NPT elements. And I would additionally note that even when Range headers are returned by the liveMedia library, these do not include spaces between NPT elements. However, because of the way that "scanf()" works, these should be parsed OK by the existing code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 5322 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20060817/69579e3b/attachment.bin From finlayson at live555.com Thu Aug 17 05:27:27 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Aug 2006 05:27:27 -0700 Subject: [Live-devel] Parsing of x-playNow header In-Reply-To: <675C5A5A309B1F4EB9EE1EC5FA94AE4562C542@MAILSERVER.swishgroup.local> References: <675C5A5A309B1F4EB9EE1EC5FA94AE4562C542@MAILSERVER.swishgroup.local> Message-ID: >Attached please find a patch for the live.2006.08.07 code base to >add support for the x-playNow header to initiate playback of a >stream immediately following the submission of a SETUP request by a >RTSP client (in the same manner by which support for the Range >header within SETUP requests is implemented currently). In general, I'm reluctant to support non-standard headers like this. However, because we're already supporting the 'stream after SETUP' hack (primarily for Amino STBs) using the presence of a "Range:" header as a test, I'll go ahead and extend this to also check for the "x-playNow" header. Your patch will be included in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060817/8ca09e70/attachment.html From xcsmith at rockwellcollins.com Thu Aug 17 10:55:07 2006 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Thu, 17 Aug 2006 12:55:07 -0500 Subject: [Live-devel] Media Subsessions Message-ID: What is the difference between PassiveServerMediaSubsession and OnDemandServerMediaSubsession? I've been looking at the test programs and I can't really figure out when to use PassiveServerMediaSubsession or a subclass of OnDemandServerMediaSubsession. Thanks very much! ~Medra From finlayson at live555.com Thu Aug 17 13:14:40 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Aug 2006 13:14:40 -0700 Subject: [Live-devel] Media Subsessions In-Reply-To: References: Message-ID: >What is the difference between PassiveServerMediaSubsession and >OnDemandServerMediaSubsession? Short answer: Use "PassiveServerMediaSubsession" for streaming via multicast. Use "OnDemandServerMediaSubsession" for streaming via unicast. Slightly longer answer: Use "PassiveServerMediaSubsession" if you want to stream to an existing, already-set-up destination (i.e., "RTPSink"), using the same destination each time a client connects. Use "OnDemandServerMediaSubsession" if you want to stream to a new destination ("RTPSink") that is created for each new client. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From spacelis at gmail.com Thu Aug 17 20:59:59 2006 From: spacelis at gmail.com (SpaceLi) Date: Fri, 18 Aug 2006 11:59:59 +0800 Subject: [Live-devel] Porting Live Media Library successfully to uClinux with NIOS2 Message-ID: <4f8ff5c10608172059m630f1211o58fb391e7cf139b6@mail.gmail.com> For some unknown reasons, nios2-linux-uclibc-g++ cannot successfully construct const DelayInterval class, so const ETERNITY.seconds() and ETERNITY.useconds() always return 0. And of course fDelayQueue constructs incorrectly. So we have to explicitly construct DelayInterval(INT_MAX, MILLION-1) for fDelayQueue Constructor instead of using const ETERNITY. And then it worked. This problem must due to the poor compiler we use. Fortunately live use C style system call rather than C++. If doesn't so, we could even not compile it. Thanks EVERYONE here, especially Ross. From spacelis at gmail.com Fri Aug 18 01:05:08 2006 From: spacelis at gmail.com (SpaceLi) Date: Fri, 18 Aug 2006 16:05:08 +0800 Subject: [Live-devel] Which encoder can work with live media better? Message-ID: <4f8ff5c10608180105je39829fgeca9070d92e1eac1@mail.gmail.com> I could not find a proper encoder for m4v. I don't know which encoder (and what parameter should use for encoding can produce standerd m4v which will fit testOnDemandServer in testProg. thx From ishan73 at yahoo.com Fri Aug 18 03:09:53 2006 From: ishan73 at yahoo.com (Ishan Vaishnavi) Date: Fri, 18 Aug 2006 11:09:53 +0100 (BST) Subject: [Live-devel] frame alignment In-Reply-To: Message-ID: <20060818100954.64442.qmail@web35606.mail.mud.yahoo.com> Hello people, I need some help with getNextFrame again. I fixed the last issue I had with mpeg4 stream, some trivial internal coding issue. However I am having problems now with mpeg2 streams. I was under the impression from what little live code I read that the getNextFrame call returns a complete video frame which I can queue up in a buffer to pass to the decoder. This seems true for MPEG4 however I seem to be getting much smaller packets than the frame size for MPEG2. My question is do I have to clobber these together myself till I make a complete frame or is live suppoesed to do it for me and I am doing something wrong ? Cheers Ishan ___________________________________________________________ All New Yahoo! Mail ? Tired of Vi at gr@! come-ons? Let our SpamGuard protect you. http://uk.docs.yahoo.com/nowyoucan.html From weiyutao36 at 163.com Fri Aug 18 06:15:14 2006 From: weiyutao36 at 163.com (weiyutao36) Date: Fri, 18 Aug 2006 21:15:14 +0800 (CST) Subject: [Live-devel] about: presentationTime--NPT--playMediaSe ssion--re-connect Message-ID: <44E5BD62.000005.00373@bj163app23.163.com> Hi Ross, In MultiFramedRTPSource.cpp,in the function void MultiFramedRTPSource::networkReadHandler(MultiFramedRTPSource* source, int /*mask*/) there is a code segment: =================================== // Fill in the rest of the packet descriptor, and store it: struct timeval timeNow;gettimeofday(&timeNow, NULL); bPacket->assignMiscParams(rtpSeqNo, rtpTimestamp, presentationTime,hasBeenSyncedUsingRTCP, rtpMarkerBit, timeNow); source->fReorderingBuffer->storePacket(bPacket); //========I added here=======// latestNPT=fReorderingBuffer->fHeadPacket->fPresentationTime; //========I added here=======// readSuccess = True; } while (0); if (!readSuccess) source->fReorderingBuffer->freePacket(bPacket); source->doGetNextFrame1(); =================================== I want to set up a global variable which can always records the latest RTP packet's presentationTime and it should return to playCommon.cpp to be used by other functions. In the above code segment, I added a sentense between two //===// lines, but I do not know whether it is the right position to do this? And, how can I update it and return it to playCommon.cpp? Please give me some advice. Thanks. FilexBlue -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060818/99a81e96/attachment.html From ymreddy at ssdi.sharp.co.in Fri Aug 18 07:12:57 2006 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Fri, 18 Aug 2006 19:42:57 +0530 Subject: [Live-devel] Closing the file handle on the client side Message-ID: <7FB4685EA93D014C8E30AA087B66E7520337CF2F@ssdimailsrvnt01.ssdi.sharp.co.in> Hi Ross, Thanks for your continuous support on LIVE codebase. We have a small issue in LIVE codebase. We are streaming the mpeg2 files from server to client using the test programs (testMPEG1or2VideoStreamer.cpp, and testMPEG1or2VideoReceiver.cpp). We commented out play() function in afterPlaying() function in the server side. We are writing into a file on the client side. We observed that the file pointer is not closed on the client side, once the streaming is completed. In the client side, the function handleClosure() in FramedSource.cpp in not called. We believe this will call the afterPlaying() function in the client side, which will close the file handle. Please provide your inputs. Thanks and Regards Y. Mallikharjuna Reddy From brainlai1102 at gmail.com Fri Aug 18 22:43:13 2006 From: brainlai1102 at gmail.com (Brain Lai) Date: Sat, 19 Aug 2006 13:43:13 +0800 Subject: [Live-devel] SDP headers x-dimensions and x-framerate VS. a=framerate: 25 and a=framesize:96 720-480 Message-ID: <2dd7fdf10608182243q383a5177t502fd9fcb975d4cf@mail.gmail.com> Dear Sir: I note liveMedia parses x-dimensions and x-framerate in SDP while some servers use a=framerate: 25 and a=framesize:96 720-480 to represent the video frame size as well as framrate. I'm confused and find no clues which one is standard or adopted popularly. Would you please tell me, wouldn't you? BR. Brain Lai -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060818/9014be31/attachment.html From yossyd at nayos.com Sun Aug 20 03:15:11 2006 From: yossyd at nayos.com (Yossy Dreyfus) Date: Sun, 20 Aug 2006 12:15:11 +0200 Subject: [Live-devel] wav & mpeg4 streaming Message-ID: <000001c6c441$8b5b69a0$570a1f0a@nayos.local> Hello I wrote a program that send mpeg4 video and WAV audio via rtsp. This program is based on the defaults from "testMPEG4VideoStreamer" and "testWAVAudioStreamer". When I play this streaming on a player, I here the audio approximately 1 second after the corresponding video picture. What should I do to solve this delay? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060820/f1e1a559/attachment.html From ori_yq at yahoo.com Sun Aug 20 15:13:21 2006 From: ori_yq at yahoo.com (JC) Date: Sun, 20 Aug 2006 15:13:21 -0700 (PDT) Subject: [Live-devel] How to use genMakefiles to generate makefile for cross-compiler? In-Reply-To: <4f8ff5c10608172059m630f1211o58fb391e7cf139b6@mail.gmail.com> Message-ID: <20060820221321.67298.qmail@web56510.mail.re3.yahoo.com> Hello all. I am JC and new here. I am trying to port liveMedia library to Montavista Linux cross compiler environment. Is it possible to run genMakefiles so that I can get a makefiles that can be used for the cross-compiler? Or if there step by step instructions that I could follow for doing this? Any hint would be very much helpful. Thanks in advance. --------------------------------- Do you Yahoo!? Everyone is raving about the all-new Yahoo! Mail Beta. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060820/8700b401/attachment.html From ymreddy at ssdi.sharp.co.in Sun Aug 20 20:50:24 2006 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Mon, 21 Aug 2006 09:20:24 +0530 Subject: [Live-devel] Closing the file handle on the client side Message-ID: <7FB4685EA93D014C8E30AA087B66E7520337CF31@ssdimailsrvnt01.ssdi.sharp.co.in> Hi Ross, Thanks for your continuous support on LIVE codebase. We have a small issue in LIVE codebase. We are streaming the mpeg2 files from server to client using the test programs (testMPEG1or2VideoStreamer.cpp, and testMPEG1or2VideoReceiver.cpp). We commented out play() function in afterPlaying() function in the server side. We are writing into a file on the client side. We observed that the file pointer is not closed on the client side, once the streaming is completed. In the client side, the function handleClosure() in FramedSource.cpp in not called. We believe this will call the afterPlaying() function in the client side, which will close the file handle. Please provide your inputs. Thanks and Regards Y. Mallikharjuna Reddy From bidibulle at operamail.com Mon Aug 21 05:34:55 2006 From: bidibulle at operamail.com (David BERTRAND) Date: Mon, 21 Aug 2006 13:34:55 +0100 Subject: [Live-devel] memory leak in MultiFrameRTPSource Message-ID: <20060821123455.41BDD245E3@ws5-3.us4.outblaze.com> Hi Ross, It seems that liveMedia library (version "2006.06.27") has a memory leak in MultiFrameRTPSource class. In method "ReorderingPacketBuffer::storePacket" there is one case where function returns without adding the newly created packet in the queue and where the packet is not freed. This happens if a sequence number has already been received (duplicated packet). Here below the incriminated code : // Figure out where the new packet will be stored in the queue: BufferedPacket* beforePtr = NULL; BufferedPacket* afterPtr = fHeadPacket; while (afterPtr != NULL) { if (seqNumLT(rtpSeqNo, afterPtr->rtpSeqNo())) break; // it comes here if (rtpSeqNo == afterPtr->rtpSeqNo()) { // This is a duplicate packet - ignore it freePacket(bPacket); !!! fix for the memory leak return; } David -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From lkovacs at xperts.hu Mon Aug 21 05:57:39 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Mon, 21 Aug 2006 14:57:39 +0200 Subject: [Live-devel] MPEG4 in RTP help Message-ID: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> Hello all, I am a new user of live, and I am currently hacking two IP cameras. Both cameras emit raw MPEG4 stream in RTP. What I want is to recieve the VOPs (I,P,B,S), or the entire MPEG4 stream by Live555, and process it with my code. So far, my program uses the MPEG4GenericRTPSource class. The problem is that only the P type VOPs are received. Could anyone explain me what do I do wrong? I set up the class like this: sessionState.source = MPEG4GenericRTPSource::createNew(*env, &rtpGroupsock, 96, 90000, "MPEG4", "generic", 2048000, 2048, 2048); I have to admit that I have no clue about the last 3 argumet of the call. Could anyone please help me about this topic. Thank you very much in advance. Levente From brainlai1102 at gmail.com Mon Aug 21 06:06:50 2006 From: brainlai1102 at gmail.com (Brain Lai) Date: Mon, 21 Aug 2006 21:06:50 +0800 Subject: [Live-devel] SDP headers x-dimensions and x-framerate VS. a=framerate: 25 and a=framesize:96 720-480 In-Reply-To: <2dd7fdf10608182243q383a5177t502fd9fcb975d4cf@mail.gmail.com> References: <2dd7fdf10608182243q383a5177t502fd9fcb975d4cf@mail.gmail.com> Message-ID: <2dd7fdf10608210606q6d40afb4t3bdbdab9f2aa430c@mail.gmail.com> Dear Sir: I note liveMedia parses x-dimensions and x-framerate in SDP while some servers use a=framerate: 25 and a=framesize:96 720-480 to represent the video frame size as well as framrate. I'm confused and find no clues which one is standard or adopted popularly. Would you please tell me, wouldn't you? BR. Brain Lai -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060821/f6efe3e8/attachment.html From ori_yq at yahoo.com Mon Aug 21 07:10:13 2006 From: ori_yq at yahoo.com (JC) Date: Mon, 21 Aug 2006 07:10:13 -0700 (PDT) Subject: [Live-devel] cross compiler fail, please help In-Reply-To: Message-ID: <20060821141013.73754.qmail@web56505.mail.re3.yahoo.com> HI All/Ross, I am using ARM cross compiler with LiveMedia library. I have modified config.armlinux, in which I changed the first line: CROSS_COMPILE= arm_v5t_le-. Then, I used genMakefiles armlinux. But then, when I run make, I have got lots of "undefined reference" error, like bellow: ../BasicUsageEnvironment/libBasicUsageEnvironment.a(.rodata._ZTI14BasicHashTable[typeinfo for BasicHashTable]+0x0): undef ined reference to `vtable for __cxxabiv1::__si_class_type_info' ../BasicUsageEnvironment/libBasicUsageEnvironment.a(.ARM.extab+0x0): undefined reference to `__gxx_personality_v0' There are lots more such error. I think I have missed something. Maybe from your experience, you know. Please give me some of your suggestions. Any hint or directions would be very useful! Thanks a lot. JC --------------------------------- Talk is cheap. Use Yahoo! Messenger to make PC-to-Phone calls. Great rates starting at 1?/min. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060821/a4510a9b/attachment.html From brainlai1102 at gmail.com Mon Aug 21 09:20:56 2006 From: brainlai1102 at gmail.com (Brain Lai) Date: Tue, 22 Aug 2006 00:20:56 +0800 Subject: [Live-devel] ~RTSPClient() may casue problems when doing some session management Message-ID: <2dd7fdf10608210920j4c9f69bbg19acf3a0a0c7bbca@mail.gmail.com> Dear Sir: In 00145 RTSPClient::~RTSPClient() { 00146 reset(); 00147 envir().taskScheduler().turnOffBackgroundReadHandling(fInputSocketNum); 00148 delete[] fResponseBuffer; 00149 delete[] fUserAgentHeaderStr; 00150 }, I note reset() calls closeTCPSockets() to set fIinputSocketNum = fOutputSocketNum = -1 which causes the following turnOffBackgroundReadHandling(fInputSocketNum) not to clear fInputSocketNum in the fReadset of envir().taskScheduler(). As a result, after ~RTCPClient(), the select() in SingleStep() returns -1 with errno == EBADF and leads the program to exit! It seems that changing their order can avoid this problem and people can happily control the life cycle of the instances of RTSPClient in another select callback in SingleStep(). Hope this helpful for you. I may not consider everything clearly, just be careful. BR. Brain Lai -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060821/0c67d072/attachment.html From rob.casey at swishgroup.com.au Mon Aug 21 15:03:58 2006 From: rob.casey at swishgroup.com.au (Rob Casey) Date: Tue, 22 Aug 2006 08:03:58 +1000 Subject: [Live-devel] frame alignment Message-ID: <675C5A5A309B1F4EB9EE1EC5FA94AE455EA8EC@MAILSERVER.swishgroup.local> While I haven't examined these frame sizes myself, is it not possible that you are simply getting B- and P-frames from the MPEG2 stream? See http://wiki.multimedia.cx/index.php?title=Frame_Types for a brief overview of frame types employed within MPEG2 streams. Rob Casey Swish Interactive a division of The Swish Group Limited 170 Dorcas Street South Melbourne, Victoria 3205 Australia [P] +61 3 9686 6640 [F] +61 3 9686 6680 [M] +61 401 460 490 [E] rob.casey at swishgroup.com.au > -----Original Message----- > From: live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ishan > Vaishnavi > Sent: Friday, 18 August 2006 8:10 PM > To: LIVE555 Streaming Media - development & use > Subject: [Live-devel] frame alignment > > Hello people, > > I need some help with getNextFrame again. I fixed the last > issue I had with mpeg4 stream, some trivial internal coding issue. > > However I am having problems now with mpeg2 streams. > I was under the impression from what little live code I read > that the getNextFrame call returns a complete video frame > which I can queue up in a buffer to pass to the decoder. This > seems true for MPEG4 however I seem to be getting much > smaller packets than the frame size for MPEG2. > > My question is do I have to clobber these together myself > till I make a complete frame or is live suppoesed to do it > for me and I am doing something wrong ? > > Cheers > Ishan > > > > > ___________________________________________________________ > All New Yahoo! Mail - Tired of Vi at gr@! come-ons? Let our > SpamGuard protect you. http://uk.docs.yahoo.com/nowyoucan.html > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Mon Aug 21 17:26:22 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 17:26:22 -0700 Subject: [Live-devel] about: presentationTime--NPT--playMediaSe ssion--re-connect In-Reply-To: <44E5BD62.000005.00373@bj163app23.163.com> References: <44E5BD62.000005.00373@bj163app23.163.com> Message-ID: >I want to set up a global variable which can always records the latest >RTP packet's presentationTime and it should return to >playCommon.cpp to be used by other functions. >In the above code segment, I added a sentense between two //===// >lines, but I do not know whether it is the right position to do >this? And, how can I update it and return it to playCommon.cpp? >Please give me some advice. The presentation time for each incoming frame is *already* delivered to each reader (in the data sink's 'after getting' function). You do not need to modify any underlying code to give you this - you already have it! In particular, note the function "FileSink::addData()" (in "liveMedia/FileSink.cpp"). It includes a "presentationTime" parameter. You could modify this code (or, alternatively, write your own "MediaSink" subclass) to use this parameter. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 21 17:33:24 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 17:33:24 -0700 Subject: [Live-devel] Closing the file handle on the client side In-Reply-To: <7FB4685EA93D014C8E30AA087B66E7520337CF2F@ssdimailsrvnt01.ssdi.sharp.co.in > References: <7FB4685EA93D014C8E30AA087B66E7520337CF2F@ssdimailsrvnt01.ssdi.sharp.co.in > Message-ID: >We observed that the file pointer is not closed on the >client side, once the streaming is completed No - the line Medium::close(sessionState.sink); in the "afterPlaying()" function should cause the output file to be closed (by caling the "FileSink" destructor). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 21 17:50:32 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 17:50:32 -0700 Subject: [Live-devel] frame alignment In-Reply-To: <20060818100954.64442.qmail@web35606.mail.mud.yahoo.com> References: <20060818100954.64442.qmail@web35606.mail.mud.yahoo.com> Message-ID: > However I am having problems now with mpeg2 streams. >I was under the impression from what little live code >I read that the getNextFrame call returns a complete >video frame which I can queue up in a buffer to pass >to the decoder. This seems true for MPEG4 however I >seem to be getting much smaller packets than the frame >size for MPEG2. For MPEG-1 or 2 video, each chunk of data that is delivered to a downstream reader is not actually a complete video 'frame'. Instead, it is a video 'slice'. (This makes it possible for a smart decoder to handle/display incomplete frames.) > >My question is do I have to clobber these together >myself till I make a complete frame Yes - if your decoder doesn't handle slices. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 21 18:48:01 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 18:48:01 -0700 Subject: [Live-devel] How to use genMakefiles to generate makefile for cross-compiler? In-Reply-To: <20060820221321.67298.qmail@web56510.mail.re3.yahoo.com> References: <20060820221321.67298.qmail@web56510.mail.re3.yahoo.com> Message-ID: >Hello all. I am JC and new here. I am trying to port liveMedia >library to Montavista Linux cross compiler environment. Is it >possible to run genMakefiles so that I can get a makefiles that can >be used for the cross-compiler? Or if there step by step >instructions that I could follow for doing this? You will need to create a new configuration file (e.g., named "config.montavista-cross-compile"), and then run genMakefiles montavista-cross-compile I suggest using one or more of the existing "config.*" files as a model. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 21 19:06:52 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 19:06:52 -0700 Subject: [Live-devel] MPEG4 in RTP help In-Reply-To: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> References: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> Message-ID: >I am a new user of live, and I am currently hacking two IP cameras. >Both cameras emit raw MPEG4 stream in RTP. I hope they are using our software to do the MPEG-4/RTP streaming :-) >What I want is to recieve the VOPs (I,P,B,S), or the entire MPEG4 >stream by Live555, and process it with my code. So far, my program >uses the MPEG4GenericRTPSource class. No, to receive MPEG-4/RTP video, you should use the "MPEG4ESVideoRTPSource" class. ("MPEG4GenericRTPSource" is for a more general RTP payload format that is (usually) used only for MPEG-4 *audio* RTP streams.) > >The problem is that only the P type VOPs are received. Could anyone >explain me what do I do wrong? I set up the class like this: > >sessionState.source = MPEG4GenericRTPSource::createNew(*env, >&rtpGroupsock, 96, 90000, "MPEG4", "generic", 2048000, 2048, 2048); Instead, call sessionState.source = MPEG4ESVideoRTPSource::createNew(*env, &rtpGroupsock, 96, 90000); -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jeremy at electrosilk.net Mon Aug 21 20:29:52 2006 From: jeremy at electrosilk.net (Jeremy) Date: Tue, 22 Aug 2006 11:29:52 +0800 Subject: [Live-devel] wis-streamer driver check Centos / RHEL In-Reply-To: References: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> Message-ID: <44EA7A30.8080506@electrosilk.net> Hello, I hope this is the right place to bring this up. I have started to set-up and test wis-streamer on a Centos 4.3 system (RHEL equivalent). I have downloaded and installed wis-go7007-linux-0.9.8 and the latest version of the live software library and wis-streamer The wis-streamer application would not start nor would the application gorecord that comes with the driver. Checking into the code I see that there is a driver check section in both sets of code. The code is is looking in slightly the wrong place for the files - at least for Centos / RHEL. When I disable that check in wis-streamer by defining IGNORE_DRIVER_CHECK the programs starts up and runs O.K. Applying the following diff to the gorecord.c file fixes the problem there as well < "/sys/class/video4linux/video%d/device/driver", i); --- > "/sys/class/video4linux/video%d/driver", i); Regards Jeremy From jeremy at electrosilk.net Mon Aug 21 20:42:22 2006 From: jeremy at electrosilk.net (Jeremy) Date: Tue, 22 Aug 2006 11:42:22 +0800 Subject: [Live-devel] wis-streamer no audio In-Reply-To: <44EA7A30.8080506@electrosilk.net> References: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> <44EA7A30.8080506@electrosilk.net> Message-ID: <44EA7D1E.6070303@electrosilk.net> Hello, Evaluating wis-streamer I have it most working except for the lack of audio. Video works fine. Running the driver utility app gorecord I can record an avi that contains picture and sound. This means my setup is working and the driver is doing the right thing. When I use wis-streamer in any mode the system reports audio is being delivered but it is totally silent. VLC reports the expected formats e.g. mpgv and mpga and the right bit rates. I have tested with mplayer and VLC. (One note is that trying to visualise the audio in VLC consistently crashes VLC.) My system logs have the following 'interesting' entries debug log Aug 21 16:49:03 ipstream kernel: application wis-streamer uses obsolete OSS audio interface message log Aug 21 16:49:32 ipstream hald[3082]: Timed out waiting for hotplug event 606. Rebasing to 607 Aug 21 16:50:04 ipstream hald[3082]: Timed out waiting for hotplug event 609. Rebasing to 610 Asides from these logs I cannot see anything unusual. Any suggestions on what steps can I take to locate and eliminate the problem? Thanks Jeremy From chenw at blrcsv.china.bell-labs.com Mon Aug 21 20:42:40 2006 From: chenw at blrcsv.china.bell-labs.com (chenwei) Date: Tue, 22 Aug 2006 11:42:40 +0800 Subject: [Live-devel] How can I know a mpeg1/2 file is PS,TS or ES? Message-ID: <20060822033044.293F58719@blrcsv.china.bell-labs.com> Dear All? Every test program in the /testProgs needs a proper type such as TS/PS/ES. But how can I know such information about mpeg1/2 files? Thanks:) From chenw at blrcsv.china.bell-labs.com Mon Aug 21 20:54:53 2006 From: chenw at blrcsv.china.bell-labs.com (chenwei) Date: Tue, 22 Aug 2006 11:54:53 +0800 Subject: [Live-devel] How can I release the port 8854 used by test programs Message-ID: <20060822034257.2CE148719@blrcsv.china.bell-labs.com> Dear All: Every time after I run a test program in the directory /testProgs, I found the port 8854 was still occupied. So I can't run another test program:( What should I do to release it? Thank you very much for your reply. ????????chenwei ????????chenw at blrcsv.china.bell-labs.com ??????????2006-08-22 From chenw at blrcsv.china.bell-labs.com Mon Aug 21 21:01:07 2006 From: chenw at blrcsv.china.bell-labs.com (chenwei) Date: Tue, 22 Aug 2006 12:01:07 +0800 Subject: [Live-devel] How can I release the port 8854 used by test programs Message-ID: <20060822034911.55E0A8719@blrcsv.china.bell-labs.com> Dear All: Every time after I run a test program in the directory /testProgs, I found the port 8854 was still occupied. So I can't run another test program:( What should I do to release it? Thank you very much for your reply. ????????chenwei ????????chenw at blrcsv.china.bell-labs.com ??????????2006-08-22 From chenw at blrcsv.china.bell-labs.com Mon Aug 21 21:01:44 2006 From: chenw at blrcsv.china.bell-labs.com (chenwei) Date: Tue, 22 Aug 2006 12:01:44 +0800 Subject: [Live-devel] How can I know a mpeg1/2 file is PSTS or ES? Message-ID: <20060822034948.4148E8719@blrcsv.china.bell-labs.com> Dear All? Every test program in the /testProgs needs a proper type such as TS/PS/ES. But how can I know such information about mpeg1/2 files? Thanks:) ????????chenwei ????????chenw at blrcsv.china.bell-labs.com ??????????2006-08-22 From finlayson at live555.com Mon Aug 21 21:14:34 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 21:14:34 -0700 Subject: [Live-devel] SDP headers x-dimensions and x-framerate VS. a=framerate: 25 and a=framesize:96 720-480 In-Reply-To: <2dd7fdf10608210606q6d40afb4t3bdbdab9f2aa430c@mail.gmail.com> References: <2dd7fdf10608182243q383a5177t502fd9fcb975d4cf@mail.gmail.com> <2dd7fdf10608210606q6d40afb4t3bdbdab9f2aa430c@mail.gmail.com> Message-ID: >I note liveMedia parses x-dimensions and x-framerate in SDP while >some servers use a=framerate: 25 and a=framesize:96 720-480 to >represent the video frame size as well as framrate. I'm confused and >find no clues which one is standard or adopted popularly. The "x-" prefix for "x-dimensions" and "x-framerate" indicates that these are experimental (i.e., non-standard) SDP attributes. They are, however, used by some QuickTime-based video servers, which is why I chose to support them in the "LIVE555 Streaming Media" code. The "framerate" attribute is defined in RFC 4566 - the latest version of the SDP specification. Therefore it is a standard, and because of this, it will be supported in the LIVE555 code shortly. However, I have not seen any RFC (or Internet-Draft) that defines the "framesize" attribute. Therefore, it is probably not a standard, and should not be used (because it doesn't have a "x-" prefix). It will probably not get supported in the LIVE555 code. From finlayson at live555.com Mon Aug 21 21:24:01 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 21:24:01 -0700 Subject: [Live-devel] How can I know a mpeg1/2 file is PS,TS or ES? In-Reply-To: <20060822033044.293F58719@blrcsv.china.bell-labs.com> References: <20060822033044.293F58719@blrcsv.china.bell-labs.com> Message-ID: > Every test program in the /testProgs needs a proper type >such as TS/PS/ES. But how can I know such information about mpeg1/2 >files? You would need to inspect the contents of the file, to look for appropriate headers. (However, the "LIVE555 Streaming Media" libraries currently do not do this; you would need to write this yourself.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 21 21:26:24 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 21:26:24 -0700 Subject: [Live-devel] How can I release the port 8854 used by test programs In-Reply-To: <20060822034257.2CE148719@blrcsv.china.bell-labs.com> References: <20060822034257.2CE148719@blrcsv.china.bell-labs.com> Message-ID: >Dear All: > Every time after I run a test program in the directory >/testProgs, I found the port 8854 was still occupied. Are you sure that each test program is exiting? If it does, then the OS should release the port afterwards. (Obviously, you can't run more than one simultaneous application that uses the same RTSP server port number.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 21 21:47:10 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 21:47:10 -0700 Subject: [Live-devel] wis-streamer no audio In-Reply-To: <44EA7D1E.6070303@electrosilk.net> References: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> <44EA7A30.8080506@electrosilk.net> <44EA7D1E.6070303@electrosilk.net> Message-ID: >Asides from these logs I cannot see anything unusual. Any suggestions on >what steps can I take to locate and eliminate the problem? Not really. This is not a problem that I have seen so far. Unfortunately, you're going to have to go through the "wis-streamer" code yourself (Remember, You Have Complete Source Code), to figure out if it is capturing audio correctly on your system, and if not, why not. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jeremy at electrosilk.net Mon Aug 21 22:23:16 2006 From: jeremy at electrosilk.net (Jeremy) Date: Tue, 22 Aug 2006 13:23:16 +0800 Subject: [Live-devel] How can I release the port 8854 used by test programs In-Reply-To: References: <20060822034257.2CE148719@blrcsv.china.bell-labs.com> Message-ID: <44EA94C4.4090904@electrosilk.net> Ross Finlayson wrote: >> Dear All: >> Every time after I run a test program in the directory >> /testProgs, I found the port 8854 was still occupied. >> > > Are you sure that each test program is exiting? If it does, then the > OS should release the port afterwards. > > > If it exits while in a debugger it is quite common for the port to not be released. You have to exit the debugger or have explicit code in the app to close the port. From ymreddy at ssdi.sharp.co.in Mon Aug 21 22:50:07 2006 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Tue, 22 Aug 2006 11:20:07 +0530 Subject: [Live-devel] Closing the file handle on the client side Message-ID: <7FB4685EA93D014C8E30AA087B66E7520337CF39@ssdimailsrvnt01.ssdi.sharp.co.in> Hi Ross, Thanks for the information. On the client side, afterPlaying() funtion is not being called once the streaming is completed. Can you suggest when can we call this function. Thanks and Regards Y. Mallikharjuna Reddy -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Ross Finlayson Sent: Tuesday, August 22, 2006 6:03 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Closing the file handle on the client side >We observed that the file pointer is not closed on the >client side, once the streaming is completed No - the line Medium::close(sessionState.sink); in the "afterPlaying()" function should cause the output file to be closed (by caling the "FileSink" destructor). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Mon Aug 21 23:09:08 2006 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Aug 2006 23:09:08 -0700 Subject: [Live-devel] Closing the file handle on the client side In-Reply-To: <7FB4685EA93D014C8E30AA087B66E7520337CF39@ssdimailsrvnt01.ssdi.sharp.co.in > References: <7FB4685EA93D014C8E30AA087B66E7520337CF39@ssdimailsrvnt01.ssdi.sharp.co.in > Message-ID: >Hi Ross, > >Thanks for the information. On the client side, afterPlaying() funtion is >not being called once the streaming is completed. Can you suggest when can >we call this function. You could try calling "setByeHandler()" on the RTCPInstance, to arrange for afterPlaying() to be called if a RTCP "BYE" packet arrives. (However, if the source does not send a RTCP "BYE" packet at the end, then this won't work.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lkovacs at xperts.hu Tue Aug 22 00:22:44 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Tue, 22 Aug 2006 09:22:44 +0200 Subject: [Live-devel] MPEG4 in RTP help In-Reply-To: References: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> Message-ID: <20060822092244.921e94e0.lkovacs@xperts.hu> Hello Ross, > I hope they are using our software to do the MPEG-4/RTP streaming :-) Life would be much easier with live555. But they use something else. :-( We currently have 4 type of mpeg camera, and we see 4 types of MPEG stream! :-) > No, to receive MPEG-4/RTP video, you should use the > "MPEG4ESVideoRTPSource" class. ("MPEG4GenericRTPSource" is for a > more general RTP payload format that is (usually) used only for > MPEG-4 *audio* RTP streams.) [...] > Instead, call > sessionState.source = MPEG4ESVideoRTPSource::createNew(*env, > &rtpGroupsock, 96, 90000); Ah.. Ok. By the time I got this message, I tried to use SimpleRTPSource, and it was somehow working too. Thank you very much for your help! Levente > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From lkovacs at xperts.hu Tue Aug 22 00:24:47 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Tue, 22 Aug 2006 09:24:47 +0200 Subject: [Live-devel] How can I release the port 8854 used by test programs In-Reply-To: <44EA94C4.4090904@electrosilk.net> References: <20060822034257.2CE148719@blrcsv.china.bell-labs.com> <44EA94C4.4090904@electrosilk.net> Message-ID: <20060822092447.45e2b593.lkovacs@xperts.hu> On Tue, 22 Aug 2006 13:23:16 +0800 Jeremy wrote: > If it exits while in a debugger it is quite common for the port to not > be released. You have to exit the debugger or have explicit code in the > app to close the port. Or say "kill" to the debugger. From finlayson at live555.com Tue Aug 22 00:43:48 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Aug 2006 00:43:48 -0700 Subject: [Live-devel] ~RTSPClient() may casue problems when doing some session management In-Reply-To: <2dd7fdf10608210920j4c9f69bbg19acf3a0a0c7bbca@mail.gmail.com> References: <2dd7fdf10608210920j4c9f69bbg19acf3a0a0c7bbca@mail.gmail.com> Message-ID: >It seems that changing their order can avoid this problem You're right. Thanks - this will be fixed in the next release of the software. >Brain Lai Is your name really "Brain" rather than "Brian"?? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jeremy at electrosilk.net Tue Aug 22 01:21:00 2006 From: jeremy at electrosilk.net (Jeremy) Date: Tue, 22 Aug 2006 16:21:00 +0800 Subject: [Live-devel] wis-streamer no audio In-Reply-To: References: <20060821145739.ff7a2dc7.lkovacs@xperts.hu> <44EA7A30.8080506@electrosilk.net> <44EA7D1E.6070303@electrosilk.net> Message-ID: <44EABE6C.10901@electrosilk.net> Ross Finlayson wrote: > Unfortunately, you're going to have to go through the "wis-streamer" > code yourself (Remember, You Have Complete Source Code), to figure > out if it is capturing audio correctly on your system, and if not, > why not. > It turns out that defining IGNORE_DRIVER_CHECK in WISInput.h was causing the loss of audio problem, The resolution was to not define IGNORE_DRIVER_CHECK and manually go through the code and change the paths used for v4l and ALSA configuration to match the paths used by Centos 4.3 / RHEL There was some original code to handle different paths but only in one place. Even than the two alternates used did not match the Centos/RHEL layout The following are my patches for my immediate needs. They won't suite anyone not running Centos 4.3 or RHEL equivalent. 166c166,167 < snprintf(sympath, sizeof sympath, "/sys/class/video4linux/video%d/device", i); --- > // snprintf(sympath, sizeof sympath, "/sys/class/video4linux/video%d/device", i); > snprintf(sympath, sizeof sympath, "/sys/class/video4linux/video%d/driver", i); // Alternate path for Centos 4.3 [J] 186c187,188 < snprintf(sympath, sizeof sympath, "/sys/class/sound/pcmC%dD0c/device", i); --- > // snprintf(sympath, sizeof sympath, "/sys/class/sound/pcmC%dD0c/device", i); > snprintf(sympath, sizeof sympath, "/sys/class/sound/pcmC%dD0c/driver", i); // Alternate path for Centos 4.3 [J] From mrtheo at gmail.com Tue Aug 22 03:24:44 2006 From: mrtheo at gmail.com (Theo Cushion) Date: Tue, 22 Aug 2006 11:24:44 +0100 Subject: [Live-devel] Altering number of bytes written to output file on openRTSP Message-ID: <9e0137740608220324y52bcdd93t3d07e973a27a304f@mail.gmail.com> Hi, I'm currently using a modified version of openRTSP to write the MPEG2 TS packets straight to a Linux TV dvr device, ready for it to be displayed. However I noticed it writes 1024 bytes at a time, then 292. Rather than the 1316 (7*188) bytes it receives in one go with each packet. Presumably this is a default setting on one of the functions that writes to the output, however looking through the Filesink I cannot see how to alter this. I notice there is a function for adding extra data, whereby I can specify how much to write at a time, but this isn't what I want to do. Is there a way of altering this 1024 behaviour? Thanks Theo -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060822/6a853e06/attachment.html From ori_yq at yahoo.com Tue Aug 22 06:52:16 2006 From: ori_yq at yahoo.com (JC) Date: Tue, 22 Aug 2006 06:52:16 -0700 (PDT) Subject: [Live-devel] How to use genMakefiles to generate makefile for cross-compiler? In-Reply-To: Message-ID: <20060822135216.28789.qmail@web56507.mail.re3.yahoo.com> Thanks. I did and it works! You will need to create a new configuration file (e.g., named "config.montavista-cross-compile"), and then run genMakefiles montavista-cross-compile --------------------------------- Talk is cheap. Use Yahoo! Messenger to make PC-to-Phone calls. Great rates starting at 1?/min. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060822/1ff61101/attachment.html From brainlai1102 at gmail.com Tue Aug 22 08:35:02 2006 From: brainlai1102 at gmail.com (Brain Lai) Date: Tue, 22 Aug 2006 23:35:02 +0800 Subject: [Live-devel] ~RTSPClient() may casue problems when doing some session management In-Reply-To: References: <2dd7fdf10608210920j4c9f69bbg19acf3a0a0c7bbca@mail.gmail.com> Message-ID: <2dd7fdf10608220835h177ccb64r9ac84dea40c6a72@mail.gmail.com> Yes, I'm really Brain, not Brian ^_^ Brain Lai 2006/8/22, Ross Finlayson : > > >It seems that changing their order can avoid this problem > > You're right. Thanks - this will be fixed in the next release of the > software. > > >Brain Lai > > Is your name really "Brain" rather than "Brian"?? > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060822/bc2e33b9/attachment.html From jmbaio at gmail.com Tue Aug 22 10:53:38 2006 From: jmbaio at gmail.com (Juan Manuel Lopez Baio) Date: Tue, 22 Aug 2006 14:53:38 -0300 Subject: [Live-devel] Event loop and threads Message-ID: Hello people. I'm fairly new with livemedia, and getting the hang of it. I'm using it to solve media broadcasting aspects of a very specific project, and so for now I can do with the basic implementation that already comes and the RTSPServer class. The thing is that my app runs a few threads, in one of wich the doEventLoop function of the BasicTaskScheduler is called. I read on the site that livemedia is not thread-safe, so I took it into account. The only "shared" information is the watchVariable used by the loop to know when to quit, which I set to true from the main thread when trying to leave the program, so that doEventLoop finally returns making it possible to join its thread and finish the program. Well, if I pass that watchVariable already set, then the loop never even starts, which is fine, (if somewhat useless); but, if I try to set it afterwards, then when I try to join the thread at the end of the program (pthread_join, from posix thread), it never comes back! The variable's (actually, an object's attribute) state is fine, and I'm not really sure what else to check. Anyway, if someone has any idea I'd appreciate the hand. Thanks! Juan PS: I'm sending this mail to live-devel at ns.live555.com as I see the rest doing, even though when I signed up the list told me to send them to live-devel at lists.live555.com. What's with that? From finlayson at live555.com Tue Aug 22 11:17:24 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Aug 2006 11:17:24 -0700 Subject: [Live-devel] Event loop and threads In-Reply-To: References: Message-ID: >The only "shared" information is the watchVariable used by the loop to >know when to quit, which I set to true from the main thread when >trying to leave the program, so that doEventLoop finally returns >making it possible to join its thread and finish the program. >Well, if I pass that watchVariable already set, then the loop never >even starts, which is fine, (if somewhat useless); but, if I try to >set it afterwards, then when I try to join the thread at the end of >the program (pthread_join, from posix thread), it never comes back! What may be happening is that your event loop has no pending events (i.e., no incoming network packets to be handled, and no pending delayed tasks). If that happens, the event loop will (quite correctly) block indefinitely in "select()", and will never get around to checking the watch variable. For a way to overcome this, see >PS: I'm sending this mail to live-devel at ns.live555.com as I see the >rest doing, even though when I signed up the list told me to send them >to live-devel at lists.live555.com. What's with that? Right now "lists.live555.com" and "ns.live555.com" are the same host. That might not remain true in the future, though. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060822/983cd081/attachment.html From xcsmith at rockwellcollins.com Tue Aug 22 11:26:51 2006 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Tue, 22 Aug 2006 13:26:51 -0500 Subject: [Live-devel] How can I know a mpeg1/2 file is PS,TS or ES? In-Reply-To: Message-ID: chenwei you might try searching for a library called "libavformat". the library apparently parses many types of file streams and returns information about them. http://www.inb.uni-luebeck.de/~boehme/using_libavcodec.html (the link name is misleading) i've never used it personally though; i just heard about it while working with Live ~Medra Ross Finlayson To Sent by: LIVE555 Streaming Media - live-devel-bounce development & use s at ns.live555.com cc 08/21/2006 11:24 Subject PM Re: [Live-devel] How can I know a mpeg1/2 file is PS,TS or ES? Please respond to LIVE555 Streaming Media - development & use > Every test program in the /testProgs needs a proper type >such as TS/PS/ES. But how can I know such information about mpeg1/2 >files? You would need to inspect the contents of the file, to look for appropriate headers. (However, the "LIVE555 Streaming Media" libraries currently do not do this; you would need to write this yourself.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Aug 22 11:37:19 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Aug 2006 11:37:19 -0700 Subject: [Live-devel] memory leak in MultiFrameRTPSource In-Reply-To: <20060821123455.41BDD245E3@ws5-3.us4.outblaze.com> References: <20060821123455.41BDD245E3@ws5-3.us4.outblaze.com> Message-ID: >It seems that liveMedia library (version "2006.06.27") has a memory >leak in MultiFrameRTPSource class. In method >"ReorderingPacketBuffer::storePacket" there is one case where >function returns without adding the newly created packet in the >queue and where the packet is not freed. >This happens if a sequence number has already been received >(duplicated packet). David, You're right - thanks! This will be fixed in the next release of the code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lkovacs at xperts.hu Wed Aug 23 03:01:03 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Wed, 23 Aug 2006 12:01:03 +0200 Subject: [Live-devel] two RTP streams at the same time Message-ID: <20060823120103.a6c4a7e0.lkovacs@xperts.hu> Hi All, Ross, Well, video is working good, but now I'd like to add audio support. The camera streams audio inside a new RTP session. My question is rather basic. The problem is that I don't see what is the link among thoes functions calls, which I used for video. Do I need to have another environment? Or can I just use the same environment, and just have to create another groupsock? Do I have to create a new scheduler too? Thank you very much for our help. Levente From finlayson at live555.com Wed Aug 23 06:22:15 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Aug 2006 06:22:15 -0700 Subject: [Live-devel] two RTP streams at the same time In-Reply-To: <20060823120103.a6c4a7e0.lkovacs@xperts.hu> References: <20060823120103.a6c4a7e0.lkovacs@xperts.hu> Message-ID: >Well, video is working good, but now I'd like to add audio support. >The camera streams audio inside a new RTP session. My question is >rather basic. The problem is that I don't see what is the link among >thoes functions calls, which I used for video. Do I need to have >another environment? Or can I just use the same environment, and >just have to create another groupsock? Do I have to create a new >scheduler too? No, you use the same "UsageEnvironment" and the same "TaskScheduler" (and thus the same event loop). However, you will need new "groupsock"s, and a new "RTPSink" and "RTCPInstance" for the audio RTP/RTCP stream. See the code for the "testMPEG1or2AudioVideoStreamer" demo application for an example of this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lorenooliveira at gmail.com Wed Aug 23 07:39:35 2006 From: lorenooliveira at gmail.com (Loreno Oliveira) Date: Wed, 23 Aug 2006 11:39:35 -0300 Subject: [Live-devel] testOnDemandRTSPServer and multi homed hosts Message-ID: Hi list, does anybody has any experience in running testOnDemandRTSPServer on a multi homed host?? My problem is: i'm running testOnDemandRTSPServer on a machine with two ip addresses. My client application is running in a machine which belongs to one of the networks then the server host. When I run openRTSP the applicacion cannot talk with the server. I believe the problem is that the RTSP server started using its IP address which is not the address the client host can reach. Well, is there any way of setting the address in which the RTSP server will use for listening to clients?? Regards, Loreno -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060823/6a7c94b0/attachment.html From nagyz at nefty.hu Wed Aug 23 09:22:54 2006 From: nagyz at nefty.hu (Zoltan NAGY) Date: Wed, 23 Aug 2006 18:22:54 +0200 Subject: [Live-devel] live lib & quicktime? Message-ID: <44EC80DE.707@nefty.hu> Hello! I'm trying to use testMPEG4VideoStreamer with quicktime, but I'm getting parse errors: accept()ed connection from xxx.xxx.xxx.xxx RTSPClientSession[0x80d5cf8]::incomingRequestHandler1() read 221 bytes:GET /testStream HTTP/1.0 User-Agent: QuickTime/7.1 (qtver=7.1;os=Windows NT 5.1Service Pack 2) x-sessioncookie: DpnQlNUZAAB+0AUAFoAAAA Accept: application/x-rtsp-tunnelled Pragma: no-cache Cache-Control: no-cache parseRTSPRequestString() failed! sending response: RTSP/1.0 400 Bad Request Date: Wed, Aug 23 2006 16:21:43 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE any ideas about that? can I get live to support quicktime? Thanks in advance, Zoltan From jmbaio at gmail.com Wed Aug 23 10:23:41 2006 From: jmbaio at gmail.com (Juan Manuel Lopez Baio) Date: Wed, 23 Aug 2006 14:23:41 -0300 Subject: [Live-devel] Event loop and threads In-Reply-To: References: Message-ID: > What may be happening is that your event loop has no pending events (i.e., > no incoming network packets to be handled, and no pending delayed tasks). > If that happens, the event loop will (quite correctly) block indefinitely in > "select()", and will never get around to checking the watch variable. > > > For a way to overcome this, see > Ok, so now I added a static dummy method to my class , that gets the UsageEnvironment* as argument, since it's not a global variable but an attribute of that same class (same as the watchVariable). Then, from another method where I call doEventLoop (the threaded method of the class), I added the call to the dummy task: dummyTask(this->Env); this->Env->taskScheduler().doEventLoop(&(this->stopServer)); Well, with that call there, I get a SEG FAULT. Looking at the the last stack frames: inside AlarmHandler::handleTimeout, fClientData == 0x0 inside dummyTask, env == 0x0 inside UsageEnvironment::taskScheduler, this == 0x0 ;before that everything seems normal. If I remove the call to dummyTask, everything works fine. Do you know what could it be? Thanks! From jmbaio at gmail.com Wed Aug 23 10:51:43 2006 From: jmbaio at gmail.com (Juan Manuel Lopez Baio) Date: Wed, 23 Aug 2006 14:51:43 -0300 Subject: [Live-devel] Objects lifetime responsibility Message-ID: Hello, I have a brand new doubt. In my RTSPServer using class, I create an UsageEnvironment and TaskScheduler with createNew(), as well as a ServerMediaSession(added to the RTSPServer) and one or more Subsessions (added to the ServerMediaSession) for each file streamed. My question is: should I delete or handle in any way the cleanup of all this objects? I'm calling reclaim() on the environment in the destructor of my class, but I'm not really sure of what I should do. Thanks. Juan From finlayson at live555.com Wed Aug 23 10:54:08 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Aug 2006 10:54:08 -0700 Subject: [Live-devel] live lib & quicktime? In-Reply-To: <44EC80DE.707@nefty.hu> References: <44EC80DE.707@nefty.hu> Message-ID: >I'm trying to use testMPEG4VideoStreamer with quicktime, but I'm getting >parse errors: > >accept()ed connection from xxx.xxx.xxx.xxx >RTSPClientSession[0x80d5cf8]::incomingRequestHandler1() read 221 >bytes:GET /testStream HTTP/1.0 >User-Agent: QuickTime/7.1 (qtver=7.1;os=Windows NT 5.1Service Pack 2) >x-sessioncookie: DpnQlNUZAAB+0AUAFoAAAA >Accept: application/x-rtsp-tunnelled >Pragma: no-cache >Cache-Control: no-cache > > >parseRTSPRequestString() failed! >sending response: RTSP/1.0 400 Bad Request >Date: Wed, Aug 23 2006 16:21:43 GMT >Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE > > >any ideas about that? Yes, the problem is that your client (QuickTime Player) is trying to send RTSP-over-HTTP (note the HTTP "GET" command), which our RTSP server implementation currently does not support. > can I get live to support quicktime? Yes, QuickTime Player will work if you stop it from trying to use RTSP-over-HTTP tunneling. Maybe you have a firewall somewhere that is blocking TCP port 8554? If so, either unblock this, or else change "testMPEG4VideoStreamer" to use TCP port 554 for RTSP (if you're on Unix, you will need to be 'root' to do this). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From vkozhuhov at gmx.net Wed Aug 23 18:40:55 2006 From: vkozhuhov at gmx.net (Victor Kozhuhov) Date: Thu, 24 Aug 2006 04:40:55 +0300 Subject: [Live-devel] RTSP customization Message-ID: <009901c6c71e$5bfbc210$0100a8c0@sleepy> Hello, I am writing a sort of custom RTSP server application, intended to stream specific media files on demand. When application starts, I have no idea about real file names, so I am unable to create ServerMediaSession objects and submit them to the server prior to event loop. It is possible to list the directory on startup, but its content may change in run time. Please let me know if the is an elegant way to update ServerMediaSessions table in run-time in some special case (for example, if request link is not listed)? I failed to find such feature in the RTSPServer code, so I am going to implement a sort of extrnal event handler mechanism: declare special pure interface (some RTSPServerExtender with several events onXXX(...) ), implement this interface in my application and pass it to the server. Than server will call corresponding handler about error or other special condition, so it may react properly (update table). I found the following in August mail list digest: "You can add new "ServerMediaSession" objects to (or remove existing "ServerMediaSession" objects from) a running RTSPServer at any time. But you do this within the event loop - not using threads. Please, everybody, read the FAQ! " I did it - really :-) Actually, there is no problem to schedule a task for listing the directory, looking for newly appeared / removed files and updating ServerMediaSession table in run time, but this idea has several disadvantages (for example, if there are 1000 new files in the directory, but the client will ask for only one, it is not necessary to list all of them). Please let me know if there is a way to do what I want without modifying the server code? If not - please tell me if you are intersted in such modification and corresponding sources submit? Thank you in advance, Victor. From jmbaio at gmail.com Wed Aug 23 19:06:59 2006 From: jmbaio at gmail.com (Juan Manuel Lopez Baio) Date: Wed, 23 Aug 2006 23:06:59 -0300 Subject: [Live-devel] RTSP customization In-Reply-To: <009901c6c71e$5bfbc210$0100a8c0@sleepy> References: <009901c6c71e$5bfbc210$0100a8c0@sleepy> Message-ID: On 8/23/06, Victor Kozhuhov wrote: > Hello, > > I am writing a sort of custom RTSP server application, intended to stream > specific media files on demand. > When application starts, I have no idea about real file names, so I am > unable to create ServerMediaSession objects and submit them to the server > prior to event loop. > It is possible to list the directory on startup, but its content may change > in run time. > > Please let me know if the is an elegant way to update ServerMediaSessions > table in run-time in some special case (for example, if request link is not > listed)? Funny, I am doing something _very_ similar. > > I found the following in August mail list digest: > "You can add new "ServerMediaSession" objects to (or remove existing > "ServerMediaSession" objects from) a running RTSPServer at any time. > But you do this within the event loop - not using threads. Please, > everybody, read the FAQ! " > ?Ouch! Threads are exactly the solution I used, and for two weeks now I've been trying it with very good results (I read the faq also, but assumed if I kept all livemedia-related objects in its own thread without sharing with objects of my own, there would be no problem). The event loop is run in its own thread, and the main thread creates the MediaSessions and adds them to the RTSPServer as needed. There's not really mucho more to it, but if there is an actual danger in this, I'll review it all. So, well, I guess I'll be following this thread closely... :) From finlayson at live555.com Wed Aug 23 19:16:31 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Aug 2006 19:16:31 -0700 Subject: [Live-devel] RTSP customization In-Reply-To: <009901c6c71e$5bfbc210$0100a8c0@sleepy> References: <009901c6c71e$5bfbc210$0100a8c0@sleepy> Message-ID: >"You can add new "ServerMediaSession" objects to (or remove existing >"ServerMediaSession" objects from) a running RTSPServer at any time. Correct. >Actually, there is no problem to schedule a task for listing the directory, >looking for newly appeared / removed files and updating ServerMediaSession >table in run time, but this idea has several disadvantages (for example, if >there are 1000 new files in the directory, but the client will ask for only >one, it is not necessary to list all of them). Nonetheless, this is the easiest approach, because it does not require modifying any existing code. The current server implementation assumes that "ServerMediaSession" objects are stored in a hash table, indexed by the string name contained within a RTSP "DESCRIBE" command. If you want to do something different - e.g., do a file system lookup, and create the "ServerMediaSession" on demand, iff the file is present - then you will need to modify the code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Aug 23 19:22:12 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Aug 2006 19:22:12 -0700 Subject: [Live-devel] RTSP customization In-Reply-To: References: <009901c6c71e$5bfbc210$0100a8c0@sleepy> Message-ID: >The event loop is run in its own thread, and the main thread creates >the MediaSessions and adds them to the RTSPServer as needed. This will work only if the "main thread" is also your event loop thread. This is explained very clearly (I thought) in the FAQ. If you want to have a periodic task that scans the file system for new files, then you must do this within the event loop (thread), using "scheduleDelayedTask". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From chenw at blrcsv.china.bell-labs.com Wed Aug 23 20:38:33 2006 From: chenw at blrcsv.china.bell-labs.com (chenwei) Date: Thu, 24 Aug 2006 03:38:33 -0000 Subject: [Live-devel] How to rename the filename generated by openRTSP? Message-ID: <20060824032653.B86A08719@blrcsv.china.bell-labs.com> Dear All: I want to know whether ./testOnDemandRTSPServer can read from a stream rather than test.mpg. I want to rename the file generated by openRTSP to test.mpg so that I cant test testOnDemandRTSPServer. I tried hard to find how to rename video-MPV-1 using grep but failed. Who can help me? ????????chenwei ????????chenw at blrcsv.china.bell-labs.com ??????????2006-01-24 From chenw at blrcsv.china.bell-labs.com Wed Aug 23 20:41:48 2006 From: chenw at blrcsv.china.bell-labs.com (chenwei) Date: Thu, 24 Aug 2006 03:41:48 -0000 Subject: [Live-devel] How can I get a stream from a RTSP player ? Message-ID: <20060824033009.591DD8719@blrcsv.china.bell-labs.com> Dear All: How can I get a stream from the RTSP player ? ????????chenwei ????????chenw at blrcsv.china.bell-labs.com ??????????2006-01-24 From tilakadhya at rediffmail.com Wed Aug 23 21:40:23 2006 From: tilakadhya at rediffmail.com (Tilak Adhya) Date: 24 Aug 2006 04:40:23 -0000 Subject: [Live-devel] Streaming - switching of files Message-ID: <20060824044023.7962.qmail@webmail25.rediffmail.com> Hi,My requirement is that --- I\'ve two(for example f1 and f2) audio files and I need to play first n1 sec of f1, then play the whole file f2 and then again play the rest of audio file f1. i.e. switching the files using LIVE code....  What I\'ve done is that I\'ve created two different subsessions with f1 and f2... the client request for f1 and server will stream out as above... but here the problem is, the server streams out both the files simultaneously. So my question is how shall I declare above mentioned scenario to the server.Using npt is it possible to do that ?Could u pl help me out from this prob or any other suggestions...Thanks in advance...Tilak *-- tilak -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060823/afb34ee0/attachment.html From finlayson at live555.com Wed Aug 23 21:51:00 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Aug 2006 21:51:00 -0700 Subject: [Live-devel] Streaming - switching of files In-Reply-To: <20060824044023.7962.qmail@webmail25.rediffmail.com> References: <20060824044023.7962.qmail@webmail25.rediffmail.com> Message-ID: See The best way to do what you want to do is have the file switching happening in the underlying source object, so that the RTP/RTCP/RTSP code can remain ignorant of this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060823/8d74fb93/attachment.html From lkovacs at xperts.hu Thu Aug 24 02:04:31 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Thu, 24 Aug 2006 11:04:31 +0200 Subject: [Live-devel] two RTP streams at the same time In-Reply-To: References: <20060823120103.a6c4a7e0.lkovacs@xperts.hu> Message-ID: <20060824110431.06aea2d1.lkovacs@xperts.hu> On Wed, 23 Aug 2006 06:22:15 -0700 Ross Finlayson wrote: > No, you use the same "UsageEnvironment" and the same "TaskScheduler" > (and thus the same event loop). However, you will need new > "groupsock"s, and a new "RTPSink" and "RTCPInstance" for the audio > RTP/RTCP stream. See the code for the > "testMPEG1or2AudioVideoStreamer" demo application for an example of > this. Thank you, it works just fine! And thank you very much in general, for the great Live555 library. Levente From oleg_g at mer.co.il Thu Aug 24 02:28:03 2006 From: oleg_g at mer.co.il (Oleg Galbert) Date: Thu, 24 Aug 2006 12:28:03 +0300 Subject: [Live-devel] A problem: receiving data from several multicast groups, while joined only one Message-ID: <200608241228.03810.oleg_g@mer.co.il> Hi, I have two processes - one streaming UDP data to specific multicast group (streamer), the second - receiving data from the same group (receiver). On the streamer side I create Groupsock object with specific multicast IP, and start to stream the data. On the recorder side - I'm receiving the data for the multicast group with this IP. But the problem is that the receiver gets the data for even it is joined to the different multicast group. It looks like I cannot differentiate the streams by group. After some hacking I found a way to solve it. When I set ReceivingInterfaceAddr to the same group IP address, the socket is binded to this IP. everythink works OK. In original code the socket was binded to INADDR_ANY. My question is: Is it correct to bind the receiving side socket to the same address as multicast group address? I include my patch for related functions in GroupsockHelper.cpp. ============================================== --- live.2006.08.07/groupsock/GroupsockHelper.cpp 2006-08-07 09:12:26.000000000 +0300 +++ live/groupsock/GroupsockHelper.cpp 2006-08-24 11:27:19.000000000 +0300 @ -425,9 +417,9 @@ Boolean socketJoinGroup(UsageEnvironment struct ip_mreq imr; imr.imr_multiaddr.s_addr = groupAddress; - imr.imr_interface.s_addr = ReceivingInterfaceAddr; + imr.imr_interface.s_addr = htonl(INADDR_ANY); if (setsockopt(socket, IPPROTO_IP, IP_ADD_MEMBERSHIP, - (const char*)&imr, sizeof (struct ip_mreq)) < 0) { + (const char*)&imr, sizeof (struct ip_mreq)) != 0) { #if defined(__WIN32__) || defined(_WIN32) if (env.getErrno() != 0) { // That piece-of-shit toy operating system (Windows) sometimes lies @@ -466,8 +458,8 @@ Boolean socketLeaveGroup(UsageEnvironmen #define IP_ADD_SOURCE_MEMBERSHIP 39 #define IP_DROP_SOURCE_MEMBERSHIP 40 #else -#define IP_ADD_SOURCE_MEMBERSHIP 25 -#define IP_DROP_SOURCE_MEMBERSHIP 26 +#define IP_ADD_SOURCE_MEMBERSHIP 67 +#define IP_DROP_SOURCE_MEMBERSHIP 68 #endif struct ip_mreq_source { @@ -550,6 +542,7 @@ Boolean loopbackWorks = 1; netAddressBits ourSourceAddressForMulticast(UsageEnvironment& env) { static netAddressBits ourAddress = 0; + netAddressBits oldReceivingInterfaceAddr = ReceivingInterfaceAddr; int sock = -1; struct in_addr testAddr; @@ -566,6 +559,7 @@ netAddressBits ourSourceAddressForMultic loopbackWorks = 0; // until we learn otherwise testAddr.s_addr = our_inet_addr("228.67.43.91"); // arbitrary + ReceivingInterfaceAddr = testAddr.s_addr; Port testPort(15947); // ditto sock = setupDatagramSocket(env, testPort); @@ -656,6 +650,7 @@ netAddressBits ourSourceAddressForMultic if (sock >= 0) { socketLeaveGroup(env, sock, testAddr.s_addr); + ReceivingInterfaceAddr == oldReceivingInterfaceAddr; closeSocket(sock); } ========================================================= -------------- next part -------------- A non-text attachment was scrubbed... Name: live_multicast.patch Type: text/x-diff Size: 1893 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20060824/a94efb2d/attachment.bin From jmbaio at gmail.com Thu Aug 24 07:14:09 2006 From: jmbaio at gmail.com (Juan Manuel Lopez Baio) Date: Thu, 24 Aug 2006 11:14:09 -0300 Subject: [Live-devel] RTSP threads vs event loop [was: RTSP customization] Message-ID: On 8/23/06, Ross Finlayson wrote: > >The event loop is run in its own thread, and the main thread creates > >the MediaSessions and adds them to the RTSPServer as needed. > > This will work only if the "main thread" is also your event loop > thread. This is explained very clearly (I thought) in the FAQ. yeah, I guess it is, I just misinterpreted it. I'd like to point out that I'm not married to threads, in fact I had read in the past http://www.softpanorama.org/People/Ousterhout/Threads/tsld001.htm and thought he made a good point; but, you know, sometimes we don't have that many choices, and this project is being handled with threads so I must adapt my tools the best I can. > If you want to have a periodic task that scans the file system for > new files, then you must do this within the event loop (thread), > using "scheduleDelayedTask". > -- ok, just did and it appears to be working; and now that I have a task the watchVariable started to work too! (that thingy in the other mail where I tried the dummyTask and it segfaulted, was a silly distraction of mine, in which, as I copypasted the example, I left the last argument to scheduleDelayedTask NULL, where I should've passed the instance of the environment (which is an attribute and not a global); I guess the long weekend we recently enjoyed here left me somewhat... "fuzzy"... ;). I think it may be a good idea to add to the FAQ, in the watchVariable section, a link to http://lists.live555.com/pipermail/live-devel/2006-March/004192.html Well, thanks a lot. See you around, Juan From finlayson at live555.com Thu Aug 24 09:49:46 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Aug 2006 09:49:46 -0700 Subject: [Live-devel] Performance/Security concern Message-ID: >However, in studying the code to learn about the architecture, I >encountered something odd: The RTSP connection is blocking. In >RTSPServer::RTSPClientSession::incomingRequestHandler1(), the code >essentially waits until a full RTSP request header has arrived. As a >result, if I connect to the server and begin sending a header, but do >not finish the header, the server is wedged until I either complete the >header or disconnect. FYI, a new version (2006.08.24) of the "LIVE555 Streaming Media" code has now been released that fixes this problem. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bidibulle at operamail.com Thu Aug 24 09:57:11 2006 From: bidibulle at operamail.com (David BERTRAND) Date: Thu, 24 Aug 2006 17:57:11 +0100 Subject: [Live-devel] Question about FileSink and presentation time Message-ID: <20060824165711.34C8343CBF@ws5-1.us4.outblaze.com> Hi Ross, I'm trying to implement sort of video recorder by receiving a MP4 RTP stream (tanks to MPEG4ESVideoRTPSource class) and feeding a FileSink with the corresponding frames. I'm not using openRTSP to do that (basically because my RTP stream comes from a sip phone). When I try to replay the recorded stream with testOnDemandRTSPServer program, the clip I get run faster than what I recorded. So, If I record 30 seconds, the 30 seconds are played within 10 seconds approximatively. I suspect a timestamp problem in the RTP packets sent by testOnDemandRTSPServer (via MPEG4ESVideoRTPSink). The frame rate is supposed to be 10fps but I think there is some variable frame rate that might be the problem. For example, I see that the first VOP received from my SIP phone (during recording phase) is fragmented in several packets and is followed 1 second later by the second VOP with a timestamp difference equivalent to 1 second. This would correspond to 10 frames in a constant frame rate. When I play the file back, the second VOP is sent with a timestamp difference corresponding to 1OO ms (as with a normal frame rate of 10), which IMO is the problem. Do you think MPEG4VideoStreamFramer object is able to compute presentation times and frame durations correctly from the ByteStreamFileSource in this case ? How could I check that the MPEG4 frames sent by my sip phones are correctly conveying timestamping information ? Any other idea that I would have missed ? I would appreciate any help. Thanks a lot David -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From finlayson at live555.com Thu Aug 24 09:56:26 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Aug 2006 09:56:26 -0700 Subject: [Live-devel] A problem: receiving data from several multicast groups, while joined only one In-Reply-To: <200608241228.03810.oleg_g@mer.co.il> References: <200608241228.03810.oleg_g@mer.co.il> Message-ID: >I have two processes - one streaming UDP data to specific multicast group >(streamer), the second - receiving data from the same group (receiver). >On the streamer side I create Groupsock object with specific multicast IP, and >start to stream the data. >On the recorder side - I'm receiving the data for the multicast group with >this IP. >But the problem is that the receiver gets the data for even it is joined to >the different multicast group. It looks like I cannot differentiate the >streams by group. This is an OS problem. Some OSs (e.g., some versions of Linux) do not distinguish between incoming packets sent to different multicast group addresses, but with the same port number. The solution is to use different port numbers, as well as different multicast groups. >My question is: Is it correct to bind the receiving side socket to the same >address as multicast group address? No. "ReceivingInterfaceAddr" - if not set to "INADDR_ANY" must be set to the unicast IP address of one of your network interfaces, not to a multicast address. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Aug 24 10:27:34 2006 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Aug 2006 10:27:34 -0700 Subject: [Live-devel] Question about FileSink and presentation time In-Reply-To: <20060824165711.34C8343CBF@ws5-1.us4.outblaze.com> References: <20060824165711.34C8343CBF@ws5-1.us4.outblaze.com> Message-ID: >I'm trying to implement sort of video recorder by receiving a MP4 >RTP stream (tanks to MPEG4ESVideoRTPSource class) and feeding a >FileSink with the corresponding frames. I'm not using openRTSP to do >that (basically because my RTP stream comes from a sip phone). When >I try to replay the recorded stream with testOnDemandRTSPServer >program, the clip I get run faster than what I recorded. So, If I >record 30 seconds, the 30 seconds are played within 10 seconds >approximatively. >I suspect a timestamp problem in the RTP packets sent by >testOnDemandRTSPServer (via MPEG4ESVideoRTPSink). No, I doubt that this is a RTP timestamp related problem. More likely, the problem is that there is a problem with the MPEG timestamp information embedded within your MPEG-4 video data. ("MPEG4VideoStreamFramer" - via "MPEG4VideoStreamParser" - parses this information in order to figure out the fPresentationTime, and fDurationInMicroseconds, for each MPEG-4 video frame. See "MPEGVideoStreamFramer.cpp".) >The frame rate is supposed to be 10fps but I think there is some >variable frame rate that might be the problem. For example, I see >that the first VOP received from my SIP phone (during recording >phase) is fragmented in several packets and is followed 1 second >later by the second VOP with a timestamp difference equivalent to 1 >second. This would correspond to 10 frames in a constant frame rate. >When I play the file back, the second VOP is sent with a timestamp >difference corresponding to 1OO ms (as with a normal frame rate of >10), which IMO is the problem. >Do you think MPEG4VideoStreamFramer object is able to compute >presentation times and frame durations correctly from the >ByteStreamFileSource in this case ? I don't know. I suggest reviewing the code for "MPEG4VideoStreamParser::parseVideoObjectPlane()" (in "MPEG4VideoStreamFramer.cpp"). Note that this code already contains some special cases to handle some kinds of buggy MPEG-4 video streams. Perhaps your stream is yet another type of buggy MPEG-4 video stream, or perhaps there's something else in your stream that's tripping up this code? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From vkozhuhov at gmx.net Thu Aug 24 18:22:48 2006 From: vkozhuhov at gmx.net (Victor Kozhuhov) Date: Fri, 25 Aug 2006 04:22:48 +0300 Subject: [Live-devel] WAV files extension Message-ID: <005001c6c7e5$0414fb80$0100a8c0@sleepy> Hello, during my development I slightly modified WAVAudioFileServerMediaSubsession and WAVAudioFileSource classes to support PCMU/PCMA content. If you are interested in such option, please use attached patch. (Sorry - I am not familar with diff/patch software, so just sending modified files in ZIP.) Regards, Victor. -------------- next part -------------- A non-text attachment was scrubbed... Name: wav_patch.zip Type: application/x-zip-compressed Size: 8542 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20060824/759faf39/attachment.bin From lkovacs at xperts.hu Fri Aug 25 00:44:03 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Fri, 25 Aug 2006 09:44:03 +0200 Subject: [Live-devel] A problem: receiving data from several multicast groups, while joined only one In-Reply-To: References: <200608241228.03810.oleg_g@mer.co.il> Message-ID: <20060825094403.e1d1a4bb.lkovacs@xperts.hu> On Thu, 24 Aug 2006 09:56:26 -0700 Ross Finlayson wrote: > This is an OS problem. Some OSs (e.g., some versions of Linux) do > not distinguish between incoming packets sent to different multicast > group addresses, but with the same port number. This sounds odd. I'd see the OPs routing table. Levente From finlayson at live555.com Fri Aug 25 02:04:19 2006 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Aug 2006 02:04:19 -0700 Subject: [Live-devel] WAV files extension In-Reply-To: <005001c6c7e5$0414fb80$0100a8c0@sleepy> References: <005001c6c7e5$0414fb80$0100a8c0@sleepy> Message-ID: >during my development I slightly modified WAVAudioFileServerMediaSubsession >and WAVAudioFileSource classes to support PCMU/PCMA content. >If you are interested in such option, please use attached patch. Victor, Thanks for the useful modifications. They will be included in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lorenooliveira at gmail.com Fri Aug 25 11:40:02 2006 From: lorenooliveira at gmail.com (Loreno Oliveira) Date: Fri, 25 Aug 2006 15:40:02 -0300 Subject: [Live-devel] need help for adjusting required bandwidth Message-ID: Hi Ross and list, I need to stream a video over a bluetooth channel but I need to to so using all the bluetooth available bandwidth (aprox. 741kbps). It doesn't matter if packets arrive out of order or some of them were lost. But I need to fulfill the bandwidth with the video stream. I'm using testOnDemandRTSPServer and openRTSP for serving and consumig the stream. The sample video I'm using is a mpeg2 video, which consumes around 639kbps of bandwidth. Is there any way of configuring the stream flow for consuming more bandwidth? Thanks in advance! Loreno -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060825/52668944/attachment.html From finlayson at live555.com Sat Aug 26 04:24:39 2006 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Aug 2006 04:24:39 -0700 Subject: [Live-devel] New "LIVE555 Streaming Media" version installed - changes implementation of "ByteStreamFileSource" Message-ID: A new version (2006.08.26) of the "LIVE555 Streaming Media" code has now been installed. It includes an important change to the implementation of "ByteStreamFileSource" (a class that most of your applications probably use). Reads from the "ByteStreamFileSource" open file now occur asynchronously (when new file data becomes available), rather than the open file being read synchronously. This better fits the library's event-driven execution model, and will make code that reads from files perform better - especially if multiple files are being read concurrently (e.g., within a RTSP server). NOTE: If you are running an old version of Windows - before Windows XP - then it is possible that your version of Windows does not let open files be select()able sockets. If that's the case, then you will find that "ByteStreamFileSource" no longer works. If you encounter this, then you can restore the old behavior by adding the line: #define READ_FROM_FILES_SYNCHRONOUSLY 1 to near the start of "liveMedia/ByteStreamFileSource.cpp". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lkovacs at xperts.hu Mon Aug 28 01:31:00 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Mon, 28 Aug 2006 10:31:00 +0200 Subject: [Live-devel] need help for adjusting required bandwidth In-Reply-To: References: Message-ID: <20060828103100.b4f00f4f.lkovacs@xperts.hu> On Fri, 25 Aug 2006 15:40:02 -0300 "Loreno Oliveira" wrote: > I need to stream a video over a bluetooth channel but I need to to so using > all the bluetooth available bandwidth (aprox. 741kbps). It doesn't matter if > packets arrive out of order or some of them were lost. But I need to fulfill > the bandwidth with the video stream. > > I'm using testOnDemandRTSPServer and openRTSP for serving and consumig the > stream. The sample video I'm using is a mpeg2 video, which consumes around > 639kbps of bandwidth. Is there any way of configuring the stream flow for > consuming more bandwidth? I think the above mentioned 741kbps is a total bitrate, including protocol headesr. If you calculate all protocol overheads, I think the 639kbps is reasonable. Levente From lorenooliveira at gmail.com Mon Aug 28 05:38:59 2006 From: lorenooliveira at gmail.com (Loreno Oliveira) Date: Mon, 28 Aug 2006 09:38:59 -0300 Subject: [Live-devel] need help for adjusting required bandwidth In-Reply-To: <20060828103100.b4f00f4f.lkovacs@xperts.hu> References: <20060828103100.b4f00f4f.lkovacs@xperts.hu> Message-ID: Hi Levente, thanks for the answer. I share this feeling with you. Maybe ~629kbps (I had a typing error :-) ) be the usable bandwidth if we consider the protocols overhead. However, I get slightly better results by doing some changes in the environment. I adjusted the BT MTU and enabled only outgoing DH5 packets. These changes improved the "usable" bandwidth for ~680kbps. Regards, Loreno On 8/28/06, Levente Kovacs wrote: > > On Fri, 25 Aug 2006 15:40:02 -0300 > "Loreno Oliveira" wrote: > > > I need to stream a video over a bluetooth channel but I need to to so > using > > all the bluetooth available bandwidth (aprox. 741kbps). It doesn't > matter if > > packets arrive out of order or some of them were lost. But I need to > fulfill > > the bandwidth with the video stream. > > > > I'm using testOnDemandRTSPServer and openRTSP for serving and consumig > the > > stream. The sample video I'm using is a mpeg2 video, which consumes > around > > 639kbps of bandwidth. Is there any way of configuring the stream flow > for > > consuming more bandwidth? > > I think the above mentioned 741kbps is a total bitrate, including protocol > headesr. If you calculate all protocol overheads, I think the 639kbps is > reasonable. > > Levente > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060828/557e54c6/attachment-0001.html From alexr at vigilanttechnology.com Tue Aug 29 01:58:05 2006 From: alexr at vigilanttechnology.com (Alex Rier) Date: Tue, 29 Aug 2006 10:58:05 +0200 Subject: [Live-devel] H264 RTP Streamer Message-ID: <683BC86C0162454BAC43B789A7FB19642E85C6@herlios.adyoron.com> Hi, I need a H264 RTP Streamer kind of testH264OnDemandRTSPStreamer. Have you already implemented it? If yes,- can I have it? If not,- when can it be implemented? Thanks, Alex This mail passed through VIGILANT TECHNOLOGY Mail-SeCure. ************************************************************************************ This footnote confirms that this email message has been scanned by PineApp Mail-SeCure for the presence of malicious code, vandals & computer viruses. ************************************************************************************ From weiyutao36 at 163.com Tue Aug 29 05:17:44 2006 From: weiyutao36 at 163.com (weiyutao36) Date: Tue, 29 Aug 2006 20:17:44 +0800 (CST) Subject: [Live-devel] How to transform the presentationTime to Normal Play Time(NPT) ? Message-ID: <44F43068.000078.25229@bj163app20.163.com> Hi Ross, In your last letter, you gave me some advice and I followed that.In liveMedia/FileSink.cpp, in the function "FileSink::addData()" I could get the presentationTime of the latest frame/packet. But when I printed out the "presentationTime", I found that it had the following form: latest_NPT.tv_sec = 1156840357 ,presentationTime.tv_sec=1156840357 I don't what does this mean---the number is so big! And furthermore, when I want to use the "playMediaSession(MediaSession &session, start, end,scale) " function, I know there's something more to be done. The playMediaSession() function requires a "start" parameter of "float type"(am I right?) and the parameter should be in the range of the media file such as "0-7.845000",which is the range of the media file mpg1video.mpg. So, my question is: how to transform the presentationTime to NPT of the media? Or what is the relationship between thepresentationTime and the NPT? Maybe I missed something. Any opinion is appreciated. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060829/56244729/attachment.html From ori_yq at yahoo.com Tue Aug 29 07:37:16 2006 From: ori_yq at yahoo.com (JC) Date: Tue, 29 Aug 2006 07:37:16 -0700 (PDT) Subject: [Live-devel] streaming live MPEG4 with C code Message-ID: <20060829143716.37697.qmail@web56508.mail.re3.yahoo.com> Hi All, I am trying to stream from a MPEG4 encoder. I was able to save the MPEG4 stream into test.m4v file and stream it out via testOnDemandSever. Now, I want to stream directly from the encoder. First, I run into the problem that the encoder libraries are based on C code, while live media libraries are based on C++. In the C code, I get each frame through a callback function. My question is where should I start with in order to feed live encoded MPEG4 frames to the live media library so that it could stream the live video out. I am looking into the code right now. But I am new, any guidance would be great help for me. Thanks ahead! JC --------------------------------- How low will we go? Check out Yahoo! Messenger?s low PC-to-Phone call rates. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060829/a3e1b943/attachment.html From yossydr at gmail.com Tue Aug 29 13:10:41 2006 From: yossydr at gmail.com (Yossy) Date: Tue, 29 Aug 2006 23:10:41 +0300 Subject: [Live-devel] real time streaming Message-ID: <871c63c20608291310l7f98d017hfaaf496618929ac3@mail.gmail.com> I am using the Live Media for sending real time video, based on the "testMPEG4VideoStreamer". When I play the stream I get a delay, maybe because the program pack several frames and then send them. I need to send every frame immediately, and not wait for other frames. what should I do? From finlayson at live555.com Tue Aug 29 22:46:48 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Aug 2006 22:46:48 -0700 Subject: [Live-devel] H264 RTP Streamer In-Reply-To: <683BC86C0162454BAC43B789A7FB19642E85C6@herlios.adyoron.com > References: <683BC86C0162454BAC43B789A7FB19642E85C6@herlios.adyoron.com> Message-ID: <7.0.1.0.1.20060829224517.01e6b748@live555.com> >I need a H264 RTP Streamer kind of testH264OnDemandRTSPStreamer. >Have you already implemented it? No - in part because there's no clear definition of what a H.264 video Elementary Stream file would look like. However, the "LIVE555 Streaming Media" code supports both sending and receiving H.264/RTP streams. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Tue Aug 29 22:48:37 2006 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Aug 2006 22:48:37 -0700 Subject: [Live-devel] How to transform the presentationTime to Normal Play Time(NPT) ? In-Reply-To: <44F43068.000078.25229@bj163app20.163.com> References: <44F43068.000078.25229@bj163app20.163.com> Message-ID: <7.0.1.0.1.20060829224751.01faaeb0@live555.com> >But when I >printed out the "presentationTime", I found that it had the following >form: > latest_NPT.tv_sec = 1156840357 >,presentationTime.tv_sec=1156840357 >I don't what does this mean---the number is so big! It is in standard Unix timestamp format - the number of seconds since January 1, 1970. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From tilakadhya at rediffmail.com Tue Aug 29 22:57:11 2006 From: tilakadhya at rediffmail.com (Tilak Adhya) Date: 30 Aug 2006 05:57:11 -0000 Subject: [Live-devel] Streaming --- switching of files Message-ID: <20060830055711.16591.qmail@webmail71.rediffmail.com> Hi,I\'ve to switch between audio files at the time of streaming. For this purpose if I change the underlying source file(as you have suggested previously) then this can be achieved; but I want to know that, if I add another subsession with another(for my case its aac file) audio file, then is it possible to switch between different subsessions at the time of streaming. More precisely, I want to achieve --- after playing \"n1\" frames from first subsession, shift to second subsession and again come back to first subsession after finishing complete playing of second subsession --- using diferent subsessions. If it is possible, then how shall I proceed...Thanks in advance...Tilak *-- tilak -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060829/5a230d63/attachment.html From weiyutao36 at 163.com Wed Aug 30 01:24:03 2006 From: weiyutao36 at 163.com (weiyutao36) Date: Wed, 30 Aug 2006 16:24:03 +0800 (CST) Subject: [Live-devel] about presentationTime and NPT Message-ID: <44F54B23.000050.13199@bj163app26.163.com> Hi,Ross Thank you very much, but I am still confused now for my second question---perhaps you missed it: The playMediaSession() function requires a "start" parameter of "float type"(am I right?) and the parameter should be in the range of the media file such as "0-7.845000",which is the range of the media file mpg1video.mpg. So, my question is: how to transform the presentationTime to NPT of the media? Or what is the relationship between thepresentationTime and the NPT? Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060830/cfc86d57/attachment.html From lkovacs at xperts.hu Wed Aug 30 02:12:49 2006 From: lkovacs at xperts.hu (Levente Kovacs) Date: Wed, 30 Aug 2006 11:12:49 +0200 Subject: [Live-devel] streaming live MPEG4 with C code In-Reply-To: <20060829143716.37697.qmail@web56508.mail.re3.yahoo.com> References: <20060829143716.37697.qmail@web56508.mail.re3.yahoo.com> Message-ID: <20060830111249.33580bc6.lkovacs@xperts.hu> On Tue, 29 Aug 2006 07:37:16 -0700 (PDT) JC wrote: > I am trying to stream from a MPEG4 encoder. I was able to save the MPEG4 stream into test.m4v file and stream it out via testOnDemandSever. Now, I want to stream directly from the encoder. First, I run into the problem that the encoder libraries are based on C code, while live media libraries are based on C++. In the C code, I get each frame through a callback function. My question is where should I start with in order to feed live encoded MPEG4 frames to the live media library so that it could stream the live video out. I am looking into the code right now. But I am new, any guidance would be great help for me. Thanks ahead! It is more like programming, than LIVE related. I faced the same problem. What you have to do is to have your main program written in c++, and you can include C libraries into your c++ code. For more information, contact me off list. Levente From smartbrisk at 163.com Wed Aug 30 04:01:29 2006 From: smartbrisk at 163.com (=?gb2312?B?1cW99bfm?=) Date: Wed, 30 Aug 2006 19:01:29 +0800 (CST) Subject: [Live-devel] Question about streaming H.264 over Darwi n Streaming Server Message-ID: <44F57009.000126.06339@bj163app39.163.com> Hi, all, I have implemented a H.264 streamer, it can stream H.264 RTP packets to a specified ip and port. Now I used the streamer to create a SDP file and placed it under the Darwin's movies folder. I can use quicktime player to play the stream by directly openning the SDP file. However, I cannot use the VLC player or quicktime player to play the stream by open the rtsp url. Would you like tell me the reasons? Thank you very much! Jinfeng Zhang -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20060830/e7b4c179/attachment-0001.html From finlayson at live555.com Wed Aug 30 04:17:22 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Aug 2006 04:17:22 -0700 Subject: [Live-devel] Question about streaming H.264 over Darwi n Streaming Server In-Reply-To: <44F57009.000126.06339@bj163app39.163.com> References: <44F57009.000126.06339@bj163app39.163.com> Message-ID: <7.0.1.0.1.20060830041543.01fe8bc0@live555.com> > I have implemented a H.264 streamer, it can stream H.264 RTP > packets to a specified ip and port. Did you use our software ("LIVE555 Streaming Media") to implement your 'streamer' application? If not, then your question is not appropriate for this mailing list. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Wed Aug 30 08:44:14 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Aug 2006 08:44:14 -0700 Subject: [Live-devel] real time streaming In-Reply-To: <871c63c20608300647j1cbeba90p9b8cd7aab9f8607c@mail.gmail.co m> References: <871c63c20608300647j1cbeba90p9b8cd7aab9f8607c@mail.gmail.com> Message-ID: <7.0.1.0.1.20060830083733.0203cd48@live555.com> At 06:47 AM 8/30/2006, you wrote: >I am using the Live Media for sending real time video and audio based >on the test programs (mpeg4 and wav). >When I play the stream I get: >1. Latency about off 600 ms till the video played - maybe because the >program pack several video frames before sending them. No, that doesn't happen for video. When streaming to a media player, any significant latency that you see almost always occurs inside the receiving media player. There is little latency in the "LIVE555 Streaming Media" code for transmitting RTP. >2. I get delay between the video and the audio streams (lipsing >synchronization). Audio/video synchronization should work, *provided that*: - You implement RTCP, by creating a "RTCPInstance" object for both the audio and video streams, and - The presentation timestamps ("fPresentationTime") generated at the server end must (for both audio and video) must be accurate, and synchronized to 'wall clock' time (e.g., as returned by "gettimeofday()"). Because you've developed your own code, I probably can't help you more than this. But Remember, You Have Complete Source Code. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Wed Aug 30 08:51:49 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Aug 2006 08:51:49 -0700 Subject: [Live-devel] about presentationTime and NPT In-Reply-To: <44F54B23.000050.13199@bj163app26.163.com> References: <44F54B23.000050.13199@bj163app26.163.com> Message-ID: <7.0.1.0.1.20060830085014.01fe8bc0@live555.com> > The playMediaSession() function requires a "start" parameter of > "float type"(am I right?) and the parameter should be in the range > of the media file such as "0-7.845000",which is the range of the > media file mpg1video.mpg. > So, my question is: how to transform the presentationTime to NPT > of the media? Or what is the relationship between > thepresentationTime and the NPT? The first presentation time you receive will correspond to NPT 0. After that, you can do the math... Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Wed Aug 30 08:52:13 2006 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Aug 2006 08:52:13 -0700 Subject: [Live-devel] Streaming --- switching of files In-Reply-To: <20060830055711.16591.qmail@webmail71.rediffmail.com> References: <20060830055711.16591.qmail@webmail71.rediffmail.com> Message-ID: <7.0.1.0.1.20060830085156.020380e8@live555.com> >I've to switch between audio files at the time of streaming. For >this purpose if I change the underlying source file(as you have >suggested previously) then this can be achieved; but I want to know >that, if I add another subsession with another(for my case its aac >file) audio file, then is it possible to switch between different >subsessions at the time of streaming. No. Ross Finlayson Live Networks, Inc. (LIVE555.COM)