From guiluge at gmail.com Mon Sep 1 00:09:11 2008 From: guiluge at gmail.com (Guillaume Ferry) Date: Mon, 1 Sep 2008 09:09:11 +0200 Subject: [Live-devel] RTSP Wav Receiver Message-ID: <425f13530809010009i1c73d03g5fe2e0b6e8ce77de@mail.gmail.com> Hi there, I'm interested in building a wav receiver from RTSP url, with Live555. I'm not very familiar with this library, and I don't know where to star well : I noticed you can easily grab rtsp frames and dump it to a raw file, with openRTSP, but I'd like to have thoses frames copied in a variable, so I can use them directly in my code. Which class(es) should I use to achieve this ? Thanks in advance, Regards. Guillaume Ferry. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 1 00:17:34 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 1 Sep 2008 00:17:34 -0700 Subject: [Live-devel] RTSP Wav Receiver In-Reply-To: <425f13530809010009i1c73d03g5fe2e0b6e8ce77de@mail.gmail.com> References: <425f13530809010009i1c73d03g5fe2e0b6e8ce77de@mail.gmail.com> Message-ID: >I'm interested in building a wav receiver from RTSP url, with Live555. > >I'm not very familiar with this library, and I don't know where to >star well : I noticed you can easily grab rtsp frames and dump it to >a raw file, with openRTSP, but I'd like to have thoses frames copied >in a variable, so I can use them directly in my code. > >Which class(es) should I use to achieve this ? You will need to write your own "MediaSink" subclass, and use this instead of "FileSink". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From guiluge at gmail.com Mon Sep 1 00:41:08 2008 From: guiluge at gmail.com (Guillaume Ferry) Date: Mon, 1 Sep 2008 09:41:08 +0200 Subject: [Live-devel] RTSP Wav Receiver Message-ID: <425f13530809010041m3374c608m271eb0fb92303015@mail.gmail.com> Hi, Thanks for this quick reply. I'm a little confused with this class. Do you have some hints, or better, a minimal example :-D ? Thanks again, Guillaume. PS : is there somewhere a good tutorial to start with ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 1 00:46:17 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 1 Sep 2008 00:46:17 -0700 Subject: [Live-devel] RTSP Wav Receiver In-Reply-To: <425f13530809010041m3374c608m271eb0fb92303015@mail.gmail.com> References: <425f13530809010041m3374c608m271eb0fb92303015@mail.gmail.com> Message-ID: >Hi, > >Thanks for this quick reply. > >I'm a little confused with this class. > >Do you have some hints, or better, a minimal example :-D ? Look at the existing "FileSink" class for hints. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From guiluge at gmail.com Mon Sep 1 01:48:12 2008 From: guiluge at gmail.com (Guillaume Ferry) Date: Mon, 1 Sep 2008 10:48:12 +0200 Subject: [Live-devel] RTSP Wav Receiver Message-ID: <425f13530809010148p2c760c39s65414911bdf247b7@mail.gmail.com> Ok, I made a simple subclass from FileSink, and I tested it with openRTSP. The sink initializes well, but it doesn't 'see' any frames. When I try to enable getNextFrame in my code, I get a conversion error. I must be missing something... :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guiluge at gmail.com Mon Sep 1 02:01:29 2008 From: guiluge at gmail.com (Guillaume Ferry) Date: Mon, 1 Sep 2008 11:01:29 +0200 Subject: [Live-devel] RTSP Wav Receiver Message-ID: <425f13530809010201r5150e8b7v6c31b903e6486a64@mail.gmail.com> Ok, I found out, it was a stupid syntax error :) Now I receive frames, but strangely, framesize is set to 0. Do I need to specify some buffer input sizes, or else ? Thanks Guillaume. -------------- next part -------------- An HTML attachment was scrubbed... URL: From guiluge at gmail.com Mon Sep 1 04:57:55 2008 From: guiluge at gmail.com (Guillaume Ferry) Date: Mon, 1 Sep 2008 13:57:55 +0200 Subject: [Live-devel] RTSP Wav Receiver Message-ID: <425f13530809010457k5757369au14eb765dc05b1d58@mail.gmail.com> Ok, Sorry for all those previous posts, I'm almost there :) Based a lot on openRTSP, I have a full RTSP client that (almost) works ! My last issue, is I can't read subsession source ( subsession->readSource() returns NULL). I must be missing something, any idea ? Guillaume. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrbrain at email.it Mon Sep 1 09:43:12 2008 From: mrbrain at email.it (MrBrain) Date: Mon, 1 Sep 2008 18:43:12 +0200 Subject: [Live-devel] Lost Packets Ratio Message-ID: <138a0a35342b6fb4fd6b4932cbf51ee5@79.19.255.118> Hello, I am using testMP3Streamer and testMP3Receiver.Is there a way to know how many packets (RTP or UDP) have been lost? I need to know it while data is still being transmitted, not at the end.Thanks in advance.   Marco Dell'Anna. -- Email.it, the professional e-mail, gratis per te: http://www.email.it/f Sponsor: Gioca con i Supereroi Marvel sul cellulare! Clicca qui: http://adv.email.it/cgi-bin/foclick.cgi?mid=7752&d=20080901 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 1 11:03:37 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 1 Sep 2008 11:03:37 -0700 Subject: [Live-devel] Lost Packets Ratio In-Reply-To: <138a0a35342b6fb4fd6b4932cbf51ee5@79.19.255.118> References: <138a0a35342b6fb4fd6b4932cbf51ee5@79.19.255.118> Message-ID: >I am using testMP3Streamer and testMP3Receiver. >Is there a way to know how many packets (RTP or UDP) have been lost? Yes - you can look at the "RTPReceptionStats" for the "RTPSource" object. Note how "openRTSP" implements its "-Q" option, using the code for "qosMeasurementRecord::periodicQOSMeasurement()" (in "testProgs/playCommon.cpp") -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mike at subfocal.net Tue Sep 2 12:15:21 2008 From: mike at subfocal.net (Mike Mueller) Date: Tue, 2 Sep 2008 15:15:21 -0400 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> Message-ID: <20080902191521.GR20854@samus.subfocal.net> On Fri, Aug 29, 2008 at 01:16:28PM -0700, Ross Finlayson wrote: >> Something has changed. > > The code that writes '.mov' or '.mp4' files is "QuickTimeFileSink.cpp". > In the past year or so, there have been no significant changes to this > file, apart from adding support for H.264 video (which is not relevant > for your stream). I understand your point Ross. However, on the off chance that an older version *does* work for me, however, is there a way for me to get one? Is there a subversion repository, or a place to get older versions? Thanks, Mike PS - Ron, thanks for the detailed writeup. I'm going to play with your suggestions today. -- Mike Mueller mike at subfocal.net From finlayson at live555.com Tue Sep 2 13:43:54 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 2 Sep 2008 13:43:54 -0700 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: <20080902191521.GR20854@samus.subfocal.net> References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> <20080902191521.GR20854@samus.subfocal.net> Message-ID: We don't make available old versions of the code, and offer absolutely no support for old versions. Everyone should work with the newest version of the code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rmcouat at smartt.com Tue Sep 2 17:47:02 2008 From: rmcouat at smartt.com (Ron McOuat) Date: Tue, 02 Sep 2008 17:47:02 -0700 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> <20080902191521.GR20854@samus.subfocal.net> Message-ID: <48BDDE86.9000709@smartt.com> Ross Finlayson wrote: > We don't make available old versions of the code, and offer absolutely > no support for old versions. Everyone should work with the newest > version of the code. I am still working on this to find the cause of what I am observing and have not finished fully identifying this. Here is what I have found so far in case that is enough. Axis camera is set for 320x240 image size at 7 frames/sec. The GOV setting is 8 which I believe means an I frame followed by 7 P frames and then the cycle repeats. I tested several versions of live555 and was only able to reproduce this in the version that is current 2008-07-25. If I play the movie produced back in QuickTime player and step it one frame at a time the seconds display embedded in the image advances one second for each frame which is not right. After a few seconds of playback wall clock time or 10-12 seconds of video frame time the playback becomes normal with respect to watching time advance on the playback. I also see the full 7 frames for each second in the image while single stepping in the later part of the video instead of just one frame like in the beginning. VLC also plays the recorded files the same way. When recording from an Axis camera it is like joining a live broadcast in process, the first frames could be P frames because on playback I have observed green screen until the first I frame comes in. I mention this in case it is relevant. The annoying part of finding this is it doesn't always happen, on average about 50% of the recordings have this high speed playback on the front. It could be the initial part is very short in some cases and difficult to observe. I tested 10 times with each version to get some level of confidence the fast video was not on the front for recordings made by earlier versions. The camera and recording system (Mac Book Pro 2.33 GHz) are on the same 100 MBit/sec LAN. Looking at a diff of the source to the previous version the only thing I can see that remotely looks like it might affect what is recorded is the class MultiFramedRTPSource was changed to treat the first packet as having packet loss. My next step is to patch that change out and see if the recorded video files still act the same or clear up. Is there a utility out there that will scan over a recorded stream in a file and show the frames by type etc. I could record in a format other than QuickTime and use a different playback program? I got Dumpster from the Apple site but it doesn't get into the frame details. I suppose I could write one but other work has higher priority. Any comments or suggestions welcome. Ron From finlayson at live555.com Tue Sep 2 18:28:41 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 2 Sep 2008 18:28:41 -0700 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: <48BDDE86.9000709@smartt.com> References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> <20080902191521.GR20854@samus.subfocal.net> <48BDDE86.9000709@smartt.com> Message-ID: I suggest that you ask on a QuickTime mailing list why your recorded file is not (always?) playing properly. (You may need to provide a link to an example file.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rmcouat at smartt.com Tue Sep 2 19:45:46 2008 From: rmcouat at smartt.com (Ron McOuat) Date: Tue, 02 Sep 2008 19:45:46 -0700 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> <20080902191521.GR20854@samus.subfocal.net> <48BDDE86.9000709@smartt.com> Message-ID: <48BDFA5A.5070301@smartt.com> Well actually I thought I explained fairly clearly that the file is not recorded properly using openRTSP. QuickTime is playing fine on everything else I work with. The burden of proof still remains with me so I am still working on it. Ross Finlayson wrote: > I suggest that you ask on a QuickTime mailing list why your recorded > file is not (always?) playing properly. (You may need to provide a > link to an example file.) From finlayson at live555.com Wed Sep 3 01:34:36 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Sep 2008 01:34:36 -0700 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: <48BDFA5A.5070301@smartt.com> References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> <20080902191521.GR20854@samus.subfocal.net> <48BDDE86.9000709@smartt.com> <48BDFA5A.5070301@smartt.com> Message-ID: >Well actually I thought I explained fairly clearly that the file is >not recorded properly using openRTSP. Right, but it would be nice to know exactly why the file doesn't play properly. You might need a QuickTime expert to tell you why. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From leo at alaxarxa.net Wed Sep 3 12:11:45 2008 From: leo at alaxarxa.net (Leopold Palomo-Avellaneda) Date: Wed, 3 Sep 2008 21:11:45 +0200 Subject: [Live-devel] Obtaining frames Message-ID: <200809032111.46158.leo@alaxarxa.net> Hi, probably many people have asked this before but I'm having a conceptual problem with an example. I'm trying to develop a rtsp client. I have looked the openRTSP example and the mplayer part (as you have said in the web). My doubt is about : env->taskScheduler().doEventLoop(); So, if I would like to obtain a frame from another thread using the subsession should I to stop the event loop or can I obtain the frame calling the Subsession->readSource()->getNextFrame(...) Regards, Leo From finlayson at live555.com Wed Sep 3 12:27:30 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Sep 2008 12:27:30 -0700 Subject: [Live-devel] Obtaining frames In-Reply-To: <200809032111.46158.leo@alaxarxa.net> References: <200809032111.46158.leo@alaxarxa.net> Message-ID: >I'm trying to develop a rtsp client. I have looked the openRTSP example and >the mplayer part (as you have said in the web). > >My doubt is about : >env->taskScheduler().doEventLoop(); > >So, if I would like to obtain a frame from another thread using the subsession >should I to stop the event loop or can I obtain the frame calling the > >Subsession->readSource()->getNextFrame(...) Before entering the event loop, you call "startPlaying()" on each 'sink' object. This starts things running. You then call "doEventLoop()" to enter the event loop. You do *not* return from this function (except to exit the application). I.e., all processing of incoming data happens within the event loop. Each sink object processes incoming data in the function that it passed as a parameter to "getNextFrame()". (This al happens within the event loop.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From leo at alaxarxa.net Wed Sep 3 13:26:58 2008 From: leo at alaxarxa.net (Leopold Palomo Avellaneda) Date: Wed, 3 Sep 2008 22:26:58 +0200 Subject: [Live-devel] Obtaining frames In-Reply-To: References: <200809032111.46158.leo@alaxarxa.net> Message-ID: <200809032227.03593.leo@alaxarxa.net> A Dimecres 03 Setembre 2008, Ross Finlayson va escriure: > >I'm trying to develop a rtsp client. I have looked the openRTSP example > > and the mplayer part (as you have said in the web). > > > >My doubt is about : > >env->taskScheduler().doEventLoop(); > > > >So, if I would like to obtain a frame from another thread using the > > subsession should I to stop the event loop or can I obtain the frame > > calling the > > > >Subsession->readSource()->getNextFrame(...) > > Before entering the event loop, you call "startPlaying()" on each > 'sink' object. This starts things running. ok. > You then call "doEventLoop()" to enter the event loop. You do *not* > return from this function (except to exit the application). I.e., > all processing of incoming data happens within the event loop. I know it ... but I can create and call this function in one thread, that will be blocked and do calls in from another thread. > Each sink object processes incoming data in the function that it > passed as a parameter to "getNextFrame()". (This al happens within > the event loop.) I think that the question is the classical question. I would like to develop a gui using the lib. The typical rtsp gui client. I can encapsulate all the live555 code in one class, where some the init parameters define the env, as the openrtsp example. So, I create one thread with this class (where I run the doEventLoop, that doesn't return) and from the others threads I'm obtaining frames periodically. The main doubt about it is if I can do it and call the Subsession->readSource()->getNextFrame(...), without problems, or should I stop the event loop, and after call the getnextframe and after reengage the loop? or, should I to derive the taskScheduler class and re implement the doEventLoop, and extract the data from the internal structures and put it in some global/accessible vars in the class? Regards, Leo -- -- Linux User 152692 PGP: 0xF944807E Catalonia -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 197 bytes Desc: This is a digitally signed message part. URL: From erik at hovland.org Wed Sep 3 13:49:16 2008 From: erik at hovland.org (Erik Hovland) Date: Wed, 3 Sep 2008 13:49:16 -0700 Subject: [Live-devel] [PATCH] rebased patches In-Reply-To: <20080724023039.GB7393@hovland.org> References: <20080724023039.GB7393@hovland.org> Message-ID: <20080903204916.GA21250@hovland.org> On Wed, Jul 23, 2008 at 07:30:39PM -0700, Erik Hovland wrote: > Resubmission of both the uninitialized ctor and compiler warning > patches. I recently made some updates to this patch. So I am sending it as a reply to my last submissions. It is compressed to get through the attachment filter. The compiler warning patch is not included as I have not changed it since last submission. E -- Erik Hovland mail: erik at hovland.org web: http://hovland.org/ PGP/GPG public key available on request -------------- next part -------------- A non-text attachment was scrubbed... Name: liveMedia-uninit-in-ctor.patch.bz2 Type: application/octet-stream Size: 9146 bytes Desc: not available URL: From finlayson at live555.com Wed Sep 3 14:45:11 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Sep 2008 14:45:11 -0700 Subject: [Live-devel] Obtaining frames In-Reply-To: <200809032227.03593.leo@alaxarxa.net> References: <200809032111.46158.leo@alaxarxa.net> <200809032227.03593.leo@alaxarxa.net> Message-ID: > > You then call "doEventLoop()" to enter the event loop. You do *not* >> return from this function (except to exit the application). I.e., >> all processing of incoming data happens within the event loop. > >I know it ... but I can create and call this function in one thread, that will >be blocked and do calls in from another thread. No you can't! Please read the FAQ. The event loop, and all event handlers, must be run within a single thread. >The main doubt about it is if I can do it and call the >Subsession->readSource()->getNextFrame(...), > >without problems, or should I stop the event loop, and after call the >getnextframe and after reengage the loop? or, No. You don't stop the event loop. The call to "getNextFrame()" is done within your sink object's (i.e., of the "FramedSink" subclass that you will write, to do decoding/rendering) "continuePlaying()" virtual function. This all happens within the event loop. >should I to derive the taskScheduler class and re implement the doEventLoop, If your GUI also works using an event loop, then you might wish to write your own subclass of "taskScheduler" so that the event loop handles both incoming network packets (as it does now), and GUI events. (That's not mandatory, though.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From leo at alaxarxa.net Wed Sep 3 15:11:29 2008 From: leo at alaxarxa.net (Leopold Palomo Avellaneda) Date: Thu, 4 Sep 2008 00:11:29 +0200 Subject: [Live-devel] Obtaining frames In-Reply-To: References: <200809032111.46158.leo@alaxarxa.net> <200809032227.03593.leo@alaxarxa.net> Message-ID: <200809040011.29938.leo@alaxarxa.net> A Dimecres 03 Setembre 2008, Ross Finlayson va escriure: > > > You then call "doEventLoop()" to enter the event loop. You do *not* > >> > >> return from this function (except to exit the application). I.e., > >> all processing of incoming data happens within the event loop. > > > >I know it ... but I can create and call this function in one thread, that > > will be blocked and do calls in from another thread. > > No you can't! Please read the FAQ. The event loop, and all event > handlers, must be run within a single thread. Ok, I have read the faq, but it's not an easy thing to understand all. It's why I ask.!!! :-) But if I have a GUI, I have another thread and I have to ask to this part of code about data, I guess. > >The main doubt about it is if I can do it and call the > >Subsession->readSource()->getNextFrame(...), > > > >without problems, or should I stop the event loop, and after call the > >getnextframe and after reengage the loop? or, > > No. You don't stop the event loop. The call to "getNextFrame()" is > done within your sink object's (i.e., of the "FramedSink" subclass > that you will write, to do decoding/rendering) "continuePlaying()" > virtual function. This all happens within the event loop. > > >should I to derive the taskScheduler class and re implement the > > doEventLoop, > > If your GUI also works using an event loop, then you might wish to > write your own subclass of "taskScheduler" so that the event loop > handles both incoming network packets (as it does now), and GUI > events. (That's not mandatory, though.) Ok, I have to look more inside. Someone publish a tutorial about it [1], but the url returns a 404. Thanks , Best regards, Leo [1] http://lists.live555.com/pipermail/live-devel/2007-June/007030.html -- -- Linux User 152692 PGP: 0xF944807E Catalonia -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 197 bytes Desc: This is a digitally signed message part. URL: From peeyushduttamishra at gmail.com Wed Sep 3 11:07:43 2008 From: peeyushduttamishra at gmail.com (Peeyush Mishra) Date: Wed, 3 Sep 2008 23:37:43 +0530 Subject: [Live-devel] RTSP authentication using vlc does not work Message-ID: Hi Ross, I want to implement authentication in our video multicast server , I enabled authentication at server side but when I fire request from vlc (rtsp://server ip:port/track) , after Describe method server sends "401 Authorization fails" and then communication fails ..., if I try through openRTSP clinet with "-u" flag then it works fine ,,,, -- what should we do to make it working with vlc client..... Thanks Peeyush Mishra -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 3 16:46:29 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Sep 2008 16:46:29 -0700 Subject: [Live-devel] RTSP authentication using vlc does not work In-Reply-To: References: Message-ID: >I want to implement authentication in our video multicast server , >I enabled authentication at server side but when I fire request from >vlc (rtsp://server ip:port/track) , after Describe method server >sends "401 Authorization fails" and then communication fails ..., > if I try through openRTSP clinet with "-u" flag then it works fine ,,,, >-- what should we do to make it working with vlc client..... This is a question about VLC's user interface, and therefore belongs on the "vlc at videolan.org" mailing list - not this list. (VLC is not our software (although it does use some of our software).) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jimv at iwaynet.net Wed Sep 3 17:13:53 2008 From: jimv at iwaynet.net (Jim Vaigl) Date: Wed, 3 Sep 2008 20:13:53 -0400 Subject: [Live-devel] streaming mpeg-2 with Message-ID: <52DCB00010724BFE89C9FC911B735D01@PLANCK> Hi, I have a newbie question, just trying to get a nudge in the right direction. I'm pulling mpeg-2 data off of a card on a windows PC. I have written a console app that spools this to a file. What I'd like to do is instead of writing my input buffer continuously to a file, stream it to VLC in real-time. I've found the 'testMPEG2TransportStreamer' example, and can build it and use it to play my pre-recorded files in VLC successfully. I'm having a hard time trying to understand how I could modify the program to, instead of reading from the file, allow me to point to one of the buffers of mpeg2 data and say 'stream this now'. I've been looking at the class library, and it seems like I need to implement a framed source that gives the next spot in my buffer to the caller, but I'm not sure, and I can't tell if there's already a derived class that's well suited to what I want. Can you point me to which classes to look at, or a better example program if testMPEG2TransportStreamer isn't what I should be looking at for what I'm trying to do? Thanks for all you do! --Jim __________________________________________________________________ Jim Vaigl Rockbridge Software, LLC jimv at iwaynet.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 3 18:12:23 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Sep 2008 18:12:23 -0700 Subject: [Live-devel] streaming mpeg-2 with In-Reply-To: <52DCB00010724BFE89C9FC911B735D01@PLANCK> References: <52DCB00010724BFE89C9FC911B735D01@PLANCK> Message-ID: >I've found the 'testMPEG2TransportStreamer' example, and can build >it and use it to play my pre-recorded files in VLC successfully. >I'm having a hard time trying to understand how I could modify the >program to, instead of reading from the file, allow me to point to >one of the buffers of mpeg2 data and say 'stream this now'. See http://www.live555.com/liveMedia/faq.html#liveInput -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Thu Sep 4 01:24:24 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Thu, 4 Sep 2008 11:24:24 +0300 Subject: [Live-devel] Capturing video from device Message-ID: Hi, I am imlementing live RTSP video streaming server which capture video from live camera, encoded it in hardware and its output are NAL units. I noticed the DeviceSource template and got some questions about using it. 1. Does my flow should be DevideSource::doGetNextFrame -> H264Framer ->H264RTPSinc. 2. Does the Framer is source or filter. 3. If my encoder output is NAL units, what is the function of the framer. 4. what is the functionality that should be given to the DevideSource::deliverFrame(). 5. If my camera got it frame rate (PAL 25 fps), how do I implemet different farem rate using the schedualer. Thank you, Regards, Amit Yedidia Email: amit.yedidia at elbitsystems.com ---------------------------------------------------------- The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 4 01:41:04 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Sep 2008 01:41:04 -0700 Subject: [Live-devel] Capturing video from device In-Reply-To: References: Message-ID: >I am imlementing live RTSP video streaming server which capture >video from live camera, encoded it in hardware and its output are >NAL units. > >I noticed the DeviceSource template and got some questions about using it. > >1. Does my flow should be > > DevideSource::doGetNextFrame -> H264Framer ->H264RTPSinc. Yes, sort of. The data flow is: Your source of NAL units -> Your subclass of H264VideoStreamFramer -> H264VideoRTPSink > >2. Does the Framer is source or filter. "H264VideoStreamFramer", and thus the subclass of "H264VideoStreamFramer" that you will write, is a filter. > >3. If my encoder output is NAL units, what is the function of the framer. Two things. First, you must implement the virtual function "currentNALUnitEndsAccessUnit()", which returns True iff the current NAL unit is the end of a complete video frame. Second, you must set the "fPresentationTime" variable appropriately for each NAL unit. > >4. what is the functionality that should be given to the >DevideSource::deliverFrame(). Delivering a single NAL unit, in response to each call to "doGetNextFrame()" > >5. If my camera got it frame rate (PAL 25 fps), how do I implemet >different farem rate using the schedualer. If your capture device delivers NAL units in 'real time', then you don't have to do anything special. The NAL units will get consumed at the rate at which they arrive. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Thu Sep 4 02:46:22 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Thu, 4 Sep 2008 12:46:22 +0300 Subject: [Live-devel] Capturing video from device In-Reply-To: Message-ID: Thanks Ross, It helps a lot. So the deliverFrame() should handle what is needed to get the encoded data from my encoder. and doGetNextFrame() should call deliverFrame(). 1. Does deliverFrame() should be blocking tillit acquire new frame data? 2. who initiate the call: Does the ecoder notify the sinc there is new frame data, so the sinc can call doGetNextFrame() or The sink when fininsh handling one frame immediatly call doGetNextFrame() and blocked till it return with new frame Thanks. Amit Yedidia The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From giorglev at yahoo.com Thu Sep 4 02:53:30 2008 From: giorglev at yahoo.com (Giorgio Levantini) Date: Thu, 4 Sep 2008 02:53:30 -0700 (PDT) Subject: [Live-devel] how to manage unpexpected network problems when Receiving rtp packets Message-ID: <94209.66599.qm@web59706.mail.ac4.yahoo.com> Hi, I have to receive for several days a stream from an Axis camera using an RTSPClient instance (similarly to openRTSP's implementation) Now, if for some reasons it occurs a network problem (for example, a LAN switch is shut down), and the normal situation is restored after some hours, is? there a way to make the client automatically continue in receiving packets as soon as the problem is solved, without adding any logic in order to manage that? Regards, Giorgio L -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 4 07:46:02 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Sep 2008 07:46:02 -0700 Subject: [Live-devel] Capturing video from device In-Reply-To: References: Message-ID: >So the deliverFrame() should handle what is needed to get the >encoded data from my encoder. >and doGetNextFrame() should call deliverFrame(). >1. Does deliverFrame() should be blocking tillit acquire new frame data? No, neither "doGetNextFrame()" nor "deliverFrame()" should block (because that would prevent other events from being handled). >2. who initiate the call: >Does the ecoder notify the sinc there is new frame data, so the sinc >can call doGetNextFrame() >or >The sink when fininsh handling one frame immediatly call >doGetNextFrame() and blocked till it return with new frame The latter. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bendeguy at gmail.com Thu Sep 4 11:10:53 2008 From: bendeguy at gmail.com (Miklos Szeles) Date: Thu, 4 Sep 2008 20:10:53 +0200 Subject: [Live-devel] H264 over RTP problem Message-ID: Hi, I'm trying to receive H264 video stream(unicast RTP) from a camera. It works well for 320x240 video stream, but when I increase the resolution to 640x480 some frames can't be decoded. It looks like their end is missing. Is there any limitation on the size of the video frames? If there is, where/how can I change it? Best Regards, Miklos From finlayson at live555.com Thu Sep 4 11:43:41 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Sep 2008 11:43:41 -0700 Subject: [Live-devel] H264 over RTP problem In-Reply-To: References: Message-ID: >I'm trying to receive H264 video stream(unicast RTP) from a camera. It >works well for 320x240 video stream, but when I increase the >resolution to 640x480 some frames can't be decoded. It looks like >their end is missing. >Is there any limitation on the size of the video frames? No. However, the *sink* object that you use to receive incoming H.264 NAL units needs to be big enough for each NAL unit. Perhaps you just need to use a larger buffer size? (You didn't say exactly how (i.e., with what 'sink') you are receiving the data.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From liuxy66 at gmail.com Thu Sep 4 18:07:33 2008 From: liuxy66 at gmail.com (YiMing Liu) Date: Fri, 5 Sep 2008 09:07:33 +0800 Subject: [Live-devel] problem on how to assemble the audio and video file into a total file use openRTSP Message-ID: hi all: When I useing openRTSP to down a mpg file, openRTSP.exe rtsp:// 172.28.40.123/video.mpg (I use liveMediaServer as RTSP Server) I got two files: audio and video file separately, and now I want to get a total file with audio and video, how can I do it? I read the openRTSP code, and I know the MediaSubsession will save audio and video data, but how to save the audio and video in one file, like .mp4? Thanks in advantage! YiMing Liu -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 4 18:44:15 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Sep 2008 18:44:15 -0700 Subject: [Live-devel] problem on how to assemble the audio and video file into a total file use openRTSP In-Reply-To: References: Message-ID: >When I useing openRTSP to down a mpg file, openRTSP.exe >rtsp://172.28.40.123/video.mpg (I >use liveMediaServer as RTSP Server) > >I got two files: audio and video file separately, and now I want to >get a total file with audio and video, how can I do it? There's currently no way to combine the audio and video streams into a MPEG Program Stream file (like the original). However, you *might* be able to output a ".mp4" or ".mov"-format file, or perhaps even a ".avi" format file, that contains the audio+video. See http://www.live555.com/openRTSP/#quicktime -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liuxy66 at gmail.com Thu Sep 4 19:07:19 2008 From: liuxy66 at gmail.com (YiMing Liu) Date: Fri, 5 Sep 2008 10:07:19 +0800 Subject: [Live-devel] problem on how to assemble the audio and video file into a total file use openRTSP In-Reply-To: References: Message-ID: thanks, the original file is a .mpg file, and my openRTSP command is :openRTSP -4 >haha.mp4 rtsp://172.28.40.101/video.mpg but the test.mp4 also without audio when I use Windows Media Player play it. Am I made some mistake? I want to save it to a mp4 file with video and audio. ~~ Many thanks. 2008/9/5 Ross Finlayson > When I useing openRTSP to down a mpg file, openRTSP.exe rtsp:// > 172.28.40.123/video.mpg (I use liveMediaServer as RTSP Server) > > > I got two files: audio and video file separately, and now I want to get a > total file with audio and video, how can I do it? > > > There's currently no way to combine the audio and video streams into a MPEG > Program Stream file (like the original). However, you *might* be able to > output a ".mp4" or ".mov"-format file, or perhaps even a ".avi" format file, > that contains the audio+video. See > http://www.live555.com/openRTSP/#quicktime > > -- > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From liuxy66 at gmail.com Thu Sep 4 19:21:28 2008 From: liuxy66 at gmail.com (YiMing Liu) Date: Fri, 5 Sep 2008 10:21:28 +0800 Subject: [Live-devel] problem on how to assemble the audio and video file into a total file use openRTSP In-Reply-To: References: Message-ID: thanks, the original file is a .mpg file, and my openRTSP command is :openRTSP -4 >haha.mp4 rtsp://172.28.40.101/video.mpg but the test.mp4 also without audio when I use Windows Media Player play it. Am I made some mistake? I want to save it to a mp4 file with video and audio. ~~ Many thanks. ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// D:\MyApp\HK\RTSPPOC\TestApp\openRTSP\debug>openRTSP.exe -4 >haha.mp4 rtsp: //172.28.40.101/video.mpg Warning: The -q, -4 or -i option was used, but not -w. Assuming a video width o f 240 pixels Warning: The -q, -4 or -i option was used, but not -h. Assuming a video height of 180 pixels Warning: The -q, -4 or -i option was used, but not -f. Assuming a video frame r ate of 15 frames-per-second Sending request: OPTIONS rtsp://172.28.40.101/video.mpg RTSP/1.0 CSeq: 1 User-Agent: sdkTest.exe (LIVE555 Streaming Media v2008.07.24) Received OPTIONS response: RTSP/1.0 200 OK CSeq: 1 Date: Fri, Sep 05 2008 01:56:23 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE Sending request: DESCRIBE rtsp://172.28.40.101/video.mpg RTSP/1.0 CSeq: 2 Accept: application/sdp User-Agent: sdkTest.exe (LIVE555 Streaming Media v2008.07.24) Received DESCRIBE response: RTSP/1.0 200 OK CSeq: 2 Date: Fri, Sep 05 2008 01:56:23 GMT Content-Base: rtsp://172.28.40.101/video.mpg/ Content-Type: application/sdp Content-Length: 455 Need to read 455 extra bytes Read 455 extra bytes: v=0 o=- 5631314726 1 IN IP4 172.28.40.101 s=MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server i=video.mpg t=0 0 a=tool:LIVE555 Streaming Media v2008.02.08 a=type:broadcast a=control:* a=range:npt=0-170.097 a=x-qt-text-nam:MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server a=x-qt-text-inf:video.mpg m=video 0 RTP/AVP 32 c=IN IP4 0.0.0.0 a=control:track1 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 a=control:track2 Opened URL "rtsp://172.28.40.101/video.mpg", returning a SDP description: v=0 o=- 5631314726 1 IN IP4 172.28.40.101 s=MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server i=video.mpg t=0 0 a=tool:LIVE555 Streaming Media v2008.02.08 a=type:broadcast a=control:* a=range:npt=0-170.097 a=x-qt-text-nam:MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server a=x-qt-text-inf:video.mpg m=video 0 RTP/AVP 32 c=IN IP4 0.0.0.0 a=control:track1 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 a=control:track2 Created receiver for "video/MPV" subsession (client ports 2272-2273) Created receiver for "audio/MPA" subsession (client ports 2274-2275) Sending request: SETUP rtsp://172.28.40.101/video.mpg/track1 RTSP/1.0 CSeq: 3 Transport: RTP/AVP;unicast;client_port=2272-2273 User-Agent: sdkTest.exe (LIVE555 Streaming Media v2008.07.24) Received SETUP response: RTSP/1.0 200 OK CSeq: 3 Date: Fri, Sep 05 2008 01:56:28 GMT Transport: RTP/AVP;unicast;destination=172.28.40.100;source=172.28.40.101 ;client _port=2272-2273;server_port=6970-6971 Session: 1 Setup "video/MPV" subsession (client ports 2272-2273) Sending request: SETUP rtsp://172.28.40.101/video.mpg/track2 RTSP/1.0 CSeq: 4 Transport: RTP/AVP;unicast;client_port=2274-2275 Session: 1 User-Agent: sdkTest.exe (LIVE555 Streaming Media v2008.07.24) Received SETUP response: RTSP/1.0 200 OK CSeq: 4 Date: Fri, Sep 05 2008 01:56:28 GMT Transport: RTP/AVP;unicast;destination=172.28.40.100;source=172.28.40.101 ;client _port=2274-2275;server_port=6972-6973 Session: 1 Setup "audio/MPA" subsession (client ports 2274-2275) Warning: We don't implement a QuickTime Video Media Data Type for the "MPV" trac k, so we'll insert a dummy "????" Media Data Atom instead. A separate, codec-sp ecific editing pass will be needed before this track can be played. Warning: We don't implement a QuickTime Audio Media Data Type for the "MPA" trac k, so we'll insert a dummy "????" Media Data Atom instead. A separate, codec-sp ecific editing pass will be needed before this track can be played. Sending request: PLAY rtsp://172.28.40.101/video.mpg/ RTSP/1.0 CSeq: 5 Session: 1 Range: npt=0.000-170.097 User-Agent: sdkTest.exe (LIVE555 Streaming Media v2008.07.24) Received PLAY response: RTSP/1.0 200 OK CSeq: 5 Date: Fri, Sep 05 2008 01:56:28 GMT Range: npt=0.000-170.097 Session: 1 RTP-Info: url=rtsp:// 172.28.40.101/video.mpg/track1;seq=6291;rtptime=2147487686, url=rtsp://172.28.40.101/video.mpg/track2;seq=5005;rtptime=6938 Started playing session Receiving streamed data (for up to 175.097000 seconds)... Sending request: TEARDOWN rtsp://172.28.40.101/video.mpg/ RTSP/1.0 CSeq: 6 Session: 1 User-Agent: sdkTest.exe (LIVE555 Streaming Media v2008.07.24) Received TEARDOWN response: RTSP/1.0 200 OK CSeq: 6 Date: Fri, Sep 05 2008 01:59:24 GMT ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// 2008/9/5 YiMing Liu > thanks, > > the original file is a .mpg file, and my openRTSP command is :openRTSP -4 > >haha.mp4 rtsp://172.28.40.101/video.mpg > but the test.mp4 also without audio when I use Windows Media Player play > it. > Am I made some mistake? > > I want to save it to a mp4 file with video and audio. ~~ > > Many thanks. > > > 2008/9/5 Ross Finlayson > >> When I useing openRTSP to down a mpg file, openRTSP.exe rtsp:// >> 172.28.40.123/video.mpg (I use liveMediaServer as RTSP Server) >> >> >> I got two files: audio and video file separately, and now I want to get a >> total file with audio and video, how can I do it? >> >> >> There's currently no way to combine the audio and video streams into a >> MPEG Program Stream file (like the original). However, you *might* be able >> to output a ".mp4" or ".mov"-format file, or perhaps even a ".avi" format >> file, that contains the audio+video. See >> http://www.live555.com/openRTSP/#quicktime >> >> -- >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 4 23:16:28 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Sep 2008 23:16:28 -0700 Subject: [Live-devel] problem on how to assemble the audio and video file into a total file use openRTSP In-Reply-To: References: Message-ID: >thanks, > >the original file is a .mpg file, and my openRTSP command is >:openRTSP -4 >haha.mp4 >rtsp://172.28.40.101/video.mpg >but the test.mp4 also without audio when I use Windows Media Player play it. >Am I made some mistake? Yes, it was a mistake to use Windows Media Player - it's not standards compliant. Instead, try QuickTime Player, or VLC. Also, you *must* specify the correct video width, height, and frame rate, using the -w, -h, and -f options to openRTSP, respectively. If you don't use these options, you'll get default values, which might not be right. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bendeguy at gmail.com Fri Sep 5 01:22:59 2008 From: bendeguy at gmail.com (Miklos Szeles) Date: Fri, 5 Sep 2008 10:22:59 +0200 Subject: [Live-devel] H264 over RTP problem In-Reply-To: References: Message-ID: The used sink has a big enough buffer, so if there is some data loss, that appear before the sink. I saw the following lines in MultiFramedRTPSource: // Try to use a big receive buffer for RTP: increaseReceiveBufferTo(env, RTPgs->socketNum(), 50*1024); is it possible that my problem caused by this 50*1024 setting? On Thu, Sep 4, 2008 at 8:43 PM, Ross Finlayson wrote: >> I'm trying to receive H264 video stream(unicast RTP) from a camera. It >> works well for 320x240 video stream, but when I increase the >> resolution to 640x480 some frames can't be decoded. It looks like >> their end is missing. >> Is there any limitation on the size of the video frames? > > No. However, the *sink* object that you use to receive incoming H.264 NAL > units needs to be big enough for each NAL unit. Perhaps you just need to > use a larger buffer size? (You didn't say exactly how (i.e., with what > 'sink') you are receiving the data.) > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Fri Sep 5 01:32:56 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 5 Sep 2008 01:32:56 -0700 Subject: [Live-devel] H264 over RTP problem In-Reply-To: References: Message-ID: >The used sink has a big enough buffer, so if there is some data loss, >that appear before the sink. I saw the following lines in >MultiFramedRTPSource: > // Try to use a big receive buffer for RTP: > increaseReceiveBufferTo(env, RTPgs->socketNum(), 50*1024); >is it possible that my problem caused by this 50*1024 setting? Yes, possibly. See http://www.live555.com/liveMedia/faq.html#packet-loss -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mike at subfocal.net Fri Sep 5 11:33:47 2008 From: mike at subfocal.net (Mike Mueller) Date: Fri, 5 Sep 2008 14:33:47 -0400 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> <20080902191521.GR20854@samus.subfocal.net> Message-ID: <20080905183347.GY20854@samus.subfocal.net> On Tue, Sep 02, 2008 at 01:43:54PM -0700, Ross Finlayson wrote: > We don't make available old versions of the code, and offer absolutely no > support for old versions. Everyone should work with the newest version > of the code. Ok Ross, one more question... Can openRTSP cope with a variable framerate stream? If, for example, my encoding hardware can't keep up and is sending out a stream that changes framerates, is it possible that openRTSP would produce a file that plays at the wrong speed? If so, is there anything I can do (besides forcing myself to a lower framerate) to mitigate this? Thanks, Mike -- Mike Mueller mike at subfocal.net From finlayson at live555.com Fri Sep 5 12:47:30 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 5 Sep 2008 12:47:30 -0700 Subject: [Live-devel] openRTSP producing bad video files In-Reply-To: <20080905183347.GY20854@samus.subfocal.net> References: <20080828203112.GM20854@samus.subfocal.net> <20080828232435.GO20854@samus.subfocal.net> <48B7A429.1080007@smartt.com> <20080902191521.GR20854@samus.subfocal.net> <20080905183347.GY20854@samus.subfocal.net> Message-ID: >On Tue, Sep 02, 2008 at 01:43:54PM -0700, Ross Finlayson wrote: >> We don't make available old versions of the code, and offer absolutely no >> support for old versions. Everyone should work with the newest version >> of the code. > >Ok Ross, one more question... Can openRTSP cope with a variable >framerate stream? If, for example, my encoding hardware can't keep up >and is sending out a stream that changes framerates, is it possible that >openRTSP would produce a file that plays at the wrong speed? The problem is not openRTSP, or RTP/RTSP in general. The problem is the ".mp4"/".mov" file format. This file format is badly designed for what openRTSP is trying to do: record - into the file - incoming RTP streams. The code - for recording into ".mp4"/".mov" files - assumes that the frame rate is fixed (and specified in advance with the "-f" flag to openRTSP). If the frame rate is actually variable, then I'm not sure the resulting file will play properly. If, however, your stream really does have a fixed frame rate, but packets just happen to be sent at non-fixed intervals, then that should be OK. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From diego.barberio at redmondsoftware.com Fri Sep 5 13:00:47 2008 From: diego.barberio at redmondsoftware.com (Diego Barberio) Date: Fri, 5 Sep 2008 17:00:47 -0300 Subject: [Live-devel] Synchrnozing video and audio using OnDemandServerMediaSubsessions Message-ID: <008f01c90f92$186601f0$800101df@redmondsoftware.com> Hi all, I'm new to the live555 library. I have a MediaServerSession with two SubSessions (one for H263 video and the other for G.711 A-law audio), both SubSessions extend from OnDemandServerMediaSubsession, the one I use for the video is called CH263plusVideoDXServerMediaSubsession and the other is called CALawAudioDXServerMediaSubsession. I have also two FramedSources one for each SubSession, one called CH263plusVideoDXFrameSource and the other CAlawAudioDXFrameSource. The streaming for both medias works perfectly, but the audio is delayed about 1.5 seconds from the video. To solve this I've tried to delay the video subsession by adding 1500 milliseconds to the fPresentationTime attribute in CH263plusVideoDXServerMediaSubsession::doGetNextFrame method, however no change was perceived. So I started googling this problem, until I reached to the question "Why do most RTP sessions use separate streams for audio and video? How can a receiving client synchronize these streams?" from FAQs. The problem is that I don't know where I should create the instance for RTCPInstance class, and there's no variable or field where I can store it. I looked in the OnDemandServerMediaSubsession and FramedSource class. Is there any way to delay the video streaming without using the RTCPInstance, if not, where I should create it and where I should store it? If you need anything else, please ask for it. Diego -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 5 14:27:13 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 5 Sep 2008 14:27:13 -0700 Subject: [Live-devel] Synchrnozing video and audio using OnDemandServerMediaSubsessions In-Reply-To: <008f01c90f92$186601f0$800101df@redmondsoftware.com> References: <008f01c90f92$186601f0$800101df@redmondsoftware.com> Message-ID: To get proper audio/video synchronization, you must create a "RTCPInstance" for each "RTPSink". However, the "OneDemandServerMediaSubsession" class does this automatically, so because you're subclassing this, you don't need to do anything special to implement RTCP - you already have it. However, the second important thing that you need is that the presentation times that you give to each frame (that feeds into each "RTPSink") *must* be accurate. It is those presentation times that get delivered to the receiver, and used (by the receiver) to do audio/video synchronization. Delaying the transmission (or not) does not affect this at all; it doesn't matter if video packets get sent slightly ahead of audio packets (or vice versa). What's important is the *presentation times* that you give each frame. If those are correct, then you will get audio/video synchronization at the receiver. This is assuming, of course, that your *receiver* implements standard RTCP-based synchronization correctly. (If your receiver uses our library, than it will.) But if your receiver is not standards compliant and doesn't implement this, then audio/video synchronization will never work.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From paul at spidersweb.freeserve.co.uk Sat Sep 6 10:23:50 2008 From: paul at spidersweb.freeserve.co.uk (Paul Webster) Date: Sat, 6 Sep 2008 18:23:50 +0100 Subject: [Live-devel] Unusual RTSP audio stream - l1.nl Message-ID: First off - a quick apology. I realise that this is a list for developers rather than users, but I have got as far as I can in diagnosing the problem and I don't have a development environment to get further. If not appropriate for here then either ignore this or tell me. http://www.l1.nl provides the audio stream link as (Radio / Web Radio) mms://194.171.15.26:554/l1radio (I have also tried with rtsp://194.171.15.26:554/l1radio ) I can play it with Windows Media Player - but not with mplayer or vlc I think I have worked out that it is RTSP over TCP. The final error I see (in mplayer) is: Failed to initiate "audio/X-ASF-PF" RTP subsession: RTP payload format unknown or not supported It seems to be coming from a Microsoft server - WMServer/9.1.1.3814 and 'WMFSDKVersion' = '10.00.00.3997' I found a reference to problems with handling X-ASF-PF back in December 2006. So I presume that this is the cause of the problem here. My guess is that the broadcaster is not deliberately trying to restrict access - so is there a setting on their server that I could ask them to tweak to make the stream more widely playable? Paul Webster From finlayson at live555.com Sat Sep 6 15:24:39 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 6 Sep 2008 15:24:39 -0700 Subject: [Live-devel] Unusual RTSP audio stream - l1.nl In-Reply-To: References: Message-ID: >The final error I see (in mplayer) is: >Failed to initiate "audio/X-ASF-PF" RTP subsession: RTP payload format >unknown or not supported Please read the FAQ! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From paul at spidersweb.freeserve.co.uk Sat Sep 6 16:07:50 2008 From: paul at spidersweb.freeserve.co.uk (Paul Webster) Date: Sun, 7 Sep 2008 00:07:50 +0100 Subject: [Live-devel] Unusual RTSP audio stream - l1.nl Message-ID: <312207424706b8b45674.846930886.paul@spidersweb.freeserve.co.uk> >>The final error I see (in mplayer) is: >>Failed to initiate "audio/X-ASF-PF" RTP subsession: RTP payload format >>unknown or not supported >Please read the FAQ! I have seen the FAQ comment about X- payloads. My hope was that someone might know how to reconfigure a Microsoft streaming server to make the problem go away. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From infotek at datasync.com Sun Sep 7 13:52:01 2008 From: infotek at datasync.com (Jason L. Ellison) Date: Sun, 7 Sep 2008 15:52:01 -0500 (CDT) Subject: [Live-devel] Open Source Studio to Transmitter Link (STL) using Darkice, FreeBSD, liveCaster, icecast and vlc Message-ID: List, I created an (mostly) Open Source Studio to Transmitter Link using Darkice, FreeBSD,liveCaster, icecast and vlc. I posted about this a over in March of 2007. It has been running with no problems for over a year now. I am hosting a PDF that tries to documents what I did. See: OSSTL: Open Source Studio to Transmitter Link A Studio to Transmitter Link (STL) created with Open Source Software (In use by a local FM and two local AM radio stations) http://jasonellison.net/projects.html My original post to the darkice list in March of 2007. http://lists.tyrell.hu/pipermail/darkice-list/2007-March/000074.html -Jason Ellison From rcarre at m2x.nl Mon Sep 8 06:49:04 2008 From: rcarre at m2x.nl (=?ISO-8859-1?Q?Rafa=EBl_Carr=E9?=) Date: Mon, 08 Sep 2008 15:49:04 +0200 Subject: [Live-devel] [PATCH] Sets RTP-Info for every subsession Message-ID: <1220881744.6796.38.camel@bouffe> Hello, I came around this problem while messing with a WMS stream. It has 3 subsessions in this order: - audio - application - video VLC uses the video track for pts calculation, but getNormalPlayTime() would return 0 because the rtpInfo hadn't been set for this track. In playMediaSession(), an iterator is used to associate the rtpinfo to each subsession in this way (pseudo-code): if(parseRTPInfoHeader()) /* this will increment the string pointer */ { for_each_subsession() { associate_rtpinfo(); if(!parseRTPInfoHeader()) break; /* here we fail because we didn't find valid rtpinfo data */ } } In my case, the rtpinfo would not be associated to the video track, and since the stream is not synchronized with rtcp, I would get 0 npt. I understand that the 2nd call to parseRTPInfoHeader() should go away: parse the rtpinfo once, and then associate it to all subsessions. Attached patch does what is described. -- Rafa?l Carr? -------------- next part -------------- A non-text attachment was scrubbed... Name: live-rtpinfo.diff Type: text/x-patch Size: 886 bytes Desc: not available URL: From dmaljur at elma.hr Mon Sep 8 07:09:05 2008 From: dmaljur at elma.hr (Dario) Date: Mon, 8 Sep 2008 16:09:05 +0200 Subject: [Live-devel] RTSP Response was truncated Message-ID: <000601c911bc$7475f0a0$ec03000a@gen47> Hi. I sometimes receive folowing errors when closing session. RTSP response was truncated RTSP response was truncated We received a response not ending with Failed to read response: We received a response not ending with Failed to read response: ERROR Closing session Note that we are using custom made streaming server based on live555 code. Receiving client ends the session in this way: - User press "Stop" button. - customRTSPClient executes Shutdown() Shutdown code: Medium::close(aviOut); // avitOut is AviFileSink object MediaSubsessionIterator iter(*session); MediaSubsession* subsession; while ((subsession = iter.next()) != NULL) { Medium::close(subsession->sink); subsession->sink = NULL; } pThisRTSPClient->teardownMediaSession(*session); //pThisRTSPClient is RTSPClient object Medium::close(session); Medium::close(pThisMediaClient); AfterPlayingFunc() is empty. (It's difficult on to merge static functions with object code). Why does this happens, and does it have any infuense on memory dealocation of the session? Is this the right way to end session? Or this *must* be done in afterPlayingFunc()? Thanks. ELMA Kurtalj d.o.o. (ELMA Kurtalj ltd.) Vitezi?eva 1a, 10000 Zagreb, Hrvatska (Viteziceva 1a, 10000 Zagreb, Croatia) Tel: 01/3035555, Faks: 01/3035599 (Tel: ++385-1-3035555, Fax: ++385-1-3035599 ) Www: www.elma.hr; shop.elma.hr E-mail: elma at elma.hr (elma at elma.hr) pitanje at elma.hr (questions at elma.hr) primjedbe at elma.hr (complaints at elma.hr) prodaja at elma.hr (sales at elma.hr) servis at elma.hr (servicing at elma.hr) shop at elma.hr (shop at elma.hr) skladiste at elma.hr (warehouse at elma.hr) -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 8 23:07:53 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Sep 2008 23:07:53 -0700 Subject: [Live-devel] RTSP Response was truncated In-Reply-To: <000601c911bc$7475f0a0$ec03000a@gen47> References: <000601c911bc$7475f0a0$ec03000a@gen47> Message-ID: >Hi. > >I sometimes receive folowing errors when closing session. > >RTSP response was truncated >RTSP response was truncated >We received a response not ending with >Failed to read response: >We received a response not ending with >Failed to read response: >ERROR Closing session > >Note that we are using custom made streaming server based on live555 code. Sorry, but we can't really help you with problems with your custom server (but fortunately, you have complete source code). Do you see the same problem when you use the (unmodified) server applications that we provide: "testOnDemandRTSPServer" or "live555MediaServer"? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 8 23:35:21 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Sep 2008 23:35:21 -0700 Subject: [Live-devel] [PATCH] Sets RTP-Info for every subsession In-Reply-To: <1220881744.6796.38.camel@bouffe> References: <1220881744.6796.38.camel@bouffe> Message-ID: >I understand that the 2nd call to parseRTPInfoHeader() should go away: >parse the rtpinfo once, and then associate it to all subsessions. That's not correct. Each subsession should have a *separate* list of parameters, within the "RTP-Info:" header. E.g., consider the example given in RFC 2326: RTP-Info: url=rtsp://foo.com/bar.avi/streamid=0;seq=45102,url=rtsp://foo.com/bar.avi/streamid=1;seq=30211 Note the two parameter lists, separated by ",". As far as I can tell, the existing code correctly handles reponses from compliant servers. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jd.vvd at web.de Mon Sep 8 23:58:23 2008 From: jd.vvd at web.de (Joerg Desch) Date: Tue, 9 Sep 2008 08:58:23 +0200 Subject: [Live-devel] How to stream a h264 720p ES (NAL) source? Message-ID: <20080909085823.3e66386a@anene> Hi all together, I currently playing with an eval board running a TI DM6467 (ARM+DSP, the successor of the DM355 used by the neuros.org device) with a sample 720p CODEC. I'm absolutely new to "MPEG4/h264" and "multimedia streams". So I want to try to stream the result of this hardware encoder to a VLC client using raw UPD or RTP. As mentioned in the FAQ, the live555 library isn't able to stream h264 because of the missing h264-Framer. Want Do I have to do to stream this to VLC? My current sample code consists of three threads. The first thread "CAPTURE" grab the raw frame and converts the color space and passes the frame to the next thread. The second thread "ENCODE" feeds the DSP. The last thread "WRITER" waits for the output of the DSP (I think here I get the NAL packets with are round about 90kB in size) and write it to disk. Instead of writing the NAL packets to disk, I would like to stream them to a VLC client. What would be the right way? The recorded stream is identified by the file command as "JVT NAL sequence, H.264 video, baseline @ L 31" Any hints? Thanks in advance. -- Email: Joerg Desch From anilanilg at gmail.com Tue Sep 9 00:55:05 2008 From: anilanilg at gmail.com (anil s) Date: Tue, 9 Sep 2008 13:25:05 +0530 Subject: [Live-devel] [live-devel]: Question on rate control in Live555 Message-ID: <70cdc9780809090055gea1446aq8d43bdedd0f28bbc@mail.gmail.com> Hi, This is my first mail to the list. Thanks a lot before you read the mail. I had some problem streaming a file which is generated by another server. The server that i have generates PCRs at rate of every 80-100 ms but however the PCRs do not indicate the correct bit rate i.e say if the difference between successive PCR values is 100ms then amount of data between those PCRs does not correspond to 100ms. The server simply streams the frames as and when they are generated and does not has any mechanism of rate control. If I capture the stream and try streaming the same file using Live555 the video does not play out smoothly. The client runs on a PC. As a experiment i calculated the differences between the PCRs and the RTP time stamps at that particular instant. I see that these differences do not match. Say if the difference between PCRs is 100ms then the difference between RTP timestamps goes upto 220ms. Could you please tell me as to how the live555 decides at what rate it is going to stream out a given file. Does the Live555 determine the rate by looking at the PCR values in the file. Thanks, Anil S -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 9 01:10:42 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 09 Sep 2008 01:10:42 -0700 Subject: [Live-devel] [live-devel]: Question on rate control in Live555 In-Reply-To: <70cdc9780809090055gea1446aq8d43bdedd0f28bbc@mail.gmail.com > References: <70cdc9780809090055gea1446aq8d43bdedd0f28bbc@mail.gmail.com> Message-ID: <200809090813.m898DeKd006842@ns.live555.com> You didn't say so explicitly, but I assume you're talking about MPEG Transport Stream files. (Our server can stream many other kinds of file.) >Could you please tell me as to how the live555 decides at what rate >it is going to stream out a given file. Does the Live555 determine >the rate by looking at the PCR values in the file. Yes - specifically the class "MPEG2TransportStreamFramer". (Note how the variable "fDurationInMicroseconds" is calculated. That variable determines the rate at which the data is streamed.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From rcarre at m2x.nl Tue Sep 9 01:29:02 2008 From: rcarre at m2x.nl (=?ISO-8859-1?Q?Rafa=EBl_Carr=E9?=) Date: Tue, 09 Sep 2008 10:29:02 +0200 Subject: [Live-devel] [PATCH] Sets RTP-Info for every subsession In-Reply-To: References: <1220881744.6796.38.camel@bouffe> Message-ID: <1220948942.7246.4.camel@bouffe> On Mon, 2008-09-08 at 23:35 -0700, Ross Finlayson wrote: > >I understand that the 2nd call to parseRTPInfoHeader() should go away: > >parse the rtpinfo once, and then associate it to all subsessions. > > That's not correct. Each subsession should have a *separate* list of > parameters, within the "RTP-Info:" header. E.g., consider the > example given in RFC 2326: > RTP-Info: > url=rtsp://foo.com/bar.avi/streamid=0;seq=45102,url=rtsp://foo.com/bar.avi/streamid=1;seq=30211 > > Note the two parameter lists, separated by ",". Indeed, that's what the server sends me. RTP-Info: url=rtsp://example.org/bar.asf/audio;seq=34421;rtptime=0, url=rtsp://example.org/bar.asf/video;seq=24151;rtptime=0 > As far as I can tell, the existing code correctly handles reponses > from compliant servers. The code just iterates over the lists separated by "," on one side, and over the subsessions on the other side, but it doesn't check that the list it is parsing is associated to the particular subsession. The 3 subsessions are declared in this order: audio, application, video. So application gets associated to the rtpinfo of the video subsession, and the video to nothing. I'll send another patch shortly, thanks for the details. -- Rafa?l Carr? From rcarre at m2x.nl Tue Sep 9 02:28:43 2008 From: rcarre at m2x.nl (=?ISO-8859-1?Q?Rafa=EBl_Carr=E9?=) Date: Tue, 09 Sep 2008 11:28:43 +0200 Subject: [Live-devel] [PATCH] Sets RTP-Info for every subsession In-Reply-To: <1220948942.7246.4.camel@bouffe> References: <1220881744.6796.38.camel@bouffe> <1220948942.7246.4.camel@bouffe> Message-ID: <1220952523.7246.7.camel@bouffe> On Tue, 2008-09-09 at 10:29 +0200, Rafa?l Carr? wrote: > I'll send another patch shortly, thanks for the details. Here it is. I am not very familiar with C++, this is why I use a fixed 1023 bytes length to compare the url in rtpinfo with the control url. control field may hold relative urls, this is why I compare the strings beginning from the end. -- Rafa?l Carr? -------------- next part -------------- A non-text attachment was scrubbed... Name: live-rtpinfo.diff Type: text/x-patch Size: 4300 bytes Desc: not available URL: From finlayson at live555.com Tue Sep 9 06:05:10 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 09 Sep 2008 06:05:10 -0700 Subject: [Live-devel] [PATCH] Sets RTP-Info for every subsession In-Reply-To: <1220948942.7246.4.camel@bouffe> References: <1220881744.6796.38.camel@bouffe> <1220948942.7246.4.camel@bouffe> Message-ID: <200809091312.m89DCfKU020323@ns.live555.com> Thanks for the clarification. This will get fixed in a future release of the software. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From mamille1 at rockwellcollins.com Tue Sep 9 11:45:35 2008 From: mamille1 at rockwellcollins.com (mamille1 at rockwellcollins.com) Date: Tue, 9 Sep 2008 13:45:35 -0500 Subject: [Live-devel] Filter chain question - chain seems to stall In-Reply-To: Message-ID: All We've inserted a custom filter into our transport stream recording chain. Almost all the time it works (and has been working pretty well for many months), but lately we've seen this problem: data is being streamed towards our unit, and RTP packets are being buffered; when our problem occurs, one of the 'downstream' filters seems to stop requesting new data RTP packets 'pile up' in memory and no data is written to our recording file. We wanted to check our understanding of the normal source-to-sink chain processing. The network side of our chain depends on being called from its sink-ward filter to get started, but afterwards continues on its own to get RTP packets from the network. When called on it will provide data to its sink, but the part that pulls in RTP packets from the network won't hang or stall if it's getNextFrame() is never called by its sink. Thanks! -=- Mike Miller SW Engineer Rockwell Collins, Inc. Cedar Rapids, IA -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 9 14:15:51 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 09 Sep 2008 14:15:51 -0700 Subject: [Live-devel] Filter chain question - chain seems to stall In-Reply-To: References: Message-ID: <200809092123.m89LNIrR032220@ns.live555.com> >We wanted to check our understanding of the normal source-to-sink >chain processing. The network side of our chain depends on being >called from its sink-ward filter to get started, but afterwards >continues on its own to get RTP packets from the network. Not really. It is *always* the sink (i.e., at the end of the chain) that initiates the data flow for each piece of data. It (in its "continuePlaying()" virtual function) will call "getNextFrame()" on its upstream source object. Its 'after getting' function (which was passed in the call to "getNextFrame()") will get called after it has received its data from upstream. This function will usually then call "continuePlaying()" (and thus "getNextFrame()") again. For 'filter' objects, things work similarly. Its "doGetNextFrame()" virtual function gets called (as a result of its downstream object calling "getNextFrame()"). This function will usually call "getNextFrame()" on the upstream object. Its 'after getting' function (which was passed in the call to "getNextFrame()") will get called after it has received its data from upstream. This function will usually call "FramedSource::afterGetting(this)" (ugly, I know) to tell the downstream object (filter or sink) that new data is available for it. You should be able to check each of these things, to see where/why the chain is stalling. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From dmaljur at elma.hr Wed Sep 10 01:44:44 2008 From: dmaljur at elma.hr (Dario) Date: Wed, 10 Sep 2008 10:44:44 +0200 Subject: [Live-devel] RTSP Response was truncated Message-ID: <000701c91321$79d4fda0$ec03000a@gen47> I understand. But can you tell me is the provided code in the first post right way to end session? And what is the function of AfterPlayingFunc()? ELMA Kurtalj d.o.o. (ELMA Kurtalj ltd.) Vitezi?eva 1a, 10000 Zagreb, Hrvatska (Viteziceva 1a, 10000 Zagreb, Croatia) Tel: 01/3035555, Faks: 01/3035599 (Tel: ++385-1-3035555, Fax: ++385-1-3035599 ) Www: www.elma.hr; shop.elma.hr E-mail: elma at elma.hr (elma at elma.hr) pitanje at elma.hr (questions at elma.hr) primjedbe at elma.hr (complaints at elma.hr) prodaja at elma.hr (sales at elma.hr) servis at elma.hr (servicing at elma.hr) shop at elma.hr (shop at elma.hr) skladiste at elma.hr (warehouse at elma.hr) -------------- next part -------------- An HTML attachment was scrubbed... URL: From mamille1 at rockwellcollins.com Wed Sep 10 07:45:02 2008 From: mamille1 at rockwellcollins.com (mamille1 at rockwellcollins.com) Date: Wed, 10 Sep 2008 09:45:02 -0500 Subject: [Live-devel] Filter chain question - chain seems to stall In-Reply-To: Message-ID: Ross Thanks for the quick reply and the advice. Just a note about the MultiFramedRTPSource class we're using. The first time doGetNextFrame() is called, it starts the network read handler. This handler continues to run, pulling packets off the network and buffering them, until the object is destroyed, an error occurs or we call doStopGettingFrames(). Calling getNextFrame() on this object will copy data from its list of buffered packets to the caller's buffer in typical fashion. When our chain stalls, getNextFrame() isn't being called or isn't returning, so data pulled in by the network read handler piles up in memory. We're certain that the problem is on our end rather than in the current live555 code. Wish us luck as we dig into this. -=- Mike Miller SW Engineer, Govt. Systems Rockwell Collins, Inc. Cedar Rapids, IA -------------- next part -------------- An HTML attachment was scrubbed... URL: From mamille1 at rockwellcollins.com Wed Sep 10 12:58:00 2008 From: mamille1 at rockwellcollins.com (mamille1 at rockwellcollins.com) Date: Wed, 10 Sep 2008 14:58:00 -0500 Subject: [Live-devel] Filter chain question - chain seems to stall - found our bug In-Reply-To: Message-ID: Ross Your advice was very helpful. We found the place in our code where we meant to drop a frame but instead stopped calling the afterGetting() function. -=- Mike Miller SW Engineer, Govt. Systems Rockwell Collins, Inc. Cedar Rapids, IA -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 10 14:43:51 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 10 Sep 2008 14:43:51 -0700 Subject: [Live-devel] Filter chain question - chain seems to stall In-Reply-To: References: Message-ID: <200809102149.m8ALna9r063512@ns.live555.com> >Just a note about the MultiFramedRTPSource class we're using. The >first time doGetNextFrame() is called, it starts the network read >handler. This handler continues to run, pulling packets off the >network and buffering them This is reinventing the wheel. The operating system already buffers incoming packets, and does so at least as efficiently as you could do yourself in user space. (You might need to increase the OS's buffer size, but you can do this easily (see the FAQ).) Please don't modify the existing "MultiFramedRTPSource" code. By doing so, you're not only wasting your time, but you won't be able to get any support for it on this mailing list. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From xzou999 at gmail.com Thu Sep 11 00:59:12 2008 From: xzou999 at gmail.com (xiang zou) Date: Thu, 11 Sep 2008 15:59:12 +0800 Subject: [Live-devel] something about the class Message-ID: <3180c4620809110059i7ac7f409qc6399263adb66f8b@mail.gmail.com> Hi,everybody: Recently I'm reading the live555 source code,I found something wired: In the class MPEG4GenericRTPSource, which is the derived class of the class MultiFramedRTPSource, its member function processSpecialHeader find the AU-header section but haven't provider some method functions to get attributs like "AU-size","AU-index","CTS-delta", .etc. In its parent class MultiFramedRTPSource's virtual member function "doGetNextFrame1", just skip these AU-header section,then what we get from every mpeg-4 elementry stream RTP payload is Auxiliary section(if it present) + ADU section,right?Then what make me confused is, how can the decoder to decode ADU section without CTS or DTS if the RTP payload is multiframed,especially for variable bitrate stream?Since only the first frame can be associated to the RTP timestamp,others has no time information. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mehmet.ozgul at turkcellteknoloji.com.tr Thu Sep 11 04:47:15 2008 From: mehmet.ozgul at turkcellteknoloji.com.tr (mehmet.ozgul at turkcellteknoloji.com.tr) Date: Thu, 11 Sep 2008 14:47:15 +0300 Subject: [Live-devel] Leak: Orhpan item is left in MediaLookUp table if AVI file creation fails. Message-ID: <145F7A0D1A064846969048230B5A41F4088FA5FF@exhmbx03.turkcell.entp.tgc> Hi, I noticed that if OpenOutputFile(env, outputFileName) method call in AVIFileSink constructor fails, the AVIFileSink remains in the MediaLookUp table. Addition to the table is done in the constructor of Medium and is called in AVIFileSink constructor's initializer list. Since AVIFileSink::createNew() returns NULL in this case, the client application does not receive a reference to close the AVIFileSink. Orphan AVIFileSink reference in the MediaLookUp table prevents Environment to be correctly reclaimed, since liveMediaPriv remains non-NULL because of the not empty table. It looks like it also applies to QuickTimeFileSink. However FileSink::createNew() does not construct if the file creation fails which prevents such an orphan reference. Regards, Mehmet Ozgul ************************************************************************ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jd.vvd at web.de Thu Sep 11 05:24:41 2008 From: jd.vvd at web.de (Joerg Desch) Date: Thu, 11 Sep 2008 14:24:41 +0200 Subject: [Live-devel] H264 RTP streaming tutorial References: Message-ID: <20080911142441.7bc2b134@anene> On Fri, 29 Aug 2008 18:01:24 +0800 "M Haris Lye" wrote: > May I inquire if anybody has the H264 RTP streaming tutorial as > mentioned in the previous posting > http://www.mail-archive.com/live-devel at lists.live555.com/msg00238.html > > I hope it can be shared with live555 newbie like me. I need to refer to > it to create a customized streaming server for a research project. I'm looking for exactly the same tarball... and couldn't find it. Have you already got a copy? The whole search for informations and "howtos" is really time consuming. Which H264 source do you use? I have a TI DaVinvi evaluation board with delivers h264 NAL units. But I don't know how to start streaming with this kind of packets.... -- Email: Joerg Desch From finlayson at live555.com Thu Sep 11 07:15:49 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 11 Sep 2008 07:15:49 -0700 Subject: [Live-devel] Leak: Orhpan item is left in MediaLookUp table if AVI file creation fails. In-Reply-To: <145F7A0D1A064846969048230B5A41F4088FA5FF@exhmbx03.turkcell .entp.tgc> References: <145F7A0D1A064846969048230B5A41F4088FA5FF@exhmbx03.turkcell.entp.tgc> Message-ID: <200809111419.m8BEJUmA000199@ns.live555.com> >I noticed that if OpenOutputFile(env, outputFileName) method call in >AVIFileSink constructor fails, the AVIFileSink remains in the >MediaLookUp table. > >Addition to the table is done in the constructor of Medium and is >called in AVIFileSink constructor's initializer list. Since >AVIFileSink::createNew() returns NULL in this case, the client >application does not receive a reference to close the AVIFileSink. Look at the code for "AVIFileSink::createNew()" again. You'll see that it deletes the object if the output 'fid' was not created (i.e., is NULL). The same is true for "QuickTimeFileSink". Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Thu Sep 11 07:29:39 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 11 Sep 2008 07:29:39 -0700 Subject: [Live-devel] Leak: Orhpan item is left in MediaLookUp table if AVI file creation fails. Message-ID: <200809111432.m8BEWaf0015205@ns.live555.com> Ahh... I looked at this code again, and you're right - there is a bug. In the "createNew()" function, replace delete newSink; with Medium::close(newSink); This should be done for both "AVIFileSink" and "QuickTimeFileSink". This will be fixed in the next release of the software. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From soumya.patra at lge.com Fri Sep 12 03:29:09 2008 From: soumya.patra at lge.com (soumya patra) Date: Fri, 12 Sep 2008 15:59:09 +0530 Subject: [Live-devel] RTSP authentication Message-ID: <20080912102909.6D32555800D@LGEMRELSE6Q.lge.com> Hi Ross, Thanks for your advice regarding RTSP authentication for Dynamic RTSP server. Now I want add stream session id along with username & password for Authentication. Can you give some idea regarding session id for RTSP authentication. Waiting for your response. Regards Soumya -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 12 06:02:56 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 12 Sep 2008 06:02:56 -0700 Subject: [Live-devel] RTSP authentication In-Reply-To: <20080912102909.6D32555800D@LGEMRELSE6Q.lge.com> References: <20080912102909.6D32555800D@LGEMRELSE6Q.lge.com> Message-ID: <200809121305.m8CD5sFn029780@ns.live555.com> > Thanks for your advice regarding RTSP authentication for Dynamic > RTSP server. Now I want add stream session id along with username & > password for >Authentication. This is non-standard, and will not work with any standard RTSP client (which will expect to authenticate using username,password only). Therefore, you will get no support from me on this. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From venugopalpaikr at tataelxsi.co.in Mon Sep 15 04:24:05 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Mon, 15 Sep 2008 16:54:05 +0530 Subject: [Live-devel] Difference between live555 and jrtplib Message-ID: <006e01c91725$90b3c8a0$3c033c0a@telxsi.com> Hi, I wanted to know if the RTP/RTCP implemented in live555 has any resemblance to the RTP/RTCP implemented in jrtplib. IF not what is the difference between the two libraries? Kindly help. Regards, Venugopal The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From finlayson at live555.com Mon Sep 15 05:22:09 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 15 Sep 2008 05:22:09 -0700 Subject: [Live-devel] Difference between live555 and jrtplib In-Reply-To: <006e01c91725$90b3c8a0$3c033c0a@telxsi.com> References: <006e01c91725$90b3c8a0$3c033c0a@telxsi.com> Message-ID: <200809151225.m8FCPDNh006078@ns.live555.com> > I wanted to know if the RTP/RTCP implemented in live555 has any >resemblance to the RTP/RTCP implemented in jrtplib. No, other than they they both implement the same (standard) protocols. >IF not what is the >difference between the two libraries? They're completely different implementations, developed by different people, and written in different languages. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From xzou999 at gmail.com Mon Sep 15 07:59:20 2008 From: xzou999 at gmail.com (xiang zou) Date: Mon, 15 Sep 2008 22:59:20 +0800 Subject: [Live-devel] what if MultiFramedRTPSource::doGetNextFrame1 get a multiframed packet? Message-ID: <3180c4620809150759t5f949d13nb80bcd169fa26414@mail.gmail.com> Hi, When I was reading the source code of "MultiFramedRTPSource::doGetNextFrame1" in MultiFramedRTPSource.cpp,I found that it will get data from the buffered packet(class ReorderingPacketBuffer) until get a complete frame(if the frame has been fragmented),but what if the RTP payload which cached in buffered packet is a multiframed packet?In that case,what we get from "MultiFramedRTPSource::doGetNextFrame1" will be more than one frames,for example,we received a RTP packet which contain 3 MPEG4ES video frames before and save it to the buffered packet,then how to pick out the single frames and their config information like timestamp,length or else from this multiframed data?As I know,both class MPEG4ESVideoRTPSource and MPEG4GenericRTPSource have not been completed,do they need something like multiframe split as in last question? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 15 21:25:20 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 15 Sep 2008 21:25:20 -0700 Subject: [Live-devel] what if MultiFramedRTPSource::doGetNextFrame1 get a multiframed packet? In-Reply-To: <3180c4620809150759t5f949d13nb80bcd169fa26414@mail.gmail.co m> References: <3180c4620809150759t5f949d13nb80bcd169fa26414@mail.gmail.com> Message-ID: <200809160433.m8G4XbD8016745@ns.live555.com> At 07:59 AM 9/15/2008, you wrote: >but what if the RTP payload which cached in buffered packet is a >multiframed packet?In that case,what we get from >"MultiFramedRTPSource::doGetNextFrame1" will be more than one frames That's OK. However, not all of these frames will be delivered to the reading object at once. Instead, only one frame will be delivered at a time. This is implemented by the virtual function "nextEnclosedFrameSize()", which calculates the size of each frame within the packet. >,for example,we received a RTP packet which contain 3 MPEG4ES video >frames before and save it to the buffered packet,then how to pick >out the single frames and their config information like >timestamp,length or else from this multiframed data?As I know,both >class MPEG4ESVideoRTPSource and MPEG4GenericRTPSource have not been >completed,do they need something like multiframe split as in last question? "MPEG4GenericRTPSource" implements multiple frames per packet (except currently for interleaving); note the function "MPEG4GenericBufferedPacket::nextEnclosedFrameSize()". However, you're correct that "MPEG4ESVideoRTPSource" currently does not implement multiple frames per packet (it currently does not reimplement the "nextEnclosedFrameSize()" virtual function). Ross Finlayson Live Networks, Inc. (LIVE555.COM) From xzou999 at gmail.com Mon Sep 15 22:45:51 2008 From: xzou999 at gmail.com (xiang zou) Date: Tue, 16 Sep 2008 13:45:51 +0800 Subject: [Live-devel] what if MultiFramedRTPSource::doGetNextFrame1 get a multiframed packet? In-Reply-To: <200809160433.m8G4XbD8016745@ns.live555.com> References: <3180c4620809150759t5f949d13nb80bcd169fa26414@mail.gmail.com> <200809160433.m8G4XbD8016745@ns.live555.com> Message-ID: <3180c4620809152245h44bb770bhd747b1788cdf5ad1@mail.gmail.com> I read the source code again,yes,"MPEG4GenericRTPSource" indeed implements multiple frames per packet by reimplementation of the virtual function "MPEG4GenericBufferedPacket::nextEnclosedFrameSize()",but it had not update the "fPresentationTime" after the first received frame,since "frameDurationInMicroseconds" is always 0.In that case,from a multiframed packet,we will get several frames with the same timestamp,is it a bug? 2008/9/16 Ross Finlayson > At 07:59 AM 9/15/2008, you wrote: > >> but what if the RTP payload which cached in buffered packet is a >> multiframed packet?In that case,what we get from >> "MultiFramedRTPSource::doGetNextFrame1" will be more than one frames >> > > That's OK. However, not all of these frames will be delivered to the > reading object at once. Instead, only one frame will be delivered at a > time. This is implemented by the virtual function > "nextEnclosedFrameSize()", which calculates the size of each frame within > the packet. > > ,for example,we received a RTP packet which contain 3 MPEG4ES video frames >> before and save it to the buffered packet,then how to pick out the single >> frames and their config information like timestamp,length or else from this >> multiframed data?As I know,both class MPEG4ESVideoRTPSource and >> MPEG4GenericRTPSource have not been completed,do they need something like >> multiframe split as in last question? >> > > "MPEG4GenericRTPSource" implements multiple frames per packet (except > currently for interleaving); note the function > "MPEG4GenericBufferedPacket::nextEnclosedFrameSize()". > > However, you're correct that "MPEG4ESVideoRTPSource" currently does not > implement multiple frames per packet (it currently does not reimplement the > "nextEnclosedFrameSize()" virtual function). > > > Ross Finlayson > Live Networks, Inc. (LIVE555.COM) > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Sep 15 23:26:45 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 15 Sep 2008 23:26:45 -0700 Subject: [Live-devel] what if MultiFramedRTPSource::doGetNextFrame1 get a multiframed packet? In-Reply-To: <3180c4620809152245h44bb770bhd747b1788cdf5ad1@mail.gmail.co m> References: <3180c4620809150759t5f949d13nb80bcd169fa26414@mail.gmail.com> <200809160433.m8G4XbD8016745@ns.live555.com> <3180c4620809152245h44bb770bhd747b1788cdf5ad1@mail.gmail.com> Message-ID: <200809160631.m8G6VJ0C039682@ns.live555.com> At 10:45 PM 9/15/2008, you wrote: >I read the source code again,yes,"MPEG4GenericRTPSource" indeed >implements multiple frames per packet by reimplementation of the >virtual function >"MPEG4GenericBufferedPacket::nextEnclosedFrameSize()",but it had not >update the "fPresentationTime" after the first received frame,since >"frameDurationInMicroseconds" is always 0.In that case,from a >multiframed packet,we will get several frames with the same >timestamp,is it a bug? In principle, yes. However, in practice, because we don't actually decode the media data, it would be difficult for an implementation to figure out the duration of each frame within the packet, so we don't try to do this. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From xzou999 at gmail.com Tue Sep 16 00:28:48 2008 From: xzou999 at gmail.com (xiang zou) Date: Tue, 16 Sep 2008 15:28:48 +0800 Subject: [Live-devel] what if MultiFramedRTPSource::doGetNextFrame1 get a multiframed packet? In-Reply-To: <200809160631.m8G6VJ0C039682@ns.live555.com> References: <3180c4620809150759t5f949d13nb80bcd169fa26414@mail.gmail.com> <200809160433.m8G4XbD8016745@ns.live555.com> <3180c4620809152245h44bb770bhd747b1788cdf5ad1@mail.gmail.com> <200809160631.m8G6VJ0C039682@ns.live555.com> Message-ID: <3180c4620809160028p6bd36833qf229d958bbfd4886@mail.gmail.com> +------+------+------------+------+------------+------+------------+------------+-----------------+ | RTP | VP |Video Packet| VP |Video Packet | VP |Video Packet| |header |header| (1) |header| (2) |header| (3) | +------+------+------------+------+------------+------+------------+------------+------------------+ Just like the figure above(I copy it from RFC3016),can we get timestamp information of every Video Packet from VP header?As you said,"we don't actually decode the media data", what does it mean?since in my opinion,most time we not only get every frame but send them to the decoder,doesn't the decoder need timestamp information for every frame?And a new question, I found some figures(like before)in RFC3016 use Video Packet and VOP,but the expression "Video Packet" can hardly be seen in ISO14496,is there anything wrong with it? 2008/9/16 Ross Finlayson > At 10:45 PM 9/15/2008, you wrote: > >> I read the source code again,yes,"MPEG4GenericRTPSource" indeed implements >> multiple frames per packet by reimplementation of the virtual function >> "MPEG4GenericBufferedPacket::nextEnclosedFrameSize()",but it had not update >> the "fPresentationTime" after the first received frame,since >> "frameDurationInMicroseconds" is always 0.In that case,from a multiframed >> packet,we will get several frames with the same timestamp,is it a bug? >> > > In principle, yes. However, in practice, because we don't actually decode > the media data, it would be difficult for an implementation to figure out > the duration of each frame within the packet, so we don't try to do this. > > > > Ross Finlayson > Live Networks, Inc. (LIVE555.COM) > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 16 00:44:28 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 16 Sep 2008 00:44:28 -0700 Subject: [Live-devel] what if MultiFramedRTPSource::doGetNextFrame1 get a multiframed packet? In-Reply-To: <3180c4620809160028p6bd36833qf229d958bbfd4886@mail.gmail.co m> References: <3180c4620809150759t5f949d13nb80bcd169fa26414@mail.gmail.com> <200809160433.m8G4XbD8016745@ns.live555.com> <3180c4620809152245h44bb770bhd747b1788cdf5ad1@mail.gmail.com> <200809160631.m8G6VJ0C039682@ns.live555.com> <3180c4620809160028p6bd36833qf229d958bbfd4886@mail.gmail.com> Message-ID: <200809160747.m8G7lh3I019637@ns.live555.com> At 12:28 AM 9/16/2008, you wrote: >Just like the figure above(I copy it from RFC3016) Note that RFC3016 describes the media type (RTP payload format) "video/MP4V-ES", for which we *do not* currently return separate frames from a multi-frame RTP packet. >,can we get timestamp information of every Video Packet from VP >header?As you said,"we don't actually decode the media data", what >does it mean?since in my opinion,most time we not only get every >frame but send them to the decoder,doesn't the decoder need >timestamp information for every frame? I think that most decoders should be able to handle blocks of multiple frames that have been assigned the same presentation time. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From RGlobisch at csir.co.za Tue Sep 16 02:24:16 2008 From: RGlobisch at csir.co.za (Ralf Globisch) Date: Tue, 16 Sep 2008 11:24:16 +0200 Subject: [Live-devel] Difference between live555 and jrtplib In-Reply-To: References: Message-ID: <48CF9758.5DA9.004D.0@csir.co.za> My two cents from my experience: jRtpLib has a lower entry level, i.e. it is much easier to get started, but deals primarily with RTP and RTCP. I'm not sure if it has been extended since. LiveMedia is a much more complex and more complete framework, will require more time to learn but also deals with RTSP, RTP payloads, RTCP synchronisation, has its own event loop, etc. It is also continuously being updated and improved and has a very active mailing list. -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. MailScanner thanks Transtec Computers for their support. From amol.ghode at gmail.com Tue Sep 16 04:37:34 2008 From: amol.ghode at gmail.com (Amol Ghode) Date: Tue, 16 Sep 2008 17:07:34 +0530 Subject: [Live-devel] Receiving RTP from VLC media player. Message-ID: Hi, I am new to Video streaming. As I know VLC uses LiveMedia stack for streaming. I am facing some problems with RTP streaming. 1. VLC to VLC rtp streaming works perfectly. 2. Streaming from VLC to LiveMedia Test Porgram, is not working for me. The receiver which is a test program of Live media, can not dump the rtp packtes. Though wireshark traces show proper packet arrival. Can anybody help me about this? Thanks in advance. Amol. From gbonneau at matrox.com Tue Sep 16 13:14:59 2008 From: gbonneau at matrox.com (Guy Bonneau) Date: Tue, 16 Sep 2008 16:14:59 -0400 Subject: [Live-devel] BufferedPacket size of buffer Message-ID: Hi, In file MultiFramedRTPSource.cpp the BufferedPacket implementation creates a fBuf buffer to receive a RTP packet. The size of the fBuf is set to MAX_PACKET_SIZE which is defined as 10000. This value is much higher than the MTU size of ethernet which should be close to 1500 bytes. I guess this is to support other networking like ATM that has maximum MTU size of 9192 ? Can I assume if the network router I am using to send RTP packet has a maximum MTU size of 1500 that it is safe to set MAX_PACKET_SIZE to 1500 for a RTP receiver implementation using the live555 library? Thanks Guy Bonneau -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 16 14:39:14 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 16 Sep 2008 14:39:14 -0700 Subject: [Live-devel] BufferedPacket size of buffer In-Reply-To: References: Message-ID: <200809162146.m8GLk84I009023@ns.live555.com> >In file MultiFramedRTPSource.cpp the BufferedPacket implementation >creates a fBuf buffer to receive a RTP packet. The size of the fBuf >is set to MAX_PACKET_SIZE which is defined as 10000. This value is >much higher than the MTU size of ethernet which should be close to >1500 bytes. I guess this is to support other networking like ATM >that has maximum MTU size of 9192 ? > >Can I assume if the network router I am using to send RTP packet has >a maximum MTU size of 1500 that it is safe to set MAX_PACKET_SIZE to >1500 for a RTP receiver implementation using the live555 library? Yes. (Note, though, that in most cases there's only one of these buffers allocated per "RTPSource", so the memory saving will be quite small.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From cbitsunil at gmail.com Wed Sep 17 04:40:23 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Wed, 17 Sep 2008 20:40:23 +0900 Subject: [Live-devel] Regarding TS stream pause Message-ID: Hi, I am using livemedia library to send the TS Stream via RTP.(only sending) Some times I need to Pause the streaming for few milli seconds. Is it possible to do that.? According to my receiver's speed I need to adjust the sending. So in between I need to stop streaming and then start again if I get acknowledgement. Please help me. Thanks in advance. Sunil. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 17 07:56:37 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 17 Sep 2008 07:56:37 -0700 Subject: [Live-devel] Regarding TS stream pause In-Reply-To: References: Message-ID: <200809171503.m8HF3frm091156@ns.live555.com> >I am using livemedia library to send the TS Stream via RTP.(only sending) >Some times I need to Pause the streaming for few milli seconds. >Is it possible to do that.? >According to my receiver's speed I need to adjust the sending. You should try to provide more buffering in your receiver, because the streaming should be happening at an appropriate bitrate, ***on average***. However, if you *really* want to adjust the rate at which the TS data gets sent, then the best way to do this would be to adjust the calculation of "fDurationInMicroseconds" in "liveMedia/MPEG2TransportStreamFramer.cpp". (As always, though, once you've made changes to the provided code, you can't expect much more help on this mailing list.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From gbonneau at matrox.com Wed Sep 17 13:34:58 2008 From: gbonneau at matrox.com (Guy Bonneau) Date: Wed, 17 Sep 2008 16:34:58 -0400 Subject: [Live-devel] Potential Issues in ReorderingPacketBuffer Class Implementation Message-ID: <5239B0E1745647BBBF32B568725BB07D@dorvalmatrox.matrox.com> Ross, While reviewing the code of Class ReorderingPacketBuffer in file MultiFramedRTPSource.cpp I think I might have found potential issues. The first issue is related to : delete fHeadPacket when reset() method of ReorderingPacketBuffer is called. The fHeadPacket pointer can be NULL if it is calling in the context of ~ReorderingPacketBuffer() when the medium source is closed. I know it legal on C++ but it is confusing for someone trying to understand the code. I have not thoroughly checked but I also worry that in some cases when the medium is closed that fHeadPacket and fSavedPacket are identical and non null causing a double delete. The second issue is related to the allocation of BufferedPacket. If the medium source is closed while many buffered packet are chained together because of a reordering issue at this moment then I believe the BufferedPacket allocated by the method getFreePacket won't be freed and will cause memory leaks. If I am right then some mechanism should be provided to walk the chained list and free all the allocated buffered packets. Regards Guy Bonneau From finlayson at live555.com Wed Sep 17 14:17:20 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 17 Sep 2008 14:17:20 -0700 Subject: [Live-devel] Potential Issues in ReorderingPacketBuffer Class Implementation In-Reply-To: <5239B0E1745647BBBF32B568725BB07D@dorvalmatrox.matrox.com> References: <5239B0E1745647BBBF32B568725BB07D@dorvalmatrox.matrox.com> Message-ID: <200809172120.m8HLKYHL082808@ns.live555.com> >The first issue is related to : delete fHeadPacket when reset() method of >ReorderingPacketBuffer is called. The fHeadPacket pointer can be NULL if it >is calling in the context of ~ReorderingPacketBuffer() when the medium >source is closed. I know it legal on C++ but it is confusing for someone >trying to understand the code. There are many places in the code where delete might be called on a pointer that happens to be NULL. As you noted, this is legal (and it simplifies code considerably). > I have not thoroughly checked but I also >worry that in some cases when the medium is closed that fHeadPacket and >fSavedPacket are identical and non null causing a double delete. No, that won't happen, because if "fHeadPacket" and "fSavedPacket" are identical (which is actually the common case), then "fSavedPacketFree" will be False, so the call to "delete fSavedPacket;" won't happen. >The second issue is related to the allocation of BufferedPacket. If the >medium source is closed while many buffered packet are chained together >because of a reordering issue at this moment then I believe the >BufferedPacket allocated by the method getFreePacket won't be freed and will >cause memory leaks. No, each of the "BufferedPacket"s in the chain will be deleted as a result of the call to "delete fHeadPacket;", because the "BufferedPacket" destructor includes the line "delete fNextPacket;". Ross Finlayson Live Networks, Inc. (LIVE555.COM) From cbitsunil at gmail.com Wed Sep 17 17:53:18 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Thu, 18 Sep 2008 09:53:18 +0900 Subject: [Live-devel] Regarding TS stream pause In-Reply-To: <200809171503.m8HF3frm091156@ns.live555.com> References: <200809171503.m8HF3frm091156@ns.live555.com> Message-ID: Hi Ross, Thanks for the reply. Instead of adjusting the "fDurationInMicroseconds" , can not I stop streaming for few milliseconds and restart again.? (pausing) Because if the client is not ready to receive the data, I shouldn't send. If I use "RTPSink->stopPlaying", It completely stops.So I can not use this. Please help me Sunil. On Wed, Sep 17, 2008 at 11:56 PM, Ross Finlayson wrote: > > I am using livemedia library to send the TS Stream via RTP.(only sending) >> Some times I need to Pause the streaming for few milli seconds. >> Is it possible to do that.? >> According to my receiver's speed I need to adjust the sending. >> > > You should try to provide more buffering in your receiver, because the > streaming should be happening at an appropriate bitrate, ***on average***. > > However, if you *really* want to adjust the rate at which the TS data gets > sent, then the best way to do this would be to adjust the calculation of > "fDurationInMicroseconds" in "liveMedia/MPEG2TransportStreamFramer.cpp". > (As always, though, once you've made changes to the provided code, you > can't expect much more help on this mailing list.) > > > Ross Finlayson > Live Networks, Inc. (LIVE555.COM ) > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Sep 17 18:00:12 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 17 Sep 2008 18:00:12 -0700 Subject: [Live-devel] Regarding TS stream pause In-Reply-To: References: <200809171503.m8HF3frm091156@ns.live555.com> Message-ID: <200809180104.m8I14gEm016122@ns.live555.com> >Instead of adjusting the "fDurationInMicroseconds" , can not I stop >streaming for few >milliseconds and restart again.? (pausing) Yes, of course you could do that. But it's not the right way to do what you want to do. The "fDurationInMicroseconds" variable does exactly what you want: It tells the code how long to wait before sending the next packet. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From gbonneau at matrox.com Thu Sep 18 05:45:30 2008 From: gbonneau at matrox.com (Guy Bonneau) Date: Thu, 18 Sep 2008 08:45:30 -0400 Subject: [Live-devel] Potential Issues in ReorderingPacketBuffer Class Implementation In-Reply-To: <200809172120.m8HLKYHL082808@ns.live555.com> References: <5239B0E1745647BBBF32B568725BB07D@dorvalmatrox.matrox.com> <200809172120.m8HLKYHL082808@ns.live555.com> Message-ID: > No, each of the "BufferedPacket"s in the chain will be > deleted as a result of the call to "delete fHeadPacket;", > because the "BufferedPacket" destructor includes the line > "delete fNextPacket;". Indeed !!! Thanks for the clarification. Its much appreciated. From lucr at live.be Thu Sep 18 07:43:26 2008 From: lucr at live.be (Luc Rules) Date: Thu, 18 Sep 2008 16:43:26 +0200 Subject: [Live-devel] FW: Bi-directional audio Message-ID: Hi, Maybe a stupid question but would it somehow be possible to use an RTSP client to send audio back to the server? The idea being to create an RTSP connection to the server to receive its live video and audio stream and in the mean time to have a way to send audio from the client to the server over the same RTSP connection to achieve a bidirectional audio link. best regards, Luc Roebels _________________________________________________________________ Vanaf nu heb je je vrienden overal bij! http://www.windowslivemobile.msn.com/?mkt=nl-be -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Sep 18 11:37:58 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Sep 2008 11:37:58 -0700 Subject: [Live-devel] FW: Bi-directional audio In-Reply-To: References: Message-ID: <200809181841.m8IIfFAw017945@ns.live555.com> >Maybe a stupid question but would it somehow be possible to use an >RTSP client to send audio back to the server? There's no standard way to do this in RTSP, because that is predominantly a protocol for server->client streaming A better protocol for two-way media transfer is SIP. We implement a SIP client (note the "playSIP" demo application), but currently only in the server->client direction. (To add client->server streaming, you would need to create "RTPSink" objects for each stream.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From sureshs at tataelxsi.co.in Thu Sep 18 23:04:23 2008 From: sureshs at tataelxsi.co.in (sureshs) Date: Fri, 19 Sep 2008 11:34:23 +0530 Subject: [Live-devel] H264 Streaming Error In-Reply-To: Message-ID: <000a01c91a1d$910d46f0$51053c0a@telxsi.com> Hi, I am streaming the test.h264 file using the code provided in the mailing list by Mike Qilorma. I could stream the H264 file but the MPlayer failed to play the streamed file, I am here with attaching the MPlayer messages. //////////////////////////Message////////////////////////////// root at rameshkarigarb ~]#./testH264Streamer Play this stream using the URL "rtsp://10.60.5.56:8554/testStream" Beginning streaming... Beginning to read from file... ...done reading from file Beginning to read from file... [root at rameshkarigarb ~]# mplayer rtsp://10.60.5.56:8554/testStream MPlayer 1.0rc2-4.1.1 (C) 2000-2007 MPlayer Team CPU: Intel(R) Pentium(R) 4 CPU 3.00GHz (Family: 15, Model: 4, Stepping: 9) CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 1 Compiled for x86 CPU with extensions: MMX MMX2 SSE SSE2 Playing rtsp://10.60.5.56:8554/testStream. Resolving 10.60.5.56 for AF_INET6... Couldn't resolve name for AF_INET6: 10.60.5.56 Connecting to server 10.60.5.56[10.60.5.56]: 8554... rtsp_session: unsupported RTSP server. Server type is 'unknown'. STREAM_LIVE555, URL: rtsp://10.60.5.56:8554/testStream Stream not seekable! file format detected. Initiated "video/H264" RTP subsession on port 18888 demux_rtp: Failed to guess the video frame rate VIDEO: [H264] 0x0 0bpp 0.000 fps 0.0 kbps ( 0.0 kbyte/s) FPS not specified in the header or invalid, use the -fps option. No stream found. Exiting... (End of file) AFTER SPECIFYING THE FPS AS 30, MPLAYER IS GIVING THE FOLLOWING MESSAGES, [root at rameshkarigarb ~]# mplayer -fps 5 rtsp://10.60.5.56:8554/testStream MPlayer 1.0rc2-4.1.1 (C) 2000-2007 MPlayer Team CPU: Intel(R) Pentium(R) 4 CPU 3.00GHz (Family: 15, Model: 4, Stepping: 9) CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 1 Compiled for x86 CPU with extensions: MMX MMX2 SSE SSE2 Playing rtsp://10.60.5.56:8554/testStream. Resolving 10.60.5.56 for AF_INET6... Couldn't resolve name for AF_INET6: 10.60.5.56 Connecting to server 10.60.5.56[10.60.5.56]: 8554... rtsp_session: unsupported RTSP server. Server type is 'unknown'. STREAM_LIVE555, URL: rtsp://10.60.5.56:8554/testStream Stream not seekable! file format detected. Initiated "video/H264" RTP subsession on port 18888 VIDEO: [H264] 0x0 0bpp 0.000 fps 0.0 kbps ( 0.0 kbyte/s) ========================================================================== Opening video decoder: [ffmpeg] FFmpeg's libavcodec codec family Selected video codec: [ffh264] vfm: ffmpeg (FFmpeg H.264) ========================================================================== Audio: no sound FPS forced to be 30.000 Starting playback... V: 0.0 0/ 0 ??% ??% ??,?% 0 0 Exiting... (End of file) [root at rameshkarigarb ~]# //////////////////////////////////////////////////////////////////////////// /////// What could be the problem?. Regards, Suresh S The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From ibgg_11 at hotmail.com Fri Sep 19 10:01:48 2008 From: ibgg_11 at hotmail.com (israel garcia) Date: Fri, 19 Sep 2008 17:01:48 +0000 Subject: [Live-devel] Can I read from a video camera? Message-ID: In your test code, the source is a file test.*, but I would like to know if can I read from a video camera or a microphone, for example if can I reaf from /dev/video0. Thanks for your help _________________________________________________________________ Your PC, mobile phone, and online services work together like never before. http://clk.atdmt.com/MRT/go/108587394/direct/01/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 19 12:35:14 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 19 Sep 2008 12:35:14 -0700 Subject: [Live-devel] Can I read from a video camera? In-Reply-To: References: Message-ID: <200809191938.m8JJciKt082276@ns.live555.com> At 10:01 AM 9/19/2008, you wrote: >In your test code, the source is a file test.*, but I would like to >know if can I read from a video camera or a microphone, for example >if can I reaf from /dev/video0. Please read the FAQ!!! Ross Finlayson Live Networks, Inc. (LIVE555.COM) From erik at hovland.org Fri Sep 19 16:45:35 2008 From: erik at hovland.org (Erik Hovland) Date: Fri, 19 Sep 2008 16:45:35 -0700 Subject: [Live-devel] [patch] whitespace removal, spelling fixes, FSF address change Message-ID: <20080919234535.GA27609@hovland.org> I ran Krazy2 on the latest available live555. It turned up some tidbits. Whitespace at then end of line removal: http://hovland.org/live555-whitespace-removal.bz2 Spelling fixes (just two): http://hovland.org/live555-spelling-errors The FSF postal address has changed. The license headers need to reflect that: http://hovland.org/live555-fix-fsf-address.bz2 They are stacked in my tree in this order. I can rearrange the order if it isn't desirable to apply a whitespace removal patch at this time. E -- Erik Hovland mail: erik at hovland.org web: http://hovland.org/ PGP/GPG public key available on request From finlayson at live555.com Fri Sep 19 17:14:58 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 19 Sep 2008 17:14:58 -0700 Subject: [Live-devel] [patch] whitespace removal, spelling fixes, FSF address change In-Reply-To: <20080919234535.GA27609@hovland.org> References: <20080919234535.GA27609@hovland.org> Message-ID: <200809200019.m8K0JYIj075530@ns.live555.com> Thanks. BTW, I haven't forgotten about your earlier patches. I'll be reviewing/applying them, over time. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From erik at hovland.org Fri Sep 19 17:21:09 2008 From: erik at hovland.org (Erik Hovland) Date: Fri, 19 Sep 2008 17:21:09 -0700 Subject: [Live-devel] [patch] whitespace removal, spelling fixes, FSF address change In-Reply-To: <200809200019.m8K0JYIj075530@ns.live555.com> References: <20080919234535.GA27609@hovland.org> <200809200019.m8K0JYIj075530@ns.live555.com> Message-ID: > BTW, I haven't forgotten about your earlier patches. I'll be > reviewing/applying them, over time. Thanks for the update. I appreciate it. I got a hold of 2008.09.02 and I have a couple of updates to one of my patches. I will get it out when I am sure it is ready. E -- Erik Hovland erik at hovland.org http://hovland.org/ From cbitsunil at gmail.com Sat Sep 20 02:00:23 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Sat, 20 Sep 2008 18:00:23 +0900 Subject: [Live-devel] TimeStamp Message-ID: Hi, Is the time stamp calculation is different from VLC to livemedia stream (TS stream using RTP). 'or' the header is different. Coz my receiver able to accept the data from VLC streaming but unable to accept it ,if I send it through my application using livemedia library. But VLC on client PC able to play the stream sent from my application. What might be the problem with my client. Thanks in advance. Sunil. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 20 04:42:15 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 20 Sep 2008 04:42:15 -0700 Subject: [Live-devel] TimeStamp In-Reply-To: References: Message-ID: <200809201146.m8KBk384099934@ns.live555.com> >Coz my receiver able to accept the data from VLC streaming but >unable to accept it ,if I send it through my application >using livemedia library. >But VLC on client PC able to play the stream sent from my application. >What might be the problem with my client. I don't know; only you can debug your own code. Note, though, that VLC - when receiving a RTSP/RTP stream - uses the "LIVE555 Streaming Media" liibraries. I suggest that you first make sure that you can receive your data using our unmodified "openRTSP" application. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From cbitsunil at gmail.com Sat Sep 20 05:24:40 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Sat, 20 Sep 2008 21:24:40 +0900 Subject: [Live-devel] TimeStamp In-Reply-To: <200809201146.m8KBk384099934@ns.live555.com> References: <200809201146.m8KBk384099934@ns.live555.com> Message-ID: One more thing Ross, Is the time stamp techniq used by Livemedia and VLC are same?? Thanks, Sunil. On Sat, Sep 20, 2008 at 8:42 PM, Ross Finlayson wrote: > > Coz my receiver able to accept the data from VLC streaming but unable to >> accept it ,if I send it through my application >> using livemedia library. >> But VLC on client PC able to play the stream sent from my application. >> What might be the problem with my client. >> > > I don't know; only you can debug your own code. Note, though, that VLC - > when receiving a RTSP/RTP stream - uses the "LIVE555 Streaming Media" > liibraries. > > I suggest that you first make sure that you can receive your data using our > unmodified "openRTSP" application. > > > Ross Finlayson > Live Networks, Inc. (LIVE555.COM ) > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 20 05:28:48 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 20 Sep 2008 05:28:48 -0700 Subject: [Live-devel] TimeStamp In-Reply-To: References: <200809201146.m8KBk384099934@ns.live555.com> Message-ID: <200809201232.m8KCWIR2048256@ns.live555.com> >Is the time stamp techniq used by Livemedia and VLC are same?? Yes - both follow the RTP standard. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From cbitsunil at gmail.com Sat Sep 20 21:33:26 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Sun, 21 Sep 2008 13:33:26 +0900 Subject: [Live-devel] TimeStamp In-Reply-To: <200809201232.m8KCWIR2048256@ns.live555.com> References: <200809201146.m8KBk384099934@ns.live555.com> <200809201232.m8KCWIR2048256@ns.live555.com> Message-ID: But the timestamp difference between two consecutive packets in VLC case is different from Livemedia. I compared the first 5 packets. I played the same file.But the timestamp difference incase of VLC on an average is 0xdd, But in case of Livemedia it is 0x0f (average). Whats the reason?? Thanks and regards, Sunil. On Sat, Sep 20, 2008 at 9:28 PM, Ross Finlayson wrote: > > Is the time stamp techniq used by Livemedia and VLC are same?? >> > > Yes - both follow the RTP standard. > > > > Ross Finlayson > Live Networks, Inc. (LIVE555.COM ) > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amit.yedidia at elbitsystems.com Sun Sep 21 05:44:34 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Sun, 21 Sep 2008 15:44:34 +0300 Subject: [Live-devel] H264 streaming Message-ID: Hi Mike I would like to know if there is an updated version for the H264VideoStreamFramer.cpp you published few months ago? Thank you very much. Regards, Amit Yedidia The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sukhbir.singh20 at gmail.com Mon Sep 22 05:07:56 2008 From: sukhbir.singh20 at gmail.com (Sukhbir Singh) Date: Mon, 22 Sep 2008 17:37:56 +0530 Subject: [Live-devel] NewBie Queries Message-ID: <3b2878e60809220507k647e654am4ef41ba9199a4e3f@mail.gmail.com> Hi All, I am new user of live555. I have some queries regards live555.I know these are very simple queries by putting efforts i can easily get the answer of these question.But due to shortage of time, i can not do that: 1. First Is live555 also do coding/decoding or to do coding/decoding we need to use ffmpeg library. 2. Is live555 also provides the functionality of SIP UAS as i know it provides SIP UAC functionality. If no then please tell me from where i can get the code for SIP UAS. 3. Third, test programmes that are part of live555 provides only single call support. I need to write application over live555 that can support simultaneous mutiple calls. For that Is there need to do changes in the exising code of live555. Or i need to implement logis in my application for support of multiple calls. Thanks in Advance -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike at subfocal.net Mon Sep 22 15:51:17 2008 From: mike at subfocal.net (Mike Mueller) Date: Mon, 22 Sep 2008 18:51:17 -0400 Subject: [Live-devel] Discarding interleaved RTP or RTCP packet Message-ID: <20080922225117.GC19240@samus.subfocal.net> Hi all, I'm hitting a problem (via vlc), where playback behaves normally for a minute or two, and then a stream of messages appears on the console, and playback hangs: Discarding interleaved RTP or RTCP packet (1444 bytes, channel id 86) Repeated infinitely. This is happening when playing an MPEG4 RTSP stream from an Axis 243SA video server. Any insight as to what this might mean? It doesn't seem to happen consistently, but it's happening a lot today. Seen with both VLC 0.8.6i and 0.9.2. Thanks, Mike -- Mike Mueller mike at subfocal.net From finlayson at live555.com Mon Sep 22 17:40:38 2008 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 22 Sep 2008 17:40:38 -0700 Subject: [Live-devel] Discarding interleaved RTP or RTCP packet In-Reply-To: <20080922225117.GC19240@samus.subfocal.net> References: <20080922225117.GC19240@samus.subfocal.net> Message-ID: <200809230047.m8N0laxf014977@ns.live555.com> >I'm hitting a problem (via vlc), where playback behaves normally for a >minute or two, and then a stream of messages appears on the console, >and playback hangs: > >Discarding interleaved RTP or RTCP packet (1444 bytes, channel id 86) > >Repeated infinitely. This is happening when playing an MPEG4 RTSP >stream from an Axis 243SA video server. Any insight as to what this >might mean? It doesn't seem to happen consistently, but it's happening >a lot today. Seen with both VLC 0.8.6i and 0.9.2. This may be a problem with your server (although I can't say for sure). The only thing I can suggest right now is to not use RTP-over-TCP streaming. (That means, of course, that you can't have any firewall between your server and client that's blocking UDP packets.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From sureshs at tataelxsi.co.in Mon Sep 22 20:33:08 2008 From: sureshs at tataelxsi.co.in (sureshs) Date: Tue, 23 Sep 2008 09:03:08 +0530 Subject: [Live-devel] Regards your question in live555 mailing list In-Reply-To: Message-ID: <00c501c91d2d$19a33530$51053c0a@telxsi.com> Refer the June-2008 archive. Regards, Suresh S Phone: +91-80-22979919 Mobile: +91-99020-66995 -----Original Message----- From: Yedidia Amit [mailto:amit.yedidia at elbitsystems.com] Sent: Sunday, September 21, 2008 3:33 PM To: sureshs Subject: Regards your question in live555 mailing list Dear Soresh, I don't have an answer to your question but can you please send me a link to the code you talked about? (H264 streaming) Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 4320 bytes Desc: not available URL: From AidenYuan at coomosoft.com Tue Sep 23 02:58:20 2008 From: AidenYuan at coomosoft.com (Aiden_Coomo) Date: Tue, 23 Sep 2008 17:58:20 +0800 Subject: [Live-devel] question about the live555 RTP sending speed In-Reply-To: References: Message-ID: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> Hi,Guys! I had added a h264 payload into live555Media server. But I can't smoothly play a H264 file, its size is 108,333,536 byte and its playing time is 255 second. After simple analysis, I found live555Media Server spent 30 seconds sending about 1000 RTP packages. If we suppose that one RTP package have 1500 bytes (Max MTU), the max sent byte number in 30 seconds is about 1000*1500 = 1,500,000, the max sending rate is 1,500,000/30 = 300 K byte/s.(2,400K bit/s) So I want to know, Can I smoothly to play a media file which is over than 300K byte/s. and How? I tried to play a .m4e file using the latest live555MediaServer as server, after catching packages, I found the average RTP sending rate is about 20 package/s. Notes: I use quicktime, realplay as client, using ethereal to catch the package. The local network is enough. (1G bps) Best Regards -----????----- ???: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] ?? live-devel-request at ns.live555.com ????: 2008?9?23? 8:48 ???: live-devel at ns.live555.com ??: live-devel Digest, Vol 59, Issue 13 Send live-devel mailing list submissions to live-devel at lists.live555.com To subscribe or unsubscribe via the World Wide Web, visit http://lists.live555.com/mailman/listinfo/live-devel or, via email, send a message with subject or body 'help' to live-devel-request at lists.live555.com You can reach the person managing the list at live-devel-owner at lists.live555.com When replying, please edit your Subject line so it is more specific than "Re: Contents of live-devel digest..." From Anoop_P.A at pmc-sierra.com Tue Sep 23 05:16:46 2008 From: Anoop_P.A at pmc-sierra.com (Anoop P.A.) Date: Tue, 23 Sep 2008 05:16:46 -0700 Subject: [Live-devel] getting parsePESPacket error while streaming MPEG2 Message-ID: Hi , I have compiled this particular source (?http://www.live555.com/liveMedia/public/live555-latest.tar.gz) code in my FC5 machine. And I streamed this video ( http://www.leadcodecs.com/Download/mpeg/WhatBox_MPEG2_720x480_q5.mpg ). Once I opened rtsp link from a Win Xp machine in same subnet I got following error in server side ?./live555MediaServer LIVE555 Media Server version 0.19 (LIVE555 Streaming Media library version 2008.09.02). Play streams from this server using the URL rtsp://172.16.48.50/ where is a file present in the current directory. Each file's type is inferred from its name suffix: ".aac" => an AAC Audio (ADTS format) file ".amr" => an AMR Audio file ".m4e" => a MPEG-4 Video Elementary Stream file ".mp3" => a MPEG-1 or 2 Audio file ".mpg" => a MPEG-1 or 2 Program Stream (audio+video) file ".ts" => a MPEG Transport Stream file (a ".tsx" index file - if present - provides server 'trick play' support) ".wav" => a WAV Audio file See http://www.live555.com/mediaServer/ for additional documentation. MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length (10034) exceeds max frame size asked for (10000) MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length (18846) exceeds max frame size asked for (10000) Video what I am getting in VLC is scrambled. I tried opening link using Quicktime player I am getting only audio .I tried recording using openRTSP client, even though I got same kind of prints in server side I was able to download. To check whether it is problem with my FC5 box I downloaded binary from ( ?http://www.live555.com/mediaServer/linux/live555MediaServer ) and ran it on RHEL4 machine then also I found same issue. ?./live555MediaServer LIVE555 Media Server version 0.19 (LIVE555 Streaming Media library version 2008.02.08). Play streams from this server using the URL rtsp://209.68.166.125:8554/ where is a file present in the current directory. Each file's type is inferred from its name suffix: ".aac" => an AAC Audio (ADTS format) file ".amr" => an AMR Audio file ".m4e" => a MPEG-4 Video Elementary Stream file ".mp3" => a MPEG-1 or 2 Audio file ".mpg" => a MPEG-1 or 2 Program Stream (audio+video) file ".ts" => a MPEG Transport Stream file (a ".tsx" index file - if present - provides server 'trick play' support) ".wav" => a WAV Audio file See http://www.live555.com/mediaServer/ for additional documentation. MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length (10034) exceeds max frame size asked for (10000) MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length (18846) exceeds max frame size asked for (10000) MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length (19322) exceeds max frame size asked for (10000) MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length (18858) exceeds max frame size asked for (10000) MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length (18858) exceeds max frame size asked for (10000) I am new to video streaming. Kindly let me know if I am doing any thing wrong Thanks Anoop -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 23 06:11:05 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 23 Sep 2008 06:11:05 -0700 Subject: [Live-devel] question about the live555 RTP sending speed In-Reply-To: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> Message-ID: <200809231314.m8NDERRU095003@ns.live555.com> The transmission speed is determined by the value of the "fDurationInMicroseconds" variable. Your "*Framer* object (in this case, your "H264VideoStreamFramer" subclass) needs to properly set this for each H.264 NAL unit that it delivers. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From amit.yedidia at elbitsystems.com Tue Sep 23 06:27:28 2008 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Tue, 23 Sep 2008 16:27:28 +0300 Subject: [Live-devel] Imlementing H264VideoStreamFramer Message-ID: Hi Ross I am trying to imlement H264VideoStreamFramer but got some difficulties. On the first time that H264VideoStreamFramer::continueReadProcessing() is running it called fParser->parse(frameDuration). This call is return with exception, which according to other examples is expected. However I didn't figure it out who is filling the buffer? According to what scheduled? And when it was initialized? In addition the H264VideoRTPsinc::doGetNextFrame() is not being called anymore? Where it was supposed to be scheduled? Tahnk you. Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 23 07:20:32 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 23 Sep 2008 07:20:32 -0700 Subject: [Live-devel] getting parsePESPacket error while streaming MPEG2 In-Reply-To: References: Message-ID: <200809231426.m8NEQ0J8070171@ns.live555.com> >MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length >(10034) exceeds max frame size asked for (10000) >MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length >(18846) exceeds max frame size asked for (10000) You can overcome these error messages by increasing the size of the "fBuf" array (in "MPEG1or2FileServerDemux.cpp", line 189) to 20000 (or more). (However, I suspect that there may be a problem with your MPEG file, because when I play i using VLC, I hear the audio, but don't see any video.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From Anoop_P.A at pmc-sierra.com Tue Sep 23 07:42:00 2008 From: Anoop_P.A at pmc-sierra.com (Anoop P.A.) Date: Tue, 23 Sep 2008 07:42:00 -0700 Subject: [Live-devel] getting parsePESPacket error while streaming MPEG2 In-Reply-To: <200809231426.m8NEQ0J8070171@ns.live555.com> Message-ID: Hi Ross, Thanks for your immediate response. I had tried with other mpeg files as well and I got same error >You can overcome these error messages by increasing the size of the >"fBuf" array (in "MPEG1or2FileServerDemux.cpp", line 189) to 20000 (or >more). If I increase fbuf what will be its impacts.?? Thanks Anoop -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, September 23, 2008 7:51 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] getting parsePESPacket error while streaming MPEG2 >MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length >(10034) exceeds max frame size asked for (10000) >MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length >(18846) exceeds max frame size asked for (10000) You can overcome these error messages by increasing the size of the "fBuf" array (in "MPEG1or2FileServerDemux.cpp", line 189) to 20000 (or more). (However, I suspect that there may be a problem with your MPEG file, because when I play i using VLC, I hear the audio, but don't see any video.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From Anoop_P.A at pmc-sierra.com Tue Sep 23 08:13:37 2008 From: Anoop_P.A at pmc-sierra.com (Anoop P.A.) Date: Tue, 23 Sep 2008 08:13:37 -0700 Subject: [Live-devel] getting parsePESPacket error while streaming MPEG2 In-Reply-To: Message-ID: Hi Ross, This might be a bug with vlc as this particular file is opening fine with windows media player. But if we open with vlc it is not showing video fully. Thanks Anoop -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Anoop P.A. Sent: Tuesday, September 23, 2008 8:12 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] getting parsePESPacket error while streaming MPEG2 Hi Ross, Thanks for your immediate response. I had tried with other mpeg files as well and I got same error >You can overcome these error messages by increasing the size of the >"fBuf" array (in "MPEG1or2FileServerDemux.cpp", line 189) to 20000 (or >more). If I increase fbuf what will be its impacts.?? Thanks Anoop -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, September 23, 2008 7:51 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] getting parsePESPacket error while streaming MPEG2 >MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length >(10034) exceeds max frame size asked for (10000) >MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length >(18846) exceeds max frame size asked for (10000) You can overcome these error messages by increasing the size of the "fBuf" array (in "MPEG1or2FileServerDemux.cpp", line 189) to 20000 (or more). (However, I suspect that there may be a problem with your MPEG file, because when I play i using VLC, I hear the audio, but don't see any video.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From mrbrain at email.it Tue Sep 23 08:54:51 2008 From: mrbrain at email.it (MrBrain) Date: Tue, 23 Sep 2008 17:54:51 +0200 Subject: [Live-devel] mp3 streaming does not work on different machines Message-ID: <61159659944aabd02ec9d5aa8bfd19e4@212.189.140.12> I am using TestMp3Streamer on the sending side,  and TestMp3Receiver on the receiving side. If sender and receiver are on the same machine, everything works fine, but if they are on different machines, TestMP3Receiver does not give any error, but does not receive anything either (It prints out "Beginning receiving multicast stream"). There is no firewall, do you have any idea about the possible reason?     -- Email.it, the professional e-mail, gratis per te: http://www.email.it/f Sponsor: Carta Eureka realizza i tuoi sogni! Fido fino a 3.000 eruo, rate a partire da 20 euro e canone gratis il 1? anno. Scoprila! Clicca qui: http://adv.email.it/cgi-bin/foclick.cgi?mid=7876&d=20080923 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Sep 23 10:42:21 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 23 Sep 2008 10:42:21 -0700 Subject: [Live-devel] mp3 streaming does not work on different machines In-Reply-To: <61159659944aabd02ec9d5aa8bfd19e4@212.189.140.12> References: <61159659944aabd02ec9d5aa8bfd19e4@212.189.140.12> Message-ID: <200809231749.m8NHntll083709@ns.live555.com> At 08:54 AM 9/23/2008, MrBrain wrote: This is a professional mailing list - not "MySpace". Pseudonyms are not welcome here. >I am using TestMp3Streamer on the sending side, and TestMp3Receiver >on the receiving side. > >If sender and receiver are on the same machine, everything works >fine, but if they are on different machines, TestMP3Receiver does >not give any error, but does not receive anything either (It prints >out "Beginning receiving multicast stream"). > >There is no firewall, do you have any idea about the possible reason? There might not be a frewall between the sender and receiver, but apparently you have a router there that (for whatever reason) does not relay IP multicast packets. Because of this, you should use unicast streaming instead. Use "testOnDemandRTSPServer" or "live555MediaServer" as your server, and "openRTSP" (or a media player, like "VLC") as your client. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Tue Sep 23 10:52:04 2008 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 23 Sep 2008 10:52:04 -0700 Subject: [Live-devel] getting parsePESPacket error while streaming MPEG2 In-Reply-To: References: Message-ID: <200809231759.m8NHxrxI094135@ns.live555.com> >This might be a bug with vlc as this particular file is opening fine >with windows media player. But if we open with vlc it is not showing >video fully. Please report this to the VLC developers (at "vlc at videolan.org"). They will be interested to learn of files that VLC does not play properly (but other media players do). Ross Finlayson Live Networks, Inc. (LIVE555.COM) From cbitsunil at gmail.com Tue Sep 23 18:32:10 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Wed, 24 Sep 2008 10:32:10 +0900 Subject: [Live-devel] VLC manipulationof input stream Message-ID: Hi All, Is VLC is manipulating the input stream while streaming via RTP compared to Livemedia library. I have compared both using wireshark. I came to know that Livemedia streams the data without manipulating the input stream, but VLC modifies the input stream. Anybody knows what exactly is the difference between those two while streaming the data. Thanks in advance. Sunil. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjhk99 at mst.edu Tue Sep 23 20:22:04 2008 From: tjhk99 at mst.edu (Harms, Tyler Jacob (S&T-Student)) Date: Tue, 23 Sep 2008 22:22:04 -0500 Subject: [Live-devel] AAC Streaming Via Darwin Message-ID: Hi, I am attempting to stream AAC audio (from an encoder) using live555 libraries through a Darwin Streaming Server. I think I have configured everything properly, but I get the error 'MPEG4GenericRTPSource Warning: Unknown or unsupported "mode":(null)'. when attempting to stream the audio. My understanding is that the "mode" variable in MPEG4GenericRTPSource is set on creation, and the only place I see it created is from within the MediaSubsession creation, where fMode is passed to the new MPEG4GenericRTPSource. Unfortunately, I don't see any way to initialize fMode without using an SDP file, which is counter to what we are attempting to do with Darwin. What method should be used to set this variable? Can I do it on initialization of the SimpleRTPSink? My audio RTP sink is set up as follows (in a larger context, this is a modification of the testMPEG4VideoToDarwin example): audioSink = SimpleRTPSink::createNew(*env, &rtpGroupsockAudio, 97, 44100, "audio", "mpeg4-generic", 2); Thanks, -Tyler From malte at frontbase.com Tue Sep 23 22:56:12 2008 From: malte at frontbase.com (Malte Tancred) Date: Wed, 24 Sep 2008 07:56:12 +0200 Subject: [Live-devel] question about the live555 RTP sending speed In-Reply-To: <200809231314.m8NDERRU095003@ns.live555.com> References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> <200809231314.m8NDERRU095003@ns.live555.com> Message-ID: On 23 sep 2008, at 15.11, Ross Finlayson wrote: > The transmission speed is determined by the value of the > "fDurationInMicroseconds" variable. Your "*Framer* object (in this > case, your "H264VideoStreamFramer" subclass) needs to properly set > this for each H.264 NAL unit that it delivers. I suppose this is correct. However, "properly set" doesn't necessarily mean what you first think it means. These are my findings: - all NAL units that belong to the same frame, except the last NAL unit in that frame, should have a duration of 0 (zero). - the last NAL unit in each frame should have a duration matching that of its frame. - all NAL units in a frame should have the same presentation time. (Please note that what I call "frame" above is called an "access unit" in the liveMedia library) What brought my attention to this was the following message to the list, regarding zero duration "frames" delivered by the MPEG stream framer: http://lists.live555.com/pipermail/live-devel/2004-April/000650.html Regards, Malte From finlayson at live555.com Wed Sep 24 00:16:22 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 24 Sep 2008 00:16:22 -0700 Subject: [Live-devel] AAC Streaming Via Darwin In-Reply-To: References: Message-ID: <200809240721.m8O7Lpbk029775@ns.live555.com> >I am attempting to stream AAC audio (from an encoder) using live555 >libraries through a Darwin Streaming Server. A reminder once again: You don't need to use a separate 'Darwin' server. We have our own RTSP server implementation which you could use instead. However... >I think I have configured everything properly, but I get the error >'MPEG4GenericRTPSource Warning Why are you creating a "RTPSource" at all?? "RTPSource" objects are used to *receive* RTP streams. Because you are transmitting a RTP stream, not eceiving one, you use a "RTPSnk" - specifically, in this case a "MPEG4GenericRTPSink". >audioSink = SimpleRTPSink::createNew(*env, &rtpGroupsockAudio, 97, >44100, "audio", "mpeg4-generic", 2); Don't use a "SimpleRTPSink", instead use a "MPEG4GenericRTPSink"; that was specifically designed for streaming AAC audio. Pass "audio" as the "sdpMediaTypeString" parameter, and "AAC-hbr" as the "mpeg4Mode" parameter. (Once again, if you'd used our RTSP server implementation, then a lot of this work would already be done for you.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Wed Sep 24 00:26:26 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 24 Sep 2008 00:26:26 -0700 Subject: [Live-devel] question about the live555 RTP sending speed In-Reply-To: References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> <200809231314.m8NDERRU095003@ns.live555.com> Message-ID: <200809240732.m8O7W4vJ040408@ns.live555.com> >>The transmission speed is determined by the value of the >>"fDurationInMicroseconds" variable. Your "*Framer* object (in this >>case, your "H264VideoStreamFramer" subclass) needs to properly set >>this for each H.264 NAL unit that it delivers. > >I suppose this is correct. However, "properly set" doesn't necessarily >mean what you first think it means. Since I wrote the software, I'm the person who gets to define what "properly set" means :-) >These are my findings: > >- all NAL units that belong to the same frame, except the last NAL > unit in that frame, should have a duration of 0 (zero). > >- the last NAL unit in each frame should have a duration matching > that of its frame. > >- all NAL units in a frame should have the same presentation time. None of this is inconsistent with what I said earlier. Your "H264VideoStreamFramer" subclass implementation should set the "fPresentationTime" and "fDurationInMicroseconds" variables in accordance with what you just said. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Wed Sep 24 00:28:52 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 24 Sep 2008 00:28:52 -0700 Subject: [Live-devel] VLC manipulationof input stream In-Reply-To: References: Message-ID: <200809240737.m8O7bCVF045740@ns.live555.com> Questions about VLC should be sent to the "vlc at videolan.org" mailing list. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From venugopalpaikr at tataelxsi.co.in Wed Sep 24 02:10:06 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Wed, 24 Sep 2008 14:40:06 +0530 Subject: [Live-devel] Query about how to build an RTSP server Message-ID: <006001c91e25$56b474e0$3c033c0a@telxsi.com> Hi, Am trying to develop an RTSP server using RTSPServer.cpp code. Is it enough for me to make an object for the class RTSPServer? After initializing this object wil the code automatically take care of listening to requests from the client and responding to them or should any modification be done? Kindly provide me with some isights. Thanks, Venu The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From malte at frontbase.com Wed Sep 24 02:12:19 2008 From: malte at frontbase.com (Malte Tancred) Date: Wed, 24 Sep 2008 11:12:19 +0200 Subject: [Live-devel] question about the live555 RTP sending speed In-Reply-To: <200809240732.m8O7W4vJ040408@ns.live555.com> References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> <200809231314.m8NDERRU095003@ns.live555.com> <200809240732.m8O7W4vJ040408@ns.live555.com> Message-ID: On 24 sep 2008, at 09.26, Ross Finlayson wrote: >> I suppose this is correct. However, "properly set" doesn't >> necessarily >> mean what you first think it means. > > Since I wrote the software, I'm the person who gets to define what > "properly set" means :-) Ah, definately! I never meant to suggest otherwise. I used the wording "what you first think" to mean "you" in general, not you Ross in particular. Perhaps "what one first thinks" would have been a better wording? Anyway, sorry for the confusion. I blame it on english not being my native tongue. :-) > None of this is inconsistent with what I said earlier. Nor was it my intent to say it was. Just wanted to point out a potential pitfall (one that I sadly fell into). > Your "H264VideoStreamFramer" subclass implementation should set the > "fPresentationTime" and "fDurationInMicroseconds" variables in > accordance with what you just said. Good to know! Thanks, Malte From finlayson at live555.com Wed Sep 24 02:15:48 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 24 Sep 2008 02:15:48 -0700 Subject: [Live-devel] Query about how to build an RTSP server In-Reply-To: <006001c91e25$56b474e0$3c033c0a@telxsi.com> References: <006001c91e25$56b474e0$3c033c0a@telxsi.com> Message-ID: <200809240921.m8O9Ln5P054951@ns.live555.com> > Am trying to develop an RTSP server using RTSPServer.cpp code. Is it >enough for me to make an object for the class RTSPServer? After initializing >this object wil the code automatically take care of listening to requests >from the client and responding to them Yes. I suggest looking at the code for the existing "testOnDemandRTSPServer" demo application for help. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From dmaljur at elma.hr Wed Sep 24 02:31:23 2008 From: dmaljur at elma.hr (Dario) Date: Wed, 24 Sep 2008 11:31:23 +0200 Subject: [Live-devel] Creating multiple RTSP receiver clients in one process. Message-ID: <001301c91e28$4f7e1d40$ec03000a@gen47> How many concurrent streams can live555 handle? At one time I would need to handle 64 concurrent streams. What is the best way to create multiple RTSPclients that will all receive MPEG4 stream in the same time? ELMA Kurtalj d.o.o. (ELMA Kurtalj ltd.) Vitezi?eva 1a, 10000 Zagreb, Hrvatska (Viteziceva 1a, 10000 Zagreb, Croatia) Tel: 01/3035555, Faks: 01/3035599 (Tel: ++385-1-3035555, Fax: ++385-1-3035599 ) Www: www.elma.hr; shop.elma.hr E-mail: elma at elma.hr (elma at elma.hr) pitanje at elma.hr (questions at elma.hr) primjedbe at elma.hr (complaints at elma.hr) prodaja at elma.hr (sales at elma.hr) servis at elma.hr (servicing at elma.hr) shop at elma.hr (shop at elma.hr) skladiste at elma.hr (warehouse at elma.hr) -------------- next part -------------- An HTML attachment was scrubbed... URL: From venugopalpaikr at tataelxsi.co.in Wed Sep 24 03:06:39 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Wed, 24 Sep 2008 15:36:39 +0530 Subject: [Live-devel] Query about how to build an RTSP server In-Reply-To: <200809240921.m8O9Ln5P054951@ns.live555.com> Message-ID: <007801c91e2d$3d2f4740$3c033c0a@telxsi.com> Hi Ross, Thanks for the information. I did check out the testOnDemandRTSPServer code wherein the RTSP server is initiated and so is the streaming of stored data. I am developing an RTSP server that waits for request from a client and on receiving the request initiates streaming from a camera. Can you please provide me more insights as to how i should change the existing code for this scenario. Regards, Venu -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Ross Finlayson Sent: Wednesday, September 24, 2008 2:46 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Query about how to build an RTSP server > Am trying to develop an RTSP server using RTSPServer.cpp code. Is it >enough for me to make an object for the class RTSPServer? After initializing >this object wil the code automatically take care of listening to requests >from the client and responding to them Yes. I suggest looking at the code for the existing "testOnDemandRTSPServer" demo application for help. Ross Finlayson Live Networks, Inc. (LIVE555.COM) _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From finlayson at live555.com Wed Sep 24 06:55:24 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 24 Sep 2008 06:55:24 -0700 Subject: [Live-devel] Query about how to build an RTSP server In-Reply-To: <007801c91e2d$3d2f4740$3c033c0a@telxsi.com> References: <200809240921.m8O9Ln5P054951@ns.live555.com> <007801c91e2d$3d2f4740$3c033c0a@telxsi.com> Message-ID: <200809241358.m8ODwibd044127@ns.live555.com> > Thanks for the information. I did check out the >testOnDemandRTSPServer code wherein the RTSP server is initiated and so is >the streaming of stored data. I am developing an RTSP server that waits for >request from a client and on receiving the request initiates streaming from >a camera. Can you please provide me more insights as to how i should change >the existing code for this scenario. See Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Wed Sep 24 07:28:12 2008 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 24 Sep 2008 07:28:12 -0700 Subject: [Live-devel] Creating multiple RTSP receiver clients in one process. In-Reply-To: <001301c91e28$4f7e1d40$ec03000a@gen47> References: <001301c91e28$4f7e1d40$ec03000a@gen47> Message-ID: <200809241431.m8OEVc7B078644@ns.live555.com> At 02:31 AM 9/24/2008, you wrote: >How many concurrent streams can live555 handle? > >At one time I would need to handle 64 concurrent streams. >What is the best way to create multiple RTSPclients that will all >receive MPEG4 stream in the same time? In principle you can create multiple "RTSPClient" objects, each using a single event loop (and a single thread). However, because our "RTSPClient" implementation currently uses synchronous (blocking) network I/O, this will not work well. Therefore, for now, I suggest creating multiple *processes*, each with its own "RTSPClient". Ross Finlayson Live Networks, Inc. (LIVE555.COM) From AidenYuan at coomosoft.com Wed Sep 24 23:10:04 2008 From: AidenYuan at coomosoft.com (Aiden_Coomo) Date: Thu, 25 Sep 2008 14:10:04 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIHF1ZXN0aW9uIGFib3V0IHRoZSBsaXZl?= =?gb2312?b?NTU1IFJUUCBzZW5kaW5nIHNwZWVk?= In-Reply-To: <200809231314.m8NDERRU095003@ns.live555.com> References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> <200809231314.m8NDERRU095003@ns.live555.com> Message-ID: <000201c91ed5$5a6b2b50$9909a8c0@shuweiyuan> HI,Ross! Thanks for your suggestion. Now, I can do better. But during the presentation, some quickly moving things were not legible yet. Whether do I missed other variables like " fDurationInMicroseconds ". -----????----- ???: Ross Finlayson [mailto:finlayson at live555.com] ????: 2008?9?23? 21:11 ???: LIVE555 Streaming Media - development & use ??: Re: [Live-devel] question about the live555 RTP sending speed The transmission speed is determined by the value of the "fDurationInMicroseconds" variable. Your "*Framer* object (in this case, your "H264VideoStreamFramer" subclass) needs to properly set this for each H.264 NAL unit that it delivers. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From venugopalpaikr at tataelxsi.co.in Wed Sep 24 23:55:15 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Thu, 25 Sep 2008 12:25:15 +0530 Subject: [Live-devel] Query regarding our_inet_ntoa() Message-ID: <00c301c91edb$aa9f5690$3c033c0a@telxsi.com> Hi, Am trying to develop an RTSPServer using RTSPServer.cpp. During compilation am getting the following errors: RTSPServer.cpp:157: error: ?our_inet_ntoa? cannot be used as a function inet.c:17: error: cp was not declared in this scope inet.c:18: error: expected , or ; before char inet.c:19: error: expected unqualified-id before { token inet.c:24: error: in was not declared in this scope inet.c:25: error: expected , or ; before struct inet.c:26: error: expected unqualified-id before { token inet.c:87: error: name was not declared in this scope inet.c:88: error: expected , or ; before char inet.c:89: error: expected unqualified-id before { token GroupsockHelper.hh:127: error: char* our_inet_ntoa(in_addr) redeclared as different kind of symbol inet.c:87: error: previous declaration of hostent* our_gethostbyname inet.c:99: error: previous declaration of long int our_random() with C++ linkage Can anybody please help me if they have encountered these kind of errors? Regards Venu The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From AidenYuan at coomosoft.com Wed Sep 24 23:51:54 2008 From: AidenYuan at coomosoft.com (Aiden_Coomo) Date: Thu, 25 Sep 2008 14:51:54 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIHF1ZXN0aW9uIGFib3V0IHRoZSBsaXZl?= =?gb2312?b?NTU1IFJUUCBzZW5kaW5nIHNwZWVk?= In-Reply-To: References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan><200809231314.m8NDERRU095003@ns.live555.com> Message-ID: <000301c91edb$321f4130$9909a8c0@shuweiyuan> HI, Malte! Thanks, The following blue notes are my questions. -----????----- ???: Malte Tancred [mailto:malte at frontbase.com] ????: 2008?9?24? 13:56 ???: live-devel at ns.live555.com ??: Re: [Live-devel] question about the live555 RTP sending speed On 23 sep 2008, at 15.11, Ross Finlayson wrote: > The transmission speed is determined by the value of the > "fDurationInMicroseconds" variable. Your "*Framer* object (in this > case, your "H264VideoStreamFramer" subclass) needs to properly set > this for each H.264 NAL unit that it delivers. I suppose this is correct. However, "properly set" doesn't necessarily mean what you first think it means. These are my findings: - all NAL units that belong to the same frame, except the last NAL unit in that frame, should have a duration of 0 (zero). [Aiden: ] : what's the meaning of " duration ", I guess, your mean?s that the consecutive nal units in the same frame have the same timestamp, right? - the last NAL unit in each frame should have a duration matching that of its frame. [Aiden: ] : now, my all NAL unit in an access unit have the same timestamp. Is it accord with your meaning? - all NAL units in a frame should have the same presentation time. [Aiden: ] : what?s the presentation time? ---- Timestamp ? (Please note that what I call "frame" above is called an "access unit" in the liveMedia library) What brought my attention to this was the following message to the list, regarding zero duration "frames" delivered by the MPEG stream framer: http://lists.live555.com/pipermail/live-devel/2004-April/000650.html Regards, Malte -------------- next part -------------- An HTML attachment was scrubbed... URL: From AidenYuan at coomosoft.com Thu Sep 25 00:25:16 2008 From: AidenYuan at coomosoft.com (Aiden_Coomo) Date: Thu, 25 Sep 2008 15:25:16 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIHF1ZXN0aW9uIGFib3V0IHRoZSBsaXZl?= =?gb2312?b?NTU1IFJUUCBzZW5kaW5nIHNwZWVk?= In-Reply-To: <200809240732.m8O7W4vJ040408@ns.live555.com> References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan><200809231314.m8NDERRU095003@ns.live555.com> <200809240732.m8O7W4vJ040408@ns.live555.com> Message-ID: <000801c91edf$dbc08f60$9909a8c0@shuweiyuan> Sorry I can't grasp "properly set". ^_^ Another question about "fPresentationTime". The sending rate depends on "fDurationInMicroseconds", Then ,what is "fPresentationTime" related to ? timestamp, right ??? -----????----- ???: Ross Finlayson [mailto:finlayson at live555.com] ????: 2008?9?24? 15:26 ???: LIVE555 Streaming Media - development & use ??: Re: [Live-devel] question about the live555 RTP sending speed >>The transmission speed is determined by the value of the >>"fDurationInMicroseconds" variable. Your "*Framer* object (in this >>case, your "H264VideoStreamFramer" subclass) needs to properly set >>this for each H.264 NAL unit that it delivers. > >I suppose this is correct. However, "properly set" doesn't necessarily >mean what you first think it means. Since I wrote the software, I'm the person who gets to define what "properly set" means :-) >These are my findings: > >- all NAL units that belong to the same frame, except the last NAL > unit in that frame, should have a duration of 0 (zero). > >- the last NAL unit in each frame should have a duration matching > that of its frame. > >- all NAL units in a frame should have the same presentation time. None of this is inconsistent with what I said earlier. Your "H264VideoStreamFramer" subclass implementation should set the "fPresentationTime" and "fDurationInMicroseconds" variables in accordance with what you just said. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From AidenYuan at coomosoft.com Thu Sep 25 00:25:16 2008 From: AidenYuan at coomosoft.com (Aiden_Coomo) Date: Thu, 25 Sep 2008 15:25:16 +0800 Subject: [Live-devel] =?gb2312?b?tPC4tDogIHF1ZXN0aW9uIGFib3V0IHRoZSBsaXZl?= =?gb2312?b?NTU1IFJUUCBzZW5kaW5nIHNwZWVk?= In-Reply-To: <200809240732.m8O7W4vJ040408@ns.live555.com> References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan><200809231314.m8NDERRU095003@ns.live555.com> <200809240732.m8O7W4vJ040408@ns.live555.com> Message-ID: <000801c91edf$dbc08f60$9909a8c0@shuweiyuan> Sorry I can't grasp "properly set". ^_^ Another question about "fPresentationTime". The sending rate depends on "fDurationInMicroseconds", Then ,what is "fPresentationTime" related to ? timestamp, right ??? -----????----- ???: Ross Finlayson [mailto:finlayson at live555.com] ????: 2008?9?24? 15:26 ???: LIVE555 Streaming Media - development & use ??: Re: [Live-devel] question about the live555 RTP sending speed >>The transmission speed is determined by the value of the >>"fDurationInMicroseconds" variable. Your "*Framer* object (in this >>case, your "H264VideoStreamFramer" subclass) needs to properly set >>this for each H.264 NAL unit that it delivers. > >I suppose this is correct. However, "properly set" doesn't necessarily >mean what you first think it means. Since I wrote the software, I'm the person who gets to define what "properly set" means :-) >These are my findings: > >- all NAL units that belong to the same frame, except the last NAL > unit in that frame, should have a duration of 0 (zero). > >- the last NAL unit in each frame should have a duration matching > that of its frame. > >- all NAL units in a frame should have the same presentation time. None of this is inconsistent with what I said earlier. Your "H264VideoStreamFramer" subclass implementation should set the "fPresentationTime" and "fDurationInMicroseconds" variables in accordance with what you just said. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Thu Sep 25 00:52:28 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 25 Sep 2008 00:52:28 -0700 Subject: [Live-devel] =?gb2312?b?tPC4tDogIHF1ZXN0aW9uIGFib3V0IHRoZSBsaXZl?= =?gb2312?b?NTU1IFJUUCBzZW5kaW5nIHNwZWVk?= In-Reply-To: <000301c91edb$321f4130$9909a8c0@shuweiyuan> References: <000001c91d62$e8ffab80$9909a8c0@shuweiyuan> <200809231314.m8NDERRU095003@ns.live555.com> <000301c91edb$321f4130$9909a8c0@shuweiyuan> Message-ID: <200809250758.m8P7wLwd069058@ns.live555.com> At 11:51 PM 9/24/2008, you wrote: >[Aiden: ] : what's the meaning of " duration ", >I guess, your mean??s that the consecutive nal >units in the same frame have the same timestamp, right? No, "fDurationInMicroseconds" just tells the code how long to wait after sending each RTP packet. This is independent of the presentation time, which is set by the "fPresentationTime" variable. >[Aiden: ] : what??s the presentation time? ---- Timestamp ? Effectively, yes. ('Timesamps' are used only internally, by the RTP protocol. They represent presentation times, which is what programmers (who use our library) actually see at either end of the link.) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From venugopalpaikr at tataelxsi.co.in Thu Sep 25 22:34:25 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Fri, 26 Sep 2008 11:04:25 +0530 Subject: [Live-devel] Query regarding turnOnBackgroundReadHandling() Message-ID: <003001c91f99$8a5826a0$3c033c0a@telxsi.com> Hi everybody, My understanding about the following function is very vague. Can anybody please help with the following doubts... env.taskScheduler().turnOnBackgroundReadHandling(fServerSocket, (TaskScheduler::BackgroundHandlerProc*)&incomingConnectionHandler, this); } 1. Am building an RTSPServer using RTSPServer.cpp code. So will the above function continously read a particular port until it gets a request from the client or do we need to add a while loop anywhere in the code to make it read continously? 2. OR Do we have to use env.taskScheduler().doEventLoop() to continously monitor for requests? If so wat should the parameter to this function? 3. If it gets a request will it automatically go to function incomingConnectionHandler() and create an instance of server class? Kindly help me with the above-mentioned queries.. Regards, Venu The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From finlayson at live555.com Thu Sep 25 22:48:04 2008 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 25 Sep 2008 22:48:04 -0700 Subject: [Live-devel] Query regarding turnOnBackgroundReadHandling() In-Reply-To: <003001c91f99$8a5826a0$3c033c0a@telxsi.com> References: <003001c91f99$8a5826a0$3c033c0a@telxsi.com> Message-ID: <200809260550.m8Q5oxBQ037801@ns.live555.com> > env.taskScheduler().turnOnBackgroundReadHandling(fServerSocket, > (TaskScheduler::BackgroundHandlerProc*)&incomingConnectionHandler, > this); > } > >1. Am building an RTSPServer using RTSPServer.cpp code. So will the above >function continously read a particular port until it gets a request from the >client or do we need to add a while loop anywhere in the code to make it >read continously? No, you don't need to modify the existing "RTSPServer" code at all. >2. OR Do we have to use env.taskScheduler().doEventLoop() to continously >monitor for requests? Yes. Your application (like all applications that use the "LIVE555 Streaming Media" code) must use an event loop. > If so wat should the parameter to this function? "doEventLoop()" need not take a parameter. See the code for the "testOnDemandRTSPServer" demo application for an example of RTSP server code. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From cbitsunil at gmail.com Fri Sep 26 00:56:19 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Fri, 26 Sep 2008 16:56:19 +0900 Subject: [Live-devel] fPlayTimePerFrame Message-ID: Hi , I am working with live555 library for TS streaming via RTP. When creating "ByteStreamFileSource" instance, how can I set "fPlayTimePerFrame" value. If I set this parameter to null, it wont add the lastframe play time to presentation time. I want to add lastframe play time to "presentation time".How can I set the value. Thanking you, Sunil. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 26 01:54:57 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Sep 2008 01:54:57 -0700 Subject: [Live-devel] fPlayTimePerFrame In-Reply-To: References: Message-ID: <200809260858.m8Q8wfCC033155@ns.live555.com> > When creating "ByteStreamFileSource" instance, how can I set > "fPlayTimePerFrame" >value. Using the "playTimePerFrame" parameter to "ByteStreamFileSource::createNew()". (Isn't this obvious from the source code??) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From venugopalpaikr at tataelxsi.co.in Fri Sep 26 02:16:30 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Fri, 26 Sep 2008 14:46:30 +0530 Subject: [Live-devel] Query regarding turnOnBackgroundReadHandling() In-Reply-To: <200809260550.m8Q5oxBQ037801@ns.live555.com> Message-ID: <008901c91fb8$908a0e20$3c033c0a@telxsi.com> Thanks Ross Additionally i also wanted to know as to how the flow will go to the function void RTSPServer::incomingConnectionHandler(void* instance, int /*mask*/) from the function env.taskScheduler().turnOnBackgroundReadHandling(fServerSocket, (TaskScheduler::BackgroundHandlerProc*)&incomingConnectionHandler, this); } I also went through "onDemandRTSPServer" and found that a server session is created initially mentioning the streamname. But my requirement is to wait for an incoming connection and then create a session. So what i have found out is that after the function env.taskScheduler().turnOnBackgroundReadHandling() my flow should go to incomingConnectionHandler() to incomingConnectionHandler1() where the client object is created. But am not able to achieve this flow as the control goes back to the main() function after env.taskScheduler().turnOnBackgroundReadHandling(). Is my approach right? If not then where could be the flaw? -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Ross Finlayson Sent: Friday, September 26, 2008 11:18 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Query regarding turnOnBackgroundReadHandling() > env.taskScheduler().turnOnBackgroundReadHandling(fServerSocket, > (TaskScheduler::BackgroundHandlerProc*)&incomingConnectionHandler, > this); > } > >1. Am building an RTSPServer using RTSPServer.cpp code. So will the above >function continously read a particular port until it gets a request from the >client or do we need to add a while loop anywhere in the code to make it >read continously? No, you don't need to modify the existing "RTSPServer" code at all. >2. OR Do we have to use env.taskScheduler().doEventLoop() to continously >monitor for requests? Yes. Your application (like all applications that use the "LIVE555 Streaming Media" code) must use an event loop. > If so wat should the parameter to this function? "doEventLoop()" need not take a parameter. See the code for the "testOnDemandRTSPServer" demo application for an example of RTSP server code. Ross Finlayson Live Networks, Inc. (LIVE555.COM) _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From Anoop_P.A at pmc-sierra.com Fri Sep 26 02:19:08 2008 From: Anoop_P.A at pmc-sierra.com (Anoop P.A.) Date: Fri, 26 Sep 2008 02:19:08 -0700 Subject: [Live-devel] mp4 streaming In-Reply-To: <200809260858.m8Q8wfCC033155@ns.live555.com> Message-ID: Hi all, How do I stream mp4 files. I am using live555mediaserver. Do I want to download any other server to stream mp4 ?. Thanks Anoop From sureshs at tataelxsi.co.in Fri Sep 26 03:00:41 2008 From: sureshs at tataelxsi.co.in (sureshs) Date: Fri, 26 Sep 2008 15:30:41 +0530 Subject: [Live-devel] Streaming Live H264 encoded bit stream Message-ID: <007901c91fbe$bcb33c00$51053c0a@telxsi.com> Hi , I am trying the stream the live captured h.264 encoded bit stream through the OnDemandRTSP server. I did modified the OnDemandRTSPServer application to read the .h264 file and could successfully stream the file. Now I would like to stream the live camera captured h.264 encoded bit stream, What changes I have to do in testOnDemandRTSP server application to send the real time live camera captured h.264 encoded bit stream?. My h.264 encoder output provides 1 NAL unit per frame. I did gone through the FAQ but couldn't get much help. Regards, Suresh S The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From cbitsunil at gmail.com Fri Sep 26 04:02:41 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Fri, 26 Sep 2008 20:02:41 +0900 Subject: [Live-devel] fPlayTimePerFrame In-Reply-To: <200809260858.m8Q8wfCC033155@ns.live555.com> References: <200809260858.m8Q8wfCC033155@ns.live555.com> Message-ID: yeah thats true.But how can I get playTimePerFrame value so that I can pass it to "createNew".May be this is a basic question. But I am a newbie. do I need to calculate playTimePerFrame from the TS file. Please clarify. On Fri, Sep 26, 2008 at 5:54 PM, Ross Finlayson wrote: > > When creating "ByteStreamFileSource" instance, how can I set >> "fPlayTimePerFrame" >> value. >> > > Using the "playTimePerFrame" parameter to > "ByteStreamFileSource::createNew()". > > (Isn't this obvious from the source code??) > > > Ross Finlayson > Live Networks, Inc. (LIVE555.COM ) > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 26 06:30:39 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Sep 2008 06:30:39 -0700 Subject: [Live-devel] mp4 streaming In-Reply-To: References: <200809260858.m8Q8wfCC033155@ns.live555.com> Message-ID: <200809261336.m8QDaiim023614@ns.live555.com> >How do I stream mp4 files. I am using live555mediaserver. It depends what you mean by "mp4 file". If you mean a "MPEG-4 Elementary Stream video" file, then yes, our "live555MediaServer" can stream this already, but the filename suffix must be ".m4e'. Similarly, we can stream a MPEG-4 audio file (in ADTS format), that has filename suffix ".aac" If, however, you mean a ".mp4"-formar file (that contains both audio and video), then we currently cannot stream such files. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Fri Sep 26 06:36:40 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Sep 2008 06:36:40 -0700 Subject: [Live-devel] Query regarding turnOnBackgroundReadHandling() In-Reply-To: <008901c91fb8$908a0e20$3c033c0a@telxsi.com> References: <200809260550.m8Q5oxBQ037801@ns.live555.com> <008901c91fb8$908a0e20$3c033c0a@telxsi.com> Message-ID: <200809261344.m8QDi6O1031286@ns.live555.com> >Additionally i also wanted to know as to how the flow will go to the >function >void RTSPServer::incomingConnectionHandler(void* instance, int /*mask*/) This is an event handler, and is called from within the event loop whenever incoming data is available on the specified socket number. (See the code for "BasicTaskScheduler") Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Fri Sep 26 06:40:47 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Sep 2008 06:40:47 -0700 Subject: [Live-devel] fPlayTimePerFrame In-Reply-To: References: <200809260858.m8Q8wfCC033155@ns.live555.com> Message-ID: <200809261349.m8QDnAlL036556@ns.live555.com> At 04:02 AM 9/26/2008, you wrote: >yeah thats true.But how can I get playTimePerFrame value so that I >can pass it to "createNew".May be this is a basic question. >But I am a newbie. >do I need to calculate playTimePerFrame from the TS file. If you are streaming a Transort Stream file then you *do not* pass a "playTimePerFrame" parameter to "ByteStreamFileSource::createNew()". This is because each (188-byte) Transport Stream packet *does not* have a fixed duration. Instead, we use a separate "MPEG2TransportStreamFramer" object to compute, on the fly, the duration of each Transport Stream packet. If you are streaming a Transport Stream file, you should not modify the existing code - it already works. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From finlayson at live555.com Fri Sep 26 06:58:11 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Sep 2008 06:58:11 -0700 Subject: [Live-devel] Streaming Live H264 encoded bit stream In-Reply-To: <007901c91fbe$bcb33c00$51053c0a@telxsi.com> References: <007901c91fbe$bcb33c00$51053c0a@telxsi.com> Message-ID: <200809261401.m8QE1hi8049586@ns.live555.com> >I am trying the stream the live captured h.264 encoded bit stream through >the OnDemandRTSP server. >I did modified the OnDemandRTSPServer application to read the .h264 file and >could successfully stream the file. >Now I would like to stream the live camera captured h.264 encoded bit >stream, What changes I have to do in testOnDemandRTSP >server application to send the real time live camera captured h.264 encoded >bit stream?. >My h.264 encoder output provides 1 NAL unit per frame. > >I did gone through the FAQ but couldn't get much help. See http://www.live555.com/liveMedia/faq.html#h264-streaming http://www.live555.com/liveMedia/faq.html#liveInput http://www.live555.com/liveMedia/faq.html#liveInput-unicast I.e., your data chain will be: YourH264InputSource -> YourH264VideoStreamFramerSubclass -> H264VideoRTPSink Since you already claim to be streaming from a ".h264" file (whatever that means), then you presumably already have: ByteStreamFileSource -> YourH264VideoStreamFramerSubclass -> H264VideoRTPSink So, just replace "ByteStreamFileSource" with "YourH264InputSource" (a new class that you will need to write, to encapsulate your H.264 input device). (Just be sure to change the the variable "reuseFirstSource" to "True" in your "OnDemandServerMediaSubsession" subclass (which you have presumably already written to create the data chain).) Ross Finlayson Live Networks, Inc. (LIVE555.COM) From gbonneau at matrox.com Fri Sep 26 07:17:08 2008 From: gbonneau at matrox.com (Guy Bonneau) Date: Fri, 26 Sep 2008 10:17:08 -0400 Subject: [Live-devel] FEC Callback Message-ID: <27582C125377498B9B0A141E4ED3D3DF@dorvalmatrox.matrox.com> Ross, Let say that someone would like to add FEC based on RFC 2733 to the library. My understanding is that some FEC source filter implementation would need to receive the RTP packet formatted and sent by the MultiFramedRTPSink implementation to be further FEC processed. At first glance I would have been tempted to suggest adding some callback mecanism to the method MultiFramedRTPSink ::sendPacketIfNecessary() after the fRTPInterface.sendPacket(...). But I was wondering. Is there a better way to do that without modifying MultiFramedRTPSink? May be at a lower level? Thanks Guy Bonneau -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Sep 26 16:14:24 2008 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 26 Sep 2008 16:14:24 -0700 Subject: [Live-devel] FEC Callback In-Reply-To: <27582C125377498B9B0A141E4ED3D3DF@dorvalmatrox.matrox.com> References: <27582C125377498B9B0A141E4ED3D3DF@dorvalmatrox.matrox.com> Message-ID: <200809262322.m8QNMCpM034170@ns.live555.com> >Let say that someone would like to add FEC based on RFC 2733 to the library [...] >But I was wondering. Is there a better way to do that without >modifying MultiFramedRTPSink? Probably not, although right now I can't say for sure. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From cbitsunil at gmail.com Fri Sep 26 22:46:02 2008 From: cbitsunil at gmail.com (sunil sunil) Date: Sat, 27 Sep 2008 14:46:02 +0900 Subject: [Live-devel] fPlayTimePerFrame In-Reply-To: <200809261349.m8QDnAlL036556@ns.live555.com> References: <200809260858.m8Q8wfCC033155@ns.live555.com> <200809261349.m8QDnAlL036556@ns.live555.com> Message-ID: Hi Ross, Thanks for the reply. One more question!! After receiving the data at the receiver side software, do I need to manipulate the timestamp value or any other field in TS packet for the TV decoder to work properly. Sunil. On Fri, Sep 26, 2008 at 10:40 PM, Ross Finlayson wrote: > At 04:02 AM 9/26/2008, you wrote: > >> yeah thats true.But how can I get playTimePerFrame value so that I can >> pass it to "createNew".May be this is a basic question. >> But I am a newbie. >> do I need to calculate playTimePerFrame from the TS file. >> > > If you are streaming a Transort Stream file then you *do not* pass a > "playTimePerFrame" parameter to "ByteStreamFileSource::createNew()". This > is because each (188-byte) Transport Stream packet *does not* have a fixed > duration. Instead, we use a separate "MPEG2TransportStreamFramer" object to > compute, on the fly, the duration of each Transport Stream packet. > > If you are streaming a Transport Stream file, you should not modify the > existing code - it already works. > > > > Ross Finlayson > Live Networks, Inc. (LIVE555.COM ) > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Sep 27 00:24:52 2008 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 27 Sep 2008 00:24:52 -0700 Subject: [Live-devel] fPlayTimePerFrame In-Reply-To: References: <200809260858.m8Q8wfCC033155@ns.live555.com> <200809261349.m8QDnAlL036556@ns.live555.com> Message-ID: <200809270728.m8R7SbHU040864@ns.live555.com> >One more question!! >After receiving the data at the receiver side software, do I need to >manipulate >the timestamp value or any other field in TS packet for the TV >decoder to work properly. No! Just stream the Transport Stream file as is, using our existing code. Ross Finlayson Live Networks, Inc. (LIVE555.COM) From venugopalpaikr at tataelxsi.co.in Sun Sep 28 21:47:42 2008 From: venugopalpaikr at tataelxsi.co.in (venugopalpaikr) Date: Mon, 29 Sep 2008 10:17:42 +0530 Subject: [Live-devel] Query regarding turnOnBackgroundReadHandling() In-Reply-To: <200809261344.m8QDi6O1031286@ns.live555.com> Message-ID: <003c01c921ee$82883f30$3c033c0a@telxsi.com> Hi Ross, My understanding was not right and hence please ignore my previous mail. But am still not able to understand the LIVE555 code. I am having no clue as to where i should proceed after the below mentioned...I don't want to create a server media session as i am tring to establish a connection using vlc player and so i don't know the stream details as yet. int main(int argc, char** argv) { //Set up task scheduler UsageEnvironment* env; TaskScheduler* scheduler=BasicTaskScheduler::createNew(); env=BasicUsageEnvironment::createNew(*scheduler); UserAuthenticationDatabase* authDB=NULL; #ifdef ACCESS_CONTROL //to implement client access control authDB=new UserAuthenticationDatabase; authDB->addUserRecord("xx","xxx123"); #endif RTSPServer* rtspServer=RTSPServer::createNew(*env,554,authDB,45); My main ques is regarding turnOnBackgroundReadHandling(). In this function a particular socket is set for reading using FD_SET. Once a request has come to this socket what action will be taken by this code.??? also since incomingConnectionHandler()is a static void private function which function in the server class can access this function? -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Ross Finlayson Sent: Friday, September 26, 2008 7:07 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Query regarding turnOnBackgroundReadHandling() >Additionally i also wanted to know as to how the flow will go to the >function >void RTSPServer::incomingConnectionHandler(void* instance, int /*mask*/) This is an event handler, and is called from within the event loop whenever incoming data is available on the specified socket number. (See the code for "BasicTaskScheduler") Ross Finlayson Live Networks, Inc. (LIVE555.COM) _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it. From anitarajan at tataelxsi.co.in Tue Sep 30 23:19:38 2008 From: anitarajan at tataelxsi.co.in (Anita Rajan) Date: Wed, 1 Oct 2008 11:49:38 +0530 Subject: [Live-devel] HTTP Tunneling Message-ID: <002d01c9238d$afab2680$4d053c0a@telxsi.com> Hi All, I would like to know if the Live555 code supports RTSP Tunneling over HTTP. Also,if the Section 10.12 in RFC2326(Binary interleaved data method) is laso well supported by live555. Regards, Me The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments contained in it.