From Beth.Turk at drs-c3a.com Tue Jan 3 06:23:39 2012 From: Beth.Turk at drs-c3a.com (Turk, Beth (SA-1)) Date: Tue, 3 Jan 2012 09:23:39 -0500 Subject: [Live-devel] question In-Reply-To: References: Message-ID: Hi Ross, What I mean by "tagging" is essentially the ability to bookmark a place in the video with something (maybe a timestamp) and then the ability to fast forward or reverse to that point in the video. Beth ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, December 22, 2011 9:53 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] question Does this code allow tagging of the incoming video data? Can you explain more what you mean by "tagging"? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 3 07:27:58 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 3 Jan 2012 07:27:58 -0800 Subject: [Live-devel] question In-Reply-To: References: Message-ID: <0F8BBF5D-91AE-4C71-AC82-4F573BED8731@live555.com> > What I mean by ?tagging? is essentially the ability to bookmark a place in the video with something (maybe a timestamp) and then the ability to fast forward or reverse to that point in the video. Well, our FAQ has an entry that talks about our support for 'trick play' (seeking, fast-forward, and reverse play): http://www.live555.com/liveMedia/faq.html#trick-mode though I'm not sure if that answers your question... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephan.depoorter at codecompetence.com Wed Jan 4 01:12:53 2012 From: stephan.depoorter at codecompetence.com (stephan depoorter) Date: Wed, 4 Jan 2012 10:12:53 +0100 Subject: [Live-devel] Suggestion to improve logging inside live555 Message-ID: Hi, I've been using Live555 and I like it. Despite things are starting to run smooth, I have a suggestion to improve the logging in the library, this will be usefull when tracking difficult problems that happen rarely. Using the current codebase, I implemented logging using a BasicUsageEnvironment derived class, and so I received some logging from the library classes as well. However, live555 is also sending usefull info to stderr, and to catch that as well, one also has to redirect stderr. These statements do not contain time info, and are only triggered in debug builds. For example in RTSPServer.cpp : #ifdef DEBUG fprintf(stderr, "parseRTSPRequestString() failed\n"); #endif #ifdef DEBUG envir() << "accept()ed connection from " << our_inet_ntoa(clientAddr.sin_addr) << "\n"; #endif So my suggestion is to : - Do all logging in the same way (through the environment class) - Always add timestamps - Add levels of logging (verbose, info, error ) - Always log, not only in debug builds, the specific environment class can decide what to do with it - For conveniance, provide a basiclogger that does the writing to the logfile(s) Regards, Steph -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 4 01:41:49 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 4 Jan 2012 01:41:49 -0800 Subject: [Live-devel] Suggestion to improve logging inside live555 In-Reply-To: References: Message-ID: Yes, I pretty much agree with these suggestions, some of which may be implemented sometime in the future. One thing to note, however, is that the existing 'debugging output' code (i.e., the code in between "#ifdef DEBUG" and "#endif" was intended specifically for that purpose - i.e., for finding bugs - rather than for general purpose logging. That's why most of this code writes to 'stderr'. Nonetheless, you're correct - this code could be generalized and improved. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Jan 4 08:32:13 2012 From: david.myers at panogenics.com (David J Myers) Date: Wed, 4 Jan 2012 16:32:13 -0000 Subject: [Live-devel] H264VideoStreamDiscreteFramer problems Message-ID: <00c001cccafe$6ab718b0$40254a10$@myers@panogenics.com> Hi, I'm having problems getting a valid stream out of my live video server now I have switched to using the H264VideoStreamDiscreteFramer. I was using H264VideoStreamFramer but I couldn't avoid frame truncation problems. I-Frames from my encoder look like this 00 00 00 01 27 42 00 32 8b 68 02 18 0f 33 02 48 04 00 00 00 01 28 ce 05 0a c8 00 00 00 01 25 b8.................... So I send these frames in three parts, removing the 00 00 00 01 headers, copying the rest of the NALU bytes to the fTo buffer and then calling FramedSource::afterGetting(this); So, in the above example I send:- The first part, the SPS, 13 bytes 27 42 00 32 8b 68 02 18 0f 33 02 48 04 Then the PPS, 5 bytes 28 ce 05 0a c8 Then the frame bytes, 25 b8 etc. I don't see any errors on my server but VLC rejects my stream with Main video output warning: late picture skipped Live555 demux warning: unsupported NAL type for H264 I also get the error Avcodec decoder error: more than 5 seconds of late video Thanks again, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 4 14:07:58 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 4 Jan 2012 14:07:58 -0800 Subject: [Live-devel] H264VideoStreamDiscreteFramer problems In-Reply-To: <00c001cccafe$6ab718b0$40254a10$@myers@panogenics.com> References: <00c001cccafe$6ab718b0$40254a10$@myers@panogenics.com> Message-ID: > I?m having problems getting a valid stream out of my live video server now I have switched to using the H264VideoStreamDiscreteFramer. I was using H264VideoStreamFramer but I couldn?t avoid frame truncation problems. > > I-Frames from my encoder look like this > 00 00 00 01 27 42 00 32 > 8b 68 02 18 0f 33 02 48 > 04 00 00 00 01 28 ce 05 > 0a c8 00 00 00 01 25 b8.................... > So I send these frames in three parts, removing the 00 00 00 01 headers, copying the rest of the NALU bytes to the fTo buffer and then calling FramedSource::afterGetting(this); > So, in the above example I send:- > The first part, the SPS, 13 bytes 27 42 00 32 8b 68 02 18 0f 33 02 48 04 > Then the PPS, 5 bytes 28 ce 05 0a c8 > Then the frame bytes, 25 b8 etc. Are you setting "fFrameSize" correctly in your 'frame source' class, before you call "FramedSource::afterGetting(this)"? (Remember to *not* count the four header bytes, because you stripped those off.) You can verify that things are working OK by checking the "nal_unit_type" that's extracted in each call to "H264VideoStreamDiscreteFramer::afterGettingFrame1()" (see "liveMedia/H264VideoStreamDiscreteFramer.cpp", lines 67-73). In your example data, "nal_unit_type" in the first call should be 7 (the SPS); in the second call, 8 (the PPS); and in the third call 5. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Jan 4 14:43:43 2012 From: david.myers at panogenics.com (David J Myers) Date: Wed, 4 Jan 2012 22:43:43 -0000 Subject: [Live-devel] H264VideoStreamDiscreteFramer problems Message-ID: <00cf01cccb32$50dc9b20$f295d160$@myers@panogenics.com> >> I-Frames from my encoder look like this >> 00 00 00 01 27 42 00 32 8b 68 >> 02 18 0f 33 02 48 >> 04 00 00 00 01 28 ce 05 >> 0a c8 00 00 00 01 25 b8.................... >> So, in the above example I >> send:- The first part, the SPS, 13 bytes 27 42 00 32 8b 68 02 18 0f 33 >> 02 48 04 Then the PPS, 5 bytes 28 ce 05 0a c8 Then the frame bytes, 25 b8 etc. >Are you setting "fFrameSize" correctly in your 'frame source' class, before you call "FramedSource::afterGetting(this)"? (Remember to *not* count the four header bytes, because you stripped those off.) Yes, I think so. So in my example, for the first SPS NAL, I set fFrameSize to 13. For the PPS, fFrameSize is 5, etc Could there be a problem with fPresentationTime? Should this be set to the same value for each NAL unit in the frame? Live555 on the server seems happy, it's just the client which rejects the stream. - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 4 14:59:56 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 4 Jan 2012 14:59:56 -0800 Subject: [Live-devel] H264VideoStreamDiscreteFramer problems In-Reply-To: <00cf01cccb32$50dc9b20$f295d160$@myers@panogenics.com> References: <00cf01cccb32$50dc9b20$f295d160$@myers@panogenics.com> Message-ID: <8E121873-A25F-4948-9CAC-33D16E1A9EE6@live555.com> > >Are you setting "fFrameSize" correctly in your 'frame source' class, before you call "FramedSource::afterGetting(this)"? (Remember to *not* count the four header bytes, because you stripped those off.) > > Yes, I think so. So in my example, for the first SPS NAL, I set fFrameSize to 13. For the PPS, fFrameSize is 5, etc > Could there be a problem with fPresentationTime? Should this be set to the same value for each NAL unit in the frame? Just calling "gettimeofday()" to get "fPresentationTime" should be OK. > Live555 on the server seems happy, it?s just the client which rejects the stream. If you're sure that you're delivering each NAL unit individually (i.e., not ever delivering two or more NAL units concatenated together, whether or not they have 00 00 00 01 headers inbetween), then I don't see what could be wrong. I suggest using "openRTSP" as your client (give it the "-d " flag, to record a specific length of time). You should end up with a file named something like "VIDEO-H264-1". Rename this file to "test.h264". See whether or not you can play it using VLC. If you can't, then email it to us (if it's short) or else put it on a web server and send us the URL - so we can inspect it ourselves. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From gsnetplayer at hotmail.com Wed Jan 4 01:07:24 2012 From: gsnetplayer at hotmail.com (GS Net Player) Date: Wed, 4 Jan 2012 10:07:24 +0100 Subject: [Live-devel] Test relay ?! Message-ID: Hallo, I would like to modify testRelay.cpp to receive udp live input and then rtsp to be output (rtsp://xxx.xxx.xxx.xxx:4444), otherwise udp to udp relay work excellent but I want to forward in the RTSP so that others can see, If you can show me I am a beginner in this ?! greeting Igor -------------- next part -------------- An HTML attachment was scrubbed... URL: From cat29076 at gmail.com Thu Jan 5 00:11:29 2012 From: cat29076 at gmail.com (leroi cat) Date: Thu, 5 Jan 2012 09:11:29 +0100 Subject: [Live-devel] GET_PARAMETER Message-ID: Hello, I just started with live555 media server and my client starts with GET_PARAMETER method rather than OPTIONS. After reading RTSPServer.cpp i found that the method GET_PARAMETER is implemented just as a 'keep alive' and it sends an empty response. i understand that i must define a subclass of "RTSPServer" but unfortunately i don't know how to do it !!! is this the solution to start up my client? i'm blocked :( please help -------------- next part -------------- An HTML attachment was scrubbed... URL: From anthony at haploid.fr Thu Jan 5 09:35:18 2012 From: anthony at haploid.fr (Anthony Nevo) Date: Thu, 5 Jan 2012 18:35:18 +0100 Subject: [Live-devel] Broadcasting live events with an iPhone using Live555 media libraries Message-ID: <0A4CD9FC-D64E-49C3-8174-1A40D7EB65E0@haploid.fr> Hi all, I am working for a small company specialized in developing iOS and Android applications. One of our clients asked us to add a Qik-like or Ustream-like functionality to one of their app and after some investigation, it seems to me that Live555 libraries are a very good starting point to do that. The idea is that a user would broadcast an event (concert, birthday party, ...) to his/her friends using only an iPhone. His/her friends would be able to watch the event "live" on their PC or phones. The architecture we are building is based on an Helix Media Server that will be used to broadcast the stream to PCs, phones, ... Then, on the iPhone part, we're building an app that will access the camera, and then maybe by using Live555 libraries, streaming it to the Helix Media Server (H264/AAC using RTP). A good example of what we are trying to achieve is the Livu app (http://stevemcfarlin.com/livu/index.html). I've compiled the Live555 libraries for the iPhone and I've checked the test programs (particularly testH264VideoStreamer or testMPEG4VideoToDarwin). TestMPEG4VideoToDarwin seems to be very close to what we want to do but I was wondering if someone had some advices on how to achieve this goal. Are we on the right path or not ? Thanks a million for your help, Cheers, Anthony Nevo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 5 12:04:34 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 5 Jan 2012 12:04:34 -0800 Subject: [Live-devel] Broadcasting live events with an iPhone using Live555 media libraries In-Reply-To: <0A4CD9FC-D64E-49C3-8174-1A40D7EB65E0@haploid.fr> References: <0A4CD9FC-D64E-49C3-8174-1A40D7EB65E0@haploid.fr> Message-ID: > The architecture we are building is based on an Helix Media Server that will be used to broadcast the stream to PCs, phones, ... > Then, on the iPhone part, we're building an app that will access the camera, and then maybe by using Live555 libraries, streaming it to the Helix Media Server (H264/AAC using RTP). A good example of what we are trying to achieve is the Livu app (http://stevemcfarlin.com/livu/index.html). > > I've compiled the Live555 libraries for the iPhone and I've checked the test programs (particularly testH264VideoStreamer or testMPEG4VideoToDarwin). TestMPEG4VideoToDarwin seems to be very close to what we want to do but I was wondering if someone had some advices on how to achieve this goal. > Are we on the right path or not ? Yes, you probably are, provided that the protocol that the "Helix Media Server" uses to receive incoming data is the same protocol (or at least a similar protocol) as the one that's used by the "Darwin Streaming Server". Personally, I dislike the idea of clients 'pushing' data into servers (see and ), but I recognize that there are legacy servers out there that support that model. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcmordie at viionsystems.com Thu Jan 5 15:59:36 2012 From: mcmordie at viionsystems.com (Dave McMordie) Date: Thu, 5 Jan 2012 18:59:36 -0500 Subject: [Live-devel] Starting and stopping video within MediaSessions Message-ID: <2b58c94af11910255cded5069c20880e@mail.gmail.com> We are using Live555 to stream (unicast) short bursts of video (currently MJPG, soon to be H.264) and metadata as objects are tracked throughout a scene. For a given tracked object , we have information about the object (id, location, etc), which we send as XML meta data in a MediaSubsession. Each video/metadata sequence has a start and an end, and multiple (up to 25) of these sequences may be active at one time (think of each as coming from a separate camera trained on a separate subject). Currently the way we are doing this (just as a proof of concept) is to use a single SMS containing one MJPG stream and one XML stream, with each video sequence multiplexed into this single image/metadata channel. In this model, the MJPG stream has the frames tagged with a serial number and put into correspondence with the XML stream in a buffer on the receiving end. This is very clunky, and will not scale to better video codecs due to the uncorrelated (multiplexed) sequence of frames. We would like to move toward a model where we have a single session open, but each video/metadata sequence is dynamically added/removed from that session as objects appear and disappear. Is this possible using Live555? Is this something that can?t be done due to limitations in RTSP/RTCP? If this can be done, at what level can we dynamically add ?streams? to an open session? Can we simply add a new ServerMediaSessions after the Unicast session has begun? Any guidance on the ?right way? to tackle this would be much appreciated. Best regards, Dave McMordie *[image: Description: cid:image001.jpg at 01C94FAF.E4535F30]* *David McMordie* *CTO* 1751 Richardson, Suite 8.123 Montreal, QC H3K 1G6 mcmordie at viionsystems.com www.viionsystems.com *Confidentiality Message * This message is intended only for the designated recipient(s). It may contain confidential or proprietary information and may be subject to the attorney-client privilege or other confidentiality protections. If you are not a designated recipient, you may not review, copy or distribute this message. If you receive this in error, please notify the sender by reply email and delete this message. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 2737 bytes Desc: not available URL: From finlayson at live555.com Thu Jan 5 19:13:47 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 5 Jan 2012 19:13:47 -0800 Subject: [Live-devel] Starting and stopping video within MediaSessions In-Reply-To: <2b58c94af11910255cded5069c20880e@mail.gmail.com> References: <2b58c94af11910255cded5069c20880e@mail.gmail.com> Message-ID: <3411493A-11B4-4745-A2A7-C1F46D923870@live555.com> > We are using Live555 to stream (unicast) short bursts of video (currently MJPG, soon to be H.264) and metadata as objects are tracked throughout a scene. For a given tracked object , we have information about the object (id, location, etc), which we send as XML meta data in a MediaSubsession. Each video/metadata sequence has a start and an end, and multiple (up to 25) of these sequences may be active at one time (think of each as coming from a separate camera trained on a separate subject). > > Currently the way we are doing this (just as a proof of concept) is to use a single SMS containing one MJPG stream and one XML stream, with each video sequence multiplexed into this single image/metadata channel. In this model, the MJPG stream has the frames tagged with a serial number and put into correspondence with the XML stream in a buffer on the receiving end. > > This is very clunky, and will not scale to better video codecs due to the uncorrelated (multiplexed) sequence of frames. We would like to move toward a model where we have a single session open, but each video/metadata sequence is dynamically added/removed from that session as objects appear and disappear. > > Is this possible using Live555? Is this something that can?t be done due to limitations in RTSP/RTCP? Hmm... I'm not 100% sure that I understand what you're trying to do, but I think there are 3 separate issues here: 1/ Does RTP/RTCP support what you're trying to do? 2/ Does RTSP support what you're trying to do? 3/ Does the "LIVE555 Streaming Media" code support what you're trying to do? For 1/, I think the answer is yes. RTP/RTCP supports multiple (logical or physical) sources within a session - with each source having its own "SSRC" (basically, a 'RTP source identifier'). Receiver(s) can demultiplex the different media sources based on "SSRC", on the media type (RTP payload format code), and of course on multiple port numbers. Normally, this is done within a multicast session, but in your case you are (from what I can gather) using unicast (with multiple logical sources effectively sharing a single unicast stream). That should be OK as well. For 2/, the answer is yes (I think) - but perhaps not in the way that you think. Because you have effectively just two media types - JPEG video and XML text - in your session, there should be just two "m=" lines in your server's SDP description, and thus (in a LIVE555 implementation) just two "ServerMediaSubsession" objects within your "ServerMediaSession". The fact that many different logical sources may appear/disappear within each (sub)stream is irrelevant, as far as RTSP is concerned. This is something that affects the contents of the media stream(s), but not the way that they are described/set up via RTSP. So, your LIVE555-based server implementation should continue to have just two "ServerMediaSubsession" objects within your "ServerMediaSession". Now the bad news: For 3/, the answer is no (for now, at least). The LIVE555 code currently does not support demultiplexing based on SSRC. This means that a LIVE555-based receiver will have to do its own demultiplexing upon the single stream of RTP data that it receives. For your XML substream, that's not a problem, because you can have a 'source id' there that you can demultiplex upon. However, as you noted, for H.264, it's going to be a problem. I'm not sure what you can do about this, but perhaps there's some 'user data' field defined for a H.264 NAL that you could use to add a tag that your receiver could use for demultiplexing?? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 6 01:03:53 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Jan 2012 01:03:53 -0800 Subject: [Live-devel] Test relay ?! In-Reply-To: References: Message-ID: > Hallo, I would like to modify testRelay.cpp to receive udp live input and then rtsp to be output (rtsp://xxx.xxx.xxx.xxx:4444), otherwise udp to udp relay work excellent but I want toforward in the RTSP so that others can see, If you can show me I am a beginner in this ?! Yes, you can do this; however, "testRelay" is not the best application to use as a starting point, because most of the functionality of your new application will be in the RTSP server. Therefore, I suggest that you instead use "testOnDemandRTSPServer" as a model. For information on how to adapt this to use a UDP stream as a data source, see http://www.live555.com/liveMedia/faq.html#liveInput-unicast As noted in the FAQ, you will need to define your own subclass of "OnDemandServerMediaSubsession, and redefine the "createNewStreamSource()" virtual function. This is where you would use the "testRelay" code as a guide. Your subclass's "createNewStreamSource()" virtual function can be quite simple - basically just creating a "groupsock" for your IP multicast address, and then creating a "BasicUDPSource" using that "groupsock" object. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 6 01:22:48 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Jan 2012 01:22:48 -0800 Subject: [Live-devel] GET_PARAMETER In-Reply-To: References: Message-ID: > I just started with live555 media server and my client starts with GET_PARAMETER method rather than OPTIONS. > After reading RTSPServer.cpp i found that the method GET_PARAMETER is implemented just as a 'keep alive' and it sends an empty response. Does the server send back an empty "200 OK" response, or does it instead send back an error "405 Method not allowed" response? If it's the latter, then we probably can't help you. (Just over a week ago, we had a report of a set-top-box that sends an erroneous "GET_PARAMETER" command that causes the server to respond with a "405" error.) If, however, the server really does send back an empty "200 OK" response, then you can change this by reimplementing the server by defining and implementing your own subclass of "RTSPServer", and reimplementing the "handleCmd_GET_PARAMETER()" virtual function. Note, however, that doing this requires a good knowledge of C++ programming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Thu Jan 5 00:54:52 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Thu, 5 Jan 2012 10:54:52 +0200 Subject: [Live-devel] Closing and deleting enviroment at the shutdownstream [ testRTSPClient and OpenRTSPClient] Message-ID: Hi, I want to simply close and delete existing UsageEnvironment and shutdown method without "exit"... What I do so far is: 1. First : reclaim method env.reclaim(); but this does NOT delete the enviroment since env.liveMediaPriv != NULL... Of course i can set iveMediaPriv to NULL manually, but is it a good way? 2. Second: void Medium::close(UsageEnvironment& env, char const* name) well i can not undestand what kind of name its want because enviment is created without name.. So i try with NULL and with application name given to RTSPClient object but get Access violation error at MediaLookupTable::remove(char const* name) method where it calls Medium* medium = lookup(name); Best Wishes From sharmafrequent at gmail.com Thu Jan 5 03:55:23 2012 From: sharmafrequent at gmail.com (Nishit Sharma) Date: Thu, 5 Jan 2012 17:25:23 +0530 Subject: [Live-devel] Require some info regarding live555 In-Reply-To: References: Message-ID: Dear All, I am new to this Live555. I want to know in the present scenario is it possible to multicast different streams on different multicast IPs. In the source code i found that if i need to do so i need to create another instance of server but didn't get any findings regarding without creating instances i can do the same. Hoping to get reply. Thanks and Regards Nishit Sharma -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Thu Jan 5 00:34:19 2012 From: tayeb.dotnet at gmail.com (Meftah Tayeb) Date: Thu, 5 Jan 2012 10:34:19 +0200 Subject: [Live-devel] Require some info regarding live555 References: Message-ID: <557516E7D1B9403DA09D2418C846D67A@work> hello, what kind of stream source are you willing to multicast ? thank you ----- Original Message ----- From: Nishit Sharma To: live-devel at ns.live555.com Sent: Thursday, January 05, 2012 1:55 PM Subject: [Live-devel] Require some info regarding live555 Dear All, I am new to this Live555. I want to know in the present scenario is it possible to multicast different streams on different multicast IPs. In the source code i found that if i need to do so i need to create another instance of server but didn't get any findings regarding without creating instances i can do the same. Hoping to get reply. Thanks and Regards Nishit Sharma ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel __________ Information from ESET NOD32 Antivirus, version of virus signature database 6771 (20120105) __________ The message was checked by ESET NOD32 Antivirus. http://www.eset.com __________ Information from ESET NOD32 Antivirus, version of virus signature database 6771 (20120105) __________ The message was checked by ESET NOD32 Antivirus. http://www.eset.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Fri Jan 6 04:19:10 2012 From: david.myers at panogenics.com (David J Myers) Date: Fri, 6 Jan 2012 12:19:10 -0000 Subject: [Live-devel] H264VideoStreamDiscreteFramer problems Message-ID: <001a01cccc6d$659ad810$30d08830$@myers@panogenics.com> You wrote: >I suggest using "openRTSP" as your client (give it the "-d " flag, to record a specific length of time). >You should end up with a file named something like "VIDEO-H264-1". >Rename this file to "test.h264". See whether or not you can play it using VLC. >If you can't, then email it to us (if it's short) or else put it on a web server and send us the URL - so we can inspect it ourselves. Ok, OpenRTSP works, provided I use the -b option to bump up the input buffer to handle our frame size (I used -b 1000000). VLC can then play the file produced, however it plays them about 5 times too fast. A 10sec clip plays in 2secs, a 30sec clip plays in 5secs. You can download the files to test them yourself at:- http://www.panogenics.com/stream0.h264 (a 10 second clip) http://www.panogenics.com/stream0-001.h264 (a 30 second clip) Thanks, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhanm at join.net.cn Fri Jan 6 05:23:02 2012 From: zhanm at join.net.cn (=?gb2312?B?1bLD9w==?=) Date: Fri, 6 Jan 2012 21:23:02 +0800 Subject: [Live-devel] Extend RTSPServer and RTSPClient problems In-Reply-To: References: Message-ID: <005f01cccc76$519888e0$f4c99aa0$@net.cn> Hi, I want to subclass RTSPServer and RTSPClient classes to extend my specific functions. However, with the private class RequestQueue and some other private attributes defined within the RTSPClient class, I found it hard to overide just a few method to make it, because my overrided methods can?t access those private attributes in the RTSPClient class. Also, when I want to add another video format to be processed by the server, I found that I need to override the ceateSourceObjects method to generate my specific RTPSource. So, I subclassed MediaSubsession class(eg, myMediaSubsession) and override its createSourceObjects. However, ceateSourceObjects is called by MediaSubsession::initiate() method, which in turn requires me to override it. After I have done that, I found the myMediaSubsession:: initiate() method still call the MediaSubsession:: ceateSourceObjects rather than myMediaSubsession:: ceateSourceObjects I wonder if I want to subclass RTSPServer and RTSPClient classes, what is the minimum classes that I also need to subclass them to make it work? (RTSPClientSession, MediaSubSession, or ?) -------------- next part -------------- An HTML attachment was scrubbed... URL: From sharmafrequent at gmail.com Fri Jan 6 02:30:57 2012 From: sharmafrequent at gmail.com (Nishit Sharma) Date: Fri, 6 Jan 2012 16:00:57 +0530 Subject: [Live-devel] Require some info regarding live555 In-Reply-To: <557516E7D1B9403DA09D2418C846D67A@work> References: <557516E7D1B9403DA09D2418C846D67A@work> Message-ID: hi Meftah, Thanks for replying. Below is my query:- In the current Live555 source we have a sample application * testMPEG2TransportStreamer* which reads a MPEG Transport Stream file (named "test.ts"), and streams it, using RTP, to the multicast group 239.255.42.42, port 1234 and i can change the file also. Now my question is how i can multicast the same file on multiple multicast addresses using the same application and how can i multicast other source (ts) files on multiple multicast addresses using the same application? Waiting for reply. Thanks On Thu, Jan 5, 2012 at 2:04 PM, Meftah Tayeb wrote: > ** > hello, > what kind of stream source are you willing to multicast ? > thank you > > > ----- Original Message ----- > *From:* Nishit Sharma > *To:* live-devel at ns.live555.com > *Sent:* Thursday, January 05, 2012 1:55 PM > *Subject:* [Live-devel] Require some info regarding live555 > > Dear All, > > I am new to this Live555. I want to know in the present scenario is it > possible to > multicast different streams on different multicast IPs. > > In the source code i found that if i need to do so i need to create > another instance of > server but didn't get any findings regarding without creating instances i > can do the same. > > Hoping to get reply. > > > Thanks and Regards > Nishit Sharma > > > ------------------------------ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > > __________ Information from ESET NOD32 Antivirus, version of virus > signature database 6771 (20120105) __________ > > The message was checked by ESET NOD32 Antivirus. > > http://www.eset.com > > > > __________ Information from ESET NOD32 Antivirus, version of virus > signature database 6771 (20120105) __________ > > The message was checked by ESET NOD32 Antivirus. > > http://www.eset.com > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From viraj.mehta at kritnu.com Fri Jan 6 05:56:34 2012 From: viraj.mehta at kritnu.com (Viraj Mehta) Date: Fri, 6 Jan 2012 19:26:34 +0530 Subject: [Live-devel] RTSP Streaming on iOS Message-ID: <02A3D55E-F3BF-4B13-BD83-5DB71E6C238C@kritnu.com> Hello All, I was wondering if you folks can help me stream video which is available to me using RTSP. I have compiled Live555 source code for iOS and integrated in my project. After supplying the URL to the below code: session = [[RTSPClientSession alloc] initWithURL:[NSURL URLWithString:stringContainingURL]]; [session setup]; NSLog(@"getSDP: --> %@", [session getSDP]); NSArray *array = [session getSubsessions]; for (int i = 0; i < [array count]; i++) { RTSPSubsession *subsession = [array objectAtIndex:i]; [session setupSubsession:subsession useTCP:YES]; subsession.delegate = self; [subsession increaseReceiveBufferTo:2000000]; NSLog(@"[subsession getProtocolName] = %@", [subsession getProtocolName]); NSLog(@"[subsession getCodecName] = %@", [subsession getCodecName]); NSLog(@"[subsession getMediumName] = %@", [subsession getMediumName]); } [session play]; NSLog(@"error: --> %@",[session getLastErrorString]); I get the following log: 2012-01-06 19:06:18.857 Sample_Video[25829:207] getSDP: --> v=0 o=- 13260000 1 IN IP4 0.0.0.0 s=Session streamed by "Object RTSPServer" i=H264 t=0 0 a=tool:LIVE555 Streaming Media v2011.07.08 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:Session streamed by "Object RTSPServer" a=x-qt-text-inf:H264 m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:500 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=42000D;sprop-parameter-sets=J0IADapAoPk3AgICQAAAAwBAAAAGeAC5zgCAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA,KM4EcgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA== a=control:track1 2012-01-06 19:06:19.045 Sample_Video[25829:207] [subsession getProtocolName] = RTP 2012-01-06 19:06:19.046 Sample_Video[25829:207] [subsession getCodecName] = H264 2012-01-06 19:06:19.046 Sample_Video[25829:207] [subsession getMediumName] = video 2012-01-06 19:06:19.197 Sample_Video[25829:207] error: --> liveMedia4 2012-01-06 19:06:19.345 Sample_Video[25829:207] didReceiveFrame 2012-01-06 19:06:19.346 Sample_Video[25829:207] frameDataLength = 220 2012-01-06 19:06:19.347 Sample_Video[25829:207] presentationTime.tv_sec = 1325856979 2012-01-06 19:06:19.498 Sample_Video[25829:207] didReceiveFrame 2012-01-06 19:06:19.499 Sample_Video[25829:207] frameDataLength = 316 2012-01-06 19:06:19.500 Sample_Video[25829:207] presentationTime.tv_sec = 1325856979 Its my understanding that I am receiving the video frames correctly but I am unsure on how to proceed from here to display the video on the device. Any prod in the right direction is much appreciated. Thanks for your time guys. Best Regards, Viraj Mehta Software Engineer Kritnu IT Solutions Private Limited No. 7 | 3rd Cross | B Street | Link Road | Malleshwaram East Link | Bangalore | 560003 | India Phone- +91 80 23564841 +91 - 9538453154 | viraj.mehta at kritnu.com www.kritnu.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: linkedin - 2.jpeg Type: image/jpeg Size: 1184 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: logo? smaller.png Type: image/png Size: 17521 bytes Desc: not available URL: From finlayson at live555.com Fri Jan 6 06:09:13 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Jan 2012 06:09:13 -0800 Subject: [Live-devel] H264VideoStreamDiscreteFramer problems In-Reply-To: <001a01cccc6d$659ad810$30d08830$@myers@panogenics.com> References: <001a01cccc6d$659ad810$30d08830$@myers@panogenics.com> Message-ID: <05DFF9F3-A5E8-4F3E-928F-B8AE1074FE73@live555.com> > >I suggest using "openRTSP" as your client (give it the "-d " flag, to record a specific length of time). > >You should end up with a file named something like "VIDEO-H264-1". > >Rename this file to "test.h264". See whether or not you can play it using VLC. > >If you can't, then email it to us (if it's short) or else put it on a web server and send us the URL - so we can inspect it ourselves. > > Ok, OpenRTSP works, provided I use the ?b option to bump up the input buffer to handle our frame size (I used ?b 1000000). > VLC can then play the file produced, however it plays them about 5 times too fast. A 10sec clip plays in 2secs, a 30sec clip plays in 5secs. OK, because "openRTSP" - when run on your stream - generated a file that could be played by "VLC", this shows that there is nothing inherently wrong with the LIVE555-related code in your server. Whatever problem(s) that you might have are likely caused by your encoder - and thus are off-topic for this mailing list. (If you were to generate a raw H.264 video file directly from your encoder - without going through any LIVE555 server code at all - you would likely end up with the same result, when the file is played by VLC.) The problem with VLC playing your file too fast needs to be raised on a VLC mailing list, not this one. (However, I suggest that you look at how your encoder is generating frame-rate-related parameters in your stream's SPS NAL unit. Perhaps it's not setting "num_units_in_tick" and "time_scale" properly? But again, off-topic for this mailing list.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 6 06:53:44 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Jan 2012 06:53:44 -0800 Subject: [Live-devel] Extend RTSPServer and RTSPClient problems In-Reply-To: <005f01cccc76$519888e0$f4c99aa0$@net.cn> References: <005f01cccc76$519888e0$f4c99aa0$@net.cn> Message-ID: <71684847-C612-44CD-B1DE-86E1A46AC14A@live555.com> > I want to subclass RTSPServer and RTSPClient classes to extend my specific functions. I don't recommend trying to modify or 'extend' the RTSP protocol, because - if you do so - you will end up with a protocol that noone else will understand. So, I'm not convinced that you need to subclass "RTSPClient" at all. Note that 'custom' functionality can often be added to RTSP using the standard "GET_PARAMETER" and "SET_PARAMETER" commands, which you can send using the "RTSPClient::sendGetParameterCommand()" and "RTSPClient::sendSetParameterCommand()" functions - i.e., without making any modifications to "RTSPClient", or to the RTSP protocol in general. See, for example http://lists.live555.com/pipermail/live-devel/2011-January/013072.html > Also, when I want to add another video format to be processed by the server I think you mean "client" here. > , I found that I need to override the ceateSourceObjects method to generate my specific RTPSource. That's correct. However, the preferred mechanism for doing this is described in the comment that's near the top of the header file "liveMedia/include/MediaSession.hh" - i.e., the comment that begins: /* NOTE: To support receiving your own custom RTP payload format, ... > I wonder if I want to subclass RTSPServer and RTSPClient classes, what is the minimum classes that I also need to subclass them to make it work? (RTSPClientSession, MediaSubSession, or ?) As I noted above, I don't think that you need to subclass "RTSPClient". You will not need to subclass "RTSPServer" either, *unless* you want to extend the functionality of one or more of the standard RTSP commands - most likely "GET_PARAMETER" and/or "SET_PARAMETER", as noted above. Again, for more information on how to do this, see http://lists.live555.com/pipermail/live-devel/2011-January/013072.html Also, if you want your server to stream from a new kind of data source - e.g., to stream from an input device, rather than from a file - then you will need to define your own subclass of "OnDemandServerMediaSubsession". (I'm assuming here that you want to stream via unicast, rather than via multicast.) This has been described several times on this mailing list, and in this FAQ entry: http://www.live555.com/liveMedia/faq.html#liveInput-unicast Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 6 07:06:57 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Jan 2012 07:06:57 -0800 Subject: [Live-devel] Require some info regarding live555 In-Reply-To: References: <557516E7D1B9403DA09D2418C846D67A@work> Message-ID: > In the current Live555 source we have a sample application testMPEG2TransportStreamer > which reads a MPEG Transport Stream file (named "test.ts"), and streams it, using RTP, > to the multicast group 239.255.42.42, port 1234 and i can change the file also. > > Now my question is how i can multicast the same file on multiple multicast addresses using the same > application and how can i multicast other source (ts) files on multiple multicast addresses using the same > application? You can do this very easily - using the existing "testMPEG2TransportStreamer" code as a model. For each multicast address that you want to send to, you: 1/ create two "Groupsock" objects - one for RTP, the other for RTCP. (The port number for RTP should be even; the next higher (i.e., odd) port number should be for RTCP.) 2/ create a new "SimpleRTPSink" object from the RTP "Groupsock" 3/ create a new "RTCPInstance" object from the RTCP "Groupsock" 4/ create a new "ByteStreamFileSource" object (for the file that you want to read from) 5/ create a new "MPEG2TransportStreamFramer" object (for the "ByteStreamFileSource" object that you created in step 4/) 6/ call "startPlaying()" on the "SimpleRTPSink" object that you created in step 4/, taking, as parameter, the "MPEG2TransportStreamFramer" object that you created in step 5/ I.e., you do steps 1/ through 6/ for each file+multicast address that you want to stream. Then (and only then), once you've done all this, you call env->taskScheduler().doEventLoop(); to enter the application's 'event loop' (which is where the actual streaming gets done). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 6 07:11:27 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Jan 2012 07:11:27 -0800 Subject: [Live-devel] RTSP Streaming on iOS In-Reply-To: <02A3D55E-F3BF-4B13-BD83-5DB71E6C238C@kritnu.com> References: <02A3D55E-F3BF-4B13-BD83-5DB71E6C238C@kritnu.com> Message-ID: > Its my understanding that I am receiving the video frames correctly but I am unsure on how to proceed from here to display the video on the device. > Any prod in the right direction is much appreciated. I can't help you with Objective-C programming, unfortunately. However, I suggest that you look at the (C++) code for the new "testRTSPClient" application. Note, in particular, how we define our own "MediaSink" subclass (which, in this demo application, we call "DummySink") to receive each media frame. You would do something similar: Define your own "MediaSink" subclass for rendering/decoding/displaying each received frame. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 6 18:32:40 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 6 Jan 2012 18:32:40 -0800 Subject: [Live-devel] Closing and deleting enviroment at the shutdownstream [ testRTSPClient and OpenRTSPClient] In-Reply-To: References: Message-ID: <7D4BD304-1AEF-4C65-826E-AD625308B20B@live555.com> > I want to simply close and delete existing UsageEnvironment The (one and only) way to do this is to call "reclaim()" on the "UsageEnvironment" object. However, as you noted, this will not actually delete the object if the "liveMediaPriv" pointer is non-NULL. That happens if you call "reclaim()" on the "UsageEnvironment" when some "Medium" objects still exist, and/or some socket(s) are still open. So, if you really want to reclaim the (minuscule) memory used by a "UsageEnvironment", then you need to first make sure that all "Medium" objects, and all sockets, have been closed. But, if you've really finished all LIVE555-related computation, and want to reclaim all of its state, then *by far* the easiest/best way to do this is to put all of the LIVE555-related computation in its own process (i.e., application), and just have this application "exit()". (But apparently many modern-day developers don't understand processes :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From erin.yang at mstarsemi.com Sat Jan 7 01:22:13 2012 From: erin.yang at mstarsemi.com (Erin Yang) Date: Sat, 7 Jan 2012 17:22:13 +0800 Subject: [Live-devel] How to distinguish new RTP packets from out-of-data RTP packets when random accessing the video Message-ID: <879149D60CA247A7BC73D9503E41FA87@mstarsemi.com.tw> Hi, I would like to random access the video, so I send a PAUSE command and then a PLAY command with a specified time. That is like follows. PAUSE rtsp://113.31.34.14:554/work/500/115/969/967/500.3gp RTSP/1.0 SeqNo: 3 Session: 6347526623097789397 (==> without range header) ... PLAY rtsp://113.31.34.14:554/work/500/115/969/967/500.3gp RTSP/1.0 SeqNo: 4 Range: npt=12-100 (==> set start play point at the 12th second) Session: 6347526623097789397 However, I can't control the network traffic, so I may still receive some old RTP packets after receiving PLAY response. Is there any information to distinguish new RTP packets from out-of-data RTP packets? With the information, I can discard the old RTP packets and play video until receiving the first new RTP packet. Thx a lot! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 7 01:45:46 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 7 Jan 2012 01:45:46 -0800 Subject: [Live-devel] How to distinguish new RTP packets from out-of-data RTP packets when random accessing the video In-Reply-To: <879149D60CA247A7BC73D9503E41FA87@mstarsemi.com.tw> References: <879149D60CA247A7BC73D9503E41FA87@mstarsemi.com.tw> Message-ID: > I would like to random access the video, so I send a PAUSE command and then a PLAY command with a specified time. > That is like follows. > > PAUSE rtsp://113.31.34.14:554/work/500/115/969/967/500.3gp RTSP/1.0 > SeqNo: 3 > Session: 6347526623097789397 > (==> without range header) > > ... > > PLAY rtsp://113.31.34.14:554/work/500/115/969/967/500.3gp RTSP/1.0 > SeqNo: 4 > Range: npt=12-100 (==> set start play point at the 12th second) > Session: 6347526623097789397 > > > However, I can?t control the network traffic, so I may still receive some old RTP packets after receiving PLAY response. Are you actually seeing this happen with this server (a Darwin Streaming Server)? Our RTP reception software will automatically discard out-of-order incoming RTP packets (by checking the RTP sequence number). So, if you're seeing "old RTP packets" after the "PLAY", then presumably you're seeing some 'old RTP packets', followed by only 'new RTP packets'. You should not be seeing 'old RTP packets' mixed with 'new RTP packets'. In this case, I wouldn't worry too much about this. It's unlikely that you're seeing very many 'old RTP packets' (unless your server is badly broken), so you can probably just feed all incoming data to your decoder, as usual, and things should be OK. It'll be merely as if you slightly delayed sending the "PAUSE" command. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 7 02:23:55 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 7 Jan 2012 02:23:55 -0800 Subject: [Live-devel] New LIVE555 version - adds a "StreamReplicator" class (and a demo application for this) Message-ID: People have often asked for a way to 'replicate' an incoming stream into two or more replicas, so that they can (for example) write one replica to a file, and transmit the other replica over a network, or perhaps feed it into a decoder, etc. I've now installed a new version (2012.01.07) of the "LIVE555 Streaming Media" software that adds a new class "StreamReplicator". You can create an object of this class - using an input stream - and then call "createStreamReplica()" on it several times, to create 'replicas' of the input stream. I've also added a new demo application - to the "testProgs" directory - called "testReplicator". This demo application receives a UDP multicast stream, replicates it, and retransmits one replica stream to another (multicast or unicast) address & port, and writes the other replica stream to a file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From erin.yang at mstarsemi.com Sat Jan 7 06:09:12 2012 From: erin.yang at mstarsemi.com (Erin Yang) Date: Sat, 7 Jan 2012 22:09:12 +0800 Subject: [Live-devel] How to distinguish new RTP packets fromout-of-data RTP packets when random accessing the video In-Reply-To: Message-ID: <7296D93801D64058BCFA70A1410CAD85@GH> Yes. It happened. Sometimes, network traffic may be congested. When randomly accessing the video, we can?t guarantee the last old RTP packets arrive earlier than the first new RTP packet due to congested network. If I just feed all incoming data to decoder, as usual, the player would display the old video frame in the new position. So, we need more information (e.g. RTP header, command response?) to overcome the problem resulting from congested network. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, January 07, 2012 5:46 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] How to distinguish new RTP packets fromout-of-data RTP packets when random accessing the video I would like to random access the video, so I send a PAUSE command and then a PLAY command with a specified time. That is like follows. PAUSE rtsp://113.31.34.14:554/work/500/115/969/967/500.3gp RTSP/1.0 SeqNo: 3 Session: 6347526623097789397 (==> without range header) ... PLAY rtsp://113.31.34.14:554/work/500/115/969/967/500.3gp RTSP/1.0 SeqNo: 4 Range: npt=12-100 (==> set start play point at the 12th second) Session: 6347526623097789397 However, I can?t control the network traffic, so I may still receive some old RTP packets after receiving PLAY response. Are you actually seeing this happen with this server (a Darwin Streaming Server)? Our RTP reception software will automatically discard out-of-order incoming RTP packets (by checking the RTP sequence number). So, if you're seeing "old RTP packets" after the "PLAY", then presumably you're seeing some 'old RTP packets', followed by only 'new RTP packets'. You should not be seeing 'old RTP packets' mixed with 'new RTP packets'. In this case, I wouldn't worry too much about this. It's unlikely that you're seeing very many 'old RTP packets' (unless your server is badly broken), so you can probably just feed all incoming data to your decoder, as usual, and things should be OK. It'll be merely as if you slightly delayed sending the "PAUSE" command. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 7 07:29:15 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 7 Jan 2012 07:29:15 -0800 Subject: [Live-devel] How to distinguish new RTP packets fromout-of-data RTP packets when random accessing the video In-Reply-To: <7296D93801D64058BCFA70A1410CAD85@GH> References: <7296D93801D64058BCFA70A1410CAD85@GH> Message-ID: <532465D2-78F9-43E8-AE01-F9D5CE0A5297@live555.com> > Are you actually seeing this happen with this server (a Darwin Streaming Server)? Our RTP reception software will automatically discard out-of-order incoming RTP packets (by checking the RTP sequence number). So, if you're seeing "old RTP packets" after the "PLAY", then presumably you're seeing some 'old RTP packets', followed by only 'new RTP packets'. You should not be seeing 'old RTP packets' mixed with 'new RTP packets'. > > > Yes. It happened. Sometimes, network traffic may be congested. If you're seeing 'old RTP packets' mixed with 'new RTP packets' (i.e, not just 'old RTP packets", followed by 'new RTP packets'), then that means that much more than "network congestion" is happening here. If that is what you're seeing, then packets are being not just delayed, but reordered - *excessively* reordered. In fact, they're being reordered so much that our RTP sequence number check is not catching it - which means that some packets are out-of-order more than 32767 (==2^15) packets *within the same stream*. Something is badly broken with your server (most likely), or your network. That's what you should be fixing. > So, we need more information (e.g. RTP header, command response?) to overcome the problem resulting from congested network. Our RTP implementation already takes care of this - so that it should be delivering RTP payloads in order. If that's not happening, then - as I noted above - your server or network is badly broken, and there's nothing that our software can do to overcome this. The first thing I recommend that you do is check whether you have the latest software version for your "Darwin Streaming Server" (either that, or use a better server). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Sat Jan 7 02:00:18 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Sat, 7 Jan 2012 12:00:18 +0200 Subject: [Live-devel] Closing and deleting enviroment at the shutdownstream [ testRTSPClient and OpenRTSPClient] In-Reply-To: <7D4BD304-1AEF-4C65-826E-AD625308B20B@live555.com> References: <7D4BD304-1AEF-4C65-826E-AD625308B20B@live555.com> Message-ID: Well, I make my test on testRTSPClient example It has method shutdownStream(RTSPClient* rtspClient, int exitCode) in which I delete exit(exitCode); statement and put env.reclaim() there but env.liveMediaPriv != NULL as I said before How to modify testRTSPClient example shutdownstream method or put a new method called DestroyEvething so that all sockets and envriments deleteted? Best Wishes Novalis PS: Why i do not use exit? S?nce i develop a library which others can use...So they can able to start-stop everthing in their program without memory leaks or open sockets programatically.... Just as simple as RTSPClient* client = new RTSPClient(rtspUrl); client->Start(); client->Shutdown(); // Evething should gone at here or RTSPClient* client = new RTSPClient(rtspUrl); client->Start(); client->Stop(); delete client; // Evething should gone at here Users of this librray may create may clients without exiting the program...I have no chance to start stop clients in other process... From finlayson at live555.com Sat Jan 7 08:32:36 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 7 Jan 2012 08:32:36 -0800 Subject: [Live-devel] Closing and deleting enviroment at the shutdownstream [ testRTSPClient and OpenRTSPClient] In-Reply-To: References: <7D4BD304-1AEF-4C65-826E-AD625308B20B@live555.com> Message-ID: <896FBD18-DB99-44C1-BD6F-B5F9180BEAA7@live555.com> > Well, > > I make my test on testRTSPClient example > It has method shutdownStream(RTSPClient* rtspClient, int exitCode) > in which I delete exit(exitCode); statement > > and put env.reclaim() there but env.liveMediaPriv != NULL as I said before Sorry, but I just tried this myself, with the actual "testRTSPClient" code, and found that the "liveMediaPriv" pointer *is* NULL, and the "UsageEnvironment" object *does* get deleted. Bzzzt! Thank you for playing... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Fri Jan 6 04:09:54 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Fri, 6 Jan 2012 14:09:54 +0200 Subject: [Live-devel] Parser for sprop-parameter-sets at desribe response to get width- height... Message-ID: Hi, I check video stream width height from subsession scs.subsession->videoHeight(), scs.subsession->videoHeight()... They give me right dimesions for server which desribe response include "a=x-dimensions:%d,%d", &width, &height)"... But they give wrong value ( 0) for server which does NOT include "a=x-dimensions:%d,%d", &width, &height)"... but INCLUDES "sprop-parameter-sets......."...[ for h264 stream]... So i have to parse that subsession->fSpropParameterSets to get width.. Is there a parser for fSpropParameterSets parameter s in Live555 which i can extract video with height... Best Wishes Novalis From finlayson at live555.com Mon Jan 9 15:41:48 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Jan 2012 15:41:48 -0800 Subject: [Live-devel] Parser for sprop-parameter-sets at desribe response to get width- height... In-Reply-To: References: Message-ID: <27CD841A-249A-4A45-A740-3E98B952D734@live555.com> > I check video stream width height from subsession > scs.subsession->videoHeight(), scs.subsession->videoHeight()... > > They give me right dimesions for server which desribe response include > "a=x-dimensions:%d,%d", &width, &height)"... > > But they give wrong value ( 0) for server which does NOT include > "a=x-dimensions:%d,%d", &width, &height)" Exactly. The "MediaSubsession::videoHeight()" and "MediaSubsession::videoWidth()" member functions (and other "MediaSession" member functions) return the values that were obtained by parsing the stream's SDP description. If, however, the corresponding fields are not in the stream's SDP description, then 'null' values will be returned instead. > So i have to parse that subsession->fSpropParameterSets to get width.. Yes. Just as you have to parse this, and all of the other NAL units if you want to decode and play the H.264 video. > Is there a parser for fSpropParameterSets parameter s in Live555 > which i can extract video with height... No. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From gsnetplayer at hotmail.com Sun Jan 8 16:37:43 2012 From: gsnetplayer at hotmail.com (GS Net Player) Date: Mon, 9 Jan 2012 01:37:43 +0100 Subject: [Live-devel] Test relay ?! In-Reply-To: References: , Message-ID: hello, can you show me how to read text from .txt file ? I want to get the stream url with text which is on "C: / / text.txt" (rtsp://xxx.xxx.xxx.xxx:8554/textfromfile) ! I tried with File* fp = fopen("text.txt", "rb"); and then I put it in: ServerMediaSession* sms = ServerMediaSession::createNew(*env, fp, inputFileName, "Session streamed by \"testMPEG4VideoStreamer\"", isSSM); sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp)); rtspServer->addServerMediaSession(sms); but without success ! regards Igor From: finlayson at live555.com Date: Fri, 6 Jan 2012 01:03:53 -0800 To: live-devel at ns.live555.com Subject: Re: [Live-devel] Test relay ?! Hallo, I would like to modify testRelay.cpp to receive udp live input and then rtsp to be output (rtsp://xxx.xxx.xxx.xxx:4444), otherwise udp to udp relay work excellent but I want toforward in the RTSP so that others can see, If you can show me I am a beginner in this ?! Yes, you can do this; however, "testRelay" is not the best application to use as a starting point, because most of the functionality of your new application will be in the RTSP server. Therefore, I suggest that you instead use "testOnDemandRTSPServer" as a model. For information on how to adapt this to use a UDP stream as a data source, see http://www.live555.com/liveMedia/faq.html#liveInput-unicast As noted in the FAQ, you will need to define your own subclass of "OnDemandServerMediaSubsession, and redefine the "createNewStreamSource()" virtual function. This is where you would use the "testRelay" code as a guide. Your subclass's "createNewStreamSource()" virtual function can be quite simple - basically just creating a "groupsock" for your IP multicast address, and then creating a "BasicUDPSource" using that "groupsock" object. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From cat29076 at gmail.com Mon Jan 9 04:57:21 2012 From: cat29076 at gmail.com (leroi cat) Date: Mon, 9 Jan 2012 13:57:21 +0100 Subject: [Live-devel] Fwd: testOnDemandRTSPServer In-Reply-To: References: Message-ID: ---------- Forwarded message ---------- From: leroi cat Date: 2012/1/9 Subject: testOnDemandRTSPServer To: live-devel-request at lists.live555.com, finlayson at live555.com, live-devel at lists.live555.com Hi, i run the server with the command ./testOnDemandRTSPServer, but by using the request rtsp://192.168.1.3:8554/mpeg2TransportStreamTest i can read only one stream "test.ts". the problem that I just want to read multiple streams "*.ts" with the same runtime.How can i distinguish them? and what is the query that I must to use to read different streams for example test.ts and test1.ts? Thanks!! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 9 16:53:55 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 9 Jan 2012 16:53:55 -0800 Subject: [Live-devel] testOnDemandRTSPServer In-Reply-To: References: Message-ID: <62419E80-9EC5-43A1-A7B1-7976AC7AD89C@live555.com> > i run the server with the command ./testOnDemandRTSPServer, but by using the request rtsp://192.168.1.3:8554/mpeg2TransportStreamTest i can read only one stream "test.ts". > the problem that I just want to read multiple streams "*.ts" with the same runtime.How can i distinguish them? Very easily. Note the following code in "testOnDemandRTSPServer.cpp": // A MPEG-2 Transport Stream: { char const* streamName = "mpeg2TransportStreamTest"; char const* inputFileName = "test.ts"; char const* indexFileName = "test.tsx"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(MPEG2TransportFileServerMediaSubsession ::createNew(*env, inputFileName, indexFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } If you want to support streaming some other Transport Stream file - e.g., named "test1.ts", then just use the same code, but with: char const* streamName = "mpeg2TransportStreamTest1"; char const* inputFileName = "test1.ts"; char const* indexFileName = "test1.tsx"; > and what is the query that I must to use to read different streams for example test.ts and test1.ts? The "rtsp://" URL will be different - depending on the "streamName" string. E.g., using the above example, the URL will be rtsp://:/mpeg2TransportStreamTest for the original stream, and rtsp://:/mpeg2TransportStreamTest1 for the new stream. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Jan 10 15:08:26 2012 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 10 Jan 2012 23:08:26 +0000 Subject: [Live-devel] Non Blocking socket returns WSAEWOULDBLOCK Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1B13A6@IL-BOL-EXCH01.smartwire.com> I started having problems connecting to some RTSP streams and traced it to line 400 in the RTSPClient 00399 if (connect(socketNum, (struct sockaddr*) &remoteName, sizeof remoteName) != 0) { 00400 if (envir().getErrno() == EINPROGRESS) { 00401 // The connection is pending; we'll need to handle it later. Wait for our socket to be 'writable', or have an exception. 00402 envir().taskScheduler().setBackgroundHandling(socketNum, SOCKET_WRITABLE|SOCKET_EXCEPTION, 00403 (TaskScheduler::BackgroundHandlerProc*)&connectionHandler, this); 00404 return 0; 00405 } 00406 envir().setResultErrMsg("connect() failed: "); 00407 if (fVerbosityLevel >= 1) envir() << "..." << envir().getResultMsg() << "\n"; 00408 return -1; 00409 } My windows API started returning WSAEWOULDBLOCK which according to MSDN is fine to be rescheduled for later WSAEWOULDBLOCK = 10035 Resource temporarily unavailable. This error is returned from operations on nonblocking sockets that cannot be completed immediately, for example recv when no data is queued to be read from the socket. It is a nonfatal error, and the operation should be retried later. It is normal for WSAEWOULDBLOCK to be reported as the result from calling connect on a nonblocking SOCK_STREAM socket, since some time must elapse for the connection to be established. Should I change the if clause on line 400? Or did I somehow mess up the sockets on my computer. I have rebooted, there was a windows update this am. I am on Win7 64bit but running a win32 app. I noticed the setupStreams enables non-blocking sockets by default -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Jan 10 15:16:20 2012 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 10 Jan 2012 23:16:20 +0000 Subject: [Live-devel] Non Blocking socket returns WSAEWOULDBLOCK (addendum) Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1B13D2@IL-BOL-EXCH01.smartwire.com> Changing code to: ... int myerrno = envir().getErrno(); envir() << "socket ERROR = " << myerrno << "\n"; if (myerrno == EINPROGRESS || myerrno == WSAEWOULDBLOCK) { ... Immediately restored operation -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Mon Jan 9 23:57:09 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Tue, 10 Jan 2012 09:57:09 +0200 Subject: [Live-devel] Parser for sprop-parameter-sets at desribe response to get width- height... In-Reply-To: <27CD841A-249A-4A45-A740-3E98B952D734@live555.com> References: <27CD841A-249A-4A45-A740-3E98B952D734@live555.com> Message-ID: Well, actually i do not decode h264 stream but just need width, height info... For h264 stream, here is simple buggy parser which gets width height from fSpropParameterSets... Maybe someone also need a starting point for parsing those data... Best Wishes How To Use: So in desribe response if you get those parmeters from scs.subsession->fmtp_spropparametersets(); just use this simple buggy parser like H264SPropParameterSetParser parser(scs.subsession->fmtp_spropparametersets()); int w = parse->GetWidth(); int h= parser->GetHeight(); Here is the codes: There is 2 class BitStreamReader, and H264SPropParameterSetParser // H264SPropParameterSetParser #pragma once #ifndef _H264_SPROP_PARAMETER_SET_PARSER_ #define _H264_SPROP_PARAMETER_SET_PARSER_ #include #include "BitStreamReader.h" using namespace std; static const std::string base64_chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ" "abcdefghijklmnopqrstuvwxyz" "0123456789+/"; class H264SPropParameterSetParser { public: H264SPropParameterSetParser(string spsFromSdp) { this->spsFromSdp = spsFromSdp; height = 0; width = 0; StartParsing(); } int GetWidth() { int width = ( pic_width_in_mbs_minus1 +1)*16; return width; } int GetHeight() { int height =( pic_height_in_map_units_minus1+1)*16; return height; } ~H264SPropParameterSetParser(void) { delete bitStreamReader; bitStreamReader = NULL; } private: void StartParsing() { ConvertFromBase64IntoByteArray(); ParseSequenceParameterSet(); } void ConvertFromBase64IntoByteArray() { string decodedString = DecodeBase64( spsFromSdp); binaryData = new unsigned char[decodedString.size()]; std::copy(decodedString.begin(), decodedString.end(), binaryData); } void ParseSequenceParameterSet() { unsigned __int32 temp; bitStreamReader = new BitStreamReader(binaryData); bitStreamReader->U(8); // skip nal unit type profile_idc = bitStreamReader->U(8); constraint_set0_flag = bitStreamReader->U(1); constraint_set1_flag = bitStreamReader->U(1); constraint_set2_flag = bitStreamReader->U(1); constraint_set3_flag = bitStreamReader->U(1); reserved_zero_4bits = bitStreamReader->U(4); level_idc = bitStreamReader->U(8); seq_parameter_set_id = bitStreamReader->Uev(); if (profile_idc == 100 || profile_idc == 110 || profile_idc == 122 || profile_idc == 144) { chroma_format_idc = bitStreamReader->Uev(); if (chroma_format_idc == 3) { separate_colour_plane_flag = bitStreamReader->U(1); } bit_depth_luma_minus8 = bitStreamReader->Uev(); bit_depth_chroma_minus8 = bitStreamReader->Uev(); qpprime_y_zero_transform_bypass_flag = bitStreamReader->U(1); seq_scaling_matrix_present_flag = bitStreamReader->U(1); if( seq_scaling_matrix_present_flag ) { for(unsigned int ix = 0; ix < 8; ix++) { temp = bitStreamReader->U(1); if (temp) { ScalingList(ix, ix < 6 ? 16 : 64); } } } } log2_max_frame_num_minus4 = bitStreamReader->Uev(); pic_order_cnt_type = bitStreamReader->Uev(); if (pic_order_cnt_type == 0) { log2_max_pic_order_cnt_lsb_minus4 = bitStreamReader->Uev(); } else if (pic_order_cnt_type == 1) { delta_pic_order_always_zero_flag = bitStreamReader->U(1); offset_for_non_ref_pic = bitStreamReader->Sev(); offset_for_top_to_bottom_field = bitStreamReader->Sev(); num_ref_frames_in_pic_order_cnt_cycle = bitStreamReader->Uev(); for( int i = 0; i < num_ref_frames_in_pic_order_cnt_cycle; i++ ) { int skippedParameter = bitStreamReader->Sev(); } } num_ref_frames = bitStreamReader->Uev(); gaps_in_frame_num_value_allowed_flag = bitStreamReader->U(1); pic_width_in_mbs_minus1 = bitStreamReader->Uev(); pic_height_in_map_units_minus1 = bitStreamReader->Uev(); frame_mbs_only_flag = bitStreamReader->U(1); if( !frame_mbs_only_flag ) { mb_adaptive_frame_field_flag = bitStreamReader->U(1); } direct_8x8_inference_flag = bitStreamReader->U(1); frame_cropping_flag = bitStreamReader->U(1); if( frame_cropping_flag ) { frame_crop_left_offset = bitStreamReader->Uev(); frame_crop_right_offset = bitStreamReader->Uev(); frame_crop_top_offset = bitStreamReader->Uev(); frame_crop_bottom_offset = bitStreamReader->Uev(); } vui_parameters_present_flag = bitStreamReader->U(1); } // Utility to parse void ScalingList(unsigned int ix, unsigned int sizeOfScalingList) { unsigned int lastScale = 8; unsigned int nextScale = 8; unsigned int jx; int deltaScale; for (jx = 0; jx < sizeOfScalingList; jx++) { if (nextScale != 0) { deltaScale = bitStreamReader->Sev(); nextScale = (lastScale + deltaScale + 256) % 256; } if (nextScale == 0) { lastScale = lastScale; } else { lastScale = nextScale; } } } std::string DecodeBase64(std::string const& encodedString) { int inlen = encodedString.size(); int i = 0; int j = 0; int in = 0; unsigned char charArray4[4], charArray3[3]; std::string ret; while (inlen-- && ( encodedString[in] != '=') && IsBase64(encodedString[in])) { charArray4[i++] = encodedString[in]; in++; if (i ==4) { for (i = 0; i <4; i++) charArray4[i] = base64_chars.find(charArray4[i]); charArray3[0] = (charArray4[0] << 2) + ((charArray4[1] & 0x30) >> 4); charArray3[1] = ((charArray4[1] & 0xf) << 4) + ((charArray4[2] & 0x3c) >> 2); charArray3[2] = ((charArray4[2] & 0x3) << 6) + charArray4[3]; for (i = 0; (i < 3); i++) ret += charArray3[i]; i = 0; } } if (i) { for (j = i; j <4; j++) charArray4[j] = 0; for (j = 0; j <4; j++) charArray4[j] = base64_chars.find(charArray4[j]); charArray3[0] = (charArray4[0] << 2) + ((charArray4[1] & 0x30) >> 4); charArray3[1] = ((charArray4[1] & 0xf) << 4) + ((charArray4[2] & 0x3c) >> 2); charArray3[2] = ((charArray4[2] & 0x3) << 6) + charArray4[3]; for (j = 0; (j < i - 1); j++) ret += charArray3[j]; } return ret; } bool IsBase64(unsigned char c) { return (isalnum(c) || (c == '+') || (c == '/')); } // Data to Parse unsigned char* binaryData; // Parameter from Describe RtspReponse: Base64 string string spsFromSdp; // Utility to read bits esaily BitStreamReader* bitStreamReader; // Parameters int profile_idc ; int constraint_set0_flag; int constraint_set1_flag; int constraint_set2_flag; int constraint_set3_flag; int reserved_zero_4bits; int level_idc; int seq_parameter_set_id; int chroma_format_idc; int separate_colour_plane_flag; int bit_depth_luma_minus8; int bit_depth_chroma_minus8; int qpprime_y_zero_transform_bypass_flag; int seq_scaling_matrix_present_flag; int log2_max_frame_num_minus4; int pic_order_cnt_type; int log2_max_pic_order_cnt_lsb_minus4; int delta_pic_order_always_zero_flag; int offset_for_non_ref_pic; int offset_for_top_to_bottom_field; int num_ref_frames_in_pic_order_cnt_cycle; int num_ref_frames; int gaps_in_frame_num_value_allowed_flag; unsigned __int32 pic_width_in_mbs_minus1; unsigned __int32 pic_height_in_map_units_minus1; int frame_mbs_only_flag; int mb_adaptive_frame_field_flag; int direct_8x8_inference_flag; int frame_cropping_flag; int frame_crop_left_offset; int frame_crop_right_offset; int frame_crop_top_offset; int frame_crop_bottom_offset; int vui_parameters_present_flag; int height; int width; }; #endif Then // BitStreamReader #pragma once #ifndef __BIT_STREAM_READER__ #define __BIT_STREAM_READER__ #include #include #include class BitStreamReader { public: BitStreamReader(unsigned char* dataToRead) { position = 0; binaryData = dataToRead; } ~BitStreamReader(void) { delete[] binaryData; binaryData = NULL; } void SkipBits(int n) { for (int i = 0; i < n; i++) { SkipBit(); } } int GetBits(int n) // Same as U(int n) { int result = 0; for (int i = 0; i < n; i++) { result = result * 2 +GetBit(); } return result; } int U(int n) { int result = 0; for (int i = 0; i < n; i++) { result = result * 2 +GetBit(); } return result; } int Uev() { return Ev(false); } int Sev() { return Ev(true); } private: int GetBit() { int mask = 1 << (7 - (position & 7)); int index = position >> 3; position++; return ((binaryData[index] & mask) == 0) ? 0 : 1; } void SkipBit() { position++; } int Ev(bool isItSigned) { int bitCount = 0; std::string expGolomb; while (GetBit() == 0) { expGolomb += '0'; bitCount++; } expGolomb += "/1"; int result = 1; for (int i = 0; i < bitCount; i++) { int b = GetBit(); expGolomb += b; result = result * 2 + b; } result--; if (isItSigned) { result = (result + 1) / 2 * (result % 2 == 0 ? -1 : 1); } return result; } unsigned char* binaryData; int position; }; #endif 2012/1/10 Ross Finlayson : > I check video stream ?width ?height ?from subsession > scs.subsession->videoHeight(), scs.subsession->videoHeight()... > > They give me right dimesions for server which desribe response include > "a=x-dimensions:%d,%d", &width, &height)"... > > But they give wrong value ( 0) for server which does NOT ?include > "a=x-dimensions:%d,%d", &width, &height)" > > > Exactly. ?The "MediaSubsession::videoHeight()" and > "MediaSubsession::videoWidth()" member functions (and other "MediaSession" > member functions) return the values that were obtained by parsing the > stream's SDP description. ?If, however, the corresponding fields are not in > the stream's SDP description, then 'null' values will be returned instead. > > > So i have to parse that subsession->fSpropParameterSets to get width.. > > > Yes. ?Just as you have to parse this, and all of the other NAL units if you > want to decode and play the H.264 video. > > > Is there a parser for ?fSpropParameterSets parameter s in Live555 > which i can extract video with height... > > > No. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Jan 10 15:54:18 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Jan 2012 15:54:18 -0800 Subject: [Live-devel] Non Blocking socket returns WSAEWOULDBLOCK (addendum) In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1B13D2@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1B13D2@IL-BOL-EXCH01.smartwire.com> Message-ID: <6229701E-1739-49EE-9FB1-09C5160A7E57@live555.com> > Changing code to: > ? > int myerrno = envir().getErrno(); > envir() << "socket ERROR = " << myerrno << "\n"; > if (myerrno == EINPROGRESS || myerrno == WSAEWOULDBLOCK) { > ? > > Immediately restored operation I'll make this change (although with "EWOULDBLOCK" - the platform-independent definition) in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 10 17:00:27 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 10 Jan 2012 17:00:27 -0800 Subject: [Live-devel] Parser for sprop-parameter-sets at desribe response to get width- height... In-Reply-To: References: <27CD841A-249A-4A45-A740-3E98B952D734@live555.com> Message-ID: > Well, actually i do not decode h264 stream but just need width, height info... > > For h264 stream, here is simple buggy parser which gets width height > from fSpropParameterSets... > Maybe someone also need a starting point for parsing those data... Note that we already provide a function: parseSPropParameterSets() (defined in "liveMedia/include/H264VideoRTPSource.hh") that parses the "MediaSubsession::fmtp_spropparametersets()" string, Base64-decodes it, etc. - returning an array of NAL units (in binary form). So you don't need to do the string-to-binary conversion. Instead, you need focus on parsing the NAL unit binaries. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nmenne at dcscorp.com Wed Jan 11 14:53:40 2012 From: nmenne at dcscorp.com (Menne, Neil) Date: Wed, 11 Jan 2012 17:53:40 -0500 Subject: [Live-devel] "no frame!" Message-ID: I'm working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is "no frame!" I'm stumped as to why that is the case when I'm taking the buffer after the "afterGettingFrame" function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I'm pulling down doesn't contain the width/height, so I'm guessing I need to pull that out of the sprop_parameter_sets; is this the case? My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? Video is not my expertise, so I'm spending most days reading anything/everything on these problems, but this one's got me stuck. -Neil P.S. My video is an H.264 video stream -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at stevemcfarlin.com Wed Jan 11 16:31:01 2012 From: steve at stevemcfarlin.com (Steve McFarlin) Date: Wed, 11 Jan 2012 16:31:01 -0800 Subject: [Live-devel] "no frame!" In-Reply-To: References: Message-ID: <217F3E4F-B18F-4926-87A2-CBDD0C09A2D9@stevemcfarlin.com> It sounds like your not setting up FFmpeg properly. The SDP will contain the SPS and PPS in the sprop-parameter-sets. You need to decode these (Base64 encoded) and place them in the extradata field of the AVCodecContext. Additionally every NALU passed to FFmpeg must be in Annex B format. That is it must be prefixed with the NALU start code of 0x00000001. This start code needs to be in network byte order. If width and height information is not specified in the SDP you can get this from the SPS. See ISO 141496-10 section 7.3.2.1. If I recall correctly Live555 has a method to inspect the SPS. It might be able to be used to dig this information out of the SPS. Sorry my Live555 knowledge is starting to get fuzzy. You might check out the dropcam sources on github (iPhone project). It uses Live555 and FFmpeg for decoding H264. Thanks, Steve On Jan 11, 2012, at 2:53 PM, Menne, Neil wrote: > I?m working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is ?no frame!? I?m stumped as to why that is the case when I?m taking the buffer after the ?afterGettingFrame? function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. > > My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I?m pulling down doesn?t contain the width/height, so I?m guessing I need to pull that out of the sprop_parameter_sets; is this the case? > > My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? > > Video is not my expertise, so I?m spending most days reading anything/everything on these problems, but this one?s got me stuck. > > -Neil > > P.S. My video is an H.264 video stream > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From av at bsbc.nb.ca Wed Jan 11 08:30:10 2012 From: av at bsbc.nb.ca (Anthony Brown) Date: Wed, 11 Jan 2012 12:30:10 -0400 Subject: [Live-devel] Trick Play and .mkv / .webm files Message-ID: <4F0DB912.109@bsbc.nb.ca> Hi there: I'm trying to use live media server to stream out video, captured live from an HD camera, but streamed on demand starting at an arbitrary point. My plan is to use dvgrab (in linux) to grab the firewire HDV stream, pipe it into ffmpeg and transcode to a file that can be streamed by live555. Then I can use a trick play command from my streaming client (VLC) to seek to the desired point in the stream. This involves streaming from a file that is still being written, but this seems to work. At least it works with mpeg ps files except for the fact that there is an av sync issue after the seek operation. I'm having a problem when trying to trick play from mkv files produced by ffmpeg in that I cannot seek in them. I don't seem to be able to play back webm files at all (at least, VLC doesn't like them). Are there any restrictions on what mkv/webm files support trick play? Any special settings needed in ffmpeg when generating them? Or any chance that the AV sync problem with seeking in ps files will get resolved? Anthony -- Anthony Brown Audiovisual coordinator Brunswick Street Baptist Church Telephone: (506)-458-8348 (leave message) Email: av at bsbc.nb.ca -------------- next part -------------- A non-text attachment was scrubbed... Name: av.vcf Type: text/x-vcard Size: 163 bytes Desc: not available URL: From finlayson at live555.com Wed Jan 11 16:44:14 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Jan 2012 16:44:14 -0800 Subject: [Live-devel] "no frame!" In-Reply-To: References: Message-ID: > I?m working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is ?no frame!? I?m stumped as to why that is the case when I?m taking the buffer after the ?afterGettingFrame? function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. Well, remember that you have complete source code - for "ffmpeg", as well as our code. If necessary, you should be able to find out why "ffmpeg" is complaining. > My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I?m pulling down doesn?t contain the width/height, so I?m guessing I need to pull that out of the sprop_parameter_sets; is this the case? Yes. You should take the string (from "MediaSubsession::fmtp_spropparametersets()"), and parse this string into a set of SPS and PPS NAL units, using the function "parseSPropParameterSets()". You should then insert these NAL units into your decoder (before the NAL units that come from the RTP stream). > My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? Perhaps, depending upon how your decoder works. Perhaps your decoder needs to see a 'start code' (0x00000001) before each NAL unit. If so, then you'll need to prepend it to each NAL unit before you feed it to your decoder. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Jan 11 15:35:44 2012 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 11 Jan 2012 23:35:44 +0000 Subject: [Live-devel] "no frame!" In-Reply-To: References: Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1B1A2F@IL-BOL-EXCH01.smartwire.com> I have been working with the avcode_decode_video2 for a bit. If [7] is SPS, [8] is PPS, [5] is a keyframe slice, and [1]'a are the difference frames. I have found it needs a stream like this 00 00 01 [7]00 00 01[8]00 00 01[5]00 00 01[1]00 00 01[1].... If you are calling it in a loop it will have no error and no frame until it has enough info. This often means you get the output frame with a delay of 1 frame and must call until it is done outputting frames. I personally aggregate the 7,8,5's into a keyframe and pass the whole thing to avcoded_decode_video2. It can survive without the 7 and 8's after it has got them at least once, but bad packet here or there and it disrupts the video for a while, so I cache them and insert them if the stream does not have them. It is a small price to pay in bandwidth that gives me faster first frame out in VLC or our own player. So Assuming the [n] means "prefix each "n" with 00 00 01... This will work [7][8][5][1][1][1][1][1][5][1][1][1][1]..... But I prefer [7][8][5][1][1][1][1][1][7][8][5][1][1][1][1]..... Also note on high resolution, it may be [7][8][5][5][5][1][1][1][1][1]... Multiple slices to make up huge keyframes, these you may have to call the decoder multiple times before you get a frame. This does not touch on one more reason to call the decode in a loop until it says it is done. Frames may arrive in a different order than display on high resolution advanced H264 profiles (and even MPEG4) The "bi-directional predictive frames must arrive out of order so the decode order differes from the display order. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Menne, Neil Sent: Wednesday, January 11, 2012 4:54 PM To: live-devel at lists.live555.com Subject: [Live-devel] "no frame!" I'm working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is "no frame!" I'm stumped as to why that is the case when I'm taking the buffer after the "afterGettingFrame" function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I'm pulling down doesn't contain the width/height, so I'm guessing I need to pull that out of the sprop_parameter_sets; is this the case? My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? Video is not my expertise, so I'm spending most days reading anything/everything on these problems, but this one's got me stuck. -Neil P.S. My video is an H.264 video stream ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4736 - Release Date: 01/11/12 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremiah.Morrill at econnect.tv Wed Jan 11 16:50:30 2012 From: Jeremiah.Morrill at econnect.tv (Jer Morrill) Date: Thu, 12 Jan 2012 00:50:30 +0000 Subject: [Live-devel] "no frame!" In-Reply-To: References: Message-ID: <80C795F72B3CB241A9256DABF0A04EC5FA2E87@CH1PRD0702MB107.namprd07.prod.outlook.com> I just implemented an custom H264 source and decoder with FFMPEG. Here's how I did it (in concept): 1.) Get the subsession->fmtp_configuration() string 2.) Do something like this with H264VideoRTPSource.hh included - "auto records = parseSPropParameters(fmtp_configuration, recordCount);" 3.) Each of the returned records from step 2, prepend a 0x00000001 4.) Each sample you send to FFMPEG, prepend each of your records from step 3 to your sample buffer I'm not sure if I'm telling you incorrect information, but this worked for me! -Jer From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Menne, Neil Sent: Wednesday, January 11, 2012 2:54 PM To: live-devel at lists.live555.com Subject: [Live-devel] "no frame!" I'm working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is "no frame!" I'm stumped as to why that is the case when I'm taking the buffer after the "afterGettingFrame" function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I'm pulling down doesn't contain the width/height, so I'm guessing I need to pull that out of the sprop_parameter_sets; is this the case? My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? Video is not my expertise, so I'm spending most days reading anything/everything on these problems, but this one's got me stuck. -Neil P.S. My video is an H.264 video stream -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 11 16:59:48 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Jan 2012 16:59:48 -0800 Subject: [Live-devel] Trick Play and .mkv / .webm files In-Reply-To: <4F0DB912.109@bsbc.nb.ca> References: <4F0DB912.109@bsbc.nb.ca> Message-ID: <5017991B-35D9-4B84-9B2A-A75328995C29@live555.com> > I'm having a problem when trying to trick play from mkv files produced by ffmpeg in that I cannot seek in them. I don't seem to be able to play back webm files at all (at least, VLC doesn't like them). OK, so your first task should be to find out (perhaps using a VLC mailing list) why VLC can't play your ".webm" files. If there's something wrong with these files that VLC cannot play them, then it's also likely that our server will be unable to stream them. > Are there any restrictions on what mkv/webm files support trick play? Yes, they need to have "Cue Points". (And seeking will be supported to the points in the file that are specified by these Cue Points.) > Any special settings needed in ffmpeg when generating them? I can't help you there. (This is not a "ffmpeg" mailing list :-) > Or any chance that the AV sync problem with seeking in ps files will get resolved? That's a possibility, but fixing this is a low priority, so it's unlilkely to happen anytime soon. (MPEG Program Stream files don't get very much use these days...) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From r63400 at gmail.com Wed Jan 11 10:08:15 2012 From: r63400 at gmail.com (Ricardo Acosta) Date: Wed, 11 Jan 2012 19:08:15 +0100 Subject: [Live-devel] RTCP Receiver Reports use Message-ID: Hi Ross We have implemented a sender and a receiver for MPEG2TS using Livemedia. I found some old posts from 2004 talking about Receiver reports. "Note, however, that data from RTCP "RR" (Receiver Report) packets (i.e., coming from receivers back to the sender) are currently not processed at all - except to record the identity (SSRC) of the receiver. (This allows the sender to call the "RTCPInstance::numMembers()" member function to get a count of the number of receivers, for example.)" Right now I send (and receive on the client side) RTCP sender reports over the port+1 port, but I am not able to see the Sender Receiver messages. I tried with some other test as testMP3Streamer and testMP3receiver, and I dont receive RTCP receive reports. My questions are 1. Can you please let me know what do I missing to activate RR ? 2. Does now livemedia has more implementations using the receiving report? 3. In which test can we see some examples about using information from sender report. Thank you in advance Ricardo Leon Aserticom -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 11 20:08:08 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 11 Jan 2012 20:08:08 -0800 Subject: [Live-devel] RTCP Receiver Reports use In-Reply-To: References: Message-ID: <8EAAB55A-5609-4C23-B417-A2E773686725@live555.com> > We have implemented a sender and a receiver for MPEG2TS using Livemedia. > Do 'we' not have our own domain name? :-) > I found some old posts from 2004 talking about Receiver reports. > You realize, I hope, that 2004 was 8 years ago :-) We've had full support for RTCP "RR" and "SR" packets for many years now. > Right now I send (and receive on the client side) RTCP sender reports over the port+1 port, but I am not able to see the Sender Receiver messages. > You may be confused by the difference between RTCP "SR" (Sender Report) packets and RTCP "RR" (Reception Report) packets. We implement (the sending and receiving) of both kinds of RTCP packet. Note, however, that a RTP server (transmitter) sends RTCP "SR" packets (and receives RTCP "RR" packets from receiver(s)). A RTP client (receiver) sends RTCP "RR" packets (and receives RTCP "SR" packets from the server). But anyway, the important thing to note is that these packets get sent and received automatically (by the "RTCPInstance" objects). You don't need to do anything special to implement them. > I tried with some other test as testMP3Streamer and testMP3receiver, and I dont receive RTCP receive reports. "testMP3Streamer" sends RTCP "SR" packets, and will receive RTCP "RR" reports from any (multicast-connected receivers). "testMP3Receiver" sends RTCP "RR" packets, and will receiver RTCP "SR" packets from "testMP3Streamer" (provided, of course, that multicast routing is enabled between the computers that run "testMP3Streamer" and "testMP3Receiver"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ferry at bertin.fr Wed Jan 11 23:47:55 2012 From: ferry at bertin.fr (Guillaume Ferry) Date: Thu, 12 Jan 2012 08:47:55 +0100 Subject: [Live-devel] "no frame!" In-Reply-To: References: Message-ID: When you mention ffmpeg, do you mean the application, or the underlying library (libav) ? I had the same problematic few years ago, and I found it can be handled nicely with a reduced amount of code around avcodec_decode_video. I suggest you to have a look on AVCodecParserContext for this matter. If you're still stuck, I can even provide you a code sample. Regards, Guillaume. Le Wed, 11 Jan 2012 23:53:40 +0100, Menne, Neil a ?crit: > > I?m working on my Live555/FFmpeg video player, and I ran into an > interesting problem that has kept me stumped for several days. I am > taking the buffer that is delivered to my MediaSink (like >the example > in testRTSPClient), and I am passing the buffer and the size to FFmpeg > to decode. It says that there is ?no frame!? I?m stumped as to why that > is the case when I?m taking the buffer >after the ?afterGettingFrame? > function is called. I was wondering if there was something else that > must be done to that buffer for it to be a true frame that can be > decoded. > > > My first guess is that the decoder needs more information which brings > me to my next question: the SDP description that I?m pulling down > doesn?t contain the width/height, so I?m guessing I >need to pull that > out of the sprop_parameter_sets; is this the case? > > > My second guess is that there is no frame separator; I noticed that in > one particular file: H264VideoFileSink they prepend a 0x00000001 along > with the sprop_parameters. Is this a potential >problem? > > > Video is not my expertise, so I?m spending most days reading > anything/everything on these problems, but this one?s got me stuck. > > > -Neil > > > P.S. My video is an H.264 video stream > > > > -- Guillaume FERRY Ing?nieur Traitement de l'Information et du Contenu Tel : 01 39 30 62 09 Fax : 01 39 30 62 45 ferry at bertin.fr http://www.bertin.fr -------------- next part -------------- An HTML attachment was scrubbed... URL: From anthony at haploid.fr Thu Jan 12 00:47:47 2012 From: anthony at haploid.fr (Anthony Nevo) Date: Thu, 12 Jan 2012 09:47:47 +0100 Subject: [Live-devel] Broadcasting live events with an iPhone using Live555 media libraries In-Reply-To: References: <0A4CD9FC-D64E-49C3-8174-1A40D7EB65E0@haploid.fr> Message-ID: <30DF1826-2E92-4989-ABB5-CA07A45DA27A@haploid.fr> Thanks very much Ross for your answer. I'm all for "pulling" data from the client but my main problem is that, because we'll be mainly on cellular networks, all clients will be behind a WISP and, thus, expose the same IP address to the internet world. As I suppose it's the server that has to initiate the connection, how can it knows from where to pull the stream if all mobile phones have the same IP address ? Thanks again for your help, Anthony Le 5 janv. 2012 ? 21:04, Ross Finlayson a ?crit : >> The architecture we are building is based on an Helix Media Server that will be used to broadcast the stream to PCs, phones, ... >> Then, on the iPhone part, we're building an app that will access the camera, and then maybe by using Live555 libraries, streaming it to the Helix Media Server (H264/AAC using RTP). A good example of what we are trying to achieve is the Livu app (http://stevemcfarlin.com/livu/index.html). >> >> I've compiled the Live555 libraries for the iPhone and I've checked the test programs (particularly testH264VideoStreamer or testMPEG4VideoToDarwin). TestMPEG4VideoToDarwin seems to be very close to what we want to do but I was wondering if someone had some advices on how to achieve this goal. >> Are we on the right path or not ? > > Yes, you probably are, provided that the protocol that the "Helix Media Server" uses to receive incoming data is the same protocol (or at least a similar protocol) as the one that's used by the "Darwin Streaming Server". > > Personally, I dislike the idea of clients 'pushing' data into servers (see and ), but I recognize that there are legacy servers out there that support that model. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From gsnetplayer at hotmail.com Thu Jan 12 04:12:26 2012 From: gsnetplayer at hotmail.com (GS Net Player) Date: Thu, 12 Jan 2012 13:12:26 +0100 Subject: [Live-devel] RTSP only in Lan - Lan ?! Message-ID: Can someone tell me why my rtsp code only works in the local network ( Lan - Lan ) but not on Windows Server 2008 ( hosting ) ? here's my code: #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" UsageEnvironment* env; Boolean const isSSM = True; char const* inputFileName = "udp://@239.255.42.42:8888"; MPEG1or2VideoStreamFramer* videoSource; RTPSink* videoSink; void play(); // forward Boolean reuseFirstSource = True; Boolean iFramesOnly = False; static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName); // fwd int main(int argc, char** argv) { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Create a 'groupsock' for the input multicast group,port: char const* inputAddressStr #ifdef USE_SSM = "232.255.42.42"; #else = "239.255.42.42"; #endif struct in_addr inputAddress; inputAddress.s_addr = our_inet_addr(inputAddressStr); Port const inputPort(8888); unsigned char const inputTTL = 0; // we're only reading from this mcast group #ifdef USE_SSM char* sourceAddressStr = "udp://@239.255.42.42:8888"; // replace this with the real source address struct in_addr sourceFilterAddress; sourceFilterAddress.s_addr = our_inet_addr(sourceAddressStr); Groupsock inputGroupsock(*env, inputAddress, sourceFilterAddress, inputPort); #else Groupsock inputGroupsock(*env, inputAddress, inputPort, inputTTL); #endif // Then create a liveMedia 'source' object, encapsulating this groupsock: FramedSource* source = BasicUDPSource::createNew(*env, &inputGroupsock); FramedSource* source2 = BasicUDPSource::createNew(*env, &inputGroupsock); char const* outputAddressStr = "239.255.43.43"; // this could also be unicast // Note: You may change "outputAddressStr" to use a different multicast // (or unicast address), but do *not* change it to use the same multicast // address as "inputAddressStr". struct in_addr outputAddress; outputAddress.s_addr = our_inet_addr(outputAddressStr); Port const outputPort(4444); unsigned char const outputTTL = 255; Groupsock outputGroupsock(*env, outputAddress, outputPort, outputTTL); // Create a 'MPEG-4 Video RTP' sink from the RTP 'groupsock': unsigned const maxPacketSize = 65536; // allow for large UDP packets videoSink = SimpleRTPSink::createNew(*env, &outputGroupsock, 33, 90000, "video", "mp2t", 1, True, False /*no 'M' bit*/); // MediaSink* sink = SimpleRTPSink::createNew(*env, &outputGroupsock, 33, 90000, "video", "mp2t", // 1, True, False /*no 'M' bit*/); const unsigned estimatedSessionBandwidth = 4500; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* rtcp = RTCPInstance::createNew(*env, &inputGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL /* we're a server */, isSSM); RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554); if (rtspServer == NULL) { *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n"; exit(1); } ServerMediaSession* sms = ServerMediaSession::createNew(*env, "testStream", inputFileName, "Session streamed by \"testMPEG4VideoStreamer\"", isSSM); sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp)); rtspServer->addServerMediaSession(sms); char* url = rtspServer->rtspURL(sms); *env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; videoSource = MPEG1or2VideoStreamDiscreteFramer::createNew(*env, source); videoSink->startPlaying(*videoSource, NULL, NULL); if (rtspServer->setUpTunnelingOverHTTP(80) || rtspServer->setUpTunnelingOverHTTP(8000) || rtspServer->setUpTunnelingOverHTTP(8080)) { *env << "\n(We use port " << rtspServer->httpServerPortNum() << " for optional RTSP-over-HTTP tunneling.)\n"; } else { *env << "\n(RTSP-over-HTTP tunneling is not available.)\n"; } // sink->startPlaying(*source2, NULL, NULL); env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName) { char* url = rtspServer->rtspURL(sms); UsageEnvironment& env = rtspServer->envir(); env << "\n\"" << streamName << "\" stream, from the file \"" << inputFileName << "\"\n"; env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; } greeting Igor -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 12 06:47:11 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Jan 2012 06:47:11 -0800 Subject: [Live-devel] RTSP only in Lan - Lan ?! In-Reply-To: References: Message-ID: > Can someone tell me why my rtsp code only works in the local network ( Lan - Lan ) but not on Windows Server 2008 ( hosting ) ? Probably because you don't have IP multicast routing between the sending computer (that's sending to multicast address 239.255.42.42, port 8888) and the computer that's running your application, and/or between the computer that's running your application (that's sending data to multicast address 239.255.43.43, port 4444) and the receiving computer. (Also, because you're not using 'source-specific multicast', you should be setting "isSSM" to "False", not "True". And because "IS_SSM" is not defined, you can remove all of the code from the #if branch of the #ifdef USE_SSM ... #endif code, because the #if branch of that code doesn't get executed (which is just as well, because the "sourceAddressStr" line that you've put there is very wrong).) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Thu Jan 12 06:55:59 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Thu, 12 Jan 2012 16:55:59 +0200 Subject: [Live-devel] RTSP only in Lan - Lan ?! In-Reply-To: References: Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8ACB@SSTSVR1.sst.local> Make sure that your firewall allows communications on the network for the specified port. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of GS Net Player Sent: 12 January 2012 14:12 To: live-devel at ns.live555.com Subject: [Live-devel] RTSP only in Lan - Lan ?! Can someone tell me why my rtsp code only works in the local network ( Lan - Lan ) but not on Windows Server 2008 ( hosting ) ? here's my code: #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" UsageEnvironment* env; Boolean const isSSM = True; char const* inputFileName = "udp://@239.255.42.42:8888"; MPEG1or2VideoStreamFramer* videoSource; RTPSink* videoSink; void play(); // forward Boolean reuseFirstSource = True; Boolean iFramesOnly = False; static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName); // fwd int main(int argc, char** argv) { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Create a 'groupsock' for the input multicast group,port: char const* inputAddressStr #ifdef USE_SSM = "232.255.42.42"; #else = "239.255.42.42"; #endif struct in_addr inputAddress; inputAddress.s_addr = our_inet_addr(inputAddressStr); Port const inputPort(8888); unsigned char const inputTTL = 0; // we're only reading from this mcast group #ifdef USE_SSM char* sourceAddressStr = "udp://@239.255.42.42:8888"; // replace this with the real source address struct in_addr sourceFilterAddress; sourceFilterAddress.s_addr = our_inet_addr(sourceAddressStr); Groupsock inputGroupsock(*env, inputAddress, sourceFilterAddress, inputPort); #else Groupsock inputGroupsock(*env, inputAddress, inputPort, inputTTL); #endif // Then create a liveMedia 'source' object, encapsulating this groupsock: FramedSource* source = BasicUDPSource::createNew(*env, &inputGroupsock); FramedSource* source2 = BasicUDPSource::createNew(*env, &inputGroupsock); char const* outputAddressStr = "239.255.43.43"; // this could also be unicast // Note: You may change "outputAddressStr" to use a different multicast // (or unicast address), but do *not* change it to use the same multicast // address as "inputAddressStr". struct in_addr outputAddress; outputAddress.s_addr = our_inet_addr(outputAddressStr); Port const outputPort(4444); unsigned char const outputTTL = 255; Groupsock outputGroupsock(*env, outputAddress, outputPort, outputTTL); // Create a 'MPEG-4 Video RTP' sink from the RTP 'groupsock': unsigned const maxPacketSize = 65536; // allow for large UDP packets videoSink = SimpleRTPSink::createNew(*env, &outputGroupsock, 33, 90000, "video", "mp2t", 1, True, False /*no 'M' bit*/); // MediaSink* sink = SimpleRTPSink::createNew(*env, &outputGroupsock, 33, 90000, "video", "mp2t", // 1, True, False /*no 'M' bit*/); const unsigned estimatedSessionBandwidth = 4500; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* rtcp = RTCPInstance::createNew(*env, &inputGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL /* we're a server */, isSSM); RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554); if (rtspServer == NULL) { *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n"; exit(1); } ServerMediaSession* sms = ServerMediaSession::createNew(*env, "testStream", inputFileName, "Session streamed by \"testMPEG4VideoStreamer\"", isSSM); sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp)); rtspServer->addServerMediaSession(sms); char* url = rtspServer->rtspURL(sms); *env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; videoSource = MPEG1or2VideoStreamDiscreteFramer::createNew(*env, source); videoSink->startPlaying(*videoSource, NULL, NULL); if (rtspServer->setUpTunnelingOverHTTP(80) || rtspServer->setUpTunnelingOverHTTP(8000) || rtspServer->setUpTunnelingOverHTTP(8080)) { *env << "\n(We use port " << rtspServer->httpServerPortNum() << " for optional RTSP-over-HTTP tunneling.)\n"; } else { *env << "\n(RTSP-over-HTTP tunneling is not available.)\n"; } // sink->startPlaying(*source2, NULL, NULL); env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName, char const* inputFileName) { char* url = rtspServer->rtspURL(sms); UsageEnvironment& env = rtspServer->envir(); env << "\n\"" << streamName << "\" stream, from the file \"" << inputFileName << "\"\n"; env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; } greeting Igor -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Thu Jan 12 06:25:04 2012 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 12 Jan 2012 14:25:04 +0000 Subject: [Live-devel] "no frame!" In-Reply-To: <80C795F72B3CB241A9256DABF0A04EC5FA2E87@CH1PRD0702MB107.namprd07.prod.outlook.com> References: <80C795F72B3CB241A9256DABF0A04EC5FA2E87@CH1PRD0702MB107.namprd07.prod.outlook.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B1B1D71@IL-BOL-EXCH01.smartwire.com> You may be doing a bit more work than is necessary. Most H264 streams have the SPS and PPS embedded periodically. Here is my setup, note the extradata is set to NULL and size set to 0. This you need to do then on decode of the first frame liavcode fills in the rest of the context. pCodec_ = avcodec_find_decoder(CODEC_ID_H264); if (!pCodecCtx_) { throw std::exception("[CH264Filter]: Can't allocate pCodecCtx_!"); } pCodecCtx_ = avcodec_alloc_context(); pCodecCtx_->skip_frame = AVDISCARD_DEFAULT; pCodecCtx_->skip_idct = AVDISCARD_DEFAULT; pCodecCtx_->skip_loop_filter = AVDISCARD_DEFAULT; pCodecCtx_->error_concealment = FF_EC_GUESS_MVS | FF_EC_DEBLOCK ; pCodecCtx_->workaround_bugs = FF_BUG_AUTODETECT | FF_BUG_MS; pCodecCtx_->strict_std_compliance = FF_COMPLIANCE_NORMAL | FF_COMPLIANCE_UNOFFICIAL; pCodecCtx_->error_recognition = FF_ER_CAREFUL; pCodecCtx_->codec_type = CODEC_TYPE_VIDEO; pCodecCtx_->codec_id = CODEC_ID_H264; pCodecCtx_->codec_tag = MKTAG('H', '2', '6', '4'); pCodecCtx_->extradata = NULL; pCodecCtx_->extradata_size = 0; res = avcodec_open(pCodecCtx_, pCodec_); if(res < 0) throw std::exception("Can't open video codec!"); I actually create the SPS and PPS from the spprops and insert them into the stream if the encoder does not do already do that for me. (Like VLC does when capturing) From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jer Morrill Sent: Wednesday, January 11, 2012 6:51 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] "no frame!" I just implemented an custom H264 source and decoder with FFMPEG. Here's how I did it (in concept): 1.) Get the subsession->fmtp_configuration() string 2.) Do something like this with H264VideoRTPSource.hh included - "auto records = parseSPropParameters(fmtp_configuration, recordCount);" 3.) Each of the returned records from step 2, prepend a 0x00000001 4.) Each sample you send to FFMPEG, prepend each of your records from step 3 to your sample buffer I'm not sure if I'm telling you incorrect information, but this worked for me! -Jer From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Menne, Neil Sent: Wednesday, January 11, 2012 2:54 PM To: live-devel at lists.live555.com Subject: [Live-devel] "no frame!" I'm working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is "no frame!" I'm stumped as to why that is the case when I'm taking the buffer after the "afterGettingFrame" function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I'm pulling down doesn't contain the width/height, so I'm guessing I need to pull that out of the sprop_parameter_sets; is this the case? My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? Video is not my expertise, so I'm spending most days reading anything/everything on these problems, but this one's got me stuck. -Neil P.S. My video is an H.264 video stream ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4737 - Release Date: 01/11/12 -------------- next part -------------- An HTML attachment was scrubbed... URL: From gsnetplayer at hotmail.com Thu Jan 12 08:53:33 2012 From: gsnetplayer at hotmail.com (GS Net Player) Date: Thu, 12 Jan 2012 17:53:33 +0100 Subject: [Live-devel] RTSP only in Lan - Lan ?! In-Reply-To: References: , Message-ID: Hi Ross, thank you for your response, I changed everything as you suggested but still can not get stream from rtsp://xxx.xxx.xxx.xxx:8554/testStream or rstp://xxx.xxx.xxx.xxx:8000/testStream ! Otherwise, when I put in my code: { char const* streamName = "StreamTest"; char const* inputFileName = "test.ts"; char const* indexFileName = "test.tsx"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(MPEG2TransportFileServerMediaSubsession ::createNew(*env, inputFileName, indexFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } I can then get rtsp://xxx.xxx.xxx.xxx:8554/StreamTest or rtsp://xxx.xxx.xxx.xxx:8000/StreamTest ( I have already allowed the ports 8554 and 8000 through the firewall ). Please show me how to get the stream to work on the Internet Igor From: finlayson at live555.com Date: Thu, 12 Jan 2012 06:47:11 -0800 To: live-devel at ns.live555.com Subject: Re: [Live-devel] RTSP only in Lan - Lan ?! Can someone tell me why my rtsp code only works in the local network ( Lan - Lan ) but not on Windows Server 2008 ( hosting ) ? Probably because you don't have IP multicast routing between the sending computer (that's sending to multicast address 239.255.42.42, port 8888) and the computer that's running your application, and/or between the computer that's running your application (that's sending data to multicast address 239.255.43.43, port 4444) and the receiving computer. (Also, because you're not using 'source-specific multicast', you should be setting "isSSM" to "False", not "True". And because "IS_SSM" is not defined, you can remove all of the code from the #if branch of the #ifdef USE_SSM ... #endif code, because the #if branch of that code doesn't get executed (which is just as well, because the "sourceAddressStr" line that you've put there is very wrong).) Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From MikeS212 at intelligentvideo.tv Thu Jan 12 10:40:44 2012 From: MikeS212 at intelligentvideo.tv (Mike Stewart) Date: Thu, 12 Jan 2012 18:40:44 +0000 Subject: [Live-devel] StreamReplicator active replicas count Message-ID: <4F0F292C.6070303@intelligentvideo.tv> Hello All, I have been looking at the new StreamReplicator class and find I am hitting the 'Internal Error 2' logging in StreamReplicator::deliverReceivedFrame() when a StreamReplica is removed whilst another remains active. This could be due to a double-decrement of fNumActiveReplicas as StreamReplicator::deactivateStreamReplica() it is called both from StreamReplica::doStopGettingFrames() and subsequently via removeStreamReplica() from the StreamReplica destructor. One suggestion would be to add a condition to the last instruction of StreamReplicator::removeStreamReplica() as follows: if (replicaBeingRemoved->fFrameIndex != -1) deactivateStreamReplica(replicaBeingRemoved); Apologies if I am wide of the mark on this. Many thanks, Mike -- Mike Stewart Intelligent Video Limited From liquidl at email.com Thu Jan 12 16:42:46 2012 From: liquidl at email.com (liquidl at email.com) Date: Thu, 12 Jan 2012 19:42:46 -0500 Subject: [Live-devel] Accessing RR stats Message-ID: <20120113004246.5400@gmx.com> Hi, I am using a class derived from OnDemandServerMediaSubsession. ?I know that a class derived from OnDemandServerMediaSubsession automatically instantiates RTCPInstance for each stream and I can very well see the VLC client and my RTSP server talking RTCP protocol. Now my issue is that I'd like to examine the statistics generated by the RTCP RR packets that are coming in from VLC client. ?I'm unsure how I'd access this data. Would I access the RTPTransmissionStatsDB in function RTSPServer::RTSPClientSession:noteLiveness() in RTSPServer.cpp? This is because this function is triggered on each RR packet reception. ?If true, then how do I access RTPTransmissionStatsDB from this function? Alternatively, is there another (cleaner) way to do this without modifying core library code i.e. RTSPServer.cpp? ?I am getting confused because since RTCP is instantiated automatically for me, I'm not sure how to register my own RR handler function and access the RTPTransmissionStatsDB. I have searched the list and spent some time racking my brains. ?Any tips would be greatly appreciated. Regards, UQ From finlayson at live555.com Thu Jan 12 21:37:14 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Jan 2012 21:37:14 -0800 Subject: [Live-devel] StreamReplicator active replicas count In-Reply-To: <4F0F292C.6070303@intelligentvideo.tv> References: <4F0F292C.6070303@intelligentvideo.tv> Message-ID: <05D54C08-EB4E-434D-8E53-65D745242AB9@live555.com> Mike, Thanks for noticing this bug. Your suggested fix looks good, and will be included in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 12 22:41:43 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 12 Jan 2012 22:41:43 -0800 Subject: [Live-devel] Accessing RR stats In-Reply-To: <20120113004246.5400@gmx.com> References: <20120113004246.5400@gmx.com> Message-ID: <9CE1C28D-1947-4655-B3C7-D8D70E4FC71C@live555.com> > Would I access the RTPTransmissionStatsDB in function RTSPServer::RTSPClientSession:noteLiveness() in RTSPServer.cpp? No; that function is used only by the RTCP implementation (when it receives an incoming "RR" packet). It's not a function that you would call (or modify) yourself. > Alternatively, is there another (cleaner) way to do this without modifying core library code i.e. RTSPServer.cpp? Yes. Note that each "RTPTransmissionStatsDB" object (a 'database of RTP transmission stats') is for a particular "RTPSink" object (and accessed using "RTPSink::transmissionStatsDB()"); therefore, you access these stats by first accessing a "RTPSink" object. Note that a "RTPSink" object is used for a particular server->client (sub)session - i.e., for a particular media substream (audio or video) of a particular server->client stream. Therefore, you access a "RTPSink" object via a "RTSPServer::RTSPClientSession" object. So, you need to do the following: 1/ Subclass "RTSPServer::RTSPClientSession". 2/ Subclass "RTSPServer" (only to redefine the "createNewClientSession()" virtual function to create objects of your new "RTSPServer::RTSPClientSession" subclass, rather than just the "RTSPServer::RTSPClientSession" base class (the default behavior)). 3/ In your "RTSPServer::RTSPClientSession" subclass - whenever you wish to access the transmission stats (e.g., you might choose to do this periodically, using a timer) - you would do so using code like the following: for (unsigned i = 0; i < fNumStreamStates; ++i) { (StreamState*)streamState = (StreamState*)fStreamStates[i].streamToken; // we can do this cast because we know that we are using "OnDemandServerMediaSubsession"s RTPTransmissionStatsDB& transmissionStatesDB = streamState->rtpSink()->transmissionStatsDB(); } Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Fri Jan 13 01:09:52 2012 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Fri, 13 Jan 2012 10:09:52 +0100 Subject: [Live-devel] OnDemand MPEG 2 TS UDP source Relay Message-ID: Hi, I have Implemented a new class called "MPEG2TransportUDPServerMediaSubsession" that handles on demand relaying of a Basic UDP source transporting MPEG2 TS Video to any client requesting it through rtsp. To test it please add the following bloc to testOnDemandRTSPServer: { char const* tvServiceName = "tv"; char const* inputAddressStr = "0";//unicast UDP source Port const inputPort(1234); unsigned char const inputTTL = 0; ServerMediaSession* sms = ServerMediaSession::createNew(*env, tvServiceName, tvServiceName, descriptionString); sms->addSubsession(MPEG2TransportUDPServerMediaSubsession::createNew(*env, inputAddressStr,inputPort,inputTTL, True)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms,tvServiceName,tvServiceName); } The new class is attached to this email. Thanks. -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hanalab.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: UDPSourcerRelay.tar.gz Type: application/x-gzip Size: 1680 bytes Desc: not available URL: From gsnetplayer at hotmail.com Fri Jan 13 03:02:04 2012 From: gsnetplayer at hotmail.com (GS Net Player) Date: Fri, 13 Jan 2012 12:02:04 +0100 Subject: [Live-devel] OnDemand MPEG 2 TS UDP source Relay In-Reply-To: References: Message-ID: When I try to compile I get this: msvcrt.lib(crtexe.obj) : error LNK2019: unresolved external symbol _main referen ced in function ___tmainCRTStartup testReplicator.exe : fatal error LNK1120: 1 unresolved externals NMAKE : fatal error U1077: '"c:\Program Files\Microsoft Visual Studio 9.0\VC\BIN \link.EXE"' : return code '0x460' Stop. how to solve ? Igor Date: Fri, 13 Jan 2012 10:09:52 +0100 From: achraf.gazdar at gmail.com To: live-devel at ns.live555.com Subject: [Live-devel] OnDemand MPEG 2 TS UDP source Relay Hi, I have Implemented a new class called "MPEG2TransportUDPServerMediaSubsession" that handles on demand relaying of a Basic UDP source transporting MPEG2 TS Video to any client requesting it through rtsp. To test it please add the following bloc to testOnDemandRTSPServer: { char const* tvServiceName = "tv"; char const* inputAddressStr = "0";//unicast UDP source Port const inputPort(1234); unsigned char const inputTTL = 0; ServerMediaSession* sms = ServerMediaSession::createNew(*env, tvServiceName, tvServiceName, descriptionString); sms->addSubsession(MPEG2TransportUDPServerMediaSubsession::createNew(*env, inputAddressStr,inputPort,inputTTL, True)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms,tvServiceName,tvServiceName); } The new class is attached to this email. Thanks. -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hanalab.org Tunisia _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From r63400 at gmail.com Fri Jan 13 05:25:03 2012 From: r63400 at gmail.com (Ricardo Acosta) Date: Fri, 13 Jan 2012 14:25:03 +0100 Subject: [Live-devel] RTCP Receiver Reports use in unicast sessions Message-ID: Hi Ross Thank you for your response > > I tried with some other test as testMP3Streamer and testMP3receiver, and > I dont receive RTCP receive reports. > > > "testMP3Streamer" sends RTCP "SR" packets, and will receive RTCP "RR" > reports from any (multicast-connected receivers). "testMP3Receiver" sends > RTCP "RR" packets, and will receiver RTCP "SR" packets from > "testMP3Streamer" (provided, of course, that multicast routing is enabled > between the computers that run "testMP3Streamer" and "testMP3Receiver"). > > I tried again testMP3Streamer in multicast and now i receive back the RR. Is there any special reason that works only in multicast (knowing RR is send back in unicast to the original sender) ? I am trying to see how can it works in unicast , but when I call the RTCPInstance, there is no other parameter to change. Sender videoSink = SimpleRTPSink::createNew(*env, &(*rtpGroupSock), 33, 90000, "video", "mp2t", 1, True, False /*no 'M' bit*/); rtcpInstance = RTCPInstance::createNew(*env, &(*rtcpGroupSock), estimatedSessionBandwidth, cname,videoSink, NULL /* we're a server */, isSSM); Having isSSM false for Unicast Receiver rtpSource = SimpleRTPSource::createNew(*env, &(*rtpGroupSock),33,90000, "video/",0,false ); rtcpInstance = RTCPInstance::createNew(*env,&(*rtcpGroupSock),estimatedSessionBandwidth,cname,NULL/* we're a client */,rtpSource); Thank you Ricardo > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 13 06:38:28 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 13 Jan 2012 06:38:28 -0800 Subject: [Live-devel] RTCP Receiver Reports use in unicast sessions In-Reply-To: References: Message-ID: <40E7E051-63ED-47BC-BC38-46F73BEA77EC@live555.com> > I tried again testMP3Streamer in multicast and now i receive back the RR. > Is there any special reason that works only in multicast (knowing RR is send back in unicast to the original sender) ? I am trying to see how can it works in unicast It works just fine with unicast if you use RTSP. Try using "testOnDemandRTSPServer" (or "live555MediaServer") as your server (transmitter), and "testRTSPClient" (or "openRTSP") as your client (receiver). You'll see that RTCP "RR" reports get received by the server just fine. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nmenne at dcscorp.com Fri Jan 13 10:42:22 2012 From: nmenne at dcscorp.com (Menne, Neil) Date: Fri, 13 Jan 2012 13:42:22 -0500 Subject: [Live-devel] "no frame!" In-Reply-To: References: Message-ID: I am in fact talking about the libraries for the underlying parts. In what way was the parser context useful? I?m working on the SPS and PPS stuff, but if that?s useful for anything, it?d be a worthwhile addition. -Neil From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Guillaume Ferry Sent: Thursday, January 12, 2012 2:48 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] "no frame!" When you mention ffmpeg, do you mean the application, or the underlying library (libav) ? I had the same problematic few years ago, and I found it can be handled nicely with a reduced amount of code around avcodec_decode_video. I suggest you to have a look on AVCodecParserContext for this matter. If you're still stuck, I can even provide you a code sample. Regards, Guillaume. Le Wed, 11 Jan 2012 23:53:40 +0100, Menne, Neil a ?crit: I?m working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is ?no frame!? I?m stumped as to why that is the case when I?m taking the buffer after the ?afterGettingFrame? function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I?m pulling down doesn?t contain the width/height, so I?m guessing I need to pull that out of the sprop_parameter_sets; is this the case? My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? Video is not my expertise, so I?m spending most days reading anything/everything on these problems, but this one?s got me stuck. -Neil P.S. My video is an H.264 video stream -- [http://images.bertin.fr/Barettes.png] Guillaume FERRY Ing?nieur Traitement de l'Information et du Contenu Tel : 01 39 30 62 09 Fax : 01 39 30 62 45 ferry at bertin.fr http://www.bertin.fr [http://images.bertin.fr/Bertin.png] -------------- next part -------------- An HTML attachment was scrubbed... URL: From nmenne at dcscorp.com Fri Jan 13 13:11:45 2012 From: nmenne at dcscorp.com (Menne, Neil) Date: Fri, 13 Jan 2012 16:11:45 -0500 Subject: [Live-devel] "no frame!" In-Reply-To: <615FD77639372542BF647F5EBAA2DBC20B1B1A2F@IL-BOL-EXCH01.smartwire.com> References: <615FD77639372542BF647F5EBAA2DBC20B1B1A2F@IL-BOL-EXCH01.smartwire.com> Message-ID: Thanks everyone for your help. The 0x00000001 in front of each piece as explained by Jeff was the trick. I would also like to add that Jeff's preferred method does run better for performance. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Wednesday, January 11, 2012 6:36 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] "no frame!" I have been working with the avcode_decode_video2 for a bit. If [7] is SPS, [8] is PPS, [5] is a keyframe slice, and [1]'a are the difference frames. I have found it needs a stream like this 00 00 01 [7]00 00 01[8]00 00 01[5]00 00 01[1]00 00 01[1].... If you are calling it in a loop it will have no error and no frame until it has enough info. This often means you get the output frame with a delay of 1 frame and must call until it is done outputting frames. I personally aggregate the 7,8,5's into a keyframe and pass the whole thing to avcoded_decode_video2. It can survive without the 7 and 8's after it has got them at least once, but bad packet here or there and it disrupts the video for a while, so I cache them and insert them if the stream does not have them. It is a small price to pay in bandwidth that gives me faster first frame out in VLC or our own player. So Assuming the [n] means "prefix each "n" with 00 00 01... This will work [7][8][5][1][1][1][1][1][5][1][1][1][1]..... But I prefer [7][8][5][1][1][1][1][1][7][8][5][1][1][1][1]..... Also note on high resolution, it may be [7][8][5][5][5][1][1][1][1][1]... Multiple slices to make up huge keyframes, these you may have to call the decoder multiple times before you get a frame. This does not touch on one more reason to call the decode in a loop until it says it is done. Frames may arrive in a different order than display on high resolution advanced H264 profiles (and even MPEG4) The "bi-directional predictive frames must arrive out of order so the decode order differes from the display order. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Menne, Neil Sent: Wednesday, January 11, 2012 4:54 PM To: live-devel at lists.live555.com Subject: [Live-devel] "no frame!" I'm working on my Live555/FFmpeg video player, and I ran into an interesting problem that has kept me stumped for several days. I am taking the buffer that is delivered to my MediaSink (like the example in testRTSPClient), and I am passing the buffer and the size to FFmpeg to decode. It says that there is "no frame!" I'm stumped as to why that is the case when I'm taking the buffer after the "afterGettingFrame" function is called. I was wondering if there was something else that must be done to that buffer for it to be a true frame that can be decoded. My first guess is that the decoder needs more information which brings me to my next question: the SDP description that I'm pulling down doesn't contain the width/height, so I'm guessing I need to pull that out of the sprop_parameter_sets; is this the case? My second guess is that there is no frame separator; I noticed that in one particular file: H264VideoFileSink they prepend a 0x00000001 along with the sprop_parameters. Is this a potential problem? Video is not my expertise, so I'm spending most days reading anything/everything on these problems, but this one's got me stuck. -Neil P.S. My video is an H.264 video stream ________________________________ No virus found in this message. Checked by AVG - www.avg.com Version: 2012.0.1901 / Virus Database: 2109/4736 - Release Date: 01/11/12 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Anarchist666 at yandex.ru Sat Jan 14 01:30:17 2012 From: Anarchist666 at yandex.ru (=?koi8-r?B?5sHazMXF1yDtwcvTyc0=?=) Date: Sat, 14 Jan 2012 13:30:17 +0400 Subject: [Live-devel] DeviceSource and OpenCV Message-ID: <267121326533417@web127.yandex.ru> I am creating an application using OpenCV. There is a need to use the video stream. I created a class based on DeviceSource.Based on the debug output cycle frame reception from the camera will not start. If i need to use event triggers, I do not understand how to do it. Source code is attached. -------------- next part -------------- A non-text attachment was scrubbed... Name: h264videocamservermediasubsession.cpp Type: application/octet-stream Size: 3661 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: h264videocamservermediasubsession.h Type: application/octet-stream Size: 1186 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: main.cpp Type: application/octet-stream Size: 2361 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: videosource.cpp Type: application/octet-stream Size: 5085 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: videosource.h Type: application/octet-stream Size: 919 bytes Desc: not available URL: From iminimup at gmail.com Sat Jan 14 07:38:36 2012 From: iminimup at gmail.com (imin imup) Date: Sat, 14 Jan 2012 09:38:36 -0600 Subject: [Live-devel] looking for motion JPEG over RTP over UDP streaming server. Message-ID: Hello, I'm looking for an Motion JPEG over RTP over UDP streaming server (RFC2435). Can LiveStreamigServer support this? Any help on setup the server and client is appreciated. Best Imin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 15 03:35:55 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 15 Jan 2012 03:35:55 -0800 Subject: [Live-devel] looking for motion JPEG over RTP over UDP streaming server. In-Reply-To: References: Message-ID: > I'm looking for an Motion JPEG over RTP over UDP streaming server (RFC2435). Can LiveStreamigServer support this? No, the "LIVE555 Media Server" product (i.e., pre-built binary application) cannot stream motion JPEG. However, our RTSP server implementation *can* be used - with some additional programming - to stream motion JPEG. However, this is discouraged, because JPEG is a very poor codec for video streaming. See http://www.live555.com/liveMedia/faq.html#jpeg-streaming Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 15 04:30:24 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 15 Jan 2012 04:30:24 -0800 Subject: [Live-devel] DeviceSource and OpenCV In-Reply-To: <267121326533417@web127.yandex.ru> References: <267121326533417@web127.yandex.ru> Message-ID: <64F88190-CD27-44EB-B764-D9D81E663B6A@live555.com> > I am creating an application using OpenCV. There is a need to use the video stream. I created a class based on DeviceSource.Based on the debug output cycle frame reception from the camera will not start. If i need to use event triggers, I do not understand how to do it. You need to arrange for your "signalNewFrameData()" function to be called whenever a new frame (or, in your case, a new H.264 NAL unit) becomes available. That is something that you will have to figure out yourself how to do (because only you understand the environment in which your application will run). Another thing that's wrong with your code is your "H264VideoCamServerMediaSubsession::createNewStreamSource()" function. It needs to be feed its input source into a "H264VideoStreamDiscreteFramer" (because "H264VideoRTPSink" expects a "H264VideoStreamFramer" (or a subclass) as input). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From iminimup at gmail.com Sun Jan 15 08:32:25 2012 From: iminimup at gmail.com (imin imup) Date: Sun, 15 Jan 2012 10:32:25 -0600 Subject: [Live-devel] looking for motion JPEG over RTP over UDP streaming server. In-Reply-To: References: Message-ID: Thank you, Ross. > However, our RTSP server implementation *can* be used - with some > additional programming - to stream motion JPEG. > I'm a newbie on RTP. Could you please detail the additional programming? However, this is discouraged, because JPEG is a very poor codec for video > streaming. See http://www.live555.com/liveMedia/faq.html#jpeg-streaming > Thanks for reminding. The application runs in embedded device over LAN and motion JPEG is suitable for its low codec cost. It is preferred not to support TCP if possible. So I'm wondering in its simplest form, will it work if I only implement an RTP streaming server (2435) without RTSP? Kind like RTP push streamer. Let's suppose server knows the client address/port and it uses fixed profile. Best Imin -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 15 21:33:21 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 15 Jan 2012 21:33:21 -0800 Subject: [Live-devel] looking for motion JPEG over RTP over UDP streaming server. In-Reply-To: References: Message-ID: > However, our RTSP server implementation *can* be used - with some additional programming - to stream motion JPEG. > > I'm a newbie on RTP. Could you please detail the additional programming? The two links in the FAQ entry that I showed you should give you the information that you need. In fact, I recommend that you read the entire FAQ. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ferry at bertin.fr Mon Jan 16 00:01:55 2012 From: ferry at bertin.fr (Guillaume Ferry) Date: Mon, 16 Jan 2012 09:01:55 +0100 Subject: [Live-devel] "no frame!" In-Reply-To: References: Message-ID: Well, it's really simple, the av_parser_parse2 method is being called on input buffers, and it will eventually update a variable when sufficient data has been received to identify a complete frame. It works like a charm for me, when receiving videos streamed by live555 servers. Glad you found out another trick, anyway :) Regards, Guillaume. Le Fri, 13 Jan 2012 19:42:22 +0100, Menne, Neil a ?crit: > > I am in fact talking about the libraries for the underlying parts. In > what way was the parser context useful? I?m working on the SPS and PPS > stuff, but if >that?s useful for anything, it?d be a worthwhile addition. > > > -Neil > > > From: live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Guillaume Ferry > Sent: Thursday, January 12, 2012 2:48 AM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] "no frame!" > > > When you mention ffmpeg, do you mean the application, or the underlying > library (libav) ? > I had the same problematic few years ago, and I found it can be handled > nicely with a reduced amount of code around avcodec_decode_video. > I suggest you to have a look on AVCodecParserContext for this matter. > If you're still stuck, I can even provide you a code sample. > > > Regards, > Guillaume. > > > Le Wed, 11 Jan 2012 23:53:40 +0100, Menne, Neil a > ?crit: >> >> >> I?m working on my Live555/FFmpeg video player, and I ran into an >> interesting problem that has kept me stumped for several days. I am >> taking the >>buffer that is delivered to my MediaSink (like the example >> in testRTSPClient), and I am passing the buffer and the size to FFmpeg >> to decode. It says >>that there is ?no frame!? I?m stumped as to why >> that is the case when I?m taking the buffer after the >> ?afterGettingFrame? function is called. I was >>wondering if there was >> something else that must be done to that buffer for it to be a true >> frame that can be decoded. >> >> >> My first guess is that the decoder needs more information which brings >> me to my next question: the SDP description that I?m pulling down >> doesn?t >>contain the width/height, so I?m guessing I need to pull that >> out of the sprop_parameter_sets; is this the case? >> >> >> My second guess is that there is no frame separator; I noticed that in >> one particular file: H264VideoFileSink they prepend a 0x00000001 along >> with >>the sprop_parameters. Is this a potential problem? >> >> >> Video is not my expertise, so I?m spending most days reading >> anything/everything on these problems, but this one?s got me stuck. >> >> >> -Neil >> >> >> P.S. My video is an H.264 video stream >> >> >> >> > > > > >> -- >> Guillaume FERRY > Ing?nieurTraitement de l'Information et du ContenuTel : 01 39 30 62 09 > Fax : 01 39 30 62 45ferry at bertin.frhttp://www.bertin.fr >> > > > > -- Guillaume FERRY Ing?nieur Traitement de l'Information et du Contenu Tel : 01 39 30 62 09 Fax : 01 39 30 62 45 ferry at bertin.fr http://www.bertin.fr -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Mon Jan 16 03:18:28 2012 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Mon, 16 Jan 2012 12:18:28 +0100 Subject: [Live-devel] Adding new feature to live555 Message-ID: Hi Ross, I have developped and submitted (in my previous email) a new feature to live555 which is the onDemand Relay relaying an MPEG TS stream coming through basic UDP source. Is this new feature not so important that can not be added to the new relaese ? Regards -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hanalab.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 16 18:45:16 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 16 Jan 2012 18:45:16 -0800 Subject: [Live-devel] Adding new feature to live555 In-Reply-To: References: Message-ID: <9B1423A0-952C-4A4B-98BE-126CB5E2237B@live555.com> > I have developped and submitted (in my previous email) a new feature to live555 which is the onDemand Relay relaying an MPEG TS stream coming through basic UDP source. > Is this new feature not so important that can not be added to the new relaese ? I wasn't planning on adding it to our source code release; however, it remains available - in the mailing list archives - for anyone who wants it: http://lists.live555.com/pipermail/live-devel/2012-January/014454.html I might end up adding this (or something like it) to the released source code sometime in the future, but right now it seems a bit too special-purpose (e.g., raw UDP input sent at the server, and Transport Stream-only output). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nmenne at dcscorp.com Tue Jan 17 15:04:03 2012 From: nmenne at dcscorp.com (Menne, Neil) Date: Tue, 17 Jan 2012 18:04:03 -0500 Subject: [Live-devel] SDP Question Message-ID: I'm trying to add the functionality of opening a video stream given an SDP file like the one included. Given that there's no URL, it doesn't seem possible to parse it for the RTSP URL then initialize the RTSPClient; however, if you look at the SDP file, there isn't a URL to find. Is there a way to figure this out using the MediaSession as it is created using a sdp description? Or is there another method that I just haven't encountered? Thanks for everything! -Neil -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: test.sdp Type: application/octet-stream Size: 287 bytes Desc: test.sdp URL: From finlayson at live555.com Tue Jan 17 15:20:30 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 17 Jan 2012 15:20:30 -0800 Subject: [Live-devel] SDP Question In-Reply-To: References: Message-ID: > I?m trying to add the functionality of opening a video stream given an SDP file like the one included. Given that there?s no URL, it doesn?t seem possible to parse it for the RTSP URL then initialize the RTSPClient; however, if you look at the SDP file, there isn?t a URL to find. > > Is there a way to figure this out using the MediaSession as it is created using a sdp description? Well, you could do this if the SDP description was for a multicast stream, because then the SDP description would contain the IP multicast address in its "c=" line. However, in your case the "c=" line contains only the 'null' address "0.0.0.0", which usually means that an additional control protocol (like RTSP or SIP) is necessary to set up the reception of the stream. So the real question here is: Where is this video stream coming from? And how did you get this SDP description? If the stream's server supports RTSP, then you should be able to access directly using a RTSP client (which means that the SDP description gets processed internally, and you shouldn't need to concern yourself with it). But if the stream's server *doesn't* support RTSP, then how did you get the SDP description? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Anarchist666 at yandex.ru Tue Jan 17 23:57:27 2012 From: Anarchist666 at yandex.ru (=?koi8-r?B?5sHazMXF1yDtwcvTyc0=?=) Date: Wed, 18 Jan 2012 11:57:27 +0400 Subject: [Live-devel] DeviceSource and OpenCV In-Reply-To: <64F88190-CD27-44EB-B764-D9D81E663B6A@live555.com> References: <267121326533417@web127.yandex.ru> <64F88190-CD27-44EB-B764-D9D81E663B6A@live555.com> Message-ID: <256221326873447@web63.yandex.ru> An HTML attachment was scrubbed... URL: From tomis66 at hotmail.com Wed Jan 18 01:30:59 2012 From: tomis66 at hotmail.com (Tom S) Date: Wed, 18 Jan 2012 09:30:59 +0000 Subject: [Live-devel] vc++ solution file(s) Message-ID: Dear all, I am doing a student project where I have to take screenshots from the camera stream. I decided to use openRTSP. Unfortunately, I am having a lot of troubles and I am not able to buld the project using visual c++ 2008 on winxp. It would be great if somebody can send me a solution file(s) to successfully build openRTSP and shorten my agony J. Thank you for any help. With kind regards, Tom -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 18 01:49:12 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Jan 2012 01:49:12 -0800 Subject: [Live-devel] DeviceSource and OpenCV In-Reply-To: <256221326873447@web63.yandex.ru> References: <267121326533417@web127.yandex.ru> <64F88190-CD27-44EB-B764-D9D81E663B6A@live555.com> <256221326873447@web63.yandex.ru> Message-ID: > In my case the device appears as VideoCapture from OpenCV. VideoCapture throught the function read() returns a frame in BGR format. Then the frame is encoded using libx264. How do I use signalNewFrameData() > in my case? Getting frames should be in a separate thread? Yes, if you are using a separate thread (i.e. a thread separate from the thread that runs LIVE555) to read and encode frames, then this thread should call "signalNewFrameData()" whenever new frame data becomes available. > How, the transfer data from my device to fTo? That is done in the "deliverFrame()" function (assuming that you're using the "DeviceSource" code as a model). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From joao_dealmeida at hotmail.com Wed Jan 18 02:08:32 2012 From: joao_dealmeida at hotmail.com (Joao Almeida) Date: Wed, 18 Jan 2012 10:08:32 +0000 Subject: [Live-devel] vc++ solution file(s) In-Reply-To: References: Message-ID: don't count on it, here they are apologist of the old make files in consoles. The same i used 25 years ago, before borland and later microsoft visual come with the integrated compile and debug visual interfaces. This guys like the old hard way to do the job done :) Anyway, in previous mailling you found some guys who try to explain how to use the vs solutions for openRTSP. From: tomis66 at hotmail.com To: live-devel at ns.live555.com Date: Wed, 18 Jan 2012 09:30:59 +0000 Subject: [Live-devel] vc++ solution file(s) Dear all, I am doing a student project where I have to take screenshots from the camera stream. I decided to use openRTSP. Unfortunately, I am having a lot of troubles and I am not able to buld the project using visual c++ 2008 on winxp. It would be great if somebody can send me a solution file(s) to successfully build openRTSP and shorten my agony J. Thank you for any help. With kind regards, Tom _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nmenne at dcscorp.com Wed Jan 18 06:50:41 2012 From: nmenne at dcscorp.com (Menne, Neil) Date: Wed, 18 Jan 2012 09:50:41 -0500 Subject: [Live-devel] SDP Question In-Reply-To: References: Message-ID: I just realized what the issue was. In this particular case, the SDP file is used to receive an old RTSP stream that is being looped via sending a pcap of the video. Seeing as there is no RTSP server to communicate with, that part isn't used but everything else regarding the session, subsession, and sink is fine (for anyone trying to follow along: see testRTSPClient). By changing that "0.0.0.0" to the IP address that the video was being streamed to, the sink was able to find and receive the video. I am grateful for your help and prompt responses, Ross. Keep up the good work. -Neil From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, January 17, 2012 6:21 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] SDP Question I'm trying to add the functionality of opening a video stream given an SDP file like the one included. Given that there's no URL, it doesn't seem possible to parse it for the RTSP URL then initialize the RTSPClient; however, if you look at the SDP file, there isn't a URL to find. Is there a way to figure this out using the MediaSession as it is created using a sdp description? Well, you could do this if the SDP description was for a multicast stream, because then the SDP description would contain the IP multicast address in its "c=" line. However, in your case the "c=" line contains only the 'null' address "0.0.0.0", which usually means that an additional control protocol (like RTSP or SIP) is necessary to set up the reception of the stream. So the real question here is: Where is this video stream coming from? And how did you get this SDP description? If the stream's server supports RTSP, then you should be able to access directly using a RTSP client (which means that the SDP description gets processed internally, and you shouldn't need to concern yourself with it). But if the stream's server *doesn't* support RTSP, then how did you get the SDP description? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jeremiah.Morrill at econnect.tv Wed Jan 18 09:44:09 2012 From: Jeremiah.Morrill at econnect.tv (Jer Morrill) Date: Wed, 18 Jan 2012 17:44:09 +0000 Subject: [Live-devel] vc++ solution file(s) In-Reply-To: References: Message-ID: <80C795F72B3CB241A9256DABF0A04EC5022D0E24@CH1PRD0710MB391.namprd07.prod.outlook.com> Visual Studio 2010 is what I use for development on Windows. I found it to be no more than a 15 minute process to make a visual studio solution for Live555, which is not even necessary because of the make files the author uses. It's only my opinion, but I don't believe the author should be responsible for maintaining a VS solution and I find the make files to be a great cross platform solution. Here's how my VS solution looks (nothing magic): Static Library Project for: BasicUsageEnvironment directory Static Library Project for: groupsock directory Static Library Project for: liveMedia directory Static Library Project for: UsageEnvironment directory To compile the examples in VS, simply create a new Win32 Console project template, add source files, the libs from BasicUsageEnviroment, groupsock, liveMedia and UsageEnvironment, add your header paths and you'll be on your way. -Jer From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Joao Almeida Sent: Wednesday, January 18, 2012 2:09 AM To: live-devel at ns.live555.com Subject: Re: [Live-devel] vc++ solution file(s) don't count on it, here they are apologist of the old make files in consoles. The same i used 25 years ago, before borland and later microsoft visual come with the integrated compile and debug visual interfaces. This guys like the old hard way to do the job done :) Anyway, in previous mailling you found some guys who try to explain how to use the vs solutions for openRTSP. ________________________________ From: tomis66 at hotmail.com To: live-devel at ns.live555.com Date: Wed, 18 Jan 2012 09:30:59 +0000 Subject: [Live-devel] vc++ solution file(s) Dear all, I am doing a student project where I have to take screenshots from the camera stream. I decided to use openRTSP. Unfortunately, I am having a lot of troubles and I am not able to buld the project using visual c++ 2008 on winxp. It would be great if somebody can send me a solution file(s) to successfully build openRTSP and shorten my agony :). Thank you for any help. With kind regards, Tom _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Wed Jan 18 07:55:24 2012 From: isambhav at gmail.com (Sambhav) Date: Wed, 18 Jan 2012 21:25:24 +0530 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app Message-ID: Hi, The testRTSPClient app shows how to receive data for a subsession. Can the application get the RTP packets instead/along with media data ? Regards, Sambhav -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 18 18:35:48 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Jan 2012 18:35:48 -0800 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app In-Reply-To: References: Message-ID: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> > The testRTSPClient app shows how to receive data for a subsession. > Can the application get the RTP packets instead/along with media data ? No. The LIVE555 library does all of the RTP/RTCP protocol handling for you, so applications that use it do not see RTP packet headers (or RTCP packets) at all. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Wed Jan 18 21:02:29 2012 From: isambhav at gmail.com (Sambhav) Date: Thu, 19 Jan 2012 10:32:29 +0530 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app In-Reply-To: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> References: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> Message-ID: Thanks Ross. Is there a way to get these RTP packets out of the live555 library for usage by application ? and which Class in the library handles H264 RTP packets and extracts the media data ? On Thu, Jan 19, 2012 at 8:05 AM, Ross Finlayson wrote: > The testRTSPClient app shows how to receive data for a subsession. > Can the application get the RTP packets instead/along with media data ? > > > No. The LIVE555 library does all of the RTP/RTCP protocol handling for > you, so applications that use it do not see RTP packet headers (or RTCP > packets) at all. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 18 21:13:35 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Jan 2012 21:13:35 -0800 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app In-Reply-To: References: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> Message-ID: > Thanks Ross. Is there a way to get these RTP packets out of the live555 library for usage by application ? > and which Class in the library handles H264 RTP packets and extracts the media data ? Let's step back a bit. Could you please explain specifically what you want to do with the incoming H.264/RTP stream, and why you feel that you can't do this with the LIVE555 library as it currently stands? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Wed Jan 18 21:41:34 2012 From: isambhav at gmail.com (Sambhav) Date: Thu, 19 Jan 2012 11:11:34 +0530 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app In-Reply-To: References: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> Message-ID: I have a low end device with Live555 RTSP server serving live H264 stream. My application needs acts as a RTSP client and do the following 2 things a) Store this H264 data in mp4 format. b) Re - stream the H264 data to another application(can be more than one) for a doing some other processing. feature a) can be directly done using the current Live555 library. To implement feature b) one option is , my application has to packetize the elementary H264 data into RTP and stream this data on a UDP socket. On Thu, Jan 19, 2012 at 10:43 AM, Ross Finlayson wrote: > Thanks Ross. Is there a way to get these RTP packets out of the live555 > library for usage by application ? > and which Class in the library handles H264 RTP packets and extracts the > media data ? > > > Let's step back a bit. Could you please explain specifically what you > want to do with the incoming H.264/RTP stream, and why you feel that you > can't do this with the LIVE555 library as it currently stands? > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 18 21:57:36 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 18 Jan 2012 21:57:36 -0800 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app In-Reply-To: References: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> Message-ID: > I have a low end device with Live555 RTSP server serving live H264 stream. > > My application needs acts as a RTSP client and do the following 2 things > a) Store this H264 data in mp4 format. > b) Re - stream the H264 data to another application(can be more than one) for a doing some other processing. > > feature a) can be directly done using the current Live555 library. > To implement feature b) one option is , my application has to packetize the elementary H264 data into RTP and stream this data on a UDP socket. OK, this is completely different to what you were asking about before. You don't need to 'receive RTP packets' at all. You can do what you want to do by using the new "StreamReplicator" class (see the "testReplicator" demo application for hints on how to use this). First, take the incoming H.264 NAL unit stream from your RTSP client (i.e., from "subsession->readSource()") and feed it into a new "StreamReplicator" object. Then create two 'replica' streams (using "createStreamReplica()") from the "StreamReplicator" object. Use one replica for your 'mp4' file output (i.e., a "QuickTimeFileSink"). Use the other replica for streaming (using a "H264VideoStreamDiscreteFramer" fed into a "H264VideoRTPSink"). Call "startPlaying()" on each of these 'sink' objects before entering the event loop. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Thu Jan 19 06:45:35 2012 From: isambhav at gmail.com (Sambhav) Date: Thu, 19 Jan 2012 20:15:35 +0530 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app In-Reply-To: References: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> Message-ID: Hi Ross, I am trying to implement my app as suggested in the previous mail. In the testRTSPClient app, after calling "rtspClient->sendDescribeCommand(continueAfterDESCRIBE); " the program enters the doEventLoop. Rest of the processing is done in asynchronous callbacks. you mentioned >> Call "startPlaying()" on each of these 'sink' objects before entering the doEventLoop. StreamReplicator cannot be initialized before entering doEventLoop. (Am I understanding it wrong here) I called startPlaying() on the sink objects in continueAfterPLAY function and got this error. "FramedSource[0x10b001ad0]::getNextFrame(): attempting to read more than once at the same time!" Regards, Sambhav On Thu, Jan 19, 2012 at 11:43 AM, Sambhav wrote: > Thanks a lot. This is exactly what I need for my application. > > On Thu, Jan 19, 2012 at 11:27 AM, Ross Finlayson wrote: > >> I have a low end device with Live555 RTSP server serving live H264 >> stream. >> >> My application needs acts as a RTSP client and do the following 2 things >> a) Store this H264 data in mp4 format. >> b) Re - stream the H264 data to another application(can be more than >> one) for a doing some other processing. >> >> feature a) can be directly done using the current Live555 library. >> To implement feature b) one option is , my application has to packetize >> the elementary H264 data into RTP and stream this data on a UDP socket. >> >> >> OK, this is completely different to what you were asking about before. >> You don't need to 'receive RTP packets' at all. >> >> You can do what you want to do by using the new "StreamReplicator" class >> (see the "testReplicator" demo application for hints on how to use this). >> >> First, take the incoming H.264 NAL unit stream from your RTSP client >> (i.e., from "subsession->readSource()") and feed it into a new >> "StreamReplicator" object. >> >> Then create two 'replica' streams (using "createStreamReplica()") from >> the "StreamReplicator" object. >> >> Use one replica for your 'mp4' file output (i.e., a "QuickTimeFileSink"). >> Use the other replica for streaming (using a >> "H264VideoStreamDiscreteFramer" fed into a "H264VideoRTPSink"). Call >> "startPlaying()" on each of these 'sink' objects before entering the event >> loop. >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 19 07:00:12 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 19 Jan 2012 07:00:12 -0800 Subject: [Live-devel] Receiving RTP packets by a RTSP Client app In-Reply-To: References: <3C3A57F1-A885-4777-82BA-60747D97654E@live555.com> Message-ID: > I am trying to implement my app as suggested in the previous mail. > > In the testRTSPClient app, after calling "rtspClient->sendDescribeCommand(continueAfterDESCRIBE); " the program enters the doEventLoop. > Rest of the processing is done in asynchronous callbacks. > > you mentioned >> Call "startPlaying()" on each of these 'sink' objects before entering the doEventLoop. > StreamReplicator cannot be initialized before entering doEventLoop. Yes, I misspoke here. Calling "startPlaying()" from within an event handler (as we currently do in "continueAfterSETUP()" in the "testRTSPClient" code) is OK. > I called startPlaying() on the sink objects in continueAfterPLAY function and got this error. > "FramedSource[0x10b001ad0]::getNextFrame(): attempting to read more than once at the same time!" That error message means eexactly what it says: You're trying to read from some object more than once at the same time, which you can't do. Make sure you didn't accidentally call "startPlaying()" more than once on the same 'sink' object. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jussi.Hattara at turkuamk.fi Fri Jan 20 00:26:03 2012 From: Jussi.Hattara at turkuamk.fi (Hattara Jussi) Date: Fri, 20 Jan 2012 10:26:03 +0200 Subject: [Live-devel] Forwarding RTSP with live555 on demand Message-ID: Hello, I have an AXIS IP-camera that can output either MJPEG or MPEG-4 stream over at least RTSP. I was wondering if it's possible to use live555 on a server to read the stream from the camera and then forward it over RTSP on demand? -- Jussi Hattara Projektisuunnittelija, Ins. (AMK) / Project Planner, B.Eng Turun ammattikorkeakoulu / Turku University of Applied Sciences Tekniikka, ymp?rist? ja talous / Technology, Environment and Business Sep?nkatu 1, 20700 Turku, Finland Tel. +358 44 907 2078 jussi.hattara at turkuamk.fi www.turkuamk.fi / www.tuas.fi -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 20 01:02:47 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Jan 2012 01:02:47 -0800 Subject: [Live-devel] Forwarding RTSP with live555 on demand In-Reply-To: References: Message-ID: <269424A8-DF2D-483A-B0F1-83A8F061CDDB@live555.com> > I have an AXIS IP-camera that can output either MJPEG or MPEG-4 stream over at least RTSP. I was wondering if it?s possible to use live555 on a server to read the stream from the camera and then forward it over RTSP on demand? This sort of thing (basically, RTSP proxying) is something that is often requested. I think it should be possible with a little programming - by using the output from a RTSP client as the input to a RTSP server - but it's not something that's provided with the supplied code 'as is'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Fri Jan 20 04:34:33 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Fri, 20 Jan 2012 13:34:33 +0100 Subject: [Live-devel] use of unitialized memory in our_MD5End Message-ID: <25520_1327062881_4F195F61_25520_17724_1_1BE8971B6CFF3A4F97AF4011882AA2550155F99ABEE5@THSONEA01CMS01P.one.grp> Hi, Valgrind report use of memory that is not initialized in our_MD5End. Use of uninitialised value of size 8 (see: http://valgrind.org/docs/manual/mc-manual.html#mc-manual.uninitvals) at 0xE5558F: our_MD5End (our_md5hl.c:34) by 0xE55751: our_MD5Data (our_md5hl.c:67) by 0xE54104: Authenticator::setRealmAndRandomNonce(char const*) (DigestAuthentication.cpp:79) by 0xE19552: RTSPServer::RTSPClientSession::authenticationOK(char const*, char const*, char const*, char const*) (RTSPServer.cpp:1511) by 0xE164FD: RTSPServer::RTSPClientSession::handleCmd_DESCRIBE(char const*, char const*, char const*, char const*) (RTSPServer.cpp:611) Adding in Authenticator::setRealmAndRandomNonce memset(&seedData,0,sizeof seedData); seems to force this initialization. What do you think ? Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 20 06:18:19 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Jan 2012 06:18:19 -0800 Subject: [Live-devel] use of unitialized memory in our_MD5End In-Reply-To: <25520_1327062881_4F195F61_25520_17724_1_1BE8971B6CFF3A4F97AF4011882AA2550155F99ABEE5@THSONEA01CMS01P.one.grp> References: <25520_1327062881_4F195F61_25520_17724_1_1BE8971B6CFF3A4F97AF4011882AA2550155F99ABEE5@THSONEA01CMS01P.one.grp> Message-ID: <95DDF4CF-B2DB-40FB-8C4F-3F267B5DE2BA@live555.com> > What do you think ? I think that "valgrind" is incorrect in this case. In the implementation of "Authenticator::setRealmAndRandomNonce()", both fields of the "seedData" structure are initialized. (The "timestamp" field is initialized via the call to "gettimeofday()"; the "counter" field is set as well.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jussi.Hattara at turkuamk.fi Fri Jan 20 06:30:36 2012 From: Jussi.Hattara at turkuamk.fi (Hattara Jussi) Date: Fri, 20 Jan 2012 16:30:36 +0200 Subject: [Live-devel] Forwarding RTSP with live555 on demand In-Reply-To: <269424A8-DF2D-483A-B0F1-83A8F061CDDB@live555.com> References: <269424A8-DF2D-483A-B0F1-83A8F061CDDB@live555.com> Message-ID: Unfortunately I'm not fluent with C++ and it would take me way too much time to figure out what to do with the code. I already managed to store the feed into a file and transmit that file over to the RTSP server, but that's where my skills end. Fortunately managed to get this working using Darwin Streaming Server, although I'd rather use live555, if possible. -- Jussi Hattara Projektisuunnittelija, Ins. (AMK) / Project Planner, B.Eng Turun ammattikorkeakoulu / Turku University of Applied Sciences Tekniikka, ymp?rist? ja talous / Technology, Environment and Business Sep?nkatu 1, 20700 Turku, Finland Tel. +358 44 907 2078 jussi.hattara at turkuamk.fi www.turkuamk.fi / www.tuas.fi From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 20. tammikuuta 2012 11:03 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Forwarding RTSP with live555 on demand I have an AXIS IP-camera that can output either MJPEG or MPEG-4 stream over at least RTSP. I was wondering if it's possible to use live555 on a server to read the stream from the camera and then forward it over RTSP on demand? This sort of thing (basically, RTSP proxying) is something that is often requested. I think it should be possible with a little programming - by using the output from a RTSP client as the input to a RTSP server - but it's not something that's provided with the supplied code 'as is'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liam at europa-network.com Fri Jan 20 06:35:41 2012 From: liam at europa-network.com (Liam Carter) Date: Fri, 20 Jan 2012 15:35:41 +0100 Subject: [Live-devel] RSTP a Live Stream to an AMINO A125 Message-ID: <4F197BBD.5030204@europa-network.com> Good Afternoon. I have been given the task of getting LIVE555 to work with our current system. You have communicated with my colleague Ben Wheway (See Below). I have read the email below and I am quite confused. Could I have a few more pointers please. We have managed to get a test.ts file to stream to an Amino A125 using ./testOnDemandRTSPServer and the file is contained in the same directory. We point the amino (using a customer html page) to http://xxx.xxx.xxx.xxx:xxxx/mpeg2TransportStreamTest I am stuck on the ingest. I have tried to find an example of the "createNewStreamSource()" function so I can copy and re-write it. I have however not found an example. Could you advise further please. Thanks for any help in advance. Many Regards Liam *From:*live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] *On Behalf Of *Ross Finlayson *Sent:* 16 December 2011 02:57 *To:* LIVE555 Streaming Media - development & use *Subject:* Re: [Live-devel] Multicast to rtsp with Amino A125 We have many Amino A125 STB?s. The server we currently use to rtsp to them is outdated and we need a new system to stream to them. Your software appears to be able to do this but im struggling to find guides to achieve this. Here is a background of what we have as in streams etc: We multicast from our encoders to our current streaming server. This server then RTSP?s out to the Amino STB. The multicast input stream is MPEG4/h264 TS UDP. We then RTSP over UDP unicast out to the STB. So we need to input UDP multicast to live555 server and then RTSP UDP unicast to the Amino STB. Yes, you should be able do this fairly easily. I suggest using the "testOnDemandRTSPServer" demo application as a model; note, in particular, the code for streaming Transport Stream data (lines 215 through 218 of "testProgs/testOnDemandRTSPServer.cpp"). The one change that you'll need to make to this code is that rather than adding a "MPEG2TransportFileServerMediaSubsession" to the "ServerMediaSession" object, you'll be adding an object of a different "OnDemandServerMediaSubsession" - one that you will write yourself. In fact, I suggest that you subclass "MPEG2TransportFileServerMediaSubsession". If you do that, then you will need only to redefine the "createNewStreamSource()" virtual function. In your subclass's constructor, when it calls the parent class ("MPEG2TransportFileServerMediaSubsession")'s constructor, you should set the "fileName" and "indexFile" parameters to NULL, and set "reuseFirstSource" to True. (This tells the server to use the same input source object, even if more than one client is streaming from the server concurrently.) Your subclass's "createNewStreamSource()" virtual function can be quite simple - basically just creating a "groupsock" for your IP multicast address, and then creating a "BasicUDPSource" using that "groupsock" object. I suggest looking at the "testRelay" demo application code for a hint about how to do this. (Because your input is Transport Stream data packed into UDP packets, I don't think that you'll need a separate 'framer' object in front of the "BasicUDPSource" object. Instead, you'll probably be able to transfer the contents of each incoming UDP multicast packet directly into output UDP unicast packets. The method that I've outlined above should do that.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liam at europa-network.com Fri Jan 20 06:50:46 2012 From: liam at europa-network.com (Liam Carter) Date: Fri, 20 Jan 2012 15:50:46 +0100 Subject: [Live-devel] RSTP a Live Stream to an AMINO A125 In-Reply-To: <4F197BBD.5030204@europa-network.com> References: <4F197BBD.5030204@europa-network.com> Message-ID: <4F197F46.9040405@europa-network.com> Sorry for the error the correct address is rtsp://xxx.xxx.xxx.xxx:xxxx/mpeg2TransportStreamTest Regards Liam On 01/20/2012 03:35 PM, Liam Carter wrote: > Good Afternoon. > > I have been given the task of getting LIVE555 to work with our current > system. > > You have communicated with my colleague Ben Wheway (See Below). > > I have read the email below and I am quite confused. > > Could I have a few more pointers please. > > We have managed to get a test.ts file to stream to an Amino A125 using > ./testOnDemandRTSPServer and the file is contained in the same directory. > > We point the amino (using a customer html page) to > http://xxx.xxx.xxx.xxx:xxxx/mpeg2TransportStreamTest > > I am stuck on the ingest. I have tried to find an example of the > "createNewStreamSource()" function so I can copy and re-write it. I > have however not found an example. > > Could you advise further please. > > Thanks for any help in advance. > > Many Regards > > Liam > > *From:*live-devel-bounces at ns.live555.com > [mailto:live-devel-bounces at ns.live555.com] *On Behalf Of *Ross Finlayson > *Sent:* 16 December 2011 02:57 > *To:* LIVE555 Streaming Media - development & use > *Subject:* Re: [Live-devel] Multicast to rtsp with Amino A125 > > We have many Amino A125 STB?s. The server we currently use to rtsp > to them is outdated and we need a new system to stream to them. > Your software appears to be able to do this but im struggling to > find guides to achieve this. Here is a background of what we have > as in streams etc: > > We multicast from our encoders to our current streaming server. > This server then RTSP?s out to the Amino STB. The multicast input > stream is MPEG4/h264 TS UDP. We then RTSP over UDP unicast out to > the STB. > > So we need to input UDP multicast to live555 server and then RTSP > UDP unicast to the Amino STB. > > Yes, you should be able do this fairly easily. I suggest using the > "testOnDemandRTSPServer" demo application as a model; note, in > particular, the code for streaming Transport Stream data (lines 215 > through 218 of "testProgs/testOnDemandRTSPServer.cpp"). > > The one change that you'll need to make to this code is that rather > than adding a "MPEG2TransportFileServerMediaSubsession" to the > "ServerMediaSession" object, you'll be adding an object of a different > "OnDemandServerMediaSubsession" - one that you will write yourself. > In fact, I suggest that you subclass > "MPEG2TransportFileServerMediaSubsession". If you do that, then you > will need only to redefine the "createNewStreamSource()" virtual > function. In your subclass's constructor, when it calls the parent > class ("MPEG2TransportFileServerMediaSubsession")'s constructor, you > should set the "fileName" and "indexFile" parameters to NULL, and set > "reuseFirstSource" to True. (This tells the server to use the same > input source object, even if more than one client is streaming from > the server concurrently.) > > Your subclass's "createNewStreamSource()" virtual function can be > quite simple - basically just creating a "groupsock" for your IP > multicast address, and then creating a "BasicUDPSource" using that > "groupsock" object. I suggest looking at the "testRelay" demo > application code for a hint about how to do this. > > (Because your input is Transport Stream data packed into UDP packets, > I don't think that you'll need a separate 'framer' object in front of > the "BasicUDPSource" object. Instead, you'll probably be able to > transfer the contents of each incoming UDP multicast packet directly > into output UDP unicast packets. The method that I've outlined above > should do that.) > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 20 07:10:00 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Jan 2012 07:10:00 -0800 Subject: [Live-devel] RSTP a Live Stream to an AMINO A125 In-Reply-To: <4F197BBD.5030204@europa-network.com> References: <4F197BBD.5030204@europa-network.com> Message-ID: <1ACBFA8A-8653-4D69-B1E4-D46770634685@live555.com> > I have been given the task of getting LIVE555 to work with our current system. > > You have communicated with my colleague Ben Wheway (See Below). > > I have read the email below and I am quite confused. > > Could I have a few more pointers please. Unfortunately I can give only limited help to people who are "confused". To use this software, you need to be well-versed in C++ programming, have read the FAQ (especially, in your case, ), and, in your case, understand what the code for the "testOnDemandRTSPServer" demo application does. Fortunately, another developer recently contributed code that does what (I think) you are looking to do: Take an input Transport Stream UDP source, and stream it - using RTSP - to one or more clients. (The one change that you would make to this is change the "inputAddressStr" variable to a string that contains your desired input IP multicast address, and change the port number in "inputPort".) See http://lists.live555.com/pipermail/live-devel/2012-January/014454.html > I am stuck on the ingest. I have tried to find an example of the "createNewStreamSource()" function so I can copy and re-write it. I have however not found an example. You haven't looked very hard. Look again. (Hint: the "liveMedia" directory.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liam at europa-network.com Fri Jan 20 07:46:39 2012 From: liam at europa-network.com (Liam Carter) Date: Fri, 20 Jan 2012 16:46:39 +0100 Subject: [Live-devel] RSTP a Live Stream to an AMINO A125 In-Reply-To: <1ACBFA8A-8653-4D69-B1E4-D46770634685@live555.com> References: <4F197BBD.5030204@europa-network.com> <1ACBFA8A-8653-4D69-B1E4-D46770634685@live555.com> Message-ID: <4F198C5F.7000306@europa-network.com> Ross. Thanks for the prompt reply. I have no experience in C++, typical, I mainly use PHP. I have read the link and implemented the test, but the class file was removed from the thread, Do you by any chance have the class file? Regards Liam On 01/20/2012 04:10 PM, Ross Finlayson wrote: >> I have been given the task of getting LIVE555 to work with our >> current system. >> >> You have communicated with my colleague Ben Wheway (See Below). >> >> I have read the email below and I am quite confused. >> >> Could I have a few more pointers please. > > Unfortunately I can give only limited help to people who are > "confused". To use this software, you need to be well-versed in C++ > programming, have read the FAQ (especially, in your case, > ), and, > in your case, understand what the code for the > "testOnDemandRTSPServer" demo application does. > > Fortunately, another developer recently contributed code that does > what (I think) you are looking to do: Take an input Transport Stream > UDP source, and stream it - using RTSP - to one or more clients. (The > one change that you would make to this is change the "inputAddressStr" > variable to a string that contains your desired input IP multicast > address, and change the port number in "inputPort".) > > See http://lists.live555.com/pipermail/live-devel/2012-January/014454.html > > >> I am stuck on the ingest. I have tried to find an example of the >> "createNewStreamSource()" function so I can copy and re-write it. I >> have however not found an example. > > You haven't looked very hard. Look again. (Hint: the "liveMedia" > directory.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 20 07:58:18 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Jan 2012 07:58:18 -0800 Subject: [Live-devel] RSTP a Live Stream to an AMINO A125 In-Reply-To: <4F198C5F.7000306@europa-network.com> References: <4F197BBD.5030204@europa-network.com> <1ACBFA8A-8653-4D69-B1E4-D46770634685@live555.com> <4F198C5F.7000306@europa-network.com> Message-ID: <7E38D5AE-1707-4997-8F4D-5A0C7F73C447@live555.com> > I have no experience in C++, typical, I mainly use PHP. Then this software is not for you. I'm sorry to sound harsh, but I'm just telling it as it is. > I have read the link and implemented the test, but the class file was removed from the thread, Do you by any chance have the class file? There's an attachment linked at the end of this message: http://lists.live555.com/pipermail/live-devel/2012-January/014454.html It contains the files that you're looking for. (I'll let you figure out how to extract them.) (This will be my last posting on this topic.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Aleksandar.Milenkovic at rt-rk.com Fri Jan 20 09:05:04 2012 From: Aleksandar.Milenkovic at rt-rk.com (Aleksandar Milenkovic) Date: Fri, 20 Jan 2012 18:05:04 +0100 Subject: [Live-devel] General tips would be appreciated Message-ID: <4F199EC0.9050909@rt-rk.com> Hi all. I didn't really know how to name this, hope nobody minds.... Anyway, i'm getting into live555 code. I compiled the whole thing as a library, ported the mediaServer to android and now comes the interesting part. A Java app sends a byte[] thru JNI to a C routine. The C part should have the mediaserver embedded into it, and in such a way that the mediaserver can be controlled just like a real object (i know this doesn't make sense yet, but it will). That means the server should behave like 'expected' upon calling the wrapper's Init(), Start/Stop(), Deinit()... So far I've got some slight understanding of the UsageEnvironment classes, but i'm having slight problems on how to break it up. I could use an explanation of what it does etc, a background story. So far, env->doEventLoop(shouldIQuit); will do the trick i hope. On top of that, i wanna use the passed-in byte[] for streaming, not files. I found this - ByteStreamMemoryBufferSource, so i'll try to dump the byte[] into that and feed that to the mediaServer; Basically, I'm not-so-politely asking how to break up the Media Server app into pieces; this approach sounds feasable but might be impractical or there might be better ways to accomplish this. That's why i'm asking you if you've got any tips for the new guy here :) I'd like to avoid reimplementing UsageEnvironments and TaskSchedulers if possible, and only delete/move the code around :) Kind regards, Aleksandar -- *Aleksandar Milenkovic* Software Engineer Phone: +381-(0)21-4801-139 Fax: +381-(0)21-450-721 Mobile: +381-(0)64-31-666-82 E-mail: Aleksandar.Milenkovic at rt-rk.com RT-RK Computer Based Systems LLC Fruskogorska 11 21000 Novi Sad, Serbia www.rt-rk.com *RT-RK* invites you to visit us @ *IBC2011*, September 9-13 2011, stand *5.A01*, Amsterdam RAI. For more information please visit www.bbt.rs -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 20 13:14:39 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 20 Jan 2012 13:14:39 -0800 Subject: [Live-devel] General tips would be appreciated In-Reply-To: <4F199EC0.9050909@rt-rk.com> References: <4F199EC0.9050909@rt-rk.com> Message-ID: <8C170A7A-DCB8-4670-89AF-71FAD75D08E9@live555.com> > Basically, I'm not-so-politely asking how to break up the Media Server app into pieces The "LIVE555 Media Server" application is meant to be a standalone application, for use - unmodified - by end users. In particular, it was not intended to be used as a model for programmers who are developing their own RTSP server-based applications. For that purpose, you should instead look at the code for "testOnDemandRTSPServer" (in the "testProgs" directory). And you should read the FAQ (as you were asked to do before posting to the mailing list). It contains at least one entry that relates (I think) to what you are trying to do. > I'd like to avoid reimplementing UsageEnvironments and TaskSchedulers if possible, and only delete/move the code around :) This code is Open Source; you can do whatever you like with it (subject, of course, to the LGPL). However, if you choose to modify the supplied source code (rather than extending it via subclassing), you are far less likely to get support on this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Sat Jan 21 01:37:36 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Sat, 21 Jan 2012 11:37:36 +0200 Subject: [Live-devel] Suggestions for RTSPClient [ testRTSPClient example] Message-ID: As a user of your library, i have some problems with your RTSPClient while integrating it to my app. So i decide to write my "pains" and my "ideal" client example for testRTSPClient -- What i need to create client? Do i have to know "env" or "task scheduler? string rtspUrl = "rtsp://..../video.h264" string userName = "user"; string password = "password" // This factory automatically create single enviroment object // pass this reference to newly created client and // start env->taskScheduler().doEventLoop(&watchVariable) in a new thread // so that it will not block anything OurRTSPClient* ourClient = ClientFactory::GiveMeNewClient(rtspUrl,userName, password); If i need "env" then i can get if from our client or ClientFactory... -- Do i need the details of Rtsp Protocole? ourClient->Start(); // send "describe" request and if sucess then continue ourClient->Stop(); // just send teardown and close all sockets for that client -- Do i need to worry about "dirty details" of deleting/closing resources? // Just delete object // Anyway we are using C++ delete ourClient(); -- At the end, i want to close everthing // stop do event loop // delete env and scheduler ClientFactory::CloseEnv(); -- What about if something goes wrong? Why not to use Exceptions ? try { OurRTSPClient* ourClient = ClientFactory::GiveMeNewClient(rtspUrl,userName, password); ourClient->Start(); // send "describe" request and if sucess then continue Sleep(30); ourClient->Stop(); // just send teardown and close all sockets for that client delete ourClient; ourClient = NULL; ClientFactory::CloseEnv(); } cath(const Live555CannotGetSDPDescription& e) { // may be i create new client and try to re-connect } catch(const Live555ServerConnectionGone& e) { //.... } catch(const Live555CanNotCloseEnvriment& e) { //.... } Best Wishes From Anarchist666 at yandex.ru Sat Jan 21 01:48:59 2012 From: Anarchist666 at yandex.ru (=?koi8-r?B?5sHazMXF1yDtwcvTyc0=?=) Date: Sat, 21 Jan 2012 13:48:59 +0400 Subject: [Live-devel] x264 nal unist and H264VideoStreamDiscreteFramer Message-ID: <349751327139339@web107.yandex.ru> libx264 returns a pointer to an array of NAL units during encoding using the function x264_encoder_encode. But i can't feed them H264VideoStreamDiscreteFramer. My code in DeviceSource: ... newFrameDataStart = (u_int8_t*)(nals[current_nal].p_payload); newFrameSize = nals[current_nal].i_payload; printf("nal type = %i\n", nals[current_nal].i_type); ... printf("newFrameSize = %i fTo[0]&0x1F = %i\n", newFrameSize, fTo[0]&0x1F); Output: nal type = 7 newFrameSize = 28 fTo[0]&0x1F = 0 Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake? nal type = 8 newFrameSize = 8 fTo[0]&0x1F = 0 Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake? nal type = 6 newFrameSize = 575 fTo[0]&0x1F = 0 Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake? nal type = 5 newFrameSize = 5170 fTo[0]&0x1F = 0 Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake? nal type = 1 newFrameSize = 344 fTo[0]&0x1F = 0 Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake? From finlayson at live555.com Sat Jan 21 13:12:00 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Jan 2012 13:12:00 -0800 Subject: [Live-devel] Suggestions for RTSPClient [ testRTSPClient example] In-Reply-To: References: Message-ID: <195811F9-CB1A-4445-9FAE-0158D20A9C78@live555.com> I'm not sure I totally understand what you're asking/suggesting (but since you use a non-professional email address, I really don't care :-). But it sounds like you want to perform RTSP client operations in a 'synchronous' way, whereby you block - waiting for the response to each RTSP request - before sending the next request. We actually provide an optional alternative "RTSPClient" interface that lets you program RTSP clients in this manner. Look for RTSPCLIENT_SYNCHRONOUS_INTERFACE in "liveMedia/include/RTSPClient.hh". Be warned, however, that this 'synchronous' interface is not recommended, and will someday likely be removed. Although it arguably makes it easier to program a single RTSP client (that does nothing else), it makes it difficult to implement multiple concurrent RTSP clients (or other concurrent activities in addition to a single RTSP client). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 21 13:20:46 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Jan 2012 13:20:46 -0800 Subject: [Live-devel] x264 nal unist and H264VideoStreamDiscreteFramer In-Reply-To: <349751327139339@web107.yandex.ru> References: <349751327139339@web107.yandex.ru> Message-ID: <26DE2E4D-6C51-4E5B-A9B5-BF55285C7826@live555.com> > Warning: Invalid 'nal_unit_type': 0. Does the NAL unit begin with a MPEG 'start code' by mistake? This message means exactly what it says (and you can very easily see - in the "H264VideoStreamDiscreteFramer" source code - where/why it's displayed). Remember, You Have Complete Source Code. Most likely, the H.264 NAL units that you are (trying to) feed into the "H264VideoStreamDiscreteFramer" begin with a MPEG 'start code' (0x00000001 or 0x000001). You must not do this. Instead, you need to skip over these 'start code' bytes before feeding the NAL unit data into the "H264VideoStreamDiscreteFramer". (Remember also to reduce "fFrameSize" accordingly.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Anarchist666 at yandex.ru Sat Jan 21 22:50:07 2012 From: Anarchist666 at yandex.ru (=?koi8-r?B?5sHazMXF1yDtwcvTyc0=?=) Date: Sun, 22 Jan 2012 10:50:07 +0400 Subject: [Live-devel] x264 nal unist and H264VideoStreamDiscreteFramer In-Reply-To: <26DE2E4D-6C51-4E5B-A9B5-BF55285C7826@live555.com> References: <349751327139339@web107.yandex.ru> <26DE2E4D-6C51-4E5B-A9B5-BF55285C7826@live555.com> Message-ID: <802711327215007@web23.yandex.ru> An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 21 23:30:52 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 21 Jan 2012 23:30:52 -0800 Subject: [Live-devel] x264 nal unist and H264VideoStreamDiscreteFramer In-Reply-To: <802711327215007@web23.yandex.ru> References: <349751327139339@web107.yandex.ru> <26DE2E4D-6C51-4E5B-A9B5-BF55285C7826@live555.com> <802711327215007@web23.yandex.ru> Message-ID: > There is also an option: int b_repeat_headers; /* put SPS/PPS before each keyframe */. It should be set in 0? I suggest setting this to 1, to ensure that the downstream "H264VideoStreamDiscreteFramer" gets to see the SPS and PPS NAL units at least once. (This is important, otherwise the RTSP server won't be able to compute the proper 'configuration' string to put in the stream's SDP description.) Ross. From Anarchist666 at yandex.ru Sun Jan 22 01:55:50 2012 From: Anarchist666 at yandex.ru (=?koi8-r?B?5sHazMXF1yDtwcvTyc0=?=) Date: Sun, 22 Jan 2012 13:55:50 +0400 Subject: [Live-devel] x264 nal unist and H264VideoStreamDiscreteFramer In-Reply-To: References: <349751327139339@web107.yandex.ru> <26DE2E4D-6C51-4E5B-A9B5-BF55285C7826@live555.com> <802711327215007@web23.yandex.ru> Message-ID: <10501327226150@web154.yandex.ru> If I set to 1, then there SIGSEGV and output: referenceCount = 1 nals[i].i_type = 7 newFrameSize = 23 fTo[0]&0x1F = 7 nals[i].i_type = 8 newFrameSize = 4 fTo[0]&0x1F = 8 nals[i].i_type = 6 newFrameSize = 571 fTo[0]&0x1F = 5 nals[i].i_type = 5 newFrameSize = 39523 fTo[0]&0x1F = 8 referenceCount = 0 If I set to 0, then output: referenceCount = 1 nals[i].i_type = 5 newFrameSize = 39565 fTo[0]&0x1F = 5 nals[i].i_type = 1 newFrameSize = 8263 fTo[0]&0x1F = 1 nals[i].i_type = 1 newFrameSize = 5403 fTo[0]&0x1F = 1 nals[i].i_type = 5 newFrameSize = 13161 fTo[0]&0x1F = 5 nals[i].i_type = 1 newFrameSize = 4298 fTo[0]&0x1F = 1 nals[i].i_type = 1 newFrameSize = 3844 fTo[0]&0x1F = 1 nals[i].i_type = 5 newFrameSize = 13416 fTo[0]&0x1F = 5 nals[i].i_type = 1 newFrameSize = 3557 fTo[0]&0x1F = 1 .. Sources is attached. 22.01.2012, 11:30, "Ross Finlayson" : >> ?There is also an option: int b_repeat_headers; ?/* put SPS/PPS before each keyframe */. It should be set in 0? > > I suggest setting this to 1, to ensure that the downstream "H264VideoStreamDiscreteFramer" gets to see the SPS and PPS NAL units at least once. ?(This is important, otherwise the RTSP server won't be able to compute the proper 'configuration' string to put in the stream's SDP description.) > > ????????Ross. > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- A non-text attachment was scrubbed... Name: config.h Type: application/octet-stream Size: 9193 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamServerMediaSubsession.cpp Type: application/octet-stream Size: 3750 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamServerMediaSubsession.hh Type: application/octet-stream Size: 1189 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamSource.cpp Type: application/octet-stream Size: 8731 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamSource.hh Type: application/octet-stream Size: 1542 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: main.cpp Type: application/octet-stream Size: 2740 bytes Desc: not available URL: From finlayson at live555.com Sun Jan 22 03:48:56 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 22 Jan 2012 03:48:56 -0800 Subject: [Live-devel] x264 nal unist and H264VideoStreamDiscreteFramer In-Reply-To: <10501327226150@web154.yandex.ru> References: <349751327139339@web107.yandex.ru> <26DE2E4D-6C51-4E5B-A9B5-BF55285C7826@live555.com> <802711327215007@web23.yandex.ru> <10501327226150@web154.yandex.ru> Message-ID: <3E353F59-9571-401F-9124-E51596329BE6@live555.com> As I noted before, SPS (nal_unit_type 7) and PPS (nal_unit_type 8) NAL units *must* appear - at least once - in the sequence of NAL units that you feed to "H264VideoStreamDiscreteFramer". Therefore, in your case, you must set your "b_repeat_headers" variable to 1. If this is causing a segmentation fault, then this is a bug in your code that you will need to solve. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From gsnetplayer at hotmail.com Sun Jan 22 07:36:02 2012 From: gsnetplayer at hotmail.com (GS Net Player) Date: Sun, 22 Jan 2012 16:36:02 +0100 Subject: [Live-devel] RTSP only in Lan - Lan ?! In-Reply-To: References: , , , Message-ID: Please can somebody show me how to switch from multicast to unicast streaming ? I trying several weeks to do it but without success ! Can someone help with OnDemandServerMediaSubsession to create "createNewStreamSource()" and "createNewRTPSink()" From: gsnetplayer at hotmail.com To: live-devel at ns.live555.com Date: Thu, 12 Jan 2012 17:53:33 +0100 Subject: Re: [Live-devel] RTSP only in Lan - Lan ?! Hi Ross, thank you for your response, I changed everything as you suggested but still can not get stream from rtsp://xxx.xxx.xxx.xxx:8554/testStream or rstp://xxx.xxx.xxx.xxx:8000/testStream ! Otherwise, when I put in my code: { char const* streamName = "StreamTest"; char const* inputFileName = "test.ts"; char const* indexFileName = "test.tsx"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(MPEG2TransportFileServerMediaSubsession ::createNew(*env, inputFileName, indexFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } I can then get rtsp://xxx.xxx.xxx.xxx:8554/StreamTest or rtsp://xxx.xxx.xxx.xxx:8000/StreamTest ( I have already allowed the ports 8554 and 8000 through the firewall ). Please show me how to get the stream to work on the Internet Igor From: finlayson at live555.com Date: Thu, 12 Jan 2012 06:47:11 -0800 To: live-devel at ns.live555.com Subject: Re: [Live-devel] RTSP only in Lan - Lan ?! Can someone tell me why my rtsp code only works in the local network ( Lan - Lan ) but not on Windows Server 2008 ( hosting ) ? Probably because you don't have IP multicast routing between the sending computer (that's sending to multicast address 239.255.42.42, port 8888) and the computer that's running your application, and/or between the computer that's running your application (that's sending data to multicast address 239.255.43.43, port 4444) and the receiving computer. (Also, because you're not using 'source-specific multicast', you should be setting "isSSM" to "False", not "True". And because "IS_SSM" is not defined, you can remove all of the code from the #if branch of the #ifdef USE_SSM ... #endif code, because the #if branch of that code doesn't get executed (which is just as well, because the "sourceAddressStr" line that you've put there is very wrong).) Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From achraf.gazdar at gmail.com Sun Jan 22 12:29:48 2012 From: achraf.gazdar at gmail.com (Achraf Gazdar) Date: Sun, 22 Jan 2012 21:29:48 +0100 Subject: [Live-devel] RSTP a Live Stream to an AMINO A125 Message-ID: > > I have no experience in C++, typical, I mainly use PHP. > > Then this software is not for you. I'm sorry to sound harsh, but I'm just > telling it as it is. > > > > I have read the link and implemented the test, but the class file was > removed from the thread, Do you by any chance have the class file? > > There's an attachment linked at the end of this message: > > http://lists.live555.com/pipermail/live-devel/2012-January/014454.html > It contains the files that you're looking for. (I'll let you figure out > how to extract them.) > Here you have to put the .hh file into include directory within liveMedia directory. The .cpp file into the liveMedia directory. You have to update the Makefile to consider the new added source files. Finally add the test example (as detailed in my previous post pointed by Ross) the testOnDemandRTSPserver.cpp. Remake the library and testProgs and run the testOnDemandRTSPserver binary. > > (This will be my last posting on this topic.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > -- Achraf Gazdar Associate Professor on Computer Science Hana Research Unit, CRISTAL Lab. http://www.hanalab.org Tunisia -------------- next part -------------- An HTML attachment was scrubbed... URL: From Aleksandar.Milenkovic at rt-rk.com Mon Jan 23 06:01:18 2012 From: Aleksandar.Milenkovic at rt-rk.com (Aleksandar Milenkovic) Date: Mon, 23 Jan 2012 15:01:18 +0100 Subject: [Live-devel] General tips would be appreciated In-Reply-To: <4F199EC0.9050909@rt-rk.com> References: <4F199EC0.9050909@rt-rk.com> Message-ID: <4F1D682E.1030607@rt-rk.com> After some more digging and looking through testApps (such as testOndemandRTSPServer) i saw this ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); So I figured that in order to stream anything, I need to pass it to H264VideoFileServerMediaSubsession... then I looked into the H264VideoFileServerMediaSubsession class to find that it references ByteStreamFileSource... now, from previous message and looking thru docs i realized that all I need to do is swap the ByteStreamFileSource with ByteStreamMemoryBufferSource (since I don't have the file and the byte[] containing video data is just sitting there waiting to be consumed) So i should be able to write a H264VideoMemoryBufferServerMediaSubsession that uses MemoryBuffer instead of File and feed that to my ServerMediaSession and subsequently rtspServer and it should work? Alternatively (sorry for bad spelling btw) should I perhaps adapt MPEG2TSFileSubsession to use the MemoryBufferSource since my byte[] holds raw TS data? Thank you in advance for quick and precise answers. Kind regards, Aleksandar P.S. I've read the FAQ last time, the part I think you were referring to wasn't really referring to me :) esp the one about not reading the FAQ. *Aleksandar Milenkovic* Software Engineer Phone: +381-(0)21-4801-139 Fax: +381-(0)21-450-721 Mobile: +381-(0)64-31-666-82 E-mail: Aleksandar.Milenkovic at rt-rk.com RT-RK Computer Based Systems LLC Fruskogorska 11 21000 Novi Sad, Serbia www.rt-rk.com *RT-RK* invites you to visit us @ *IBC2011*, September 9-13 2011, stand *5.A01*, Amsterdam RAI. For more information please visit www.bbt.rs On 1/20/2012 6:05 PM, Aleksandar Milenkovic wrote: > Hi all. I didn't really know how to name this, hope nobody minds.... > > Anyway, i'm getting into live555 code. I compiled the whole thing as a > library, ported the mediaServer to android and now comes the > interesting part. > > A Java app sends a byte[] thru JNI to a C routine. The C part should > have the mediaserver embedded into it, and in such a way that the > mediaserver can be controlled just like a real object (i know this > doesn't make sense yet, but it will). That means the server should > behave like 'expected' upon calling the wrapper's Init(), > Start/Stop(), Deinit()... > > So far I've got some slight understanding of the UsageEnvironment > classes, but i'm having slight problems on how to break it up. I could > use an explanation of what it does etc, a background story. So far, > env->doEventLoop(shouldIQuit); will do the trick i hope. > > On top of that, i wanna use the passed-in byte[] for streaming, not > files. I found this - ByteStreamMemoryBufferSource, so i'll try to > dump the byte[] into that and feed that to the mediaServer; > > Basically, I'm not-so-politely asking how to break up the Media Server > app into pieces; this approach sounds feasable but might be > impractical or there might be better ways to accomplish this. That's > why i'm asking you if you've got any tips for the new guy here :) > > I'd like to avoid reimplementing UsageEnvironments and TaskSchedulers > if possible, and only delete/move the code around :) > > Kind regards, > Aleksandar > -- > > *Aleksandar Milenkovic* > Software Engineer > > Phone: +381-(0)21-4801-139 > Fax: +381-(0)21-450-721 > Mobile: +381-(0)64-31-666-82 > E-mail: Aleksandar.Milenkovic at rt-rk.com > > > RT-RK Computer Based Systems LLC > Fruskogorska 11 > 21000 Novi Sad, Serbia > www.rt-rk.com > > *RT-RK* invites you to visit us @ *IBC2011*, September 9-13 2011, > stand *5.A01*, Amsterdam RAI. > For more information please visit www.bbt.rs > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 23 06:53:54 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 23 Jan 2012 06:53:54 -0800 Subject: [Live-devel] General tips would be appreciated In-Reply-To: <4F1D682E.1030607@rt-rk.com> References: <4F199EC0.9050909@rt-rk.com> <4F1D682E.1030607@rt-rk.com> Message-ID: <438D770B-2F55-43BD-8441-6154FB21A42A@live555.com> > P.S. I've read the FAQ last time, the part I think you were referring to wasn't really referring to me :) esp the one about not reading the FAQ. No, the part of the FAQ that relates specifically to what you're trying to do (I think) is: http://www.live555.com/liveMedia/faq.html#liveInput-unicast You should also note: http://www.live555.com/liveMedia/faq.html#modifying-and-extending > Alternatively (sorry for bad spelling btw) should I perhaps adapt MPEG2TSFileSubsession to use the MemoryBufferSource since my byte[] holds raw TS data? Thank you in advance for quick and precise answers. The RTP payload format for Transport Stream data is independent of the particular codec(s) that the Transport Stream contains. Therefore, because you are streaming Transport Stream data, you should not be using (or looking at) any of the *H264* classes. Instead, you should be writing and using your own new subclass of "OnDemandServerMediaSubsession" that (as noted in the first FAQ entry referred to above): 1/ In its constructor, sets the "reuseFirstSource" variable in the base class constructor to True, and 2/ Implements the two pure virtual functions "createNewStreamSource()" and "createNewRTPSink()". Your "createNewRTPSink()" implementation can be exactly the same as the one that's in "MPEG2TransportFileServerMediaSubsession" - i.e. RTPSink* yourServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char /*rtpPayloadTypeIfDynamic*/, FramedSource* /*inputSource*/) { return SimpleRTPSink::createNew(envir(), rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); } Your "createNewStreamSource()" implementation would probably create a "ByteStreamMemoryBufferSource", and feed it into a "MPEG2TransportStreamFramer". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon Jan 23 07:28:49 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 23 Jan 2012 16:28:49 +0100 Subject: [Live-devel] use of unitialized memory in our_MD5End In-Reply-To: <95DDF4CF-B2DB-40FB-8C4F-3F267B5DE2BA@live555.com> References: <25520_1327062881_4F195F61_25520_17724_1_1BE8971B6CFF3A4F97AF4011882AA2550155F99ABEE5@THSONEA01CMS01P.one.grp> <95DDF4CF-B2DB-40FB-8C4F-3F267B5DE2BA@live555.com> Message-ID: <32014_1327332542_4F1D7CBE_32014_464_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA0D8D72@THSONEA01CMS01P.one.grp> Hi Ross, Sorry but, I guess that valgrind has right due to padding bytes that are at the end of the structure. seedData contains: - timestamp (so 2 long) - counter (int) In 64bits, it's not ok - timeval is 2*8 bytes - int is 4 bytes and structure size is 24 bytes. This 4 bytes of paddind are not initialized ut used in the md5. In 32 bits it's ok : - timeval is 2*4 bytes - int is 4 bytes ans structure size is 12 bytes, so no padding. One way is to change counter from "int" to "long", but implicit thinks are usually hard to maintain.... Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : vendredi 20 janvier 2012 15:18 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] use of unitialized memory in our_MD5End What do you think ? I think that "valgrind" is incorrect in this case. In the implementation of "Authenticator::setRealmAndRandomNonce()", both fields of the "seedData" structure are initialized. (The "timestamp" field is initialized via the call to "gettimeofday()"; the "counter" field is set as well.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 23 15:33:42 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 23 Jan 2012 15:33:42 -0800 Subject: [Live-devel] use of unitialized memory in our_MD5End In-Reply-To: <32014_1327332542_4F1D7CBE_32014_464_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA0D8D72@THSONEA01CMS01P.one.grp> References: <25520_1327062881_4F195F61_25520_17724_1_1BE8971B6CFF3A4F97AF4011882AA2550155F99ABEE5@THSONEA01CMS01P.one.grp> <95DDF4CF-B2DB-40FB-8C4F-3F267B5DE2BA@live555.com> <32014_1327332542_4F1D7CBE_32014_464_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA0D8D72@THSONEA01CMS01P.one.grp> Message-ID: <8A21770F-CD67-47ED-A7F2-BCD6204C81FE@live555.com> In any case, this isn't a concern, because we're using MD5 here to generate a 'pseudo-random' value. It doesn't matter if some of the memory that MD5 reads happens to be 'uninitialized'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Tue Jan 24 02:34:30 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 24 Jan 2012 11:34:30 +0100 Subject: [Live-devel] use of unitialized memory in our_MD5End In-Reply-To: <8A21770F-CD67-47ED-A7F2-BCD6204C81FE@live555.com> References: <25520_1327062881_4F195F61_25520_17724_1_1BE8971B6CFF3A4F97AF4011882AA2550155F99ABEE5@THSONEA01CMS01P.one.grp> <95DDF4CF-B2DB-40FB-8C4F-3F267B5DE2BA@live555.com> <32014_1327332542_4F1D7CBE_32014_464_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA0D8D72@THSONEA01CMS01P.one.grp> <8A21770F-CD67-47ED-A7F2-BCD6204C81FE@live555.com> Message-ID: <5776_1327401283_4F1E8943_5776_2288_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA21F001@THSONEA01CMS01P.one.grp> Hi Ross, Clearly you right, memory is surely allocated, and it will not end with a segmentation violation. I will put an exception in valgrind rules, but using unitilalized memory is probably not a good practice. In fact what's annoy me is to keep align a valgrind exception (or a patch) up to date with a live555 release. Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : mardi 24 janvier 2012 00:34 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] use of unitialized memory in our_MD5End In any case, this isn't a concern, because we're using MD5 here to generate a 'pseudo-random' value. It doesn't matter if some of the memory that MD5 reads happens to be 'uninitialized'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Tue Jan 24 06:04:01 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Tue, 24 Jan 2012 16:04:01 +0200 Subject: [Live-devel] Correct way of stopping and closing rtsp client Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8AE7@SSTSVR1.sst.local> Hi, What is the preferred way of stopping and closing a client after it received a stream? Currently, I am calling Shutdown() and it works fine for the first time around. When starting another stream after the first one completes, the client crashes in the following function: void GetSDPDescription(RTSPClient::responseHandler* afterFunc) { if (RTSPClient) RTSPClient->sendDescribeCommand(afterFunc, NULL); } Any assistance will be appreciated. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 24 10:15:54 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Jan 2012 10:15:54 -0800 Subject: [Live-devel] Correct way of stopping and closing rtsp client In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8AE7@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8AE7@SSTSVR1.sst.local> Message-ID: > What is the preferred way of stopping and closing a client after it received a stream? Currently, I am calling Shutdown() and it works fine for the first time around. When starting another stream after the first one completes, the client crashes in the following function: > > void GetSDPDescription(RTSPClient::responseHandler* afterFunc) > { > if (RTSPClient) > RTSPClient->sendDescribeCommand(afterFunc, NULL); > } I presume that you're referring to the "openRTSP" code here (although there, the function is called "getSDPDescription()", not "GetSDPDescription()"). Because you've made your own modifications to this code, I can't really help you with this. Instead, I suggest that you refer to the "testRTSPClient" code - in particular, the "shutdownStream()" function - for guidance. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From r63400 at gmail.com Tue Jan 24 07:56:54 2012 From: r63400 at gmail.com (Ricardo Acosta) Date: Tue, 24 Jan 2012 16:56:54 +0100 Subject: [Live-devel] Port forwarding issue when receiving streams in live555 Message-ID: Hi Ross > > Knowing this is not an issue totally due to live555 , if you could, I > would like to ask you for some clues or advices about an networking issue > related with my streaming client application > > Using live555, our app; now it can send and receive unicast streams to a > server , in the LAN everything works well. > > But when the client is outside our LAN (real case), behind a NAT router > with a private Ip address , incoming UDP packets from server are not routed > "automatically" to the private Ip address. Which is normal since UDP is a > conection-less protocol . TCP connections are routed with no problem. > > So basically the problem is to put the hands on every router where we > would like to install our client, creating everytime PAT rules to forward > port X to a 192.X.X.X ip address. > > The question is if you know how to elimiate this problem easily. > doing some research, there is one option is using UPnP to write > dynamically the port/address rule into the router, is more "soft" than > doing it via a PAT table but still we have to consider that the router has > activated the UPnP option. > > This issue is more often in VoIP or Video streaming when using UDP unicast > streams, and even I am not talking here about a firewall who will drop any > non related connection. > > I would like to know if you can give me some information on how to work on > it. > > Thank you in advance > Ricardo > > > I tried again testMP3Streamer in multicast and now i receive back the RR. > Is there any special reason that works only in multicast (knowing RR is > send back in unicast to the original sender) ? I am trying to see how can > it works in unicast > > > It works just fine with unicast if you use RTSP. > > Try using "testOnDemandRTSPServer" (or "live555MediaServer") as your > server (transmitter), and "testRTSPClient" (or "openRTSP") as your client > (receiver). You'll see that RTCP "RR" reports get received by the server > just fine. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 24 19:49:33 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 24 Jan 2012 19:49:33 -0800 Subject: [Live-devel] New LIVE555 version, supports RTSP server streaming from a UDP Transport Stream input source Message-ID: <35F91E19-139B-4149-9F5D-4F05DECD789B@live555.com> Recently there has been a lot of interest - by several people - in being able to develop a RTSP server that takes, as input, a UDP Transport Stream (that arrives via IP multicast, or via unicast). Achraf Gazdar demonstrated how the LIVE555 libraries can easily be used to do this. I wasn't originally planning on updating the released LIVE555 code to help make this even easier, but because there's been so much interest in this, I've changed my mind. I've now released a new version (2012.01.25) of the LIVE555 code, that adds a new class "MPEG2TransportUDPServerMediaSubsession" (a subclass of "OnDemandServerMediaSubsession") that can be used to build a RTSP server that can takes a UDP (raw UDP or RTP/UDP) Transport Stream as input (via IP multicast, or unicast). (Thanks again to Achraf Gazdar for this suggestion.) I also updated the "testOnDemandRTSPServer" demo application to show how a RTSP server can take input from the (IP multicast) Transport Stream that's sent by the "testMPEG2TransportStreamer" demo application. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From cat29076 at gmail.com Wed Jan 25 01:23:16 2012 From: cat29076 at gmail.com (leroi cat) Date: Wed, 25 Jan 2012 10:23:16 +0100 Subject: [Live-devel] live555 media Server log files Message-ID: hi all, can i have log files or traces generated by the server and containing the establishment of the connection between the LIVE555 and the client? and in which directory can i find them? thank youuu!! -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Wed Jan 25 08:52:11 2012 From: david.myers at panogenics.com (David J Myers) Date: Wed, 25 Jan 2012 16:52:11 -0000 Subject: [Live-devel] Server disconnects clients every 60 seconds Message-ID: <017601ccdb81$af97d850$0ec788f0$@myers@panogenics.com> Hi, I have a strange problem with my live embedded rtsp H.264 server when connected to a certain 3rd party client software package. Every 60 seconds my FramedSource derived class(StreamSource) is getting deleted. If I connect via VLC or openRTSP, this does not happen, but I'm fairly sure it is not the client which is tearing down the session, here is a debug trace from my server:- parseRTSPRequestString() succeeded, returning cmdName "OPTIONS", urlPreSuffix "", urlSuffix "stream0", CSeq "1", Content-Length 0 parseRTSPRequestString() succeeded, returning cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "stream0", CSeq "2", Content-Length 0 parseRTSPRequestString() succeeded, returning cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "stream0", CSeq "3", Content-Length 0 parseRTSPRequestString() succeeded, returning cmdName "SETUP", urlPreSuffix "stream0", urlSuffix "track1", CSeq "4", Content-Length 0 ../src/RTSPWrapper.cpp(76): StreamSource constructor, this=0x0xed1d0, ref count =2, StreamIndex = 0 parseRTSPRequestString() succeeded, returning cmdName "PLAY", urlPreSuffix "stream0", urlSuffix "", CSeq "5", Content-Length 0 . . 60 seconds later . ../src/RTSPWrapper.cpp(97): ~StreamSource this=0x0xed1d0, ref count=1, StreamIndex = 0 Any ideas, what could be causing my FramedSource class to self-destruct every 60 seconds? This must be some kind of timeout. Why does this not happen for VLC client connections? Thanks and regards - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 25 13:04:31 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 25 Jan 2012 13:04:31 -0800 Subject: [Live-devel] Server disconnects clients every 60 seconds In-Reply-To: <017601ccdb81$af97d850$0ec788f0$@myers@panogenics.com> References: <017601ccdb81$af97d850$0ec788f0$@myers@panogenics.com> Message-ID: <4F40B780-155A-4D7E-ACB9-996FE15B816C@live555.com> > Any ideas, what could be causing my FramedSource class to self-destruct every 60 seconds? This must be some kind of timeout. Exactly. The server is timing out the client connection (and reclaiming its state: sockets, ports, and memory) because it is not seeing any sign of activity from the client within the timeout period (which, by default, is actually 65 seconds, not 60 seconds). The problem is your client. It is apparently not sending any periodic RTCP "RR" packets - which it is supposed to do as per the RTP/RTCP standard. (Alternatively, if the client were periodically sending certain 'no-op' RTSP commands - such as "GET_PARAMETER", then the server would also treat that as a sign of client 'liveness', and not timeout the connection.) The solution is to fix your "3rd party client software package" so that it sends RTCP "RR" packets, as it is supposed to. (Alternatively, just use our RTSP/RTP/RTCP client implementation, which works properly :-) If you *really* want to avoid the timeout, without fixing your client, then you can do so by setting the (otherwise optional) "reclamationTestSeconds" parameter in "RTSPServer::createNew()" to 0. I don't recommend this, however, because then the server will have no way of reclaiming state from a client that dies without doing a RTSP "TEARDOWN". > Why does this not happen for VLC client connections? Because VLC properly sends periodic RTCP "RR" packets. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From r63400 at gmail.com Wed Jan 25 07:32:09 2012 From: r63400 at gmail.com (Ricardo Acosta) Date: Wed, 25 Jan 2012 16:32:09 +0100 Subject: [Live-devel] testRelay and testReplicator dont send packets when using same incoming/outgoing port Message-ID: Hi Ross I've been testing the new class StreamReplicator that we received for New year ! Very helpful in addition to testRelay. Using testReplicator I am simulating a server whom replicates and re dispatches live streams from their connected clients. So far it works very good, but today I was trying to use the same UDP port for sending and receiving to the same client. Doing that, it keeps receiving but It stops sending any stream, including the ones who does not have the same address/port char const* inputAddressStr = "172.16.9.161"; struct in_addr inputAddress; inputAddress.s_addr = our_inet_addr(inputAddressStr);inputPort(5000) unsigned char const inputTTL = 0; for Groupsock inputGroupsock(*env, inputAddress, inputPort, inputTTL); and startReplicaUDPSink(replicator,"172.16.9.161", 5000); Normally a UDP port can be incoming and outgoing at the same time, right. Why would I need to send over the same port ? Because I found (doing a client to client test) that sending over the same port you receive, the router forward the packet to the right client behind the router (private IP address). I tried with testRelay and I had the same "stop" problem. Can you please let me if am I doing something wrong ? thank you Ricardo -------------- next part -------------- An HTML attachment was scrubbed... URL: From aviadr1 at gmail.com Wed Jan 25 08:44:16 2012 From: aviadr1 at gmail.com (aviad rozenhek) Date: Wed, 25 Jan 2012 18:44:16 +0200 Subject: [Live-devel] Fwd: patch for live555 lib In-Reply-To: References: Message-ID: Hi, I have attached a patch for live555, which splits MediaLookupTable to its own .hh file, this enables me to check that all the lookup tables are empty when calling reclaim() on the usage environment, to make sure there are no memory leaks in my code. would appreciate if it could be added to the mainline live555 Many thanks, -- Aviad Rozenhek Media Technologies Architect RayV.com -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: return_table.diff Type: application/octet-stream Size: 558 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: split_medialookuptable.diff Type: application/octet-stream Size: 2203 bytes Desc: not available URL: From finlayson at live555.com Wed Jan 25 17:28:13 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 25 Jan 2012 17:28:13 -0800 Subject: [Live-devel] testRelay and testReplicator dont send packets when using same incoming/outgoing port In-Reply-To: References: Message-ID: <2F066120-AE7E-4007-AB45-B674CA6BE213@live555.com> > char const* inputAddressStr = "172.16.9.161"; This is wrong. "inputAddressStr" must be an IP multicast address, or - if you're receiving via unicast instead - "0.0.0.0". But anyway, I suspect that the LIVE555 code can't handle what you're trying to do - not because you're trying to send and receive using the same port number, but because you have two different "Groupsock" objects that use the same port number. If you were to restructure the code so that only one "Groupsock" object were used, then it might work. But you're probably better off not trying to do this at all... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 25 17:34:45 2012 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 25 Jan 2012 17:34:45 -0800 Subject: [Live-devel] Fwd: patch for live555 lib In-Reply-To: References: Message-ID: <55525C3D-CB71-4391-BC45-AE7911366FD6@live555.com> No, because the "MediaLookupTable" class is used only internally within "Media.cpp", and is not intended to be use outside this implementation, I don't want to expose it outside that file. > this enables me to check that all the lookup tables are empty when calling reclaim() on the usage environment, to make sure there are no memory leaks in my code. I presume that you want to call MediaLookupTable::ourMedia(*env); to check whether its value is NULL or not. But you can already check the value of env->liveMediaPriv and env->groupsockPriv which will give you the same information, without requiring any change to the supplied code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Thu Jan 26 01:32:22 2012 From: david.myers at panogenics.com (David J Myers) Date: Thu, 26 Jan 2012 09:32:22 -0000 Subject: [Live-devel] Server disconnects clients every 60 seconds Message-ID: <002101ccdc0d$68e22b40$3aa681c0$@myers@panogenics.com> >The problem is your client. It is apparently not sending any periodic RTCP "RR" packets - which it is supposed to do as per the RTP/RTCP standard. But my RTSP server (based on TestOnDemandRTSPServer) doesn't start RTCP (or RTP). So, if the Client doesn't send GET_PARAMETER messages like VCL, I won't see the RTCP packets and won't stay alive. Could you point me to a UNICAST example which does start RTCP. And do I also need to start RTP? Thanks again - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 26 01:42:14 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Jan 2012 01:42:14 -0800 Subject: [Live-devel] Server disconnects clients every 60 seconds In-Reply-To: <002101ccdc0d$68e22b40$3aa681c0$@myers@panogenics.com> References: <002101ccdc0d$68e22b40$3aa681c0$@myers@panogenics.com> Message-ID: <47CA741F-0F22-4387-95FF-37071A3C4990@live555.com> > >The problem is your client. It is apparently not sending any periodic RTCP "RR" packets - which it is supposed to do as per the RTP/RTCP standard. > But my RTSP server (based on TestOnDemandRTSPServer) doesn?t start RTCP (or RTP). Huh? If your RTSP server is "based on testOnDemandRTSPServer", then it most certainly *is* 'starting' both RTP and RTCP. That's what a unicast RTSP server does. You seem very confused here. But the issue here is your *client*. It is apparently not sending back periodic RTCP "RR" (Reception Report) packets in response to the RTP packets that it's receiving from the server. That's why the server is timing out the stream This doesn't happen with VLC as a client, because VLC properly sends back RTCP packets. ("GET_PARAMETER" has nothing to do with this.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From aviadr1 at gmail.com Thu Jan 26 05:01:21 2012 From: aviadr1 at gmail.com (aviad rozenhek) Date: Thu, 26 Jan 2012 15:01:21 +0200 Subject: [Live-devel] Fwd: patch for live555 lib In-Reply-To: <55525C3D-CB71-4391-BC45-AE7911366FD6@live555.com> References: <55525C3D-CB71-4391-BC45-AE7911366FD6@live555.com> Message-ID: On Thu, Jan 26, 2012 at 03:34, Ross Finlayson wrote: > No, because the "MediaLookupTable" class is used only internally within > "Media.cpp", and is not intended to be use outside this implementation, I > don't want to expose it outside that file. > > > this enables me to check that all the lookup tables are empty when calling > reclaim() on the usage environment, to make sure there are no memory leaks > in my code. > > > I presume that you want to call > MediaLookupTable::ourMedia(*env); > to check whether its value is NULL or not. But you can already check the > value of > env->liveMediaPriv > and > env->groupsockPriv > which will give you the same information, without requiring any change to > the supplied code. > > > actually, I'm going through the tables to print out which of the objects have leaked so I can debug my code. I need MediaLookupTable in an .h file for that if (m_pEnv) { if(_Tables* pTables = _Tables::getOurTables(*m_pEnv, False)) { if(MediaLookupTable* pMediaTable = (MediaLookupTable*) pTables->mediaTable) { HashTable::Iterator* it = HashTable::Iterator::create(pMediaTable->getTable()); const char* name = NULL; while(Medium* medium = (Medium*) it->next(name)) { RAYVLOG_PUBLIC_ERROR(s, GUI_LOG, s << "memory leak: object " << name << " of type " << typeid(*medium).name()); } delete it; } } m_pEnv->reclaim(); m_pEnv = 0; } -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 26 07:50:14 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Jan 2012 07:50:14 -0800 Subject: [Live-devel] Fwd: patch for live555 lib In-Reply-To: References: <55525C3D-CB71-4391-BC45-AE7911366FD6@live555.com> Message-ID: > actually, I'm going through the tables to print out which of the objects have leaked so I can debug my code. I need MediaLookupTable in an .h file for that OK, what I'll do in the next release of the code is add the "MediaLookupTable" definition to the "include/Media.hh" header file (and also add your new "getTable()" member function). ("MediaLookupTable" isn't significant enough to deserve its own header file.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Thu Jan 26 08:59:27 2012 From: david.myers at panogenics.com (David J Myers) Date: Thu, 26 Jan 2012 16:59:27 -0000 Subject: [Live-devel] Server disconnects clients every 60 seconds Message-ID: <00a001ccdc4b$ddd03710$9970a530$@myers@panogenics.com> Hi Ross, > Huh? If your RTSP server is "based on testOnDemandRTSPServer", then it most certainly *is* 'starting' both RTP and RTCP. That's what a unicast RTSP server does. You seem very confused here. >But the issue here is your *client*. It is apparently not sending back periodic RTCP "RR" (Reception Report) packets in response to the RTP packets that it's receiving from the server. That's why the server is timing out the stream I've talked to the developers of the NVR software (our client) and they tell me they do send regular RTCP reports, but they don't send GET_PARAMETER messages like VLC does. However, my streams don't keep alive and I don't see their RTCP report messages coming in. When I look at testOnDemandRTSPServer, I don't see any specify RTCPInstance:createNew() call, that's why I thought RTCP wasn't running. I guess you're saying that RTCP and RTP are implicit in UNICAST RTSP. Where can I put some debug to catch the RTCP reports coming in? I enabled the debug in RTSPServer::RTSPClientSession:: handleRequestBytes() after parseRTSPRequestString(), but I don't see the RTCP messages there. Thanks for your help - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 26 09:12:21 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Jan 2012 09:12:21 -0800 Subject: [Live-devel] Server disconnects clients every 60 seconds In-Reply-To: <00a001ccdc4b$ddd03710$9970a530$@myers@panogenics.com> References: <00a001ccdc4b$ddd03710$9970a530$@myers@panogenics.com> Message-ID: > I?ve talked to the developers of the NVR software (our client) and they tell me they do send regular RTCP reports, but they don?t send GET_PARAMETER messages like VLC does. However, my streams don?t keep alive and I don?t see their RTCP report messages coming in. Then your client - despite the claims of its developers - must not be sending RTCP reports correctly. (I encourage these developers to get in touch with us - via this mailing list - to help fix this.) (Note, BTW, that "openRTSP" - which you noted works correctly with your server - doesn't send "GET_PARAMETER" requests either. But it keeps the session alive, because it sends periodic RTCP "RR" packets.) > When I look at testOnDemandRTSPServer, I don?t see any specify RTCPInstance:createNew() call That is done in the "OnDemandServerMediaSubsession" class. > Where can I put some debug to catch the RTCP reports coming in? Add #define DEBUG 1 to the start of "liveMedia/RTCP.cpp". You will see reports of RTCP "SR" packets being sent by the server, and - if your client is working correctly (e.g., "openRTSP") - RTCP "RR" packets arriving from the client. > I enabled the debug in RTSPServer::RTSPClientSession:: handleRequestBytes() after parseRTSPRequestString(), but I don?t see the RTCP messages there. No. That function is used only for incoming RTSP commands, not RTCP packets. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Randy.Roberts at flir.com Thu Jan 26 09:49:06 2012 From: Randy.Roberts at flir.com (Roberts, Randy) Date: Thu, 26 Jan 2012 17:49:06 +0000 Subject: [Live-devel] Server disconnects clients every 60 seconds In-Reply-To: References: <00a001ccdc4b$ddd03710$9970a530$@myers@panogenics.com> Message-ID: <796F61FBED160B43BBD65F5837F570FB33BE5675@PDX-MAIL1.zone1.flir.net> Hi Ross, Does the requirement of the RTCP (keepalive) change at all when using RTP over RTSP? We've seen a few clients that DO NOT send keepalives when configured for RTP over RTSP, as the RTP "data" is interleaved over the same TCP "Session" as the the RTSP control. Apparently, they have determined that the TCP session "liveness" can be determined by the TCP stack, and that that is "adequate". Live555 certainly does still require the keepalives in that "configuration". Certainly, applications get notifications (from the TCP stack) when their peer "goes away" (stops responding to TCP keepalives). Are you aware of any specifications (based on alternate configuration) that remove the obligation of the RTCP (or noop message) keepalives? Thanks, Randy Roberts From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 26, 2012 9:12 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Server disconnects clients every 60 seconds I've talked to the developers of the NVR software (our client) and they tell me they do send regular RTCP reports, but they don't send GET_PARAMETER messages like VLC does. However, my streams don't keep alive and I don't see their RTCP report messages coming in. Then your client - despite the claims of its developers - must not be sending RTCP reports correctly. (I encourage these developers to get in touch with us - via this mailing list - to help fix this.) (Note, BTW, that "openRTSP" - which you noted works correctly with your server - doesn't send "GET_PARAMETER" requests either. But it keeps the session alive, because it sends periodic RTCP "RR" packets.) When I look at testOnDemandRTSPServer, I don't see any specify RTCPInstance:createNew() call That is done in the "OnDemandServerMediaSubsession" class. Where can I put some debug to catch the RTCP reports coming in? Add #define DEBUG 1 to the start of "liveMedia/RTCP.cpp". You will see reports of RTCP "SR" packets being sent by the server, and - if your client is working correctly (e.g., "openRTSP") - RTCP "RR" packets arriving from the client. I enabled the debug in RTSPServer::RTSPClientSession:: handleRequestBytes() after parseRTSPRequestString(), but I don't see the RTCP messages there. No. That function is used only for incoming RTSP commands, not RTCP packets. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ________________________________ Notice to recipient: This email is meant for only the intended recipient of the transmission, and may be a communication privileged by law, subject to export control restrictions or that otherwise contains proprietary information. If you receive this email by mistake, please notify us immediately by replying to this message and then destroy it and do not review, disclose, copy or distribute it. Thank you in advance for your cooperation. -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Thu Jan 26 09:55:21 2012 From: david.myers at panogenics.com (David J Myers) Date: Thu, 26 Jan 2012 17:55:21 -0000 Subject: [Live-devel] Server disconnects clients every 60 seconds Message-ID: <00a801ccdc53$aced4950$06c7dbf0$@myers@panogenics.com> Hi Ross, >> Where can I put some debug to catch the RTCP reports coming in? >Add >#define DEBUG 1 >to the start of "liveMedia/RTCP.cpp". You will see reports of RTCP "SR" packets being sent by the server, and - if >your client is working correctly (e.g., "openRTSP") - RTCP "RR" packets arriving from the client. Ok, here's the only type of debug I get every 10 seconds or so sending REPORT sending RTCP packet 80c80006 648fe821 d2cc0ac5 ad5347a6 3dbe6644 000011f6 00634f20 81ca0004 648fe821 01073336 302d4361 6d000000 schedule(2.195006->1327598663.967054) sending REPORT sending RTCP packet 80c80006 648fe821 d2cc0ac7 f8837718 3dc18ca5 00001489 0071960f 81ca0004 648fe821 01073336 302d4361 6d000000 schedule(4.012164->1327598668.067517) schedule(0.564588->1327598668.634021) I don't see anything being received at all. So am I correct to assume that the Client software is definitely NOT sending RTCP RR or indeed any RTCP packets? Regards, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 26 12:04:26 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Jan 2012 12:04:26 -0800 Subject: [Live-devel] Server disconnects clients every 60 seconds In-Reply-To: <796F61FBED160B43BBD65F5837F570FB33BE5675@PDX-MAIL1.zone1.flir.net> References: <00a001ccdc4b$ddd03710$9970a530$@myers@panogenics.com> <796F61FBED160B43BBD65F5837F570FB33BE5675@PDX-MAIL1.zone1.flir.net> Message-ID: > Does the requirement of the RTCP (keepalive) change at all when using RTP over RTSP? No. > Certainly, applications get notifications (from the TCP stack) when their peer ?goes away? (stops responding to TCP keepalives). Yes, in principle. However, this might take a long time (possibly a very long time), depending upon the particular TCP implementation, especially if the server is not currently sending data (e.g., it had received a RTSP "PAUSE" command beforehand). > Are you aware of any specifications (based on alternate configuration) that remove the obligation of the RTCP (or noop message) keepalives? No, and note that RTCP "RR" packets can useful in their own right (for statistics); not just as a 'keepalive'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 26 12:09:34 2012 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 26 Jan 2012 12:09:34 -0800 Subject: [Live-devel] Server disconnects clients every 60 seconds In-Reply-To: <00a801ccdc53$aced4950$06c7dbf0$@myers@panogenics.com> References: <00a801ccdc53$aced4950$06c7dbf0$@myers@panogenics.com> Message-ID: > I don?t see anything being received at all. So am I correct to assume that the Client software is definitely NOT sending RTCP RR or indeed any RTCP packets? No, but you can definitely assume that RTCP packets are not arriving at the server. It's possible, I suppose, that the client - for some reason - is sending its RTCP packets to the wrong IP address and/or port number. Or that you have a firewall somewhere that's blocking incoming RTCP packets. But you can easily verify the latter by running "openRTSP" (or "VLC") on the same computer as the problematic client software, and checking whether the RTCP packets (which those applications definitely *do* send) arrive at the server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ripcurl01 at gmail.com Fri Jan 27 06:16:38 2012 From: ripcurl01 at gmail.com (Ron Ytsma) Date: Fri, 27 Jan 2012 15:16:38 +0100 Subject: [Live-devel] Multiple mplayers always decoding last stream when using sdp file Message-ID: Hi, I am dealing with a weird issue in the either mplayer or live555 library, and I hope you can help me on this one. Perhaps it is a known bug. The Failure: * When I invoke 1st mplayer with stream1.sdp and * wait till 1st mplayer starts decoding 1st videostream then * add 2nd mplayer with stream2.sdp => the 1st and 2nd mplayer start decoding the second video. When I stop the 2nd mplayer, the 1st mplayer will either exit or start decoding the first video again. This effect continues while adding more videos (3rd 4th etc. They all start decoding the lastly added video. Monitoring the network shows that both streams are still streaming, PARSE_PMT msg shows that the stream decoding has switched when the 2nd mplayer allocates + initiates the 2nd stream. Visible through the PID numbers. It is also visible in the index numbers: The 1st mplayer shows (also visible below in the output snippet): * streamindex 0 is the 1st stream video channel * streamindex 1 is the 2nd stream video channel * streamindex 2 is the 2nd stream audio channel Groupsock shows "4" for both mplayer instances, with different multicast addresses. The behavior and setup: * I am hosting 2 separate video's through VLC on multicast RTP 239.192.140.1 and 239.192.140.2 on port 5004 (same port for both!) on a windows laptop * I am streaming from a linux laptop with mplayer-vaapi client * stream1 is video only * stream2 is video and audio stream. * When I invoke mplayer with stream1.sdp, the 1st stream starts correctly. * When I invoke mplayer and start stream2.sdp the 2nd stream starts correctly. => Ok so my video's are hosted correctly (i think) The problem does not occur when I use e.g. port 5006 on the 2nd video and 5004 on the first video. The problem does not occur when I use rtp://239.192.140.1:5004 and rtp:// 239.192.140.2:5004 (but that is not using LIVE555) The problem occurs only when using multiple instances of mplayer with local SDP file definitions, which have video streams defined on the same port (5004). It looks to me that a stream is overwritten or incorrectly selected from a different instance of mplayer into liveMedia library. I have tried more or less all live555 library versions since 2009-06-02 up to 2012-01-13 and most mplayer versions from svn revision 32xxx to 34xxxx. They all react the same. I also googled a bit and found some more or less related issues that occurred in the past: * http://lists.live555.com/pipermail/live-devel/2010-August/012455.html(resulting in changelog * http://web.archiveorange.com/archive/v/rmoqPnRacu5kiQlVWtKM My questions: Can someone reproduce this behavior? Am I overlooking something or is this a real bug? Is decoding multiple videos on 1 port a known limitation of either live555 (or mplayer)? Oh mightly mailinglist, can you help me? ;) I am capable to compile/code/stream/test suggestions. -------------------- sdp file details: * stream1.sdp (videostream definition 1) cat stream1.sdp v=0 o=- 15184584643367601205 15184584643367601205 IN IP4 gebakkie s=Unnamed1 i=N/A c=IN IP4 239.192.140.1 m=video 5004 RTP/AVP 33 * stream2.sdp (videostream definition 2) cat 1.sdp v=0 o=- 15184584643367601207 15184584643367601207 IN IP4 gebakkie s=Unnamed2 i=N/A c=IN IP4 239.192.140.2 m=video 5004 RTP/AVP 33 -------------------- my mplayer invocation fluff: * mplayer -vo vaapi -x 512 -y 300 -geometry +0+0 sdp://stream1.sdp * mplayer -vo vaapi -x 512 -y 300 -geometry+512+0 sdp://stream2.sdp -------------------- mplayer details: MPlayer SVN-r34365-4.3.4 (C) 2000-2011 MPlayer Team configure --prefix=/usr --confdir=/etc/mplayer --disable-mencoder --disable-gui --disable-langinfo --disable-lirc --disable-joystick --disable-apple-remote --disable-apple-ir --disable-xf86keysym --disable-radio --disable-radio-capture --disable-radio-v4l2 --disable-radio-bsdbt848 --disable-tv --disable-tv-v4l1 --disable-tv-v4l2 --disable-tv-bsdbt848 --disable-pvr --disable-smb --disable-librtmp --disable-vcd --disable-bluray --disable-dvdnav --disable-dvdread --disable-cdparanoia --disable-cddb --disable-sortsub --disable-maemo --disable-macosx-finder --disable-macosx-bundle --disable-inet6 --disable-ftp --disable-vstream --disable-w32threads --disable-ass-internal --disable-ass --disable-arts --disable-esd --disable-pulse --disable-jack --disable-nas --disable-sgiaudio --disable-sunaudio --disable-kai --disable-dart --disable-win32waveout --disable-vidix --disable-dga2 --disable-dga1 --disable-vesa --disable-svga --disable-sdl --disable-kva --disable-aa --disable-caca --disable-ggi --disable-ggiwmh --disable-direct3d --disable-directx --disable-dxr2 --disable-dxr3 --disable-ivtv --disable-v4l2 --disable-dvb --disable-mga --disable-xmga --disable-vdpau --disable-3dfx --disable-tdfxfb --disable-wii --disable-directfb --disable-zr --disable-bl --disable-tga --disable-pnm --disable-md5sum --disable-quartz --disable-xanim --disable-fbdev --disable-matrixview --disable-ffmpeg_so ----------- ffmpeg configure --disable-doc --enable-network --enable-vaapi --disable-vdpau --disable-vda --disable-vda --disable-dxva2 --disable-encoders --disable-bsfs --disable-indevs --disable-nonfree --enable-gpl ------------ 1st Mplayer output on switchover moment: COLLECT_SECTION, start: 64, size: 17, collected: 17 SKIP: 0+1, TID: 0, TLEN: 13, COLLECTED: 17 PARSE_PAT: section_len: 13, section 0/0 PROG: 1 (1-th of 1), PMT: 66 COLLECT_SECTION, start: 64, size: 22, collected: 22 SKIP: 0+1, TID: 2, TLEN: 18, COLLECTED: 22 FILL_PMT(prog=1), PMT_len: 22, IS_START: 64, TS_PID: 66, SIZE=22, M=0, ES_CNT=1, IDX=0, PMT_PTR=0xa28ea20 PARSE_PMT(1 INDEX 0), STREAM: 0, FOUND pid=0x55 (85), type=0x10000005, ES_DESCR_LENGTH: 0, bytes left: 0 ---------------------------- V:13572.0 164/164 2% 1% 0.0% 0 0 0% ^[[J^M07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 COLLECT_SECTION, start: 64, size: 17, collected: 17 SKIP: 0+1, TID: 0, TLEN: 13, COLLECTED: 17 PARSE_PAT: section_len: 13, section 0/0 PROG: 1 (1-th of 1), PMT: 66 07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 07:12:27 Groupsock(4: 239.192.140.1, 5004, 255): read 1328 bytes from 172.16.24.3 COLLECT_SECTION, start: 64, size: 48, collected: 22 SKIP: 0+1, TID: 2, TLEN: 44, COLLECTED: 48 FILL_PMT(prog=1), PMT_len: 48, IS_START: 64, TS_PID: 66, SIZE=48, M=0, ES_CNT=1, IDX=0, PMT_PTR=0xa28ea20 PROG DESCR, TAG=1d, LEN=13(d) PARSE_MP4_DESCRIPTORS, len=11 TAG=2 (0x2), DESCR_len=7, len=11, j=3 PARSE_MP4_IOD: len=7, IOD_ID=1 ...descr id: 0xa, len=4 Language Descriptor: eng PARSE_PMT(1 INDEX 1), STREAM: 0, FOUND pid=0x90 (144), type=0x4134504d, ES_DESCR_LENGTH: 6, bytes left: 5 PARSE_PMT(1 INDEX 2), STREAM: 1, FOUND pid=0x91 (145), type=0x10000005, ES_DESCR_LENGTH: 0, bytes left: 0 ----------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 27 08:21:59 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 27 Jan 2012 08:21:59 -0800 Subject: [Live-devel] Multiple mplayers always decoding last stream when using sdp file In-Reply-To: References: Message-ID: You could try using VLC instead of MPlayer, but I suspect you'd get similar results. Your problem is caused by the fact that your SDP files specify different IP multicast addresses, but the same port number (in this case 5004). I.e., you are trying to receive two separate multicast streams that use the same port number. Some operating systems - most notably LInux - do not handle this well; incoming multicast packets might end up getting delivered to the same receiver, regardless of what multicast group(s) they've joined. I consider this a bug in the Linux kernel, but some Linux experts apparently disagree. Your best solution is simply to change one of your streams' port numbers, so that both the multicast addresses *and* the port numbers are different. If you can't do that, then the only solution I know is to use a different operating system - e.g., FreeBSD - which doesn't have this problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Fri Jan 27 10:24:02 2012 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Fri, 27 Jan 2012 19:24:02 +0100 Subject: [Live-devel] memory leak in RTSPClient when SETUP is processed Message-ID: <14787_1327688648_4F22EBC8_14787_8874_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA58E33D@THSONEA01CMS01P.one.grp> Hi Ross, When the RTSPClient execute SETUP request, the sessionId is stored in the MediaSubSession. 36 bytes in 4 blocks are definitely lost in loss record 469 of 573 (see: http://valgrind.org/docs/manual/mc-manual.html#mc-manual.leaks) at 0x4C27939: operator new[](unsigned long) (vg_replace_malloc.c:305) by 0xF100DD: strDup(char const*) (strDup.cpp:27) by 0xECFE97: RTSPClient::handleSETUPResponse(MediaSubsession&, char const*, char const*, unsigned int) (RTSPClient.cpp:959) by 0xED2010: RTSPClient::handleResponseBytes(int) (RTSPClient.cpp:1520) by 0xED1230: RTSPClient::incomingDataHandler1() (RTSPClient.cpp:1318) by 0xED11A6: RTSPClient::incomingDataHandler(void*, int) (RTSPClient.cpp:1311) Obviously, it is possible to free these memory when we close the subsession adding in our code something like : if (subsession->sessionId) delete []subsession->sessionId; subsession->sessionId = NULL; Let me now if you plan to free this memory somehow when MediaSession is deleted. Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 27 13:06:52 2012 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 27 Jan 2012 13:06:52 -0800 Subject: [Live-devel] memory leak in RTSPClient when SETUP is processed In-Reply-To: <14787_1327688648_4F22EBC8_14787_8874_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA58E33D@THSONEA01CMS01P.one.grp> References: <14787_1327688648_4F22EBC8_14787_8874_1_1BE8971B6CFF3A4F97AF4011882AA2550155FA58E33D@THSONEA01CMS01P.one.grp> Message-ID: <656281C7-C395-47D4-81C1-CB378CC3254B@live555.com> > When the RTSPClient execute SETUP request, the sessionId is stored in the MediaSubSession. And it gets delete[]d later, when the RTSPClient does a "TEARDOWN". So this isn't really a memory leak. However, I can see why "valgrind" might get confused by this. The "sessionId" field in "MediaSubsession" is a bit strange, because it - unlike most class member fields - is not managed by the class's member functions and/or its constructor/destructor. Instead, it is managed by whatever code happens to use the "sessionId" field - in this case, the "RTSPClient" code. (Note that "MediaSession"s can be used without RTSP - e.g., if you want to receive a multicast stream using only its SDP description. So "sessionId" is not an inherent property of a "MediaSession".) However, because you're at least the third 'valgrinerd' :-) to have asked about this over the years, I might end up changing this behavior, so I don't keep getting questions about this... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Anarchist666 at yandex.ru Sat Jan 28 06:35:17 2012 From: Anarchist666 at yandex.ru (=?koi8-r?B?5sHazMXF1yDtwcvTyc0=?=) Date: Sat, 28 Jan 2012 18:35:17 +0400 Subject: [Live-devel] x264 nal unist and H264VideoStreamDiscreteFramer In-Reply-To: <3E353F59-9571-401F-9124-E51596329BE6@live555.com> References: <349751327139339@web107.yandex.ru> <26DE2E4D-6C51-4E5B-A9B5-BF55285C7826@live555.com> <802711327215007@web23.yandex.ru> <10501327226150@web154.yandex.ru> <3E353F59-9571-401F-9124-E51596329BE6@live555.com> Message-ID: <279191327761317@web132.yandex.ru> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamServerMediaSubsession.cpp Type: application/octet-stream Size: 3750 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamServerMediaSubsession.hh Type: application/octet-stream Size: 1189 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamSource.cpp Type: application/octet-stream Size: 8327 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: H264VideoCamSource.hh Type: application/octet-stream Size: 1465 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: main.cpp Type: application/octet-stream Size: 2704 bytes Desc: not available URL: From david.myers at panogenics.com Sat Jan 28 09:38:22 2012 From: david.myers at panogenics.com (David J Myers) Date: Sat, 28 Jan 2012 17:38:22 -0000 Subject: [Live-devel] Multiple clients on a single FramedSource substream Message-ID: <002401ccdde3$a24c8420$e6e58c60$@myers@panogenics.com> Hi, I don't think I'm handling multiple client connections properly in my camera's live rtsp server. My camera has multiple streams and I derive a class from FramedSource for each of my substreams, the constructor looks like this:- StreamSource::StreamSource(UsageEnvironment &env,streamSubStream_t *substream) : FramedSource(env), m_substream(substream) { FUNCTION_TRACE; substream->source = this; m_listsources.push_back(this); ++referenceCount; if (eventTriggerId == 0) { eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0); } m_BufferOffset = 0; m_NalIndex = 0; INFO("StreamSource constructor, this=0x%p, ref count =%d, StreamIndex = %d\r\n", this, referenceCount, substream->iStreamNo); } So I'm saving a pointer to the source in a parameter I pass in. The trouble is, I think, when a second client connects to the same stream, another instance of the class is created, and this pointer gets overwritten with the latest instance. Then, when I need to trigger the event when I have a NAL unit to send, I call this function:- void signalNewFrameData(StreamSource* source) { TaskScheduler* ourScheduler = &(source->envir().taskScheduler()); StreamSource* ourDevice = source; if (ourScheduler != NULL) { // sanity check TRACE("About to trigger Deliver event\r\n"); ourScheduler->triggerEvent(StreamSource::eventTriggerId, ourDevice); } } I only seem to call this function for the last source on this substream. I've inherited a lot of this code but do I need to trigger the event for all sources connected or am I totally misunderstanding this mechanism? Thanks - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 28 13:28:55 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 28 Jan 2012 13:28:55 -0800 Subject: [Live-devel] Multiple clients on a single FramedSource substream In-Reply-To: <002401ccdde3$a24c8420$e6e58c60$@myers@panogenics.com> References: <002401ccdde3$a24c8420$e6e58c60$@myers@panogenics.com> Message-ID: <31A15B46-438E-4A35-A282-309A2B227782@live555.com> > The trouble is, I think, when a second client connects to the same stream, another instance of the class is created In your "OnDemandServerMediaSubsession" subclass constructor, are you setting the "reuseFirstSource" parameter (in the parent class constructor) to True? This is important if - as in your case - you're streaming from a live input source. It prevents a new input source object from being created each time a new client connects. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.myers at panogenics.com Sat Jan 28 15:58:34 2012 From: david.myers at panogenics.com (David J Myers) Date: Sat, 28 Jan 2012 23:58:34 -0000 Subject: [Live-devel] Multiple clients on a single FramedSource substream Message-ID: <000301ccde18$bf612d60$3e238820$@myers@panogenics.com> >In your "OnDemandServerMediaSubsession" subclass constructor, are you setting the "reuseFirstSource" parameter (in the parent class constructor) to True? This is important if - as in your case - you're streaming from a live input source. It prevents a new input source object from being created each time a new client connects. Yes, I believe I am setting this flag to true, so I can't understand why I'm getting more source objects (and an increasing reference count). I'm setting it in the constructor shown below, is this enough? LiveMediaSubsession::LiveMediaSubsession( UsageEnvironment& env, streamSubStream_t *substream) : OnDemandServerMediaSubsession(env, true /* reuse first source */), _substream(substream) { FUNCTION_TRACE; } Cheers - David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 28 20:34:54 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 28 Jan 2012 20:34:54 -0800 Subject: [Live-devel] Multiple clients on a single FramedSource substream In-Reply-To: <000301ccde18$bf612d60$3e238820$@myers@panogenics.com> References: <000301ccde18$bf612d60$3e238820$@myers@panogenics.com> Message-ID: > >In your "OnDemandServerMediaSubsession" subclass constructor, are you setting the "reuseFirstSource" parameter (in the parent class constructor) to True? This is important if - as in your case - you're streaming from a live input source. It prevents a new input source object from being created each time a new client connects. > > Yes, I believe I am setting this flag to true, so I can?t understand why I?m getting more source objects (and an increasing reference count). OK. However, it turns out that your "createNewStreamSource()" function gets called twice (but no more), even if you've set the "reuseFirstParameter" to True. The first call is used to create 'dummy' source objects that might (depending upon the codec) be needed in order to determine the stream's SDP description (which the server will return in the response to the RTSP "DESCRIBE" command). Then this dummy source object gets closed. And then afterwards, when the first client does a RTSP "PLAY", "createNewStreamSource()" will get called again. (But because you've set "reuseFirstSource" to True, it won't get called again, even if more clients connect.) So, your code needs to be prepared for the following, in order: 1/ Your "createNewStreamSource()" gets called. 2/ The destructor of your source object (the one that was returned by the first call to "createNewStreamSource()") gets called. 3/ Your "createNewStreamSource()" gets called again. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Sat Jan 28 22:44:36 2012 From: isambhav at gmail.com (Sambhav) Date: Sun, 29 Jan 2012 12:14:36 +0530 Subject: [Live-devel] Live H264 RTP source to RTSP Server Message-ID: Hi, I have an application which uses H264VideoRTPSource class to receive live H264 RTP Stream. This application should now act as RTSP Server for above H264 Video received. One way I was thinking to do is - Receive RTP H264 Video using H264VideoRTPSource - Dump the H264 elementary video stream using H264VideoFileSink to a linux PIPE - To the RTSP Server add H264VideoFileServerMediaSubsession with the above mentioned linux PIPE as the input Is there a better way to do this ? Thanks & Regards, Sambhav -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 29 00:38:58 2012 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 29 Jan 2012 00:38:58 -0800 Subject: [Live-devel] Live H264 RTP source to RTSP Server In-Reply-To: References: Message-ID: > I have an application which uses H264VideoRTPSource class to receive live H264 RTP Stream. > This application should now act as RTSP Server for above H264 Video received. > > One way I was thinking to do is > Receive RTP H264 Video using H264VideoRTPSource > Dump the H264 elementary video stream using H264VideoFileSink to a linux PIPE > To the RTSP Server add H264VideoFileServerMediaSubsession with the above mentioned linux PIPE as the input Yes, that should work, I think... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Mon Jan 30 01:17:19 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Mon, 30 Jan 2012 11:17:19 +0200 Subject: [Live-devel] How to to stream h264 encoded stream in a MP4 container with Live555 Message-ID: Hi, I want to stream .h264 encoded stream in a .MP4 container. So i investigate testOnRTSPServer.cpp example. Basically it can sucessfully stream h264 raw stream[ i used VLC as a client]. [ H265 raw stream file ] --input---> testOnRTSPServer I know how to get/extract h264 raw stream in an MP4 container.So what i want is [ h264 encoded data in MP4 container] --> MP4 Demuxer --->[h264 raw frame data ] --input-->testOnRTSPServer So how or where should i modify the testOnRTSPServer so that intead of reading from a file, it justs get the output of MP4 demuxer? BesT Wishes Novalis From finlayson at live555.com Mon Jan 30 01:33:59 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Jan 2012 01:33:59 -0800 Subject: [Live-devel] How to to stream h264 encoded stream in a MP4 container with Live555 In-Reply-To: References: Message-ID: <4F6E4F22-EBA4-4F62-86B3-81B4546377CC@live555.com> > So how or where should i modify the testOnRTSPServer so that intead > of reading from a file, > it justs get the output of MP4 demuxer? You will first need to define and implement your own new subclass of "OnDemandServerMediaSubsession" that reimplements the two pure virtual functions "createNewStreamSource()" and "createNewRTPSink()". This new subclass will probably be similar to the existing "H264VideoFileServerMediaSubsession" class (which is used for streaming from a H.264 Video Elementary Stream file). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Mon Jan 30 03:33:30 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Mon, 30 Jan 2012 13:33:30 +0200 Subject: [Live-devel] How to to stream h264 encoded stream in a MP4 container with Live555 In-Reply-To: <4F6E4F22-EBA4-4F62-86B3-81B4546377CC@live555.com> References: <4F6E4F22-EBA4-4F62-86B3-81B4546377CC@live555.com> Message-ID: Well, I create a new class which is subclass of "OnDemandServerMediaSubsession" [ call it MyCustomServerMediaSubsession] which overrides the "createNewStreamSource()" and "createNewRTPSink()". methods. But the original code at testOnDemandRTSPSource for h264 is // A H.264 video elementary stream: { char const* streamName = "h264ESVideoTest"; char const* inputFileName = "test.264"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(H264VideoFileServerMediaSubsession ::createNew(*env, inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } So should i replace my custom MyCustomServerMediaSubsession with MyCustomServerMediaSubsession? If so how to modify this code? to use my custom mediaSession? Best Wishes Novalis 2012/1/30 Ross Finlayson : > So how or where should i modify the ?testOnRTSPServer so that intead > of reading from a file, > it justs get the output of MP4 demuxer? > > > You will first need to define?and implement your own new subclass of > "OnDemandServerMediaSubsession" that reimplements the two pure virtual > functions "createNewStreamSource()" and "createNewRTPSink()". ?This new > subclass will probably be similar to the existing > "H264VideoFileServerMediaSubsession" class (which is used for streaming from > a H.264 Video Elementary Stream file). > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From 6.45.vapuru at gmail.com Mon Jan 30 08:01:00 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Mon, 30 Jan 2012 18:01:00 +0200 Subject: [Live-devel] How to create my custom FramedSource based on "DeviceSource" model. Message-ID: I want to create my custom FramedSource based on "DeviceSource" model. Suppose that -- I have a shared global queue, call it IncomingQueue. -- IncomingQueue is FIFO queue. -- IncomingQueue.getNextFrame() -->return new Frame -- Frame.getData() return usigned char data -- Frama.getDataSize() return data size So can anyone give me a concrete example "which will fill" [ "%%% TO BE WRITTEN %%% ] part of the "DeviceSource" model code based on my FIFO Que Best wishes Novalis From iminimup at gmail.com Mon Jan 30 10:57:12 2012 From: iminimup at gmail.com (imin imup) Date: Mon, 30 Jan 2012 12:57:12 -0600 Subject: [Live-devel] motion JPEG over RTP over UDP streaming server (RFC2435). Message-ID: Hello Ross, This email is a bit long. Please bear with me for a moment. After going through your 2 excellent faqs and Elphel source, am I correct to say the quickest way to build a MJPEG streamer (from JPEG files in prototype) would be to modify ElphelJPEGDeviceSource.cpp only? Please help me understand the execution flow of your ElphelJPEGDeviceSource.cpp: - main() in ElphelStreamer.cpp calls through to createNew() to constructor to startCapture(), which issues JPEG_CMD_ACQUIRE to get a new frame. - A callback is setup in constructor. If newFrameHandler() is called back after doGetNextFrame(), the captured frame will be read out and sent out by deliverFrameToClient() immediately. Otherwise, till JPEGVideoRTPSink calls doGetNextFrame(), the captured frame will then be read and sent. Then deliverFrameToClient() calls startCapture() to get next new frame. Questions here (just try to understand how it works): - What is the cmd JPEG_CMD_CATCHUP doing? - It seems the 1st frame sent out in a stream "may" be obsolete -- captured a long time ago and left in camera when first time doGetNextFrame() is called. Is this true? In my own version: - In constructor: since I'll read JPEG files, the callback newFrameHandler cannot be set as no socket. - In deliverFrameToClient(): I will also need following statement to notice JPEGVideoRTPSink: // Switch to another task, and inform the reader that he has data: nextTask() = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this); Any other suggestions or comment? Thanks a lot, Imin On Sun, Jan 15, 2012 at 11:33 PM, Ross Finlayson wrote: > However, our RTSP server implementation *can* be used - with some >> additional programming - to stream motion JPEG. >> > > I'm a newbie on RTP. Could you please detail the additional programming? > > > The two links in the FAQ entry that I showed you should give you the > information that you need. In fact, I recommend that you read the entire > FAQ. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 30 19:14:51 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Jan 2012 19:14:51 -0800 Subject: [Live-devel] motion JPEG over RTP over UDP streaming server (RFC2435). In-Reply-To: References: Message-ID: <3FA7311B-D23E-4522-B510-DC3E500DB0EA@live555.com> > After going through your 2 excellent faqs and Elphel source, am I correct to say the quickest way to build a MJPEG streamer (from JPEG files in prototype) would be to modify ElphelJPEGDeviceSource.cpp only? Actually, I don't recommend that you bother with streaming JPEG at all (especially as this is just a hobby for you). JPEG streaming is very inefficient. See http://www.live555.com/liveMedia/faq.html#jpeg-streaming Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From xzha286 at aucklanduni.ac.nz Mon Jan 30 20:01:44 2012 From: xzha286 at aucklanduni.ac.nz (James Zhang) Date: Tue, 31 Jan 2012 17:01:44 +1300 Subject: [Live-devel] question about parseSPropParameterSets() Message-ID: Hello everyone I have a question about parseSPropParameterSets() function. Based on my understanding, I think this function will read in the SPS and PPS data, then do base64 decode to generate a nal unit. I have fit the SPS and PPS data by using this unsigned int num=0; SPropRecord * sps=parseSPropParameterSets(context->subsession->fmtp_spropparametersets(),num); My question is how can i store SPropRecord * data to a NSData and send into extradata to decode? Thanks alot Best regards James -- James Zhang BE (Hons) Department of Electrical and Computer Engineering THE UNIVERSITY OF AUCKLAND -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 30 21:52:01 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Jan 2012 21:52:01 -0800 Subject: [Live-devel] question about parseSPropParameterSets() In-Reply-To: References: Message-ID: <4ACBF1A3-9F91-4CE5-95D2-E955BF39CF88@live555.com> G'day James, it's nice to hear from another University of Auckland person. > I have a question about parseSPropParameterSets() function. > > Based on my understanding, I think this function will read in the SPS and PPS data, Yes, it will read in a coded ASCII string that represents the SPS and PPS NAL units. > then do base64 decode to generate a nal unit. > > I have fit the SPS and PPS data by using this > > > unsigned int num=0; > > SPropRecord * sps=parseSPropParameterSets(context->subsession->fmtp_spropparametersets(),num); > After the call, "sps" will be an array of "num" "SPropRecord"s - each one containing the data for a NAL unit (usually SPS or PPS). So you can do, for example: for (unsigned i = 0; i < num; ++i) { unsigned nalUnitSize = sps[i].sPropLength; unsigned char* nalUnitBytes = sps[i].sPropBytes; // this is a byte array, of size "nalUnitSize". // Then do whatever you like with this NAL unit data } > My question is how can i store > > SPropRecord * data to a NSData and send into extradata to decode? I don't know what a "NSData" is (it's apparently something outside our libraries), but I hope it should be obvious from the implementation of the function in "liveMedia/H264VideoRTPSource.cpp" how it works. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From xzha286 at aucklanduni.ac.nz Mon Jan 30 22:27:44 2012 From: xzha286 at aucklanduni.ac.nz (James Zhang) Date: Tue, 31 Jan 2012 19:27:44 +1300 Subject: [Live-devel] question about parseSPropParameterSets() In-Reply-To: <4ACBF1A3-9F91-4CE5-95D2-E955BF39CF88@live555.com> References: <4ACBF1A3-9F91-4CE5-95D2-E955BF39CF88@live555.com> Message-ID: Hello Ross Thank you very much for your fast and nice reply. After parseSPropParameterSets, I should be able to get sps and pps(in binary format?) But when i use print command to print the strings, I got some wired data. I m not sure why it looks like that: code: for(unsigned i=0; i > I have a question about parseSPropParameterSets() function. > > Based on my understanding, I think this function will read in the SPS and > PPS data, > > > Yes, it will read in a coded ASCII string that represents the SPS and PPS > NAL units. > > > then do base64 decode to generate a nal unit. > > I have fit the SPS and PPS data by using this > > > unsigned int num=0; > > SPropRecord * > sps=parseSPropParameterSets(context->subsession->fmtp_spropparametersets(),num); > > After the call, "sps" will be an array of "num" "SPropRecord"s - each one > containing the data for a NAL unit (usually SPS or PPS). So you can do, > for example: > > for (unsigned i = 0; i < num; ++i) { > unsigned nalUnitSize = sps[i].sPropLength; > unsigned char* nalUnitBytes = sps[i].sPropBytes; // this is a byte array, > of size "nalUnitSize". > // Then do whatever you like with this NAL unit data > } > > > My question is how can i store > > SPropRecord * data to a NSData and send into extradata to decode? > > > I don't know what a "NSData" is (it's apparently something outside our > libraries), but I hope it should be obvious from the implementation of the > function in "liveMedia/H264VideoRTPSource.cpp" how it works. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- James Zhang BE (Hons) Department of Electrical and Computer Engineering THE UNIVERSITY OF AUCKLAND -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 30 23:17:05 2012 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 30 Jan 2012 23:17:05 -0800 Subject: [Live-devel] question about parseSPropParameterSets() In-Reply-To: References: <4ACBF1A3-9F91-4CE5-95D2-E955BF39CF88@live555.com> Message-ID: <01CE4C42-C1EA-4EC1-929C-F122E0C06F6E@live555.com> > After parseSPropParameterSets, I should be able to get sps and pps(in binary format?) Yes, and that's exactly what the "sPropBytes" field is! It's a pointer to an array of "sPropLength" bytes (i.e., binary). > > But when i use print command to print the strings, I got some wired data. No! You can't print the "sProbBytes" field as a string, because it's not a string - it's the binary data! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Mon Jan 30 23:30:16 2012 From: isambhav at gmail.com (Sambhav) Date: Tue, 31 Jan 2012 13:00:16 +0530 Subject: [Live-devel] Live H264 RTP source to RTSP Server In-Reply-To: References: Message-ID: For PIPEs to work the H264VideoRTPSource and the RTSP Server should be in different processes. It will not work if they are in single process as PIPE read or write would be blocking call. Is there an alternative way to do the same without using linux PIPEs ? In H264VideoFileServerMediaSubsession::createNewStreamSource instead of ByteStreamFileSource can H264VideoRTPSource be used so that I can get live input ? On Tue, Jan 31, 2012 at 8:35 AM, Sambhav wrote: > Hi Ross, > > Is there an alternative way to do the same without using linux PIPEs ? > > In H264VideoFileServerMediaSubsession::createNewStreamSource instead > of ByteStreamFileSource can H264VideoRTPSource be used so that I can get > live input ? > > Regards, > Sambhav > > On Sun, Jan 29, 2012 at 2:08 PM, Ross Finlayson wrote: > >> I have an application which uses H264VideoRTPSource class to receive live >> H264 RTP Stream. >> This application should now act as RTSP Server for above H264 Video >> received. >> >> One way I was thinking to do is >> >> - Receive RTP H264 Video using H264VideoRTPSource >> - Dump the H264 elementary video stream using H264VideoFileSink to a >> linux PIPE >> - To the RTSP Server add H264VideoFileServerMediaSubsession with the >> above mentioned linux PIPE as the input >> >> Yes, that should work, I think... >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From 6.45.vapuru at gmail.com Mon Jan 30 23:43:57 2012 From: 6.45.vapuru at gmail.com (Novalis Vapuru) Date: Tue, 31 Jan 2012 09:43:57 +0200 Subject: [Live-devel] question about parseSPropParameterSets() In-Reply-To: <01CE4C42-C1EA-4EC1-929C-F122E0C06F6E@live555.com> References: <4ACBF1A3-9F91-4CE5-95D2-E955BF39CF88@live555.com> <01CE4C42-C1EA-4EC1-929C-F122E0C06F6E@live555.com> Message-ID: James, You can parse sprop parameters in order to get meaningfull stream info [ stream width, height etc] if you need. You can get details from here: http://stackoverflow.com/questions/6394874/fetching-the-dimensions-of-a-h264video-stream Best Wishes Novalis 2012/1/31 Ross Finlayson : > After?parseSPropParameterSets, I should be able to get sps and pps(in binary > format?) > > > Yes, and that's exactly what the "sPropBytes" field is! ?It's a pointer to > an array of "sPropLength" bytes (i.e., binary). > > > But when i use print command to print the strings, I got some wired data. > > > No! ?You can't print the "sProbBytes" field as a string, because it's not a > string - it's the binary data! > > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Jan 31 00:28:44 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Jan 2012 00:28:44 -0800 Subject: [Live-devel] Live H264 RTP source to RTSP Server In-Reply-To: References: Message-ID: <75D701BF-DAFC-4D80-8D5A-B11B88A85A22@live555.com> > In H264VideoFileServerMediaSubsession::createNewStreamSource instead of ByteStreamFileSource can H264VideoRTPSource be used so that I can get live input ? I think that will work. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jkburges at gmail.com Tue Jan 31 01:05:59 2012 From: jkburges at gmail.com (Jon Burgess) Date: Tue, 31 Jan 2012 20:05:59 +1100 Subject: [Live-devel] question about parseSPropParameterSets() Message-ID: > > > > > SPropRecord * data to a NSData and send into extradata to decode? > > I don't know what a "NSData" is (it's apparently something outside our > libraries), but I hope it should be obvious from the implementation of the > function in "liveMedia/H264VideoRTPSource.cpp" how it works. > > "NSData" is basically a glorified byte buffer available on cocoa/Mac/iOS. Cheers, Jon Burgess -------------- next part -------------- An HTML attachment was scrubbed... URL: From isambhav at gmail.com Tue Jan 31 01:47:46 2012 From: isambhav at gmail.com (Sambhav) Date: Tue, 31 Jan 2012 15:17:46 +0530 Subject: [Live-devel] Live H264 RTP source to RTSP Server In-Reply-To: <75D701BF-DAFC-4D80-8D5A-B11B88A85A22@live555.com> References: <75D701BF-DAFC-4D80-8D5A-B11B88A85A22@live555.com> Message-ID: It is not working. Its crashing. The gdb stacktrace is given below. The onDemandRTSP Server will initialize the source and sink when it receives a DESCRIBE from the client. Now when it starts to form the SDP for which it starts looking for bitstream, if there is no data received at the input will it be rescheduled ? Does direct replace of ByteStreamFileSource to H264VideoRTPSource work ? Am I missing something due to which its crashing ? Program received signal EXC_BAD_ACCESS, Could not access memory. Reason: KERN_INVALID_ADDRESS at address: 0x000000010c180938 0x000000010003cd41 in BasicTaskScheduler::setBackgroundHandling (this=0x1002008c0, socketNum=1606414464, conditionSet=2, handlerProc=0xfffffffe, clientData=0x100204320) at BasicTaskScheduler.cpp:191 #0 BasicTaskScheduler::setBackgroundHandling (this=0x1002008c0, socketNum=1606414464, conditionSet=2, handlerProc=0x10001d534 , clientData=0x100204320) at BasicTaskScheduler.cpp:191 #1 0x0000000100020056 in RTPInterface::startNetworkReading (this=0x1002043b0, handlerProc=0x10001d534 ) at UsageEnvironment.hh:156 #2 0x000000010001d570 in MultiFramedRTPSource::doGetNextFrame (this=0x1002008c0) at MultiFramedRTPSource.cpp:119 #3 0x0000000100034d49 in StreamParser::ensureValidBytes1 (this=0x1002046b0, numBytesNeeded=490640) at StreamParser.cpp:159 #4 0x000000010000aeeb in StreamParser::ensureValidBytes () at /Users/sam/Documents/Development/OpenSource/Protocols/Live555/live/liveMedia/StreamParser.hh:556 #5 0x000000010000aeeb in StreamParser::test4Bytes () at /Users/sam/Documents/Development/OpenSource/Protocols/Live555/live/liveMedia/StreamParser.hh:54 #6 0x000000010000aeeb in H264VideoStreamParser::parse (this=0x1002008c0) at H264VideoStreamFramer.cpp:556 #7 0x0000000100005ff3 in MPEGVideoStreamFramer::continueReadProcessing (this=0x100204550) at MPEGVideoStreamFramer.cpp:154 #8 0x00000001000184fd in H264FUAFragmenter::doGetNextFrame (this=0x1002008c0) at H264VideoRTPSink.cpp:167 #9 0x000000010001ecc3 in MultiFramedRTPSink::packFrame (this=0x1002008c0) at MultiFramedRTPSink.cpp:216 #10 0x000000010001f13e in MultiFramedRTPSink::continuePlaying (this=0x1002008c0) at MultiFramedRTPSink.cpp:152 #11 0x00000001000296ab in H264VideoFileServerMediaSubsession::getAuxSDPLine (this=0x1002008c0, rtpSink=0x5fbff080, inputSource=0x2) at H264VideoFileServerMediaSubsession.cpp:94 #12 0x00000001000286ce in OnDemandServerMediaSubsession::setSDPLinesFromRTPSink (this=0x100201320, rtpSink=0x5fbff080, inputSource=0x2, estBitrate=1606414464) at OnDemandServerMediaSubsession.cpp:308 #13 0x0000000100028d6a in OnDemandServerMediaSubsession::sdpLines (this=0x1002008c0) at OnDemandServerMediaSubsession.cpp:67 #14 0x0000000100027b67 in ServerMediaSession::generateSDPDescription (this=0x7fff5fbff270) at ServerMediaSession.cpp:214 #15 0x00000001000258b7 in RTSPServer::RTSPClientSession::handleCmd_DESCRIBE (this=0x100800600, cseq=0x7fff5fbff570 "3", urlPreSuffix=0x7fff5fbff700 "", urlSuffix=0x100201260 "?X\a", fullRequestStr=0x7fff5fbff390 "??_?") at RTSPServer.cpp:624 #16 0x0000000100024f1d in RTSPServer::RTSPClientSession::handleRequestBytes (this=0x100800600, newBytesRead=8390375) at RTSPServer.cpp:469 #17 0x00000001000253c7 in RTSPServer::RTSPClientSession::incomingRequestHandler (instance=0x100800600, unnamed_arg=1606414464) at RTSPServer.cpp:360 #18 0x000000010003d10c in BasicTaskScheduler::SingleStep (this=0x1002008c0, maxDelayTime=0) at BasicTaskScheduler.cpp:146 #19 0x000000010003c24b in BasicTaskScheduler0::doEventLoop (this=0x1002008c0, watchVariable=0x0) at BasicTaskScheduler0.cpp:80 #20 0x0000000100002072 in main (argc=2099392, argv=0x5fbff080) at testOnDemandRTSPServer.cpp:340 On Tue, Jan 31, 2012 at 1:58 PM, Ross Finlayson wrote: > In H264VideoFileServerMediaSubsession::createNewStreamSource instead > of ByteStreamFileSource can H264VideoRTPSource be used so that I can get > live input ? > > > I think that will work. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 31 01:56:38 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Jan 2012 01:56:38 -0800 Subject: [Live-devel] Live H264 RTP source to RTSP Server In-Reply-To: References: <75D701BF-DAFC-4D80-8D5A-B11B88A85A22@live555.com> Message-ID: <3F3904D4-F5C4-4132-B5FC-4EC294D7885D@live555.com> One thing that you're definitely doing wrong is feeding your incoming H.264 NAL units into a "H264VideoStreamFramer". This is wrong. Because you are reading discrete NAL units - i.e., complete NAL units, one at a time - from your source, you *must* feed them into a "H264VideoStreamDiscreteFramer" instead. But in any case, I don't have time to do a lot of hand-holding on people who use "@gmail.com" email addresses. You're pretty much on your own here... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From xzha286 at aucklanduni.ac.nz Tue Jan 31 02:17:13 2012 From: xzha286 at aucklanduni.ac.nz (James Zhang) Date: Tue, 31 Jan 2012 23:17:13 +1300 Subject: [Live-devel] question about parseSPropParameterSets() In-Reply-To: References: Message-ID: Hello everyone Thanks for everybody's suggestion. Looks i have made it start to do something instead on no frame, fail to decode. I m keep getting something like this. Is it because of i m putting the wrong NAL units to decoder? My nal structure is 0x1sps0x1pps *[h264 @ 0x700f400]slice type too large (2) at 0 0* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]non-existing PPS 0 referenced* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]B picture before any references, skipping* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]B picture before any references, skipping* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]B picture before any references, skipping* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]B picture before any references, skipping* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]slice type too large (3) at 0 0* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]non-existing PPS 0 referenced* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]B picture before any references, skipping* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]B picture before any references, skipping* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]slice type too large (3) at 0 0* *[h264 @ 0x700f400]decode_slice_header error* *[h264 @ 0x700f400]AVC: nal size 0* *[h264 @ 0x700f400]AVC: nal size 8459* Thank you very much Best regards James On 31 January 2012 22:05, Jon Burgess wrote: > > >> > SPropRecord * data to a NSData and send into extradata to decode? >> >> I don't know what a "NSData" is (it's apparently something outside our >> libraries), but I hope it should be obvious from the implementation of the >> function in "liveMedia/H264VideoRTPSource.cpp" how it works. >> >> > "NSData" is basically a glorified byte buffer available on cocoa/Mac/iOS. > > Cheers, > Jon Burgess > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- James Zhang BE (Hons) Department of Electrical and Computer Engineering THE UNIVERSITY OF AUCKLAND -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.mayo.calvo at gmail.com Tue Jan 31 02:03:09 2012 From: ivan.mayo.calvo at gmail.com (Ivan Mayo) Date: Tue, 31 Jan 2012 11:03:09 +0100 Subject: [Live-devel] H264, MPEG4, MJPEG recording and streaming with proper timming information. Message-ID: Hello, I am currently developing some stream recorder software based on Live 555 libs, I am recording mpeg4, h264 and mjpeg and streaming it later using XXXFileServerMediaSubsession. The main problem here is that, as files contain raw recorded stream, i don?t have any timing information as framerate or each image timestamp, if i want to seek to some exact timestamp i can only do it estimating the byte where to seek the file but it is quite imprecise if you want to have an accuracy of about some hundred miliseconds. My question is how is it meant to recordo those streams with timing information and the proper streaming parameters as the framerate, should I use AVIFileSinker? if so I think there is not any AVIFileServerMediaSubsession as an example, is there any example on how to stream an AVI. Wich is the best way for you to record that RTSP streams with that kind of timing acurracy for later streaming? It turns out that the H264 streams that i am recording do not containg any VUI on their SEI NAL units. I do also have the problem with MJPEG becouse I don?t know how to get the qFactor of JPEG frames for the stream, I know that JPEGVideoRTPSource knows about it but i don?t know how to sink it to the file and wich should be the best file format used to record a MJPEG stream with timing information. Thank you very much and best regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Tue Jan 31 02:30:05 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Tue, 31 Jan 2012 12:30:05 +0200 Subject: [Live-devel] Mp3 Tags Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8B0E@SSTSVR1.sst.local> Hi, It seems to me that Live555 removes ID3 tags from MP3 somewhere before or during streaming. Is this so? If so, where is this done? I checked MP3FileSource and MPEG1or2AudioRTPSink but cannot see anything that seems to relate to ID3 tags. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 31 02:38:58 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Jan 2012 02:38:58 -0800 Subject: [Live-devel] H264, MPEG4, MJPEG recording and streaming with proper timming information. In-Reply-To: References: Message-ID: <57648D07-59C6-4ACD-9443-A67EAD1FF039@live555.com> Unfortunately none of the output 'multimedia' file formats that we currently support - .mov/.mp4, or .avi - are very good at supporting the recording of accurate timestamp information along with each frame, so they are not very well suited for the purpose of recording incoming streams for later re-streaming. (The Matroska file format is much better for this, and someday I hope to support recording into Matroska files (we can already stream from such files).) If you're recording incoming streams only for later re-streaming (i.e., you don't also plan to play the files locally using a video player), then I suggest just coming up with your own simple file format that records a timestamp (i.e., presentation time) and frame size, along with each frame's data. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 31 02:45:25 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Jan 2012 02:45:25 -0800 Subject: [Live-devel] Mp3 Tags In-Reply-To: <002962EA5927BE45B2FFAB0B5B5D67970A8B0E@SSTSVR1.sst.local> References: <002962EA5927BE45B2FFAB0B5B5D67970A8B0E@SSTSVR1.sst.local> Message-ID: <2BEC7ECC-651C-49AA-8ADE-1893F05A249B@live555.com> > It seems to me that Live555 removes ID3 tags from MP3 somewhere before or during streaming. Is this so? Yes, because there's no standard way defined - in the RTP protocol - for streaming the 'ID3' tag information along with the MP3 audio data. Our software deals only with the MP3 audio frames. If you really want your receiver to get ID3 tags for MP3 files, then you should be doing file transfer, not streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Tue Jan 31 03:11:40 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Tue, 31 Jan 2012 13:11:40 +0200 Subject: [Live-devel] Mp3 Tags In-Reply-To: <2BEC7ECC-651C-49AA-8ADE-1893F05A249B@live555.com> References: <002962EA5927BE45B2FFAB0B5B5D67970A8B0E@SSTSVR1.sst.local> <2BEC7ECC-651C-49AA-8ADE-1893F05A249B@live555.com> Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8B0F@SSTSVR1.sst.local> Thanks for the reply. I agree that stripping off the tags is the correct behaviour. I was just wondering how you accomplish this. Where in live555 does it strip out the tags? Regards. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 31 January 2012 12:45 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Mp3 Tags It seems to me that Live555 removes ID3 tags from MP3 somewhere before or during streaming. Is this so? Yes, because there's no standard way defined - in the RTP protocol - for streaming the 'ID3' tag information along with the MP3 audio data. Our software deals only with the MP3 audio frames. If you really want your receiver to get ID3 tags for MP3 files, then you should be doing file transfer, not streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Marlon at scansoft.co.za Tue Jan 31 03:14:21 2012 From: Marlon at scansoft.co.za (Marlon Reid) Date: Tue, 31 Jan 2012 13:14:21 +0200 Subject: [Live-devel] FW: Mp3 Tags Message-ID: <002962EA5927BE45B2FFAB0B5B5D67970A8B11@SSTSVR1.sst.local> I found it in Mp3StreamState.CPP Thanks again. ________________________________ From: Marlon Reid Sent: 31 January 2012 13:12 To: LIVE555 Streaming Media - development & use Subject: RE: [Live-devel] Mp3 Tags Thanks for the reply. I agree that stripping off the tags is the correct behaviour. I was just wondering how you accomplish this. Where in live555 does it strip out the tags? Regards. ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 31 January 2012 12:45 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Mp3 Tags It seems to me that Live555 removes ID3 tags from MP3 somewhere before or during streaming. Is this so? Yes, because there's no standard way defined - in the RTP protocol - for streaming the 'ID3' tag information along with the MP3 audio data. Our software deals only with the MP3 audio frames. If you really want your receiver to get ID3 tags for MP3 files, then you should be doing file transfer, not streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.mayo.calvo at gmail.com Tue Jan 31 02:59:07 2012 From: ivan.mayo.calvo at gmail.com (Ivan Mayo) Date: Tue, 31 Jan 2012 11:59:07 +0100 Subject: [Live-devel] H264, MPEG4, MJPEG recording and streaming with proper timming information. In-Reply-To: <57648D07-59C6-4ACD-9443-A67EAD1FF039@live555.com> References: <57648D07-59C6-4ACD-9443-A67EAD1FF039@live555.com> Message-ID: Great, thank you very much! So, your idea is to create my own fileSinker with my own file format recording each timestamp and frame size and then create my own byteStreamFileSource and serve the data packets as they where sinked but filtering that timestamps and framesizes.I can use some own header as 0xAABBCCDD to mark that timestamp entry points, did I understand you properly? Would you suggest using other header or not using any to mark our entry points?? Thank you again for your amazing feedback. 2012/1/31 Ross Finlayson > Unfortunately none of the output 'multimedia' file formats that we > currently support - .mov/.mp4, or .avi - are very good at supporting the > recording of accurate timestamp information along with each frame, so they > are not very well suited for the purpose of recording incoming streams for > later re-streaming. (The Matroska file format is much better for this, and > someday I hope to support recording into Matroska files (we can already > stream from such files).) > > If you're recording incoming streams only for later re-streaming (i.e., > you don't also plan to play the files locally using a video player), then I > suggest just coming up with your own simple file format that records a > timestamp (i.e., presentation time) and frame size, along with each frame's > data. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jmm55 at psu.edu Tue Jan 31 07:48:56 2012 From: jmm55 at psu.edu (Justin Miller) Date: Tue, 31 Jan 2012 10:48:56 -0500 Subject: [Live-devel] OpenRTSP quicktime file issues Message-ID: When we are writing a quicktime file from an IP address stream the recording turns out fine. The issue we are having is that when viewing the file in quicktime after the recording we get compression artifacts when scrubbing through the timeline, additionally the audio sync is off after scrubbing but perfectly synced during fresh playback. Is OpenRTSP creating a transport stream quicktime file and that is the issue, if so is there any way to record it to a program stream file? Thanks, Justin M. Miller Multimedia Specialist Media Commons Project Manager 212A Rider bld. University Park, PA 16802 814-863-7764 http://mediacommons.psu.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 31 08:01:36 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Jan 2012 08:01:36 -0800 Subject: [Live-devel] H264, MPEG4, MJPEG recording and streaming with proper timming information. In-Reply-To: References: <57648D07-59C6-4ACD-9443-A67EAD1FF039@live555.com> Message-ID: <3A82D089-3E4D-4644-AA2E-11B802CBF0DC@live555.com> > So, your idea is to create my own fileSinker with my own file format recording each timestamp and frame size and then create my own byteStreamFileSource and serve the data packets as they where sinked but filtering that timestamps and framesizes. Yes. > I can use some own header as 0xAABBCCDD to mark that timestamp entry points, did I understand you properly? You don't need to do that, if your headers are all the same size, and contain the frame size. Instead, the frame size value will tell you how many bytes you need to read to get to the next header. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 31 08:05:42 2012 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 31 Jan 2012 08:05:42 -0800 Subject: [Live-devel] OpenRTSP quicktime file issues In-Reply-To: References: Message-ID: > When we are writing a quicktime file from an IP address stream the recording turns out fine. The issue we are having is that when viewing the file in quicktime after the recording we get compression artifacts when scrubbing through the timeline, additionally the audio sync is off after scrubbing but perfectly synced during fresh playback. Is OpenRTSP creating a transport stream quicktime file and that is the issue There's no such thing as a "Transport Stream QuickTime file". "Transport Stream" and "QuickTime File" are two completely different file formats. > if so is there any way to record it to a program stream file? No. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivan.mayo.calvo at gmail.com Tue Jan 31 08:17:16 2012 From: ivan.mayo.calvo at gmail.com (Ivan Mayo) Date: Tue, 31 Jan 2012 17:17:16 +0100 Subject: [Live-devel] H264, MPEG4, MJPEG recording and streaming with proper timming information. In-Reply-To: <3A82D089-3E4D-4644-AA2E-11B802CBF0DC@live555.com> References: <57648D07-59C6-4ACD-9443-A67EAD1FF039@live555.com> <3A82D089-3E4D-4644-AA2E-11B802CBF0DC@live555.com> Message-ID: > > I can use some own header as 0xAABBCCDD to mark that timestamp entry > points, did I understand you properly? > > > *You don't need to do that, if your headers are all the same size, and > contain the frame size. Instead, the frame size value will tell you how > many bytes you need to read to get to the next header.* > Well, I was thinking about some sanity check becouse findind some wrong timestamp could make hard damage to the stream, however i am thinking now that it can be cleaner to create some other file, something like vsif (video streaming information file) where to record some known data as frame rate or jpeg quality or so and then some pairs of byte and its timestamp every second or so and handle this file on the FileServerMediaSubsesion to manage seeking, scaling and framers creation... Thank you! I will take a look into this two posibilities. 2012/1/31 Ross Finlayson > So, your idea is to create my own fileSinker with my own file format > recording each timestamp and frame size and then create my own > byteStreamFileSource and serve the data packets as they where sinked but > filtering that timestamps and framesizes. > > > Yes. > > > I can use some own header as 0xAABBCCDD to mark that timestamp entry > points, did I understand you properly? > > > You don't need to do that, if your headers are all the same size, and > contain the frame size. Instead, the frame size value will tell you how > many bytes you need to read to get to the next header. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From iminimup at gmail.com Tue Jan 31 09:52:12 2012 From: iminimup at gmail.com (imin imup) Date: Tue, 31 Jan 2012 11:52:12 -0600 Subject: [Live-devel] motion JPEG over RTP over UDP streaming server (RFC2435). In-Reply-To: <3FA7311B-D23E-4522-B510-DC3E500DB0EA@live555.com> References: <3FA7311B-D23E-4522-B510-DC3E500DB0EA@live555.com> Message-ID: On Mon, Jan 30, 2012 at 9:14 PM, Ross Finlayson wrote: > After going through your 2 excellent faqs and Elphel source, am I correct > to say the quickest way to build a MJPEG streamer (from JPEG files in > prototype) would be to modify ElphelJPEGDeviceSource.cpp only? > > > Actually, I don't recommend that you bother with streaming JPEG at all > (especially as this is just a hobby for you). JPEG streaming is very > inefficient. See http://www.live555.com/liveMedia/faq.html#jpeg-streaming > Thanks for your suggestion. I understand MJPEG consumes more data rate (it looks 5x to 10x). However Motion JPEG is widely used in addition to H.264 in my target application. My job is to build a field mode Motion JPEG streamer from a live interlaced video source (It seems Elphel only supports progressive scan). My plan is a) to set up a MJPEG RTP streamer on PC first and test; b) if it works in push mode without RTSP/TCP, move it into target device which prefers to supports UDP only. Any help and advice from you are appreciated. -------------- next part -------------- An HTML attachment was scrubbed... URL: