From henxy21 at gmail.com Wed Jul 1 02:36:40 2009 From: henxy21 at gmail.com (Z.T Huang) Date: Wed, 1 Jul 2009 17:36:40 +0800 Subject: [Live-devel] openRTSP transmission question In-Reply-To: References: Message-ID: Thanks Ross first :-) I printed each packet before sending and the problem was gone, so....I think it's probably because streamer send packets too fast ( server and client are both on the same machine ) so that reciver could not hold them. Z.T 2009/6/30 Ross Finlayson > I'm trying to send JPEG file, and wrote my JPEGVideoSource(), I use >> openRTSP as the client, and my input test file is just a frame, but i got >> some trouble, sometimes openRTSP could got the stream and saved, but >> sometime it didn't - the recived file is 0 byte.....and always got "BYE" and >> "TEARDOWN" msg, i don't know where is something wrong, could anyone give the >> direction to figure out the point or how and where to put some debug code >> in openRTSP to see if it take packets >> > > When you don't receive a frame, it's probably because one or more of the > RTP packets got lost. If any of the (very many) RTP packets that make up a > JPEG frame get lost, then the whole frame is discarded. > > JPEG is a very poor codec for video streaming. You should use a more > efficient codec (specifically for video) instead. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From soumya.patra at lge.com Wed Jul 1 03:41:05 2009 From: soumya.patra at lge.com (soumya patra) Date: Wed, 1 Jul 2009 16:11:05 +0530 Subject: [Live-devel] video-MP4V-ES-1 not playing in case of testOnDemandRTSPServer Message-ID: <20090701104105.B25AC558008@LGEMRELSE6Q.lge.com> Hi Ross, I am using openRTSP to receive the mpeg4ESVideoTest stream from testOnDemandRTSPServer giving input as test.m4e. It's creating the output file video-MP4V-ES-1 but when am trying to play it with mplayer it's not playing. But in case of testMPEG4VideoStreamer it is able to play. Waiting for your help. Regards Soumya -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jul 1 03:53:38 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Jul 2009 03:53:38 -0700 Subject: [Live-devel] video-MP4V-ES-1 not playing in case of testOnDemandRTSPServer In-Reply-To: <20090701104105.B25AC558008@LGEMRELSE6Q.lge.com> References: <20090701104105.B25AC558008@LGEMRELSE6Q.lge.com> Message-ID: > I am using openRTSP to receive the mpeg4ESVideoTest stream >from testOnDemandRTSPServer giving input as test.m4e. It's creating >the output file video-MP4V-ES-1 but when am trying to play it with >mplayer it's not playing. That's simply because MPlayer does not know how to play such a file (a MPEG-4 Video Elementary Stream file). The place to complain about this is on a MPlayer mailing list - not this one (because MPlayer is not our software). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrussell at frontiernet.net Wed Jul 1 12:23:14 2009 From: mrussell at frontiernet.net (mrussell at frontiernet.net) Date: Wed, 1 Jul 2009 19:23:14 +0000 (UTC) Subject: [Live-devel] numChannels Parameter In-Reply-To: <708701629.3868721246474988586.JavaMail.root@cl04-host03.roch.ny.frontiernet.net> Message-ID: <756622128.3873561246476194724.JavaMail.root@cl04-host03.roch.ny.frontiernet.net> Hi Ross, I have a SimpleRTPSink that is fed a transport stream from MPEG2TransportStreamFramer. That transport stream contains one MPEG-4 video elementary stream plus one MPEG-1, Layer 3 stereo (2-channel) audio stream. When I call SimpleRTPSink::createNew(), what value should I use for the numChannels parameter? I looked at the source for SimpleRTPSink::createNew(), but it is still not clear to me. Thanks in advance, Mike. From finlayson at live555.com Wed Jul 1 18:33:45 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Jul 2009 18:33:45 -0700 Subject: [Live-devel] numChannels Parameter Message-ID: >I have a SimpleRTPSink that is fed a transport stream from >MPEG2TransportStreamFramer. That transport stream contains one >MPEG-4 video elementary stream plus one MPEG-1, Layer 3 stereo >(2-channel) audio stream. When I call SimpleRTPSink::createNew(), >what value should I use for the numChannels parameter? When streamed over RTP, a MPEG Transport Stream - regardless of its actual contents - is described in SDP as being a "video" stream, and therefore the "numChannels" parameter is ignored, but should be set to 1. See, for example, the code in "MPEG2TransportFileServerMediaSubsession.cpp": return SimpleRTPSink::createNew(envir(), rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); Note also that - for network efficiency - you should pack as much Transport Stream data as possible (but with a granularity of 188 bytes) into each RTP packet. Using the default MTU, this usually means 7*188 bytes of Transport Stream data per RTP packet. See, for example, code in "MPEG2TransportFileServerMediaSubsession.cpp" (for streaming via unicast), or "testMPEG2TransportStreamer" (for streaming via multicast). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bourdag at gmail.com Thu Jul 2 04:23:50 2009 From: bourdag at gmail.com (bourda guillaume) Date: Thu, 2 Jul 2009 13:23:50 +0200 Subject: [Live-devel] One Source and Multiple Sink Message-ID: Hi! In the Source/Sink live model, I can: create 1 source, create 1 sink and "attach" both "sink->startPlaying(source)" I would like to know if I can "attach" many sink to one source? For example in order to create different files from 1 single RTP stream. I hope my question is clear enough. Thanks in advance for your answers. Guillaume -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jul 2 05:10:04 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Jul 2009 05:10:04 -0700 Subject: [Live-devel] One Source and Multiple Sink In-Reply-To: References: Message-ID: >In the Source/Sink live model, I can: create 1 source, create 1 >sink and "attach" both "sink->startPlaying(source)" > >I would like to know if I can "attach" many sink to one source? No. Each source object can be fed to only one sink. To do what you want, in general, you would need to write a special 'duplicator' object that created multiple source objects from a single input - and then feed each of these sources to its own sink. (That would be a bit tricky to do, though.) >For example in order to create different files from 1 single RTP stream. For this particular example, the easiest thing to do would be for you to write a new 'sink' class - similar to FileSink - that writes to multiple output files. I.e., it would just be one 'sink' object, that happens to write to multiple files. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mrussell at frontiernet.net Thu Jul 2 16:26:07 2009 From: mrussell at frontiernet.net (Michael Russell) Date: Thu, 02 Jul 2009 19:26:07 -0400 Subject: [Live-devel] .m4v / .mp3 Synchronization Message-ID: <4A4D420F.7070706@frontiernet.net> Hi Ross - I am prototyping a streaming application using an MPEG-1, Layer 3 (.mp3) audio file and an MPEG-4 video elementary stream (.m4v) file as inputs. I am doing this to simulate our actual encoder outputs since they are not yet available. I recorded these files from two different physical sources on two different days. When I wrap them into an MPEG-2 transport stream, I experience synchronization issues. I seem to be dropping a lot of audio packets, making it sound like the audio is "jumping ahead" to catch up to the video (VLC client). Here is a depiction of what I am doing: |ByteStream| |MPEG4 | | | .m4v ->|FileSource|->|Video |->| | file |Stream | | | |Framer | |MPEG2 | |MPEG2 | | | |Transport |->|Transport | |Simple| |Stream | |Stream |->|RTP | |MPEG1or2| |From | |Framer | |Sink | |ByteStream| |Audio | |ESSource | | | .mp3 ->|FileSource|->|Stream |->| | file |Framer | | | (I didn't show the RTCP Instance associated with my RTP Sink) Am I doing something wrong here? How does Live555 ensure synchronization? Note that if I "disconnect" one of the inputs, the resulting transport stream (audio or video) plays fine in my VLC client. Thanks a ton, Mike. From teresa525 at gmail.com Thu Jul 2 23:20:00 2009 From: teresa525 at gmail.com (Teresa) Date: Fri, 3 Jul 2009 14:20:00 +0800 Subject: [Live-devel] Support stream avi file using mediaServer Message-ID: <9f05cb70907022320p65f65500j18a71f34dc2405cd@mail.gmail.com> Hi, Does live555MediaServer support steaming AVI container format? I searched and find out it does not. So, I am wondering how to implement it in live555 library? Can anyone give me a guild to what function i need to implement ? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jul 3 00:02:28 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Jul 2009 00:02:28 -0700 Subject: [Live-devel] .m4v / .mp3 Synchronization In-Reply-To: <4A4D420F.7070706@frontiernet.net> References: <4A4D420F.7070706@frontiernet.net> Message-ID: >I am prototyping a streaming application using an MPEG-1, Layer 3 >(.mp3) audio file and an MPEG-4 video elementary stream (.m4v) file >as inputs. I am doing this to simulate our actual encoder outputs >since they are not yet available. I recorded these files from two >different physical sources on two different days. When I wrap them >into an MPEG-2 transport stream, I experience synchronization issues. This has nothing to do with RTP streaming. When you stream a Transport Stream, it is done as a single RTP stream (*not* as separate audio and video streams). All synchronization happens within the Transport Stream. I suspect that you'll see the same synchronization problem if you just *play* your Transport Stream locally using a media player (rather than trying to stream it via RTP). That is what you should test first. Your synchronization problem occurred when you *created* (multiplexed) your Transport Stream from your audio and video inputs. If you used our software to create your Transport Stream (it wasn't really clear from your message whether or not you did), then you should make sure that you're setting correct timestamps on each frame of data that you feed into the multiplexor. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mrussell at frontiernet.net Fri Jul 3 08:27:55 2009 From: mrussell at frontiernet.net (Michael Russell) Date: Fri, 03 Jul 2009 11:27:55 -0400 Subject: [Live-devel] .m4v / .mp3 Synchronization In-Reply-To: References: <4A4D420F.7070706@frontiernet.net> Message-ID: <4A4E237B.8060908@frontiernet.net> Ross Finlayson wrote: > > Your synchronization problem occurred when you *created* (multiplexed) > your Transport Stream from your audio and video inputs. If you used > our software to create your Transport Stream (it wasn't really clear > from your message whether or not you did), then you should make sure > that you're setting correct timestamps on each frame of data that you > feed into the multiplexor. Hi Ross, Thanks for your reply. Yes, I used the live555 "MPEG2TransportStreamFromESSource" object to create my transport stream. Audio is fed from "MPEG1or2AudioStreamFramer" and video is fed from "MPEG4VideoStreamFramer". My actual .m4v and .mp3 files came from two other sources. You mentioned "setting the correct timestamps on each frame" - I too suspect this to be the root of problem. But where do these timestamps originate? Do I have any control over them? (I suspect that they get generated by my two framers. My .mp3 and .m4v files were recorded at different times and may not be the ~exact~ same duration, but I doubt that would matter.) > ------------------------------------------------------------------------ > > > No virus found in this incoming message. > Checked by AVG - www.avg.com > Version: 8.5.375 / Virus Database: 270.13.3/2216 - Release Date: 07/03/09 05:53:00 > > From finlayson at live555.com Fri Jul 3 23:04:41 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Jul 2009 23:04:41 -0700 Subject: [Live-devel] .m4v / .mp3 Synchronization In-Reply-To: <4A4E237B.8060908@frontiernet.net> References: <4A4D420F.7070706@frontiernet.net> <4A4E237B.8060908@frontiernet.net> Message-ID: >Yes, I used the live555 "MPEG2TransportStreamFromESSource" object to >create my transport stream. Audio is fed from >"MPEG1or2AudioStreamFramer" and video is fed from >"MPEG4VideoStreamFramer". My actual .m4v and .mp3 files came from >two other sources. > >You mentioned "setting the correct timestamps on each frame" - I >too suspect this to be the root of problem. But where do these >timestamps originate? Do I have any control over them? (I suspect >that they get generated by my two framers. That's correct. The timestamps (specifically, the "fPresentationTime" variable) should be set by each Framer object. These are used to set the SCR timestamps in the resulting Transport Stream. So I'm not sure why this isn't working for you; you're going to have to track this down yourself. In particular, look at how the "fSCR" variable is set in the "InputESSourceRecord::afterGettingFrame()" function, in "MPEG2TransportStreamFromESSource". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From microqq001 at gmail.com Sat Jul 4 04:55:23 2009 From: microqq001 at gmail.com (qiqi z) Date: Sat, 4 Jul 2009 19:55:23 +0800 Subject: [Live-devel] Where can I find the format document of H.264 Elementary Stream? Message-ID: <78b5e57e0907040455l22e8388gfe02cf9aa23003c6@mail.gmail.com> I have googled, but not find details. Is there standard format for H.264 ES stream? Thanks From finlayson at live555.com Sat Jul 4 05:42:55 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 4 Jul 2009 05:42:55 -0700 Subject: [Live-devel] Where can I find the format document of H.264 Elementary Stream? In-Reply-To: <78b5e57e0907040455l22e8388gfe02cf9aa23003c6@mail.gmail.com> References: <78b5e57e0907040455l22e8388gfe02cf9aa23003c6@mail.gmail.com> Message-ID: >I have googled, but not find details. > >Is there standard format for H.264 ES stream? RFC 3984 -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From woods.biz at gmail.com Sat Jul 4 20:24:14 2009 From: woods.biz at gmail.com (Woods) Date: Sun, 5 Jul 2009 11:24:14 +0800 Subject: [Live-devel] livemedia on cygwin Message-ID: <3ba8ebc0907042024r492ee34fi7a2151941022f2f6@mail.gmail.com> Hi Experts, I want to know if livemedia can be compiled using cygwin? Thanks for your suggestion. -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jul 5 00:58:20 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 5 Jul 2009 00:58:20 -0700 Subject: [Live-devel] livemedia on cygwin In-Reply-To: <3ba8ebc0907042024r492ee34fi7a2151941022f2f6@mail.gmail.com> References: <3ba8ebc0907042024r492ee34fi7a2151941022f2f6@mail.gmail.com> Message-ID: >I want to know if livemedia can be compiled using cygwin? Did you see the file "config.cygwin"? Just run genMakefiles cygwin like you would for any Unix/Linux system, and then run make -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From stas.oskin at gmail.com Sun Jul 5 02:07:47 2009 From: stas.oskin at gmail.com (Stas Oskin) Date: Sun, 5 Jul 2009 12:07:47 +0300 Subject: [Live-devel] Live 555 multi-thread support Message-ID: <77938bc20907050207s2fe2d283g95f839bada248b99@mail.gmail.com> Hi. I read the FAQ but I still not 100% sure about this: " *Longer answer:* More than one thread can still use this code, if only one thread runs the library code proper, and the other thread(s) communicate with the library only by setting global 'flag' variables. (For one possible use of this technique, see the answer to this question.) Another possible way to access the code from multiple threads is to have each thread use its own "UsageEnvironment" and "TaskScheduler" objects, and thus its own event loop. The objects created by each thread (i.e., using its own "UsageEnvironment") must not interact (except via global variables). " Is there any good guide-lines to run the Live555 (or OpenRTSP example) in multi-threaded fashion? Or my best bet would be to run separate processes? Please note that I intend to use multi-core CPU's, so multi-threading is quite important here. Thanks for any information! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jul 5 02:58:10 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 5 Jul 2009 02:58:10 -0700 Subject: [Live-devel] Live 555 multi-thread support In-Reply-To: <77938bc20907050207s2fe2d283g95f839bada248b99@mail.gmail.com> References: <77938bc20907050207s2fe2d283g95f839bada248b99@mail.gmail.com> Message-ID: >Hi. > >I read the FAQ but I still not 100% sure about this: > >" > >Longer answer: More than one thread can still use this code, if only >one thread runs the library code proper, and the other thread(s) >communicate with the library only by setting global 'flag' >variables. (For one possible use of this technique, see the answer >to >this >question.) > >Another possible way to access the code from multiple threads is to >have each thread use its own "UsageEnvironment" and "TaskScheduler" >objects, and thus its own event loop. The objects created by each >thread (i.e., using its own "UsageEnvironment") must not interact >(except via global variables). > >" > >Is there any good guide-lines to run the Live555 (or OpenRTSP >example) in multi-threaded fashion? > >Or my best bet would be to run separate processes? Yes, that's the best way - use multiple processes if you can. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From woods.biz at gmail.com Sun Jul 5 06:47:54 2009 From: woods.biz at gmail.com (Woods) Date: Sun, 5 Jul 2009 21:47:54 +0800 Subject: [Live-devel] livemedia on cygwin In-Reply-To: References: <3ba8ebc0907042024r492ee34fi7a2151941022f2f6@mail.gmail.com> Message-ID: <3ba8ebc0907050647o77fb978cla23ab480e17128c2@mail.gmail.com> Hi, Thanks for suggestion. I have made it compiled and working now. But, there is a small amendment, that I need to comment the following block, which would re-define with my cygwin multicast socket header file's structure definition otherwise. For your info, I am using cygwin 1.7. // The source-specific join/leave operations require special setsockopt() // commands, and a special structure (ip_mreq_source). If the include files // didn't define these, we do so here: #if !defined(IP_ADD_SOURCE_MEMBERSHIP) || defined(__CYGWIN32__) // NOTE TO CYGWIN DEVELOPERS: // The "defined(__CYGWIN32__)" test was added above, because - as of January 2007 - the Cygwin header files // define IP_ADD_SOURCE_MEMBERSHIP (and IP_DROP_SOURCE_MEMBERSHIP), but do not define ip_mreq_source. // This has been acknowledged as a bug (see < http://cygwin.com/ml/cygwin/2007-01/msg00516.html>), but it's // not clear when it is going to be fixed. When the Cygwin header files finally define "ip_mreq_source", // this code will no longer compile, due to "ip_mreq_source" being defined twice. When this happens, please // let us know, by sending email to the "live-devel" mailing list. // (See to subscribe to that mailing list.) // END NOTE TO CYGWIN DEVELOPERS struct ip_mreq_source { struct in_addr imr_multiaddr; /* IP multicast address of group */ struct in_addr imr_sourceaddr; /* IP address of source */ struct in_addr imr_interface; /* local IP address of interface */ }; #endif Regards, Woods On Sun, Jul 5, 2009 at 3:58 PM, Ross Finlayson wrote: > I want to know if livemedia can be compiled using cygwin? >> > > Did you see the file "config.cygwin"? > > Just run > genMakefiles cygwin > like you would for any Unix/Linux system, and then run > make > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jul 5 07:12:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 5 Jul 2009 07:12:26 -0700 Subject: [Live-devel] livemedia on cygwin In-Reply-To: <3ba8ebc0907050647o77fb978cla23ab480e17128c2@mail.gmail.com> References: <3ba8ebc0907042024r492ee34fi7a2151941022f2f6@mail.gmail.com> <3ba8ebc0907050647o77fb978cla23ab480e17128c2@mail.gmail.com> Message-ID: >Thanks for suggestion. I have made it compiled and working now. > >But, there is a small amendment, that I need to comment the >following block, which would re-define with my cygwin multicast >socket header file's structure definition otherwise. So, is the Cygwin bug (referred to in the comments in that block) now fixed? I.e., do the Cygwin header files now define IP_ADD_SOURCE_MEMBERSHIP (and IP_DROP_SOURCE_MEMBERSHIP), ***and also*** define ip_mreq_source? Could someone (preferably, someone who uses a professional email address :-) please confirm (or deny) this? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From woods.biz at gmail.com Tue Jul 7 01:43:02 2009 From: woods.biz at gmail.com (Woods) Date: Tue, 7 Jul 2009 16:43:02 +0800 Subject: [Live-devel] mp3 streamer doubt Message-ID: <3ba8ebc0907070143q9bc9f8bm51d074e0f90c50a1@mail.gmail.com> Dear Experts, I have a doubt on my MP3 Streamer. This streamer is based on testMP3Streamer code, but I changed to use BasicUDPSink. And it works fine. As you know, in play() function, the code was like: source = MP3FileSource::createNew(*env,fileName); But after I slightly changed my code as follows: MP3FileSource *mp3fileSource = MP3FileSource::createNew(*env,fileName); MPEG2TransportStreamFromESSource *m2ts = MPEG2TransportStreamFromESSource::createNew(*env); m2ts->addNewAudioSource(mp3fileSource,1); source = MPEG2TransportStreamFramer::createNew(*env, m2ts); What I want is streaming mp3 within mpeg2ts / udp. For the first round of play, this is perfect. But, when mp3 file reaches end, the stop() and play() functions are called in sequence, the program segment fault when it starts to playing again. But if I comment the Medium::close() in stop() function, (I do this only for testing purpose), I can repeatedly stream that mp3 file. After testing in different ways, I suspect there is something wrong when I use MPEG2TransportStreamFromESSource, especially when I delete it. But according to my understanding, in my code, the delete operations should be: stop() function --> Medium::close(source) --> delete MPEG2TransportStreamFramer --> delete MPEG2TransportStreamFromESSource --> delete MP3FileSource. It looks OK! But what is wrong? Could anyone give me a suggestion? Thanks -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From ottavio at videotec.com Wed Jul 8 03:36:39 2009 From: ottavio at videotec.com (Ottavio Campana) Date: Wed, 08 Jul 2009 12:36:39 +0200 Subject: [Live-devel] problems with DeviceSource and OnDemandServerMediaSubsession Message-ID: <4A5476B7.8010106@videotec.com> Hi, I'm trying to create a subclass of DeviceSource, namely VideoDeviceSource that extracts the NAL units of an H.264 compressed video from a queue and feeds them to a subclass of OnDemandServerMediaSubsession called VideoOnDemandServerMediaSubsession . As client I use openRTSP to fetch the stream. It connects but it never receives data. I noticed that VideoDeviceSource::deliverFrame() is never called, but I cannot understand why. I suspect it's a problem of VideoOnDemandServerMediaSubsession but I'd like to ask you for feedback. My implementation of VideoOnDemandServerMediaSubsession is attached. Do you see anything wrong? (f_rtsp_layer is a pointer to a struct holding a bunch of variables) #include "VideoOnDemandServerMediaSubsession.h" #include "VideoDeviceSource.h" #include VideoOnDemandServerMediaSubsession * VideoOnDemandServerMediaSubsession::createNew (UsageEnvironment & env, Boolean reuseFirstSource, rtsp_layer_t * rtsp_layer) { return new VideoOnDemandServerMediaSubsession (env, reuseFirstSource, rtsp_layer); } VideoOnDemandServerMediaSubsession::VideoOnDemandServerMediaSubsession (UsageEnvironment & env, Boolean reuseFirstSource, rtsp_layer_t * rtsp_layer):OnDemandServerMediaSubsession (env, reuseFirstSource), f_rtsp_layer (rtsp_layer) { } VideoOnDemandServerMediaSubsession::~VideoOnDemandServerMediaSubsession () { } FramedSource * VideoOnDemandServerMediaSubsession:: createNewStreamSource (unsigned clientSessionId, unsigned &estBitrate) { /* this number of kbps is now totally fake, we just use this as a remider. */ estBitrate = 2000; return VideoDeviceSource::createNew (envir (), f_rtsp_layer); } RTPSink * VideoOnDemandServerMediaSubsession::createNewRTPSink (Groupsock * rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource * inputSource) { return H264VideoRTPSink::createNew (envir (), rtpGroupsock, 96, f_rtsp_layer->profile_idc, ((VideoDeviceSource*)inputSource)->sprop_parameter_sets() ); } From mrussell at frontiernet.net Tue Jul 7 19:16:19 2009 From: mrussell at frontiernet.net (Michael Russell) Date: Tue, 07 Jul 2009 22:16:19 -0400 Subject: [Live-devel] .m4v / .mp3 Synchronization In-Reply-To: References: <4A4D420F.7070706@frontiernet.net> <4A4E237B.8060908@frontiernet.net> Message-ID: <4A540173.40007@frontiernet.net> Ross Finlayson wrote: > That's correct. The timestamps (specifically, the "fPresentationTime" > variable) should be set by each Framer object. These are used to set > the SCR timestamps in the resulting Transport Stream. So I'm not sure > why this isn't working for you; you're going to have to track this > down yourself. > > In particular, look at how the "fSCR" variable is set in the > "InputESSourceRecord::afterGettingFrame()" function, in > "MPEG2TransportStreamFromESSource". Thanks for your help, Ross. I investigated my synchronization problem further and I have some additional information in case you (or anyone else for that matter) are able to offer further suggestions or direction. To recap, here is what may data chain looks like after I took your suggestion to write my transport stream to a file instead of using RTP: I have two independent ByteStreamFileSource objects - One feeds MPEG-4 video elementary stream (.m4v) data to an MPEG4VideoStreamFramer. One feeds MPEG-1, Layer 3 (.mp3) audio data to an MPEG1or2AudioStreamFramer. Those framers then each feed a MPEG2TransportStreamFromESSource object. The resultant transport stream feeds a FileSink object that writes my .ts file. The synchronization problem that I originally described has turned into a "truncated video" problem. The only difference now is that I'm not streaming over the network anymore. Weird, but whatever. Here is what I have observed/concluded so far: 1) If I remove either the audio or video framer, the resultant transport stream file plays fine (Just audio or video of course, but it plays fine.) 2) With both the audio and video framers in place, the resultant transport stream contains a truncated video stream (about 7% of its original length). I have verified this with various media analysis tools. Of course, when I try to play the file, the video portion soon stops while the audio portion continues to play to completion. 3) I enabled the debug output in LiveMedia and noted that the PTS and SCR calculations for both audio and video data streams look correct. I saw nothing out of the ordinary in the debug output except for an early end to PTS/SCR calculations for the video stream (further evidence of its truncation). My own additional debug output suggests that my MPEG2TransportStreamFromESSource object just stops calling getNextFrame() on the video input source (MPEG4VideoStreamFramer). I have no idea why this would be. It seems that the MPEG2TransportStreamFromESSource is running into trouble when it has to deal with my two framer objects feeding it at the same time. Here's a thought: Should I be using ByteStreamMultiFileSource instead of two separate ByteStreamFileSource objects? Any other ideas for me to try? Regards, Mike. From woods.biz at gmail.com Wed Jul 8 08:09:21 2009 From: woods.biz at gmail.com (Woods) Date: Wed, 8 Jul 2009 23:09:21 +0800 Subject: [Live-devel] mp3 streamer doubt In-Reply-To: <3ba8ebc0907070143q9bc9f8bm51d074e0f90c50a1@mail.gmail.com> References: <3ba8ebc0907070143q9bc9f8bm51d074e0f90c50a1@mail.gmail.com> Message-ID: <3ba8ebc0907080809j52ec5b88u2d8e5f75341b3f45@mail.gmail.com> Dear all, This is a follow up email, regarding mp3 streamer. My previous mail is in the rear forwarded message part of this mail. Below presents my full code, which can be compiled and executable. As I tested, the code will crush when it plays test.mp3 for the second round. But if you change to use mpg file or ts file, the code will be OK. Also, as I mentioned in previous mail, if I directly use MP3FileSource, without using MPEG2TransportStreamFromESSource related classes, the code will be OK without fault. Experts, please suggest whether my code is wrong or there is potential bug in livemedia lib? Thanks Woods ----------------------------------------------------------------------- #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" #define TRANSPORT_PACKETS_PER_NETWORK_PACKET 7 #define TRANSPORT_PACKET_SIZE 188 class MyStreamer { public: UsageEnvironment *env; char * fileName; char * indexName; char * address; unsigned short port; Groupsock *rtpGroupsock; MediaSink *sink; MPEG1or2Demux* baseDemultiplexor; FramedSource* source; public: MyStreamer(UsageEnvironment* env, char *file, char * index, char * udpAddress, unsigned short udpPort); virtual ~MyStreamer(); void play(); void stop(); }; static void replay(void *param) { MyStreamer *streamer = (MyStreamer*)param; streamer->stop(); streamer->play(); } MyStreamer:: MyStreamer(UsageEnvironment* environment, char * file, char * index, char * udpAddress, unsigned short udpPort) { env = environment; indexName = new char[strlen(index)+1]; strcpy(indexName,index); address = new char[strlen(udpAddress)+1]; strcpy(address,udpAddress); fileName = file; fileName = new char[strlen(file)+1]; strcpy(fileName,file); port = udpPort; const unsigned char ttl = 16; // reduce it in case routers don't admin scope struct in_addr destinationAddress; destinationAddress.s_addr = our_inet_addr(address); Port destPort(udpPort); rtpGroupsock = new Groupsock(*env, destinationAddress, destPort, ttl); sink = BasicUDPSink::createNew(*env,rtpGroupsock,65536); *env << "Streamer " << indexName << " starting...\n"; play(); return; } MyStreamer:: ~MyStreamer() { *env << "Streamer " << indexName << " stopping...\n"; stop(); Medium::close(sink);sink = NULL; if(rtpGroupsock != NULL){ delete rtpGroupsock; rtpGroupsock = NULL; } delete [] indexName; delete [] address; delete [] fileName; *env << "Streamer Closed\n"; } void MyStreamer::play() { unsigned const inputDataChunkSize = TRANSPORT_PACKETS_PER_NETWORK_PACKET * TRANSPORT_PACKET_SIZE; ByteStreamFileSource * fileSource = ByteStreamFileSource::createNew(*env, fileName, inputDataChunkSize); if(fileSource == NULL){ *env << "unfound file\n"; source = NULL; return; } char const* extension = strrchr(fileName, '.'); if(strcmp(extension,".mpg")==0){ baseDemultiplexor = MPEG1or2Demux::createNew(*env, fileSource); MPEG1or2DemuxedElementaryStream* pesSource = baseDemultiplexor->newRawPESStream(); FramedSource* tsFrames = MPEG2TransportStreamFromPESSource::createNew(*env, pesSource); source = MPEG2TransportStreamFramer::createNew(*env, tsFrames); } else if(strcmp(extension,".ts")==0){ source = MPEG2TransportStreamFramer::createNew(*env, fileSource); } else if(strcmp(extension,".mp3")==0){ Medium::close(fileSource); source = MP3FileSource::createNew(*env,fileName); MPEG2TransportStreamFromESSource *m2ts = MPEG2TransportStreamFromESSource::createNew(*env); m2ts->addNewAudioSource(source,1); source = MPEG2TransportStreamFramer::createNew(*env, m2ts); } *env << "Streaming file " << fileName << "...\n"; sink->startPlaying(*source, replay, this); } void MyStreamer::stop() { *env << "Stop sink\n"; if(sink != NULL) sink->stopPlaying(); *env << "Close source\n"; Medium::close(source); source = NULL; if(baseDemultiplexor != NULL){ *env << "Close mpeg related souces\n"; Medium::close(baseDemultiplexor); baseDemultiplexor = NULL; } *env << "Streamer stopped\n"; } int main(int argc, char** argv){ // char file[] = "test.mpg"; // char file[] = "test.ts"; char file[] = "test.mp3"; TaskScheduler* scheduler = BasicTaskScheduler::createNew(); UsageEnvironment *env = BasicUsageEnvironment::createNew(*scheduler); MyStreamer *streamer1 = new MyStreamer(env, file, "first", "234.100.1.1", 9000); MyStreamer *streamer2 = new MyStreamer(env, file, "second", "234.100.1.2", 9000); scheduler->doEventLoop(); } ----------------------------------------------------------------------------- ---------- Forwarded message ---------- From: Woods Date: Tue, Jul 7, 2009 at 4:43 PM Subject: mp3 streamer doubt To: LIVE555 Streaming Media - development & use Dear Experts, I have a doubt on my MP3 Streamer. This streamer is based on testMP3Streamer code, but I changed to use BasicUDPSink. And it works fine. As you know, in play() function, the code was like: source = MP3FileSource::createNew(*env,fileName); But after I slightly changed my code as follows: MP3FileSource *mp3fileSource = MP3FileSource::createNew(*env,fileName); MPEG2TransportStreamFromESSource *m2ts = MPEG2TransportStreamFromESSource::createNew(*env); m2ts->addNewAudioSource(mp3fileSource,1); source = MPEG2TransportStreamFramer::createNew(*env, m2ts); What I want is streaming mp3 within mpeg2ts / udp. For the first round of play, this is perfect. But, when mp3 file reaches end, the stop() and play() functions are called in sequence, the program segment fault when it starts to playing again. But if I comment the Medium::close() in stop() function, (I do this only for testing purpose), I can repeatedly stream that mp3 file. After testing in different ways, I suspect there is something wrong when I use MPEG2TransportStreamFromESSource, especially when I delete it. But according to my understanding, in my code, the delete operations should be: stop() function --> Medium::close(source) --> delete MPEG2TransportStreamFramer --> delete MPEG2TransportStreamFromESSource --> delete MP3FileSource. It looks OK! But what is wrong? Could anyone give me a suggestion? Thanks -- Woods -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jerry.Johns at nuvation.com Wed Jul 8 11:14:18 2009 From: Jerry.Johns at nuvation.com (Jerry Johns) Date: Wed, 8 Jul 2009 11:14:18 -0700 Subject: [Live-devel] Live 555 multi-thread support In-Reply-To: References: <77938bc20907050207s2fe2d283g95f839bada248b99@mail.gmail.com> Message-ID: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8B52@mailguy3.skynet.nuvation.com> You don't really have to run it under multiple processes, especially since it works fine under its own thread (pthread in Linux) The heart of its implementation revolves around the select() call involving multiple sockets, so really, it is 'safe' as long as no one else uses those sockets/ports, and you don't try to run multiple instances of LiveMedia on multiple threads. I've successfully used LiveMedia in designs using this above concept and as long as you lower its thread priority relative to the other threads in your system, it chugs along just fine :-) Jerry Johns Design Engineer Nuvation Research Corp - Canada Tel: (519) 746-2304 ext. 221 www.nuvation.com ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Sunday, July 05, 2009 5:58 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live 555 multi-thread support Hi. I read the FAQ but I still not 100% sure about this: " Longer answer: More than one thread can still use this code, if only one thread runs the library code proper, and the other thread(s) communicate with the library only by setting global 'flag' variables. (For one possible use of this technique, see the answer to this question .) Another possible way to access the code from multiple threads is to have each thread use its own "UsageEnvironment" and "TaskScheduler" objects, and thus its own event loop. The objects created by each thread (i.e., using its own "UsageEnvironment") must not interact (except via global variables). " Is there any good guide-lines to run the Live555 (or OpenRTSP example) in multi-threaded fashion? Or my best bet would be to run separate processes? Yes, that's the best way - use multiple processes if you can. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From slaine at slaine.org Thu Jul 9 01:15:48 2009 From: slaine at slaine.org (Glen Gray) Date: Thu, 9 Jul 2009 09:15:48 +0100 Subject: [Live-devel] [vlc-devel] RTSP client 'trick play' support. When will it ever work?? In-Reply-To: References: <20090708224713.GA20713@via.ecp.fr> Message-ID: <80F92DD8-DC1E-4D0C-89BB-912E27052C90@slaine.org> On 9 Jul 2009, at 06:14, Ross Finlayson wrote: >> It is simple: >> in VLC we enable trickplay support for rtsp (pause/seek/fast >> forward/backward) >> if and only if the play time is known, that is if p_sys->ms- >> >playEndTime() >> returns a non zero values. It's the only pseudo reliable way to >> detect VOD >> that I know of). >> >> Unfortunatly, using live555MediaServer and the live555 library (even >> with the ts index), it is not the case. It is non zero at first (I >> think >> after the initial setup), but is reset to zero latter. > > Ahh... Here's what's actually happening (with VLC 1.0.0 and the > "LIVE555 Media Server") in this case: > > 1/ The RTSP server returns a SDP description (in response to VLC's > RTSP "DESCRIBE" command), which contains a "a=range:" attribute > *with* an end time - e.g. > > v=0 > o=- 1247115776948082 1 IN IP4 192.168.0.4 > s=MPEG Transport Stream, streamed by the LIVE555 Media Server > i=osbournes.ts > t=0 0 > a=tool:LIVE555 Streaming Media v2009.06.02 > a=type:broadcast > a=control:* > a=range:npt=0-45.277 > a=x-qt-text-nam:MPEG Transport Stream, streamed by the LIVE555 Media > Server > a=x-qt-text-inf:osbournes.ts > m=video 0 RTP/AVP 33 > c=IN IP4 0.0.0.0 > a=control:track1 > > > 2/ VLC, in response, is sending a "PLAY" command *without* an end > time: > > PLAY rtsp://192.168.0.4:8554/osbournes.ts/ RTSP/1.0 > CSeq: 12 > Session: 1 > Range: npt=0.000- > User-Agent: VLC media player (LIVE555 Streaming Media v2009.06.02) > > > 3/ The RTSP server then sends back a "PLAY" response without an end > time: > > RTSP/1.0 200 OK > CSeq: 12 > Date: Thu, Jul 09 2009 05:02:56 GMT > Range: npt=0.000- > Session: 1 > RTP-Info: url=rtsp://192.168.0.4:8554/osbournes.ts/ > track1;seq=4083;rtptime=4253405358 > > > So, the real problem is that VLC - in step 2 - is sending a "PLAY" > command without an end time, despite the fact that the SDP > description (returned in response to "DESCRIBE") had a range end > time. "ms->playEndTime()" *should* be non-zero (because "*ms" was > created using the SDP description). Could you please check this?? > I was working on this area the other day actually, to enable support for Anevia VOD servers on our dated version of VLC. What's happening is that the live555 RTSP libraries are re-parsing the end time from the Range: header from the Play response. This is needed for the case of an Anevia VOD server as for some reason, their SDP data is missing the range attribute "a=range:npt=.....". They only send a Range: header in the Play response. So perhaps the answer is to patch live555 so that it only parses the Range: header if it's subession's playEndTime is 0 ? If so, patch attached. -------------- next part -------------- A non-text attachment was scrubbed... Name: playEndTime.patch Type: application/octet-stream Size: 777 bytes Desc: not available URL: -------------- next part -------------- -- Glen Gray slaine at slaine.org From teresa525 at gmail.com Thu Jul 9 01:57:31 2009 From: teresa525 at gmail.com (Teresa) Date: Thu, 9 Jul 2009 16:57:31 +0800 Subject: [Live-devel] AVI demultiplexer Message-ID: <9f05cb70907090157x2fb242du7d5334d33b1628c2@mail.gmail.com> Hi experts, When "AVI demultiplexer" will be supported? Any schedule for that? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jerry.Johns at nuvation.com Thu Jul 9 07:54:16 2009 From: Jerry.Johns at nuvation.com (Jerry Johns) Date: Thu, 9 Jul 2009 07:54:16 -0700 Subject: [Live-devel] Live 555 multi-thread support In-Reply-To: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8B52@mailguy3.skynet.nuvation.com> References: <77938bc20907050207s2fe2d283g95f839bada248b99@mail.gmail.com> <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8B52@mailguy3.skynet.nuvation.com> Message-ID: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8C8F@mailguy3.skynet.nuvation.com> I forgot to add, but if you want to have multiple threads interact with LiveMedia, use pipes and shared fifo memory as your main base of data transfer/communication. Pipes allow you to expose file handles that can then be 'listened' on by LiveMedia's task scheduler using the select() call - it will just add this file handle to its list of schedulable sleeping tasks, and once data gets transferred on that pipe, your function handler should awake. That combined with some sort of shared memory makes it a complete cinch to implement various packetizers. I usually have a base class that implements inter-thread data transfer, sub-classing DeviceSource - then I just latch on my packetizer classes on-top of this so that it becomes quite seamless. This way, you can have multiple streams, multiple codecs all implemented seamlessly as long as your base class does its job well of relaying data between threads. Key thing to note is not to use mutexes/conditional variables in the thread running LiveMedia, as then you might potentially be blocking the thread outside of the select() call - this incurs 'double' scheduling, and would prevent LiveMedia from servicing network events on time. Thanks, Jerry Johns Design Engineer Nuvation Research Corp - Canada Tel: (519) 746-2304 ext. 221 www.nuvation.com ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jerry Johns Sent: Wednesday, July 08, 2009 2:14 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live 555 multi-thread support You don't really have to run it under multiple processes, especially since it works fine under its own thread (pthread in Linux) The heart of its implementation revolves around the select() call involving multiple sockets, so really, it is 'safe' as long as no one else uses those sockets/ports, and you don't try to run multiple instances of LiveMedia on multiple threads. I've successfully used LiveMedia in designs using this above concept and as long as you lower its thread priority relative to the other threads in your system, it chugs along just fine :-) Jerry Johns Design Engineer Nuvation Research Corp - Canada Tel: (519) 746-2304 ext. 221 www.nuvation.com ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Sunday, July 05, 2009 5:58 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live 555 multi-thread support Hi. I read the FAQ but I still not 100% sure about this: " Longer answer: More than one thread can still use this code, if only one thread runs the library code proper, and the other thread(s) communicate with the library only by setting global 'flag' variables. (For one possible use of this technique, see the answer to this question .) Another possible way to access the code from multiple threads is to have each thread use its own "UsageEnvironment" and "TaskScheduler" objects, and thus its own event loop. The objects created by each thread (i.e., using its own "UsageEnvironment") must not interact (except via global variables). " Is there any good guide-lines to run the Live555 (or OpenRTSP example) in multi-threaded fashion? Or my best bet would be to run separate processes? Yes, that's the best way - use multiple processes if you can. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jul 9 08:27:41 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Jul 2009 08:27:41 -0700 Subject: [Live-devel] [vlc-devel] RTSP client 'trick play' support. When will it ever work?? In-Reply-To: <80F92DD8-DC1E-4D0C-89BB-912E27052C90@slaine.org> References: <20090708224713.GA20713@via.ecp.fr> <80F92DD8-DC1E-4D0C-89BB-912E27052C90@slaine.org> Message-ID: First, Glen, *please* trim your responses! >So perhaps the answer is to patch live555 so that it only parses the >Range: header if it's subession's playEndTime is 0 ? No, because the server might have decided - for whatever reason - to stream only a subset of the range that the client requested. The client needs to respect the range that the server specified in its "PLAY" response, even if the client already thinks it knows the range. But, as I noted earlier, the problem is occurring earlier, before VLC sends "PLAY", when it is apparently not paying attention to the range end that was specified in the SDP description. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From SRawling at pelco.com Thu Jul 9 08:56:43 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Thu, 09 Jul 2009 08:56:43 -0700 Subject: [Live-devel] Live 555 multi-thread support In-Reply-To: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8B52@mailguy3.skynet.nuvation.com> Message-ID: >>I?ve successfully used LiveMedia in designs using this above concept and as long as you lower its thread priority relative to the other threads in >>your system, it chugs along just fine J Just curious as to why you lowered the thread priority? - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jerry.Johns at nuvation.com Thu Jul 9 13:27:21 2009 From: Jerry.Johns at nuvation.com (Jerry Johns) Date: Thu, 9 Jul 2009 13:27:21 -0700 Subject: [Live-devel] Live 555 multi-thread support In-Reply-To: References: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8B52@mailguy3.skynet.nuvation.com> Message-ID: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8D4E@mailguy3.skynet.nuvation.com> In my initial investigations with LiveMedia 2 years ago, I had noticed that putting it as the highest priority thread ended up chewing a lot of CPU cycles, and potentially starving other threads as well (embedded Linux, 2.6.10) - the solution was to relegate to being the lowest priority thread in my system, in which case it performed just fine. This might have changed with the latest LiveMedia libraries - theoretically, the select() should prevent idle busy looping situations in the thread, but what might have been happening might be repeated scheduling by a certain class(s) with short periods of cycle time. Jerry Johns Design Engineer Nuvation Research Corp - Canada Tel: (519) 746-2304 ext. 221 www.nuvation.com ________________________________ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Rawling, Stuart Sent: Thursday, July 09, 2009 11:57 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live 555 multi-thread support >>I've successfully used LiveMedia in designs using this above concept and as long as you lower its thread priority relative to the other threads in >>your system, it chugs along just fine :-) Just curious as to why you lowered the thread priority? - ------------------------------------------------------------------------ ------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------ ------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jerry.Johns at nuvation.com Thu Jul 9 13:29:27 2009 From: Jerry.Johns at nuvation.com (Jerry Johns) Date: Thu, 9 Jul 2009 13:29:27 -0700 Subject: [Live-devel] Quicktime Problems Message-ID: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8D51@mailguy3.skynet.nuvation.com> Anyone having problems playing back streams with the latest Quicktime player? I've checked my application against an older version (7.1.3) and both unicast/multicast works fine with the older one, while the the newer one connects but shows no video/audio. In its place, a Quicktime busy waiting animation loop plays. I'm streaming H.264 video with proper SDP hinting (sprop parameter sets). Thanks, Jerry Johns Design Engineer Nuvation Research Corp - Canada Tel: (519) 746-2304 ext. 221 www.nuvation.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jul 9 14:57:16 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Jul 2009 14:57:16 -0700 Subject: [Live-devel] New LIVE555 Streaming Media version - improves RTSP server support for 'trick play' Message-ID: I have now installed a new version (2009.07.09) of the "LIVE555 Streaming Media" libraries - and the "LIVE555 Media Server" application binaries - to better support RTSP 'trick play' operations for some clients. In particular, the server now works around a bug (arguably) in VLC - so now VLC (version 1.0.0) can be used as a 'trick play' client for our servers. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Jul 9 18:31:05 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Jul 2009 18:31:05 -0700 Subject: [Live-devel] problems with DeviceSource and OnDemandServerMediaSubsession In-Reply-To: <4A5476B7.8010106@videotec.com> References: <4A5476B7.8010106@videotec.com> Message-ID: >I'm trying to create a subclass of DeviceSource, namely >VideoDeviceSource that extracts the NAL units of an H.264 compressed >video from a queue and feeds them to a subclass of >OnDemandServerMediaSubsession called VideoOnDemandServerMediaSubsession . > >As client I use openRTSP to fetch the stream. It connects but it never >receives data. I noticed that VideoDeviceSource::deliverFrame() is never >called, but I cannot understand why. I suspect it's a problem of >VideoOnDemandServerMediaSubsession It's far more likely to just be a problem with your "VideoDeviceSource" implementation. In particular, does every call to your "doGetNextFrame()" implementation eventually result in a call to your "deliverFrame()" implementation? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mrussell at frontiernet.net Thu Jul 9 18:32:30 2009 From: mrussell at frontiernet.net (Michael Russell) Date: Thu, 09 Jul 2009 21:32:30 -0400 Subject: [Live-devel] .m4v / .mp3 Synchronization In-Reply-To: <4A540173.40007@frontiernet.net> References: <4A4D420F.7070706@frontiernet.net> <4A4E237B.8060908@frontiernet.net> <4A540173.40007@frontiernet.net> Message-ID: <4A569A2E.5010805@frontiernet.net> I wrote: > I have two independent ByteStreamFileSource objects - One feeds > MPEG-4 video elementary stream (.m4v) data to an MPEG4VideoStreamFramer. > One feeds MPEG-1, Layer 3 (.mp3) audio data to an > MPEG1or2AudioStreamFramer. > > Those framers then each feed a MPEG2TransportStreamFromESSource object. > The resultant transport stream feeds a FileSink object that writes my > .ts file. > My own additional debug output suggests that my > MPEG2TransportStreamFromESSource object just stops calling > getNextFrame() on the video input source (MPEG4VideoStreamFramer). I > have no idea why this would be. It seems that the > MPEG2TransportStreamFromESSource is running into trouble when it has > to deal with my two framer objects feeding it at the same time. Here's an update: (See InputESSourceRecord::askForNewData() in MPEG2TransportStreamFromESSource.cpp) I have confirmed that MPEG2TransportStreamFromESSource object stops calling getNextFrame() on my video framer source because fInputSource->isCurrentlyAwaitingData() almost always returns True for my MPEG4VideoStreamFramer object. Since both my audio and video framer objects work fine independently, I wonder if there is a bug in MPEG2TransportStreamFromESSource that causes it to get its sources confused (and hence, thinks that the MPEG4VideoStreamFramer object is awaiting data when it really isn't). A very similar problem was cited here: , but I don't think that there was ever a resolution to it. From anon.hui at gmail.com Thu Jul 9 22:43:15 2009 From: anon.hui at gmail.com (Anon Sricharoenchai) Date: Fri, 10 Jul 2009 12:43:15 +0700 Subject: [Live-devel] Accessing RTP packet header extension via live 555 In-Reply-To: References: <77938bc20906280842x3393e085u2ab42080717e6f8a@mail.gmail.com> Message-ID: <5914604b0907092243i1a32ef2k5f22ec1cf47ea760@mail.gmail.com> On 6/29/09, Ross Finlayson wrote: > > Is it possible to access the RTP packet header extension (with application > data), which comes right after the CSRC list, via live 555 library? > > > > I mean, whether live 555 allows to go to such depth when working with RTP > packets. > > > > No, not at present, unfortunately. > Could we access it using auxilliary read handler? void myfunc(void* data, unsigned char* packet, unsigned packetSize); subsession->readSource()->setAuxilliaryReadHandler(myfunc, data) From finlayson at live555.com Thu Jul 9 23:22:01 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Jul 2009 23:22:01 -0700 Subject: [Live-devel] Accessing RTP packet header extension via live 555 In-Reply-To: <5914604b0907092243i1a32ef2k5f22ec1cf47ea760@mail.gmail.com> References: <77938bc20906280842x3393e085u2ab42080717e6f8a@mail.gmail.com> <5914604b0907092243i1a32ef2k5f22ec1cf47ea760@mail.gmail.com> Message-ID: >On 6/29/09, Ross Finlayson wrote: >> > Is it possible to access the RTP packet header extension (with application >> data), which comes right after the CSRC list, via live 555 library? >> > >> > I mean, whether live 555 allows to go to such depth when working with RTP >> packets. >> > >> >> No, not at present, unfortunately. >> > >Could we access it using auxilliary read handler? > > void myfunc(void* data, unsigned char* packet, unsigned packetSize); > > subsession->readSource()->setAuxilliaryReadHandler(myfunc, data) Yes, I guess so. That mechanism (which I had actually forgotten about until you mentioned it) is a bit of a hack, though, and might end up getting removed sometime in the future. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Jul 9 23:24:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 9 Jul 2009 23:24:26 -0700 Subject: [Live-devel] Quicktime Problems In-Reply-To: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8D51@mailguy3.skynet.nuvation.com> References: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C8D51@mailguy3.skynet.nuvation.com> Message-ID: >Content-class: urn:content-classes:message >Content-Type: multipart/alternative; > boundary="----_=_NextPart_001_01CA00D3.F51F1582" > >Anyone having problems playing back streams with the latest >Quicktime player? I've checked my application against an older >version (7.1.3) and both unicast/multicast works fine with the older >one, while the the newer one connects but shows no video/audio. In >its place, a Quicktime busy waiting animation loop plays. I'm >streaming H.264 video with proper SDP hinting (sprop parameter sets). Does VLC work on the same streams? If so, then the problem is with QuickTime Player, and you'll (therefore) need to ask about this on a QuickTime mailing list, not this one. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From juanfhj at gmail.com Fri Jul 10 21:20:07 2009 From: juanfhj at gmail.com (Juan Fernando Herrera J.) Date: Fri, 10 Jul 2009 23:20:07 -0500 Subject: [Live-devel] Fwd: Why is RTSP needed apart from SDP? In-Reply-To: <1fb8b0210907102108p4f8c8ed9y852e5485cadf8ffb@mail.gmail.com> References: <1fb8b0210907102108p4f8c8ed9y852e5485cadf8ffb@mail.gmail.com> Message-ID: <1fb8b0210907102120k52ce9db3n372a1a0a9deb66b1@mail.gmail.com> Why can't one play unicast just from a SDP description? From a previous message: > Ross Finlayson finlayson at live555.com > Wed Oct 1 03:04:23 PDT 2008 > You can't (in general) play a unicast stream just by reading a > SDP description, because the SDP description does not, by > itself, contain enough information about the stream endpoints. What's that missing information? In detail: I want to transmit live captured unicast audio/video, and I'm trying using testMP3Streamer and ffmpeg freely running, independently (without establishing sessions because I know my clients beforehand,) then clients would use VLC with a SDP file. That's not working. I'm able to play the audio/video independently, it's just concurrently that fails. Is an RTSP server necessary? My understanding is that audio/video synchronization just comes from wall time, why would an RTSP server be needed additionally? Thanks, Juan Herrera Universidad EAFIT From finlayson at live555.com Fri Jul 10 22:19:50 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Jul 2009 22:19:50 -0700 Subject: [Live-devel] Fwd: Why is RTSP needed apart from SDP? In-Reply-To: <1fb8b0210907102120k52ce9db3n372a1a0a9deb66b1@mail.gmail.com> References: <1fb8b0210907102108p4f8c8ed9y852e5485cadf8ffb@mail.gmail.com> <1fb8b0210907102120k52ce9db3n372a1a0a9deb66b1@mail.gmail.com> Message-ID: >Why can't one play unicast just from a SDP description? From a >previous message: > >> Ross Finlayson finlayson at live555.com >> Wed Oct 1 03:04:23 PDT 2008 >> You can't (in general) play a unicast stream just by reading a >> SDP description, because the SDP description does not, by >> itself, contain enough information about the stream endpoints. > >What's that missing information? The client's port numbers. > >In detail: I want to transmit live captured unicast audio/video, and >I'm trying using testMP3Streamer and ffmpeg freely running, >independently (without establishing sessions because I know my clients >beforehand,) then clients would use VLC with a SDP file. > >That's not working. I'm able to play the audio/video independently, >it's just concurrently that fails. Is an RTSP server necessary? In general, yes, if you are streaming via unicast. We implemented our RTSP server for a reason. If you want to stream via unicast without using a RTSP server, then you might (perhaps) be able to get this to work, but you will get no support from me. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From juanfhj at gmail.com Fri Jul 10 23:00:41 2009 From: juanfhj at gmail.com (Juan Fernando Herrera J.) Date: Sat, 11 Jul 2009 01:00:41 -0500 Subject: [Live-devel] Fwd: Why is RTSP needed apart from SDP? In-Reply-To: References: <1fb8b0210907102108p4f8c8ed9y852e5485cadf8ffb@mail.gmail.com> <1fb8b0210907102120k52ce9db3n372a1a0a9deb66b1@mail.gmail.com> Message-ID: <1fb8b0210907102300w32c5c052lcd4cf49563d3026a@mail.gmail.com> >>> Ross Finlayson finlayson at live555.com >>> Wed Oct 1 03:04:23 PDT 2008 >>> You can't (in general) play a unicast stream just by reading a >>> SDP description, because the SDP description does not, by >>> itself, contain enough information about the stream endpoints. >> >> What's that missing information? > > The client's port numbers. I'm sorry for my misunderstanding. What's the difference between those client port numbers and the port numbers that appear in the sdp m= description? Thanks From finlayson at live555.com Fri Jul 10 23:22:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Jul 2009 23:22:26 -0700 Subject: [Live-devel] Fwd: Why is RTSP needed apart from SDP? In-Reply-To: <1fb8b0210907102300w32c5c052lcd4cf49563d3026a@mail.gmail.com> References: <1fb8b0210907102108p4f8c8ed9y852e5485cadf8ffb@mail.gmail.com> <1fb8b0210907102120k52ce9db3n372a1a0a9deb66b1@mail.gmail.com> <1fb8b0210907102300w32c5c052lcd4cf49563d3026a@mail.gmail.com> Message-ID: > >>> Ross Finlayson finlayson at live555.com >>>> Wed Oct 1 03:04:23 PDT 2008 >>>> You can't (in general) play a unicast stream just by reading a >>>> SDP description, because the SDP description does not, by >>>> itself, contain enough information about the stream endpoints. >>> >>> What's that missing information? >> >> The client's port numbers. > >I'm sorry for my misunderstanding. What's the difference between those >client port numbers and the port numbers that appear in the sdp m= >description? Generally speaking, the SDP description describes the session information that is known to the server. The port numbers, therefore, are *server* port numbers (that the client would use when sending RTCP reports back to the server, for example). The client port numbers are not in the SDP description. For multicast sessions, this is not a problem, because - for such sessions - the client and server port numbers are always the same (as is the IP (multicast) address). For unicast sessions, however, the port numbers in the SDP description are typically specified as 0, and the RTSP protocol is used to determine the client and server port numbers (and the client IP address). Once again, our RTSP server implementation does all of this automatically. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From avicode at yahoo.com Sun Jul 12 06:30:20 2009 From: avicode at yahoo.com (Jonny Parker) Date: Sun, 12 Jul 2009 06:30:20 -0700 (PDT) Subject: [Live-devel] audio rtp packet size Message-ID: <257651.36531.qm@web63303.mail.re1.yahoo.com> Hi ? I trying to stream (unicast) H264 and? G711 aLaw from HW encoder (Ti davinci) I would like how to control the RTP Audio packet size? ? ? Thanks ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jul 12 06:51:35 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 12 Jul 2009 06:51:35 -0700 Subject: [Live-devel] audio rtp packet size In-Reply-To: <257651.36531.qm@web63303.mail.re1.yahoo.com> References: <257651.36531.qm@web63303.mail.re1.yahoo.com> Message-ID: >Hi > > > >I trying to stream (unicast) H264 and G711 aLaw from HW encoder (Ti davinci) > >I would like how to control the RTP Audio packet size? "MultiFramedRTPSink::setPacketSizes()" -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From avicode at yahoo.com Tue Jul 14 09:11:50 2009 From: avicode at yahoo.com (avicode at yahoo.com) Date: Tue, 14 Jul 2009 09:11:50 -0700 (PDT) Subject: [Live-devel] HW encoder Davinchi Message-ID: <403915.27024.qm@web63307.mail.re1.yahoo.com> Hi ? I can stream h264 from HW Enc (Davinci) to VLC but I have problems when streaming both G711 and? h264. I think is related to the synchronization between my threads VidEnc , AudEnc and Live thread. ? I understand that ?BasicTaskScheduler::SingleStep waits for events using select, and if I'm using two pipes one for video enc second for Audio enc , then ??when the ?VidEnc or AudEnc writes frame into the pipe the select() ?wakeup and call DeviceSource::doGetNextFrame (my imaplamntion base on WIS streamer), is that correct? ? Is that also trigger the RTP sink to send packet? ? Thanks ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From hichetu at gmail.com Sun Jul 12 18:10:25 2009 From: hichetu at gmail.com (Chetan Raj) Date: Mon, 13 Jul 2009 10:10:25 +0900 Subject: [Live-devel] config.uClinux config changes Message-ID: <27b710b10907121810q1d670c8cl30143dc730ade530@mail.gmail.com> The config.uClinux provided in live 2009.07.09 version does not properly set the compiler options for uClinux environment. I suggest the following fixes to config.uClinux to make liveMedia Streaming server properly compile on uClinux environment. - Chetan ============================================================== CROSS_COMPILE= arc-linux-uclibc- COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 C = c C_COMPILER = $(CROSS_COMPILE)gcc CFLAGS += $(COMPILE_OPTS) C_FLAGS = $(CFLAGS) CPP = cpp CPLUSPLUS_COMPILER = $(CROSS_COMPILE)g++ CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 CPLUSPLUS_FLAGS += $(CPPFLAGS) -fexceptions OBJ = o LINK = $(CROSS_COMPILE)g++ -o LINK_OPTS = -L. $(LDFLAGS) CONSOLE_LINK_OPTS = $(LINK_OPTS) LIBRARY_LINK = $(CROSS_COMPILE)ld -o LIBRARY_LINK_OPTS = -L. -r -Bstatic LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = $(CXXLIBS) LIBS_FOR_GUI_APPLICATION = $(LIBS_FOR_CONSOLE_APPLICATION) EXE = -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrussell at frontiernet.net Sun Jul 12 09:49:52 2009 From: mrussell at frontiernet.net (Michael Russell) Date: Sun, 12 Jul 2009 12:49:52 -0400 Subject: [Live-devel] MP3FileSource Message-ID: <4A5A1430.6050400@frontiernet.net> My application creates a FramedSource from an MPEG-1, Layer 3 (.mp3) audio file and feeds it to an input of MPEG2TransportStreamFromESSource like this: >ByteStreamFileSource* audioFileSource = > ByteStreamFileSource::createNew(*env, filename); >FramedSource* audioES = audioFileSource; > >MPEG1or2AudioStreamFramer* inputSource = > MPEG1or2AudioStreamFramer::createNew(*env, audioES, False); > >MPEG2TransportStreamFromESSource* newTransportStream = > MPEG2TransportStreamFromESSource::createNew(*env); > >newTransportStream->addNewAudioSource(inputSource, 1); I just discovered the MP3FileSource object. Should I be using that instead? It seems to *partially* fix some multiplexing issues that I was having. e.g. - >MP3FileSource* inputSource = > MP3FileSource::createNew(*env, filename) > >newTransportStream->addNewAudioSource(inputSource, 1); From jon_huangjun at hotmail.com Mon Jul 13 03:01:10 2009 From: jon_huangjun at hotmail.com (huang jon) Date: Mon, 13 Jul 2009 18:01:10 +0800 Subject: [Live-devel] How to develope a RTSP server for streaming AVI Message-ID: Hi, I?ve look at the live555,and I can run is in Linux to live a mpeg4 file, Now,I want to use it to live a media buffer, it include three format video, H264/MPEG4/MJPEG. How can I do it? It can write the buffer to avi file and live avi file? Thanks very much! _________________________________________________________________ ?? Windows Live Messenger 9.0???????????????????????? ???? http://download.live.com/messenger -------------- next part -------------- An HTML attachment was scrubbed... URL: From woods.biz at gmail.com Mon Jul 13 23:27:53 2009 From: woods.biz at gmail.com (Woods) Date: Tue, 14 Jul 2009 14:27:53 +0800 Subject: [Live-devel] MP4 Streamer Doubt Message-ID: <3ba8ebc0907132327w6f7fdf48g35c62f4ad04bff07@mail.gmail.com> Hi experts, I am writing a MP4 Streamer, which will stream mpeg4 video and audio(mp3,mp2, whatever) as Mpeg2 transport stream. I will write code to extract elementary stream payload from MP4 file. Subsequently, there have two candidate livemedia sources: MPEG2TransportStreamFromPESSource and MPEG2TransportStreamFromESSource Which one should I use? What is their key difference? I roughly go through the code, is MPEG2TransportStreamFromPESSource only for mpeg1 and mpeg2, but MPEG2TransportStreamFromESSource can be used by mepg 1, 2, and 4. If so, I should use MPEG2TransportStreamFromESSource. Am I right? Please correct me if I am wrong. :-) Thanks for your suggestion. -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From smile2bob at gmail.com Wed Jul 15 02:26:21 2009 From: smile2bob at gmail.com (=?big5?B?q1SnUaRqpGo=?=) Date: Wed, 15 Jul 2009 17:26:21 +0800 Subject: [Live-devel] Some questions about MJPEG streaming Message-ID: <002401ca052e$5353dd40$ee25010a@mychat967177ed> Hi All I'm working on live555 library for a MJPEG strerming server. It's really complicate to me to understand it even I read FAQ , RFC2035 and Elphel sample code. As I understand, what I need to do is to setup rtp, rtcp, rtsp APIs and provide JPEG header and JPEG payload to feed relevant parameters then it will packet the data and send it out. It seems easy but I got something confused. 1) how to count jpeg header length and payload length? 2) How to decide the value of fMaxSize? Its value is always smaller than fFrameSize. Can I make it bigger? 3) Does it always abandon the frame data over fMaxSize (I mean fNumTruncatedBytes)? Any suggestions would be appreciate. Regards, -Bob -------------- next part -------------- An HTML attachment was scrubbed... URL: From yunjnz at yahoo.com Sun Jul 12 14:21:35 2009 From: yunjnz at yahoo.com (Sean) Date: Sun, 12 Jul 2009 14:21:35 -0700 (PDT) Subject: [Live-devel] Use RTP Seperately Message-ID: <534547.2625.qm@web110710.mail.gq1.yahoo.com> Greetings, Does anyone know if I can use the RTP Stack seperately with live555? I'm going to transmit MPEG4 stream from one endpoint to another one, and RTSP/SIP is not involved for the transmission, Is there any test programs for live555 on such usage? BRs, Sean From finlayson at live555.com Wed Jul 15 05:12:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 15 Jul 2009 14:12:26 +0200 Subject: [Live-devel] config.uClinux config changes In-Reply-To: <27b710b10907121810q1d670c8cl30143dc730ade530@mail.gmail.com> References: <27b710b10907121810q1d670c8cl30143dc730ade530@mail.gmail.com> Message-ID: >The config.uClinux provided in live 2009.07.09 version does not >properly set the compiler options for uClinux environment. > >I suggest the following fixes to config.uClinux to make liveMedia >Streaming server properly compile on uClinux environment. Thanks. I'll include this in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jul 15 05:16:12 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 15 Jul 2009 14:16:12 +0200 Subject: [Live-devel] MP3FileSource In-Reply-To: <4A5A1430.6050400@frontiernet.net> References: <4A5A1430.6050400@frontiernet.net> Message-ID: >My application creates a FramedSource from an MPEG-1, Layer 3 (.mp3) >audio file and feeds it to an input of MPEG2TransportStreamFromESSource >like this: > >>ByteStreamFileSource* audioFileSource = >> ByteStreamFileSource::createNew(*env, filename); >>FramedSource* audioES = audioFileSource; >> >>MPEG1or2AudioStreamFramer* inputSource = >> MPEG1or2AudioStreamFramer::createNew(*env, audioES, False); >> >>MPEG2TransportStreamFromESSource* newTransportStream = >> MPEG2TransportStreamFromESSource::createNew(*env); >> >>newTransportStream->addNewAudioSource(inputSource, 1); > > >I just discovered the MP3FileSource object. Should I be using that instead? "MP3FileSource" is a specialized source class for MPEG-1 or 2 audio, that includes it's own 'framer'. If you wish, you could use "MP3FileSource" instead of "ByteStreamFileSource"+"MPEG1or2AudioStreamFramer". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jul 15 05:17:49 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 15 Jul 2009 14:17:49 +0200 Subject: [Live-devel] Use RTP Seperately In-Reply-To: <534547.2625.qm@web110710.mail.gq1.yahoo.com> References: <534547.2625.qm@web110710.mail.gq1.yahoo.com> Message-ID: >Does anyone know if I can use the RTP Stack seperately with live555? >I'm going to transmit MPEG4 stream from one endpoint to another one, >and RTSP/SIP is not involved for the transmission If you're streaming MPEG-4 video via RTP, then you must use a RTSP server. Note our demo applications - "testMPEG4VideoStreamer" (for multicast streaming) and "testOnDemandRTSPServer" (for unicast streaming). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jul 15 06:02:56 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 15 Jul 2009 15:02:56 +0200 Subject: [Live-devel] MP4 Streamer Doubt In-Reply-To: <3ba8ebc0907132327w6f7fdf48g35c62f4ad04bff07@mail.gmail.com> References: <3ba8ebc0907132327w6f7fdf48g35c62f4ad04bff07@mail.gmail.com> Message-ID: >Subsequently, there have two candidate livemedia sources: >MPEG2TransportStreamFromPESSou >rce and MPEG2TransportStreamFromESSource > >Which one should I use? What is their key difference? "MPEG2TransportStreamFromPESSource" should be used only when your input is already in the form of 'PES packets'. Because that's not true in your case, you should use "MPEG2TransportStreamFromESSource". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From woods.biz at gmail.com Wed Jul 15 16:22:05 2009 From: woods.biz at gmail.com (Woods) Date: Thu, 16 Jul 2009 07:22:05 +0800 Subject: [Live-devel] MP4 Streamer Doubt In-Reply-To: References: <3ba8ebc0907132327w6f7fdf48g35c62f4ad04bff07@mail.gmail.com> Message-ID: <3ba8ebc0907151622r4d398438t5d473131d95c161b@mail.gmail.com> Hi, Thanks for your reply. So, can I say : any source that is workable for MPEG2TransportStreamFromPESSource are ALWAYS applicable to MPEG2TransportStreamFromESSource? Regards, Woods On Wed, Jul 15, 2009 at 9:02 PM, Ross Finlayson wrote: > Subsequently, there have two candidate livemedia sources: >> MPEG2TransportStreamFromPESSou >> rce and MPEG2TransportStreamFromESSource >> >> Which one should I use? What is their key difference? >> > > "MPEG2TransportStreamFromPESSource" should be used only when your input is > already in the form of 'PES packets'. > > Because that's not true in your case, you should use > "MPEG2TransportStreamFromESSource". > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrusse05 at harris.com Wed Jul 15 18:36:08 2009 From: mrusse05 at harris.com (Russell, Michael (mrusse05)) Date: Wed, 15 Jul 2009 21:36:08 -0400 Subject: [Live-devel] Use RTP Seperately References: <534547.2625.qm@web110710.mail.gq1.yahoo.com> Message-ID: Hi Ross - Your response to Sean prompts me to ask for clarification for the both of us. You replied to Sean: >If you're streaming MPEG-4 video via RTP, then you must use a RTSP server. It is my understanding that one purpose of RTSP when streaming MPEG4 video is to communicate the all-important SDP info to the client player. And I understand that's why Sean and I (and the whole rest of the world) require RTSP to stream MPEG4 video. But you also replied to me (in ): >In princlple, all RTP media players need to first receive a SDP description that describes the stream. So if one is streaming MPEG2 video instead of MPEG4, the SDP is still required. But then how would one's client player even get an SDP presuming MPEG2 means no RTSP? From finlayson at live555.com Wed Jul 15 22:58:45 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Jul 2009 07:58:45 +0200 Subject: [Live-devel] MP4 Streamer Doubt In-Reply-To: <3ba8ebc0907151622r4d398438t5d473131d95c161b@mail.gmail.com> References: <3ba8ebc0907132327w6f7fdf48g35c62f4ad04bff07@mail.gmail.com> <3ba8ebc0907151622r4d398438t5d473131d95c161b@mail.gmail.com> Message-ID: >So, can I say : any source that is workable for >MPEG2TransportStreamFromPESSource are ALWAYS applicable to >MPEG2TransportStreamFromESSource? You can say that, but you would be wrong. Once again: If your input is in the form of 'PES packets' (this usually occurs only if you are demultiplexing a Program Stream), then you use "MPEG2TransportStreamFromPESSource". Otherwise, you use a "MPEG2TransportStreamFromESSource". Therefore, in your case, you should use a "MPEG2TransportStreamFromESSource". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jul 15 23:24:35 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Jul 2009 08:24:35 +0200 Subject: [Live-devel] Use RTP Seperately In-Reply-To: References: <534547.2625.qm@web110710.mail.gq1.yahoo.com> Message-ID: >You replied to Sean: >>If you're streaming MPEG-4 video via RTP, then you must use a RTSP server. > >It is my understanding that one purpose of RTSP when streaming MPEG4 >video is to communicate the all-important SDP info to the client >player. And I understand that's why Sean and I (and the whole rest >of the world) require RTSP to stream MPEG4 video. Yes. > But you also replied to me (in >): >>In princlple, all RTP media players need to first receive a SDP >>description that describes the stream. > >So if one is streaming MPEG2 video instead of MPEG4, the SDP is >still required. But then how would one's client player even get an >SDP presuming MPEG2 means no RTSP? Who said that "MPEG2 means no RTSP"? I certainly didn't. It's really quite simple: 1/ If you are streaming RTP via unicast, then you should use RTSP (in order to ensure that the client will know both the client and server's port numbers for RTP/RTCP). 2/ If you are streaming RTP via multicast, then RTSP is optional. However, if you don't use RTSP, then you will need to give a SDP description to the client some other way (e.g., by having it read a file). (Therefore, RTSP is recommended, even for multicast streams.) Depending on your client and your codec, you might sometimes be able to violate these rules. For example, if you are streaming MPEG-1 or 2 audio, video, or transport stream data to VLC via multicast, then you might not need to give VLC a SDP description (because VLC can sometimes 'guess' the codec). And if you somehow know for sure that the server and client will use the same port numbers for RTP and RTCP (this usually means having only one client), then you might sometimes be able to stream via unicast without using a RTSP server. However, if you try doing either of these, you will get no support from me. For simplicity, everyone should just use RTSP for all RTP streams. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Jerry.Johns at nuvation.com Thu Jul 16 08:24:55 2009 From: Jerry.Johns at nuvation.com (Jerry Johns) Date: Thu, 16 Jul 2009 08:24:55 -0700 Subject: [Live-devel] Proper destruction of subsessions Message-ID: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C92B5@mailguy3.skynet.nuvation.com> I've searched in the list archive, but couldn't find something regarding this - I currently have LiveMedia running on its own thread with the doEventLoop() the last instruction before it goes off into its own world; Ultimately, I would like to have this thread return gracefully by first invoking any teardowns of existing unicast streaming sessions (we're streaming out), sending any BYE's, and then shutting down the task scheduler using a watch variable. I'm familiar with stopping with the scheduler, and with tearing down multicast sessions, but am not sure how to properly teardown/invoke destructor for the unicast sessions - do you know how to do this? Thank you, Jerry Johns Design Engineer Nuvation Research Corp - Canada Tel: (519) 746-2304 ext. 221 www.nuvation.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Thu Jul 16 11:47:39 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Thu, 16 Jul 2009 11:47:39 -0700 Subject: [Live-devel] Proper destruction of subsessions In-Reply-To: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C92B5@mailguy3.skynet.nuvation.com> Message-ID: I believe the best thing to do is to run your doEventLoop with a watch variable. Then when you wish to exit and cleanup, set the watch variable, which will cause the doEventLoop call to return. At this point you will be able to clean up all of the session stuff (hint: use Medium::close()). On 7/16/09 8:24 AM, "Jerry Johns" wrote: > I?ve searched in the list archive, but couldn?t find something regarding this > ? I currently have LiveMedia running on its own thread with the doEventLoop() > the last instruction before it goes off into its own world; > Ultimately, I would like to have this thread return gracefully by first > invoking any teardowns of existing unicast streaming sessions (we?re streaming > out), sending any BYE?s, and then shutting down the task scheduler using a > watch variable. I?m familiar with stopping with the scheduler, and with > tearing down multicast sessions, but am not sure how to properly > teardown/invoke destructor for the unicast sessions ? do you know how to do > this? - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jerry.Johns at nuvation.com Thu Jul 16 12:49:02 2009 From: Jerry.Johns at nuvation.com (Jerry Johns) Date: Thu, 16 Jul 2009 12:49:02 -0700 Subject: [Live-devel] Proper destruction of subsessions Message-ID: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C9342@mailguy3.skynet.nuvation.com> I am aware of the watch variable method and the Medium::close(), but closing the ServerMediaSession, or the Subsession does not entail sending a BYE to the connected client - the only way I found till now to send a BYE is to do a handleClosure() from within my framer source when it detects a shutdown of the application. Ultimately, the problem here is that I want to ensure BYE is sent, and THEN liveMedia is shut-down (i.e, taskScheduler returns) and that this order is obeyed. Having the watchVariable is fantastic but currently I have no way to really ensure that my handleClosure() triggers the BYE and then the scheduler shuts down. Jerry Johns Design Engineer Nuvation Research Corp - Canada Tel: (519) 746-2304 ext. 221 www.nuvation.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From patbob at imoveinc.com Mon Jul 20 09:58:38 2009 From: patbob at imoveinc.com (Patrick White) Date: Mon, 20 Jul 2009 09:58:38 -0700 Subject: [Live-devel] Proper destruction of subsessions In-Reply-To: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C9342@mailguy3.skynet.nuvation.com> References: <274F7B50569A1B4C9D7BCAB17A9C7BE1019C9342@mailguy3.skynet.nuvation.com> Message-ID: <200907200958.38941.patbob@imoveinc.com> Matt & I went though this issue a while back for live streams. It's probably similar if you're trying to shut down a file-based stream before it gets to its end... To force a session to end before the client requests a teardown, we ended up deleting the RTSPClientSession instance out from under the session via a scheduled task. That's the way Ross does it in Live555 when a session times out, and it is a safe way to terminate a (live) session. However, when you do that, A BYE doesn't get sent. Matt chased that end of things, and IIRC, he found that the library goes through the motions to send the BYE, but then cleans the session up before it can get sent. You'll have to modify the source to make it get sent. If you're shutting down a file stream, you might be able to shut it down and get the BYE sent by somehow convincing it that its already at the end. hope that helps, patbob On Thursday 16 July 2009 12:49 pm, Jerry Johns wrote: > I am aware of the watch variable method and the Medium::close(), but > closing the ServerMediaSession, or the Subsession does not entail > sending a BYE to the connected client - the only way I found till now to > send a BYE is to do a handleClosure() from within my framer source when > it detects a shutdown of the application. Ultimately, the problem here > is that I want to ensure BYE is sent, and THEN liveMedia is shut-down > (i.e, taskScheduler returns) and that this order is obeyed. Having the > watchVariable is fantastic but currently I have no way to really ensure > that my handleClosure() triggers the BYE and then the scheduler shuts > down. > > Jerry Johns > Design Engineer > Nuvation Research Corp - Canada > Tel: (519) 746-2304 ext. 221 > www.nuvation.com From dmaljur at elma.hr Wed Jul 22 02:04:23 2009 From: dmaljur at elma.hr (Dario) Date: Wed, 22 Jul 2009 11:04:23 +0200 Subject: [Live-devel] Closing the media does not free allocated memory? Message-ID: <000601ca0aab$6b187210$ec03000a@gen47> Hello. When task scheduler is created via "BasicTaskScheduler::createNew()" and usage environment is created via BasicUsageEnvironment::createNew(scheduler). Allocated memory shows 1892Kb. After I create Medium via: Medium *ourClient = RTSPClient::createNew(usageEnvironment , 0, "app1",0); Memory usage goes to 1960Kb. After calling Medium::close(ourClient), memory jumps to 1968Kb? I see that the data in ourClient is all nullified but the memory allocated by createNew does not go back to the OS. I'm guessing the same thing is happening with MediaSession also? How do I release memory allocated for ourClient since destructor is private operation. ELMA Kurtalj d.o.o. (ELMA Kurtalj ltd.) Vitezi?eva 1a, 10000 Zagreb, Hrvatska (Viteziceva 1a, 10000 Zagreb, Croatia) Tel: 01/3035555, Faks: 01/3035599 (Tel: ++385-1-3035555, Fax: ++385-1-3035599 ) Www: www.elma.hr; shop.elma.hr E-mail: elma at elma.hr (elma at elma.hr) pitanje at elma.hr (questions at elma.hr) primjedbe at elma.hr (complaints at elma.hr) prodaja at elma.hr (sales at elma.hr) servis at elma.hr (servicing at elma.hr) shop at elma.hr (shop at elma.hr) skladiste at elma.hr (warehouse at elma.hr) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ottavio at videotec.com Thu Jul 23 02:45:42 2009 From: ottavio at videotec.com (Ottavio Campana) Date: Thu, 23 Jul 2009 11:45:42 +0200 Subject: [Live-devel] problem trying to transmit raw audio data to mplayer Message-ID: <4A683146.9040407@videotec.com> For some tests I'0m trying to transmit raw audio data to maplayer. It's basically cd quality, thus with sampling rate 44100 Hz, 2 channels and 16 bits per sample, little endian. If I try to download the stream with openRTSP it works fine, I just open the received stream with audacity and I can listen to it. The output of openRTSP is Opened URL "rtsp://192.168.0.100:8554/video_stream", returning a SDP description: v=0 o=- 1076509460391806 1 IN IP4 192.168.0.100 s=Session streamed by my program i=LIVE555 Streaming Media v t=0 0 a=tool:LIVE555 Streaming Media v2009.06.02 a=type:broadcast a=control:* a=source-filter: incl IN IP4 * 192.168.0.100 a=rtcp-unicast: reflection a=range:npt=0- a=x-qt-text-nam:Session streamed by my program a=x-qt-text-inf:LIVE555 Streaming Media v m=audio 18890 RTP/AVP 98 c=IN IP4 232.35.236.86/255 a=rtpmap:98 L16/44100/2 a=control:track1 Created receiver for "audio/L16" subsession (client ports 18890-18891) Setup "audio/L16" subsession (client ports 18890-18891) Created output file: "audio-L16-1" Started playing session Receiving streamed data (signal with "kill -HUP 10123" or "kill -USR1 10123" to terminate)... When I try to listen to it with maplyer I get this output: MPlayer dev-SVN-r26940 CPU: Intel(R) Core(TM)2 CPU 6600 @ 2.40GHz (Family: 6, Model: 15, Stepping: 6) CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 1 Compiled with runtime CPU detection. Can't open joystick device /dev/input/js0: No such file or directory Can't init input joystick mplayer: could not connect to socket mplayer: No such file or directory Failed to open LIRC support. You will not be able to use your remote control. Playing rtsp://192.168.0.100:8554/video_stream. Resolving 192.168.0.100 for AF_INET6... Couldn't resolve name for AF_INET6: 192.168.0.100 Connecting to server 192.168.0.100[192.168.0.100]: 8554... rtsp_session: unsupported RTSP server. Server type is 'unknown'. STREAM_LIVE555, URL: rtsp://192.168.0.100:8554/video_stream Stream not seekable! file format detected. Initiated "audio/L16" RTP subsession on port 18890 No stream found. Exiting... (End of file) What can be the reason of this error? I create the RTPSink with this code: rtsp_layer->audio_sink = SimpleRTPSink:: createNew (*((UsageEnvironment *) rtsp_layer->env), (Groupsock *) rtsp_layer->audio_rtp_groupsock, 98, 44100, "audio", "L16", 2); But I don't know if it is correct, I just took testWAVAudioStreamer.cpp as an example. Thank you for your help, Ottavio From llsurreal at live.cn Mon Jul 20 04:25:50 2009 From: llsurreal at live.cn (DannyLuo) Date: Mon, 20 Jul 2009 19:25:50 +0800 Subject: [Live-devel] testMP3Streamer.cpp and testMP3Receive.cpp Message-ID: Hi, I am still confused by the using of testMP3Streamer.exe and testMP3Receive.exe. when I test the testMP3Streamer.exe and testMP3Receive.exe on two computers with the IP address of 192.168.12.20 and 192.168.12.25 respectively, the program works good. But when I test the testMP3Streamer.exe and testMP3Receive.exe on two computers with the IP address of 222.205.120.136 and 222.205.120.137 respectively, the program can not work rightly. the only difference of the codes in this two tests is the ip address in testMP3Streamer.cpp and testMP3Receive.cpp. And, I do not define the USE_SSM macro. I just release the prevent of the firewall, is it still a problem? How can I make the program works well on the computers with the ip address of 222.205.120.136 and 222.205.120.137? thanks! best regard _________________________________________________________________ ???????,????????,??MClub???????????? http://club.msn.cn/?from=3 -------------- next part -------------- An HTML attachment was scrubbed... URL: From dliu.cn at gmail.com Mon Jul 20 07:22:36 2009 From: dliu.cn at gmail.com (Dong Liu) Date: Mon, 20 Jul 2009 10:22:36 -0400 Subject: [Live-devel] get the rtp discontinuity info Message-ID: <49f9a8960907200722s2fa94920p7f23c2ff09ab630c@mail.gmail.com> Hi, I'm writing a Microsoft directshow source filter based on live media's rtp. I'm interested to get the discontinuity, such as packet lost, packet out of order etc from the rtp layer. Is there API for this? Thanks! Dong -------------- next part -------------- An HTML attachment was scrubbed... URL: From dliu.cn at gmail.com Mon Jul 20 22:18:28 2009 From: dliu.cn at gmail.com (Dong Liu) Date: Tue, 21 Jul 2009 01:18:28 -0400 Subject: [Live-devel] H264VideoRTPSource question Message-ID: <49f9a8960907202218k326791f6hd560f3e19a678eb0@mail.gmail.com> Hi, I'm trying to write a Microsoft DirectShow source filter based on live media's RTP implementation. For mpeg4 it works perfectly. But for H264, the video is always jumpy. Sometimes with a lot artifacts. I tried the source filter with Cyperlink's H264/avc decoder, mpc-hc's MPCVideo decoder and ffdshow's software decoder. All have the same problem. What I did was according to Microsoft's documentation http://msdn.microsoft.com/en-us/library/dd757808(VS.85).aspx I created mediatype with 'AVC1' as the media type, and use the sprop-parameter-sets from SDP as the the extra data form video format. set dwFlags to 4. Then call fSource->getNextFrame to get the next NALU, in the callback function with the NALU, I put 4 bytes length before it and pass them to the decoder. BTW, sources are from vlc and Haivision encoder. vlc play them perfectly. And vlc is using livemedia as well :( Thanks, Dong -------------- next part -------------- An HTML attachment was scrubbed... URL: From spigao at gmail.com Tue Jul 21 04:20:34 2009 From: spigao at gmail.com (Sergio) Date: Tue, 21 Jul 2009 13:20:34 +0200 Subject: [Live-devel] OpenRTSP video dumping Message-ID: <71e4505a0907210420y125cc7d9uf106613f3b02b410@mail.gmail.com> Hi, Let me introduce my problem a bit and I'll very glad if you can give a hint. I'm using openRTSP to receive and RTP stream and dump it in to a mpeg file. Then, I transform this video into frames (pnm) with mplayer, in order to compare it to the original video and calculate psnr. Since there are RTP packet losses and the video file doesn't keep temporal reference of the pictures shown to the user, I can't do this straight forward. Thus, I'd like to extract an index of the received frames from the RTP stream reading the timestamps. The problem is that the resulting frames from the video are less than the RTP packets with different timestamps, so maybe openRTSP drops some packets when receiving them. Can this be? Any help with this or experiences calculating PSNR with lost frames would be very helpful Thanks in advance, Sergio From tm18174 at salle.url.edu Wed Jul 22 02:57:37 2009 From: tm18174 at salle.url.edu (tm18174 at salle.url.edu) Date: Wed, 22 Jul 2009 11:57:37 +0200 (CEST) Subject: [Live-devel] RTSP Server Message-ID: Hello, I'd like to know if there's any tutorial about RTSP Servers, because im a newbie programmer and i've never worked with Live555, and there's few information about how to use it on the internet. I want to develop a server which streams H264 video (not on demand, just begins the streaming as soon as one client is connected, and uses the same stream for all the others clients connected; stops streaming when there's no clients connected). As far as i've seen, i have to: -create my own H264MediaSubsession class -create RTSPServer and add my new subsession class -create H264RTPSource and provide the video source to this class -create H264RTPSink to handle the streaming itself Am I wrong? Please give me some advice or examples if possible. Daniel Collado Multimedia Engineering Degree Student Media Technologies Department (DTM) La Salle University (Barcelona, Spain) From chengchengxin at yahoo.com Wed Jul 22 08:26:53 2009 From: chengchengxin at yahoo.com (Catherine Cheng) Date: Wed, 22 Jul 2009 08:26:53 -0700 (PDT) Subject: [Live-devel] Question on RTSP field OPTIONS syntax Message-ID: <712188.71018.qm@web112602.mail.gq1.yahoo.com> Hello, I'm using RTSP/RTP over TCP to get video data from an IP camera. As soon as I send out OPTIONS command, the camera server close the connection. However, the VLC player works fine. Its OPTIONS has many lines recognizable and unrecognizable and its User-Agent is LIVE555. My OPTIONS lines are as follow, OPTIONS rtsp://192.168.1.187/rtsp_tunnel RTSP/1.0 CSeq: 1 Do I miss something? Any help would be appreciated. -Cath From amit.yedidia at elbitsystems.com Thu Jul 23 02:55:17 2009 From: amit.yedidia at elbitsystems.com (Yedidia Amit) Date: Thu, 23 Jul 2009 12:55:17 +0300 Subject: [Live-devel] Cancelling waiting to RTCP-RR (keep alive) Message-ID: Hi all, Is it possible to disbale the fact that the streamer is waiting for RTCP-RR from the receiver, and if not received withing a period of time the session is termianed by the streamer? Regards, Amit Yedidia Elbit System Ltd. Email: amit.yedidia at elbitsystems.com Tel: 972-4-8318905 ---------------------------------------------------------- The information in this e-mail transmission contains proprietary and business sensitive information. Unauthorized interception of this e-mail may constitute a violation of law. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. You are also asked to contact the sender by reply email and immediately destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jul 23 03:48:56 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Jul 2009 12:48:56 +0200 Subject: [Live-devel] Closing the media does not free allocated memory? In-Reply-To: <000601ca0aab$6b187210$ec03000a@gen47> References: <000601ca0aab$6b187210$ec03000a@gen47> Message-ID: >How do I release memory allocated for ourClient since destructor is >private operation. To reclaim an object derived from the "Medium" class, call "Medium::close()". That is sufficient. In general, you should reclaim objects in the reverse order that they were created. So, to reclaim your "UsageEnvironment" and "TashScheduler" objects, do the following (after reclaiming other objects): env->reclaim(); delete scheduler; (Yes, this is all really ugly and inconsistent. Someday it might get improved...) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastien-devel at celeos.eu Thu Jul 23 04:57:05 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Thu, 23 Jul 2009 13:57:05 +0200 Subject: [Live-devel] [BUG] changing system time with live555 running Message-ID: <1248350225.4a68501110ddb@imp.celeos.eu> Hi If I launch a rtsp connection like this : openRtsp rtsp://myip/mystream And then I remove one hour to my system time (If I add one hour, there is no problem), then live555 becomes buggy. It uses 100% of my CPU. Any idea ? Thanks. From kidjan at gmail.com Thu Jul 23 08:25:10 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 23 Jul 2009 08:25:10 -0700 Subject: [Live-devel] H264VideoRTPSource question In-Reply-To: <49f9a8960907202218k326791f6hd560f3e19a678eb0@mail.gmail.com> References: <49f9a8960907202218k326791f6hd560f3e19a678eb0@mail.gmail.com> Message-ID: I've written something similar using FFDShow's decoder filter that works fine, so it sounds like something in your directshow filter implementation is wrong; I'd try the dshow newsgroups. Hi, > > I'm trying to write a Microsoft DirectShow source filter based on live > media's RTP implementation. For mpeg4 it works perfectly. But for H264, the > video is always jumpy. Sometimes with a lot artifacts. I tried the source > filter with Cyperlink's H264/avc decoder, mpc-hc's MPCVideo decoder and > ffdshow's software decoder. All have the same problem. > > What I did was according to Microsoft's documentation > > http://msdn.microsoft.com/en-us/library/dd757808(VS.85).aspx > > I created mediatype with 'AVC1' as the media type, and use the > sprop-parameter-sets from SDP as the the extra data form video format. set > dwFlags to 4. Then call fSource->getNextFrame to get the next NALU, in the > callback function with the NALU, I put 4 bytes length before it and pass > them to the decoder. > > BTW, sources are from vlc and Haivision encoder. vlc play them perfectly. > And vlc is using livemedia as well :( > > Thanks, > > Dong > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Where are we going? And why am I in this hand-basket? -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Thu Jul 23 08:31:06 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 23 Jul 2009 08:31:06 -0700 Subject: [Live-devel] RTSP Server In-Reply-To: References: Message-ID: On Wed, Jul 22, 2009 at 2:57 AM, wrote: > > I want to develop a server which streams H264 video (not on demand, just > begins the streaming as soon as one client is connected, and uses the same > stream for all the others clients connected; stops streaming when there's > no clients connected). What you just described is an "on-demand" RTSP server; when a client connects, the stream is started. Take a look at the samples; they function exactly how you describe it (stream starts when first client connects, subsequent connections reuse the first stream). > > > As far as i've seen, i have to: > > -create my own H264MediaSubsession class > -create RTSPServer and add my new subsession class > -create H264RTPSource and provide the video source to this class > -create H264RTPSink to handle the streaming itself > That's pretty much it. You also need to know something about the H.264 stream your encoder produces (how to get SPS/PPS, something about what NALU it outputs, etc.) -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jul 23 09:03:40 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Jul 2009 18:03:40 +0200 Subject: [Live-devel] Cancelling waiting to RTCP-RR (keep alive) In-Reply-To: References: Message-ID: >Is it possible to disbale the fact that the streamer is waiting for >RTCP-RR from the receiver, and if not received withing a period of >time the session is termianed by the streamer? Yes, set the "reclamationTestSeconds" parameter in "RTSPServer::createNew()" to 0. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmaljur at elma.hr Fri Jul 24 01:03:37 2009 From: dmaljur at elma.hr (Dario) Date: Fri, 24 Jul 2009 10:03:37 +0200 Subject: [Live-devel] Closing the media does not free allocated memory? Message-ID: <000d01ca0c35$3ffac700$ec03000a@gen47> So, I can't free memory allocated by RTSPClient::createNew?I have a wrapper class around RTSPClient which I use in mannner ofCRTSPClient *client = new CRTSPClient(...).And I should not use "delete client;" but internally close open Media and then just reuse "*client" without deleting it?I'm just curious, why did live555 team decided to do it that way.Isn't it easier just to use standard new and delete for allocation? (not a factory)>To reclaim an object derived from the "Medium" class, call >"Medium::close()". That is sufficient. >In general, you should reclaim objects in the reverse order that they >were created. So, to reclaim your "UsageEnvironment" and >"TashScheduler" objects, do the following (after reclaiming other >objects): >env->reclaim(); >delete scheduler; >(Yes, this is all really ugly and inconsistent. Someday it might get >improved...) ELMA Kurtalj d.o.o. (ELMA Kurtalj ltd.) Vitezi?eva 1a, 10000 Zagreb, Hrvatska (Viteziceva 1a, 10000 Zagreb, Croatia) Tel: 01/3035555, Faks: 01/3035599 (Tel: ++385-1-3035555, Fax: ++385-1-3035599 ) Www: www.elma.hr; shop.elma.hr E-mail: elma at elma.hr (elma at elma.hr) pitanje at elma.hr (questions at elma.hr) primjedbe at elma.hr (complaints at elma.hr) prodaja at elma.hr (sales at elma.hr) servis at elma.hr (servicing at elma.hr) shop at elma.hr (shop at elma.hr) skladiste at elma.hr (warehouse at elma.hr) -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew_umlauf at yahoo.com Thu Jul 23 13:26:41 2009 From: andrew_umlauf at yahoo.com (Andrew Umlauf) Date: Thu, 23 Jul 2009 13:26:41 -0700 (PDT) Subject: [Live-devel] MPEG2 to TS Conversion Problems Message-ID: <856633.30137.qm@web82003.mail.mud.yahoo.com> I am having issues with the testMPEG1or2ProgramToTransportStream file. ? Here is what I have done. ? 1. At first, when I ran the program I would get the following error: ? Beginning to read... BasicTaskScheduler::SingleStep(): select() fails: No error ? I found that this error comes from line 89 in BasicTaskScheduler.cpp and is originated by the call to select() on line 70. To remedy (work around) I commented out exit(0) on line 91. ? 2. Now the program will run all the way through. Here is my output: ? Beginning to read... BasicTaskScheduler::SingleStep(): select() fails: No error Done reading. Wrote output file: "out.ts" ? Although this runs, my output file (out.ts) is 0 bytes. ? 3. Finally, to try an isolate the problem a little more, I edited testMPEG1or2ProgramToTransportStream.cpp to read the following ? int main(int argc, char** argv) { ------------------------------ // Open the input file as a 'byte-stream file source': FramedSource* inputSource = ByteStreamFileSource::createNew(*env, inputFileName); if (inputSource == NULL) { *env << "Unable to open file \"" << inputFileName << "\" as a byte-stream file source\n"; exit(1); } /*// Create a MPEG demultiplexor that reads from that source. MPEG1or2Demux* baseDemultiplexor = MPEG1or2Demux::createNew(*env, inputSource); // Create, from this, a source that returns raw PES packets: MPEG1or2DemuxedElementaryStream* pesSource = baseDemultiplexor->newRawPESStream(); // And, from this, a filter that converts to MPEG-2 Transport Stream frames: FramedSource* tsFrames = MPEG2TransportStreamFromPESSource::createNew(*env, pesSource); */ FramedSource* tsFrames = inputSource; // Open the output file as a 'file sink': MediaSink* outputSink = FileSink::createNew(*env, outputFileName); ----------------------------------------- } In the above code, the part that translates the mpg stream into a ts stream was commented out and "short circuited". The result was a out.ts file that was the same size as the in.mpg file. This implies that there is a problem with the translation code. I am new to live-555 however so I have no idea where to begin. Has anybody else has these issues. If so, how to fix? P.S. I had the same output as #1 when I try to run MPEG2TransportStreamIndexer.exe and other test files. -Andrew U ? From dliu.cn at gmail.com Thu Jul 23 19:11:33 2009 From: dliu.cn at gmail.com (Dong Liu) Date: Thu, 23 Jul 2009 22:11:33 -0400 Subject: [Live-devel] H264VideoRTPSource question In-Reply-To: References: <49f9a8960907202218k326791f6hd560f3e19a678eb0@mail.gmail.com> Message-ID: <4A691855.5090801@gmail.com> Jeremy, Thanks. It works on some h264 sources but not on some other sources. I came cross this message on the vlc list, http://mailman.videolan.org/pipermail/vlc-devel/2005-July/018249.html and a reply http://mailman.videolan.org/pipermail/vlc-devel/2005-July/018262.html It mentioned you need to to gather a whole frame before give it to ffmpeg decoder. Maybe I should take look at that. BTW, the decoder I'm interested to use is the MPCVideoDecoder from mpc-hc project, it supports hardware acceleration but it also use ffmpeg code. Thanks, Dong Jeremy Noring wrote: > I've written something similar using FFDShow's decoder filter that works > fine, so it sounds like something in your directshow filter > implementation is wrong; I'd try the dshow newsgroups. > > Hi, > > I'm trying to write a Microsoft DirectShow source filter based on > live media's RTP implementation. For mpeg4 it works perfectly. But > for H264, the video is always jumpy. Sometimes with a lot artifacts. > I tried the source filter with Cyperlink's H264/avc decoder, > mpc-hc's MPCVideo decoder and ffdshow's software decoder. All have > the same problem. > > What I did was according to Microsoft's documentation > > http://msdn.microsoft.com/en-us/library/dd757808(VS.85).aspx > > > I created mediatype with 'AVC1' as the media type, and use the > sprop-parameter-sets from SDP as the the extra data form video > format. set dwFlags to 4. Then call fSource->getNextFrame to get > the next NALU, in the callback function with the NALU, I put 4 bytes > length before it and pass them to the decoder. > > BTW, sources are from vlc and Haivision encoder. vlc play them > perfectly. And vlc is using livemedia as well :( > > Thanks, > > Dong > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > > > -- > Where are we going? > And why am I in this hand-basket? > > > ------------------------------------------------------------------------ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From woods.biz at gmail.com Thu Jul 23 19:19:04 2009 From: woods.biz at gmail.com (Woods) Date: Fri, 24 Jul 2009 10:19:04 +0800 Subject: [Live-devel] testMP3Streamer.cpp and testMP3Receive.cpp In-Reply-To: References: Message-ID: <3ba8ebc0907231919y33072d07i51d8101c32932420@mail.gmail.com> Can you use Sniffer software to CONFIRM you have received streaming packets on destination machine? 2009/7/20 DannyLuo > Hi, > I am still confused by the using of testMP3Streamer.exe and > testMP3Receive.exe. > when I test the testMP3Streamer.exe and testMP3Receive.exe on two > computers with the IP address of 192.168.12.20 and 192.168.12.25 > respectively, the program works good. > But when I test the testMP3Streamer.exe and testMP3Receive.exe on two > computers with the IP address of 222.205.120.136 and 222.205.120.137 > respectively, the program can not work rightly. > the only difference of the codes in this two tests is the ip address in > testMP3Streamer.cpp and testMP3Receive.cpp. And, I do not define the USE_SSM > macro. > I just release the prevent of the firewall, is it still a problem? How > can I make the program works well on the computers with the ip address > of 222.205.120.136 and 222.205.120.137? > thanks! > best > regard > > ------------------------------ > ??+??+?? ??????,??MSN????! ????? > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From santosh.k at knoahsoft.com Thu Jul 23 23:50:05 2009 From: santosh.k at knoahsoft.com (Santosh Karankoti) Date: Fri, 24 Jul 2009 12:20:05 +0530 Subject: [Live-devel] Trying to play mp3 file or live streaming to wmplayer client from default MediaServer code Message-ID: Hi, I have downloaded and compiled the mak files for all the livemedia555 streaming open source code I am trying to play mp3 file by requesting the default Mediaserver code using the Windows Media Player v10. In this scenario, the WMPlayer is sending the Describe request to the Mediaserver server, to which the server is responding back to the wmplayer with the sdp details in the Describe success response. For some reason, it seems like the wmplayer is not recongnizing the response. The same when request url when I am providing it to the MediaServer using OpenRTSP, VLCPlayer. The mp3 file is a song file of 3 minutes 43 seconds. With both these clients, the whole RTSP message flow is finishing smoothly, as follows: VLCPlayer / OpenRTSP MediaServer | | | | |--> OPTIONS--> --> | |<-- <--200OK <-- | |--> DESCRIBE--> --> | |<-- <--200OK+SDP <-- | |--> SETUP --> --> | |<-- <--200OK <-- | |--> PLAY --> --> | |<-- <--200OK <-- | I am able to hear the song with the VLC Player but not with the OpenRTSP client. I am not bothered about it because it says it is receiving the started stream and I am able to see the rtp transmission in the Wireshark. But the WMPlayer doest not recognize the LiveMedia DESCRIBE response at all and retransmits the request Windows Media Player MediaServer | | | | |--> DESCRIBE--> --> | |<-- <--200OK+SDP <-- | |--> DESCRIBE--> --> | |<-- <--200OK <-- | Finally after 30 seconds the timeout happens at wmplayer and says it is not able My concern is that the WMPlayer is not able to recognize the LiveMedia server describe success response. The message flow captured at the wmplayer is as follows: Can anyone provide information on how I can make the Windows Media Player understand the describe response of the default MediaServer of livemedia555streamer. Do I need to enable or disable any setting on the windows media player or do I need to change the code of MediaServer? If so kindly try to provide the steps if you know for the settings in the windows media player or kindly provide the reason and the solution to change the default code to run with the windows media player. I can take care of the coding, want to know which files do I need to enhance if required to run with Windows Media Player. Do I need to recompile the source code with any changes in the default MediaServer makefile. Regards, Santosh Knoahsoft -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jul 24 02:25:04 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jul 2009 11:25:04 +0200 Subject: [Live-devel] Closing the media does not free allocated memory? In-Reply-To: <000d01ca0c35$3ffac700$ec03000a@gen47> References: <000d01ca0c35$3ffac700$ec03000a@gen47> Message-ID: >So, I can't free memory allocated by RTSPClient::createNew? Yes you can. Once again: To reclaim such an object (and free its memory), call "Medium::close(obj)" (If - for whatever reason - you think that this is not freeing memory, then you're probably wrong :-) (Also, please use a reasonably-sized font in your email postings!) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jul 24 03:09:47 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jul 2009 12:09:47 +0200 Subject: [Live-devel] get the rtp discontinuity info In-Reply-To: <49f9a8960907200722s2fa94920p7f23c2ff09ab630c@mail.gmail.com> References: <49f9a8960907200722s2fa94920p7f23c2ff09ab630c@mail.gmail.com> Message-ID: >I'm writing a Microsoft directshow source filter based on live >media's rtp. I'm interested to get the discontinuity, such as packet >lost, packet out of order etc from the rtp layer. No, you don't need this information. Our RTP reception implementation takes care of this automatically, so that your receiver is presented with whole, properly-ordered frames. If, however, you really want to look at the QOS (including packet loss) statistics that the RTP/RTCP reception implementation collects, then you can do this by looking at the "RTPReceptionStats". See, for example, how the "openRTSP" demo application implements its "-Q" option. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From anon.hui at gmail.com Fri Jul 24 04:21:11 2009 From: anon.hui at gmail.com (Anon Sricharoenchai) Date: Fri, 24 Jul 2009 18:21:11 +0700 Subject: [Live-devel] bug: testProgs/playCommon.cpp: task is not properly unscheduled in continuous play Message-ID: <5914604b0907240421qa7630fbgbba14169a1a449f2@mail.gmail.com> When playing continuously, it should unschedule the following tasks, before continue playing. testProgs/playCommon.cpp: void sessionAfterPlaying(void* /*clientData*/) { if (!playContinuously) { shutdown(0); } else { + // unschedule these task, since it will be rescheduled by + // startPlayingStreams() + if (env != NULL) { + env->taskScheduler().unscheduleDelayedTask(sessionTimerTask); + env->taskScheduler().unscheduleDelayedTask(arrivalCheckTimerTask); + env->taskScheduler().unscheduleDelayedTask(interPacketGapCheckTimerTask); + env->taskScheduler().unscheduleDelayedTask(qosMeasurementTimerTask); + sessionTimerTask = NULL; + arrivalCheckTimerTask = NULL; + interPacketGapCheckTimerTask = NULL; + qosMeasurementTimerTask = NULL; + } + // reset this, to ensure that checkInterPacketGaps() called by + // startPlayingStreams() will not close the session + totNumPacketsReceived = ~0; // We've been asked to play the stream(s) over again: startPlayingStreams(); } } From dliu.cn at gmail.com Fri Jul 24 06:24:53 2009 From: dliu.cn at gmail.com (Dong Liu) Date: Fri, 24 Jul 2009 09:24:53 -0400 Subject: [Live-devel] get the rtp discontinuity info In-Reply-To: References: <49f9a8960907200722s2fa94920p7f23c2ff09ab630c@mail.gmail.com> Message-ID: <4A69B625.5040808@gmail.com> Ross Finlayson wrote: >> I'm writing a Microsoft directshow source filter based on live media's >> rtp. I'm interested to get the discontinuity, such as packet lost, >> packet out of order etc from the rtp layer. > > No, you don't need this information. Our RTP reception implementation > takes care of this automatically, so that your receiver is presented > with whole, properly-ordered frames. > > If, however, you really want to look at the QOS (including packet loss) > statistics that the RTP/RTCP reception implementation collects, then you > can do this by looking at the "RTPReceptionStats". See, for example, > how the "openRTSP" demo application implements its "-Q" option. But, in Microsoft Directshow, when you provide a sample to downstream filter, you can indicate that if there is a discontinuity in the stream. I think this info is useful to other receivers too. Thanks, Dong From finlayson at live555.com Fri Jul 24 12:56:53 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jul 2009 21:56:53 +0200 Subject: [Live-devel] bug: testProgs/playCommon.cpp: task is not properly unscheduled in continuous play In-Reply-To: <5914604b0907240421qa7630fbgbba14169a1a449f2@mail.gmail.com> References: <5914604b0907240421qa7630fbgbba14169a1a449f2@mail.gmail.com> Message-ID: Thanks. This will get fixed in a future release. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Jul 24 13:38:56 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jul 2009 22:38:56 +0200 Subject: [Live-devel] MPEG2 to TS Conversion Problems In-Reply-To: <856633.30137.qm@web82003.mail.mud.yahoo.com> References: <856633.30137.qm@web82003.mail.mud.yahoo.com> Message-ID: >I am having issues with the testMPEG1or2ProgramToTransportStream file. > >Here is what I have done. > >1. At first, when I ran the program I would get the following error: > > Beginning to read... > BasicTaskScheduler::SingleStep(): select() fails: No error This is your problem; any attempt to 'work around' it will be futile. That message usually means a problem with your networking interface, or network interface library. What OS are you running this on? From kidjan at gmail.com Fri Jul 24 14:11:09 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Fri, 24 Jul 2009 14:11:09 -0700 Subject: [Live-devel] get the rtp discontinuity info In-Reply-To: References: <49f9a8960907200722s2fa94920p7f23c2ff09ab630c@mail.gmail.com> Message-ID: On Fri, Jul 24, 2009 at 3:09 AM, Ross Finlayson wrote: > I'm writing a Microsoft directshow source filter based on live media's rtp. >> I'm interested to get the discontinuity, such as packet lost, packet out of >> order etc from the rtp layer. >> > > No, you don't need this information. Our RTP reception implementation > takes care of this automatically, so that your receiver is presented with > whole, properly-ordered frames. > Dumb question: what if a frame is lost on the wire? For example, if it's H.264, and a keyframe was lost, does that information not need to be conveyed to the decoder? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jul 24 14:52:51 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jul 2009 23:52:51 +0200 Subject: [Live-devel] get the rtp discontinuity info In-Reply-To: References: <49f9a8960907200722s2fa94920p7f23c2ff09ab630c@mail.gmail.com> Message-ID: >No, you don't need this information. Our RTP reception >implementation takes care of this automatically, so that your >receiver is presented with whole, properly-ordered frames. > > > >Dumb question: what if a frame is lost on the wire? For example, if >it's H.264, and a keyframe was lost, does that information not need >to be conveyed to the decoder? Probably not. The decoder knows what kind of frame it's getting each time. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From woods.biz at gmail.com Fri Jul 24 23:23:00 2009 From: woods.biz at gmail.com (Woods) Date: Sat, 25 Jul 2009 14:23:00 +0800 Subject: [Live-devel] MP4 Demux problem Message-ID: <3ba8ebc0907242323m7ef9c823n8ec7d56154ddf4a5@mail.gmail.com> Dear all, I am writing a MP4Demux and MP4DemuxedElementaryStream, in a way of mimic MPEG1or2Demux and MPEG1or2DemuxedElementaryStream. In specific, my MP4Demux uses FFMPEG to read AVPacket from MP4 file, and feed the Packet->data to MP4DemuxedElementaryStream. In case, packet->size is larger than maxSize of doGetNextFrame(), I will feed packet->data in several steps with respect to the given maxSize. But I don't know how to set presentation time and duration time, just put them to 0 as MPEG1or2Demux does. Then, I test my classes with a simple streamer, feeding to MPEGTransportStreamFromESStream and BasicUDPSink. I notice it does not work properly, in that the streamer doesn't have proper scheduling during streaming and it reads all packets from MP4 file very fast, and send all of them out immediately. Could anyone give me suggestion on the cause of this? Thank you. -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From SRawling at pelco.com Sat Jul 25 10:25:19 2009 From: SRawling at pelco.com (Rawling, Stuart) Date: Sat, 25 Jul 2009 10:25:19 -0700 Subject: [Live-devel] Memory leak Message-ID: Hi Ross, In teardownMediaSubsession in RTSPClient, the subsession?s sessionId variable is deleted and reset. However *only if* the teardown succeeds. If the teardown fails and the MediaSubsession is subsequently closed using the proper method, then the sessionId will not be freed. I suggest also delete the pointer in the destructor of MediaSubsession in case it has not already being deleted. I know that you are reworking RTSPClient, but I still think this is worth handling in MediaSubsession as an edge case. Regards, Stuart Patch: --- MediaSession.cpp 2009-07-25 03:22:31.000000000 -0700 +++ MediaSession.cpp.new 2009-07-25 03:23:53.000000000 -0700 @@ -563,7 +563,8 @@ delete[] fConnectionEndpointName; delete[] fSavedSDPLines; delete[] fMediumName; delete[] fCodecName; delete[] fProtocolName; - delete[] fControlPath; delete[] fConfig; delete[] fMode; delete[] fSpropParameterSets; + delete[] fControlPath; delete[] fConfig; delete[] fMode; delete[] fSpropParameterSets; + delete[] sessionId; delete fNext; #ifdef SUPPORT_REAL_RTSP - ------------------------------------------------------------------------------ Confidentiality Notice: The information contained in this transmission is legally privileged and confidential, intended only for the use of the individual(s) or entities named above. This email and any files transmitted with it are the property of Pelco. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any review, disclosure, copying, distribution, retention, or any action taken or omitted to be taken in reliance on it is prohibited and may be unlawful. If you receive this communication in error, please notify us immediately by telephone call to +1-559-292-1981 or forward the e-mail to administrator at pelco.com and then permanently delete the e-mail and destroy all soft and hard copies of the message and any attachments. Thank you for your cooperation. - ------------------------------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew_umlauf at yahoo.com Sat Jul 25 15:32:34 2009 From: andrew_umlauf at yahoo.com (Andrew Umlauf) Date: Sat, 25 Jul 2009 15:32:34 -0700 (PDT) Subject: [Live-devel] MPEG2 to TS Conversion Problems Message-ID: <656385.21840.qm@web82004.mail.mud.yahoo.com> I am running on Windows XP compiled with VS 2005. However, I also get the error using the .mak files following the directions provided on the VS2005 hints page. I tried compiling on several computers at my work place with no success. I additionally tried compiling with VC7, no success. I was going to try with vc6, but I could not obtain a license before quitting time. Unfortunatly I am force to run this in a windows environment. After further investigation, I assumed that there was a problem with the way that ByteStreamFileSource was handling IO on windows. I replaced the ByteStreamFileSource with a SimpleRTPSource. I no longer get the select() error and I get a TS file that is larger then 0 bytes. However, the TS file does not display anything when played (VLC). When I comment out the TS conversion code, I get a replicated results (different in size but plays the same). I tried this with multiple non-audio mpeg files. Same result. This is the code: int main(int argc, char** argv) { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); char const* srcAddrStr = "147.159.167.150"; struct in_addr srcAddress; srcAddress.s_addr = our_inet_addr(srcAddrStr); const Port rtpPort(8888); const unsigned char ttl = 7; Groupsock rtpGroupsock(*env, srcAddress, rtpPort, ttl); //FramedSource* inputSource = MPEG1or2VideoRTPSource::createNew(*env, &rtpGroupsock); //doesn't work FramedSource* inputSource = SimpleRTPSource::createNew(*env, &rtpGroupsock, 33, 90000, "video"); if (inputSource == NULL) { *env << "Unable to open source \n"; exit(1); } // Create a MPEG demultiplexor that reads from that source. MPEG1or2Demux* baseDemultiplexor = MPEG1or2Demux::createNew(*env, inputSource); // Create, from this, a source that returns raw PES packets: MPEG1or2DemuxedElementaryStream* pesSource = baseDemultiplexor->newRawPESStream(); // And, from this, a filter that converts to MPEG-2 Transport Stream frames: FramedSource* tsFrames = MPEG2TransportStreamFromPESSource::createNew(*env, pesSource); // FramedSource* tsFramed = inputSource; //Short Circuit // Open the output file as a 'file sink': MediaSink* outputSink = FileSink::createNew(*env, outputFileName); if (outputSink == NULL) { *env << "Unable to open file \"" << outputFileName << "\" as a file sink\n"; exit(1); } // Finally, start playing: *env << "Beginning to read...\n"; outputSink->startPlaying(*tsFrames, afterPlaying, NULL); env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } Being new to Live555 my invalid out.ts is probably just a result of improperly utitilizing the SimpleRTPSink. However as I look through the forums and do google searches, it seems lots of people are having problems with the select() error. This error limits me from running alot of the test executables. Fortunatly, I think I will be able to get around using a ByteStreamFileSource in the solution that i need it for. Let me know what you think. I appreciate the help. -Andrew U. --- On Fri, 7/24/09, Ross Finlayson wrote: > From: Ross Finlayson > Subject: Re: [Live-devel] MPEG2 to TS Conversion Problems > To: "LIVE555 Streaming Media - development & use" > Date: Friday, July 24, 2009, 4:38 PM > >I am having issues with the > testMPEG1or2ProgramToTransportStream file. > > > >Here is what I have done. > > > >1. At first, when I ran the program I would get the > following error: > > > >? ???Beginning to read... > >? > ???BasicTaskScheduler::SingleStep(): select() > fails: No error > > This is your problem; any attempt to 'work around' it will > be futile. > > That message usually means a problem with your networking > interface, > or network interface library. > > What OS are you running this on? > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Sun Jul 26 07:49:29 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 26 Jul 2009 16:49:29 +0200 Subject: [Live-devel] Memory leak In-Reply-To: References: Message-ID: >In teardownMediaSubsession in RTSPClient, the subsession's sessionId >variable is deleted and reset. However *only if* the teardown >succeeds. If the teardown fails and the MediaSubsession is >subsequently closed using the proper method, then the sessionId will >not be freed. I suggest also delete the pointer in the destructor >of MediaSubsession in case it has not already being deleted. No, I don't want to do this, because - in general - the entity that allocates memory should be the one that deletes it. In this case, the memory ("sessionId" string) is allocated (and used) only by the "RTSPClient" - and so the "RTSPClient" should be the only thing that deletes it. The actual solution in this case is simply for the "RTSPClient" to behave as is the RTSP "TEARDOWN" succeeded, regardless of the actual RTSP response that it gets from the server. I.e., as far as the client is concerned, the session has ended. I'll fix this (in "RTSPClient") in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Jul 26 08:06:03 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 26 Jul 2009 17:06:03 +0200 Subject: [Live-devel] MPEG2 to TS Conversion Problems In-Reply-To: <656385.21840.qm@web82004.mail.mud.yahoo.com> References: <656385.21840.qm@web82004.mail.mud.yahoo.com> Message-ID: >After further investigation, I assumed that there was a problem with >the way that ByteStreamFileSource was handling IO on windows. On Windows, the "READ_FROM_FILES_SYNCHRONOUSLY" constant must be defined (to 1) in "ByteStreamFileSource.cpp". This should happen automatically (provided that __WIN32__ or _WIN32 is defined), but you should check to make sure. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Jul 26 12:17:59 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 26 Jul 2009 21:17:59 +0200 Subject: [Live-devel] problem trying to transmit raw audio data to mplayer In-Reply-To: <4A683146.9040407@videotec.com> References: <4A683146.9040407@videotec.com> Message-ID: >If I try to download the stream with openRTSP it works fine, I just open >the received stream with audacity and I can listen to it. [...] >When I try to listen to it with maplyer I get this output: > >MPlayer dev-SVN-r26940 >CPU: Intel(R) Core(TM)2 CPU 6600 @ 2.40GHz (Family: 6, Model: >15, Stepping: 6) >CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 1 >Compiled with runtime CPU detection. >Can't open joystick device /dev/input/js0: No such file or directory >Can't init input joystick >mplayer: could not connect to socket >mplayer: No such file or directory >Failed to open LIRC support. You will not be able to use your remote >control. This is not a MPlayer mailing list, so we can't help you with this However, I suggest trying VLC instead. You might have better luck with that. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Jul 26 12:46:12 2009 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 26 Jul 2009 21:46:12 +0200 Subject: [Live-devel] Question on RTSP field OPTIONS syntax In-Reply-To: <712188.71018.qm@web112602.mail.gq1.yahoo.com> References: <712188.71018.qm@web112602.mail.gq1.yahoo.com> Message-ID: >Hello, > >I'm using RTSP/RTP over TCP to get video data from an IP camera. As >soon as I send out OPTIONS command, the camera server close the >connection. However, the VLC player works fine. Its OPTIONS has many >lines recognizable and unrecognizable and its User-Agent is LIVE555. >My OPTIONS lines are as follow, Unfortunately I don't really understand your question, because VLC (as a RTSP client) uses our library, and therefore uses our "RTSPClient:: sendOptionsCmd()" function, which is what you should be using also. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Deepti.Saraswat at sdc.canon.co.in Mon Jul 27 00:42:42 2009 From: Deepti.Saraswat at sdc.canon.co.in (Deepti.Saraswat at sdc.canon.co.in) Date: Mon, 27 Jul 2009 13:12:42 +0530 Subject: [Live-devel] Regarding IPV6 Message-ID: Hello Ross I want to know how we can extend Groupsock class to support IPV6. Regards Deepti Help save paper - do you need to print this email ? This e-mail and any files transmitted with it are confidential and intended solely for the use of the individual(s) or entity addressed above. If you are not the intended recipient of this e-mail, then (i) please notify us at cipl.administrator at canon.co.in (ii) do not use the information contained & (iii) delete it from your system . E-mail communications are not secure. Though efforts have been taken to ensure that this communication has not been tampered with; however, Canon India Pvt Ltd. is not responsible for any changes made to the contents of or any attachments to this message without its consent. All information contained in this message which are not of an official nature shall not be deemed as given or endorsed by Canon India Pvt. Ltd. From finlayson at live555.com Mon Jul 27 00:56:18 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 27 Jul 2009 09:56:18 +0200 Subject: [Live-devel] Regarding IPV6 In-Reply-To: References: Message-ID: >I want to know how we can extend Groupsock class to support IPV6. That is not how we will be supporting IPv6. Instead, we will first be completely replacing the "Groupsock" mechanism with a new 'network packet' abstraction (treating incoming network interfaces as 'framed sources' and outgoing network interfaces as 'framed sinks'- the same way we already treat RTP media frames). Then our code will be able to handle any network interface, whether IPv4, IPv6, or synthetic. Unfortunately this will be a lot of work, and there's no ETA for it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Jul 27 01:35:21 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 27 Jul 2009 10:35:21 +0200 Subject: [Live-devel] RTSP Server In-Reply-To: References: Message-ID: >I'd like to know if there's any tutorial about RTSP Servers, because im a >newbie programmer Unfortunately, this software is not intended for "newbie programmers". >I want to develop a server which streams H264 video (not on demand, just >begins the streaming as soon as one client is connected, and uses the same >stream for all the others clients connected; stops streaming when there's >no clients connected). > >As far as i've seen, i have to: > >-create my own H264MediaSubsession class Yes. This should be a subclass of "OnDemandServerMediaSubsession", and must set the "reuseFirstSource" parameter to True (because you want to use the same input source for all downstream clients). >-create RTSPServer and add my new subsession class Yes. >-create H264RTPSource No! "RTPSource"s are used only for RTP *receivers* (i.e., clients). You are writing a server, not a client. Instead, you need to write your own subclass of "H264VideoStreamFramer" (to encapsulate your input source, to deliver H.264 NAL units). >-create H264RTPSink to handle the streaming itself Yes. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sebastien-devel at celeos.eu Mon Jul 27 05:29:07 2009 From: sebastien-devel at celeos.eu (=?iso-8859-1?b?U+liYXN0aWVu?= Escudier) Date: Mon, 27 Jul 2009 14:29:07 +0200 Subject: [Live-devel] [BUG] changing system time with live555 running In-Reply-To: <1248350225.4a68501110ddb@imp.celeos.eu> References: <1248350225.4a68501110ddb@imp.celeos.eu> Message-ID: <1248697747.4a6d9d9352ca6@imp.celeos.eu> Quoting S?bastien Escudier : > If I launch a rtsp connection like this : > openRtsp rtsp://myip/mystream > > And then I remove one hour to my system time (If I add one hour, there is no > problem), then live555 becomes buggy. It uses 100% of my CPU. Hi Ross, Do you plan to look at this bug ? In my case, as a temporary hack, I replaced gettimeofday by a monotonic clock[1] in the file RTCP.cpp. It solves the issue for the "openRtsp" test prog. But there may be others places to fix in the code. [1] clock_gettime( CLOCK_MONOTONIC, &ts ) http://www.opengroup.org/onlinepubs/009695399/functions/clock_gettime.html Regards, Seb. From Deepti.Saraswat at sdc.canon.co.in Mon Jul 27 06:11:02 2009 From: Deepti.Saraswat at sdc.canon.co.in (Deepti.Saraswat at sdc.canon.co.in) Date: Mon, 27 Jul 2009 18:41:02 +0530 Subject: [Live-devel] Regarding IPV6 Message-ID: Hi Ross, I am creating a RTSPClient application, using LIVE 555. Right now my application is displaying only JPEG stream . For this, I dervided a sink class from MediaSink class of Live555 and using the doEventLoop() function I am getting data in this media sink driven class. My query is Can I receive the H.264 data as well inside this simple media sink driven class or do I need to do something. Pl reply soon. Regards Deepti Help save paper - do you need to print this email ? This e-mail and any files transmitted with it are confidential and intended solely for the use of the individual(s) or entity addressed above. If you are not the intended recipient of this e-mail, then (i) please notify us at cipl.administrator at canon.co.in (ii) do not use the information contained & (iii) delete it from your system . E-mail communications are not secure. Though efforts have been taken to ensure that this communication has not been tampered with; however, Canon India Pvt Ltd. is not responsible for any changes made to the contents of or any attachments to this message without its consent. All information contained in this message which are not of an official nature shall not be deemed as given or endorsed by Canon India Pvt. Ltd. From finlayson at live555.com Mon Jul 27 06:43:49 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 27 Jul 2009 15:43:49 +0200 Subject: [Live-devel] Regarding IPV6 In-Reply-To: References: Message-ID: >I am creating a RTSPClient application, using LIVE 555. >Right now my application is displaying only JPEG stream . For this, I >dervided a sink class from MediaSink class of Live555 and using the >doEventLoop() function I am getting data in this media sink driven class. > >My query is Can I receive the H.264 data as well Yes. Look at our existing RTSP client demo application - "openRTSP". It can receive H.264 video (as well as many other codecs). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From garesuru at gmail.com Mon Jul 27 08:40:50 2009 From: garesuru at gmail.com (gerardo ares) Date: Mon, 27 Jul 2009 12:40:50 -0300 Subject: [Live-devel] sync sample issue with flash players Message-ID: <38f061a30907270840j4757a2b3k1caee0e4feeb7680@mail.gmail.com> Hi, i am using your openRTSP software to get a stream from an axis m1031 devices. The openRTSP gets the H.264 video and i can create the mp4 file with the -4 option. Everything is working good, but i have an issue when i need to play those videos in the JW flash player or flowplayer. I saw that the time shift functionality was not working with the openRTSP mp4 files. After doing some research and hours of work (i'm not an mp4 expert) i found that openRTSP do not create the "sync sample" metadata, so that was the cause that the time shift functionality was not working with flash players. Based on the documentation, the "sync sample" metadata is optional, so i think openRTSP is ok. But the solution i have found is to change the QuickTimeFileSink.cpp file to support the sync sample atom. I added the addAtom(stss) function (code below) based on the Time-To-Sample atom, and the new openRTSP videos start to work good. Could you add the sync sample metadata to your software as an option? So, i could use a solution based on mp4 expert developers. Thanks in advance. Gerardo. The code...could have some bugs but it is working... addAtom(stss); // Sync-Sample size += addWord(0x00000000); // Version+flags // First, add a dummy "Number of entries" field // (and remember its position). We'll fill this field in later: unsigned numEntriesPosition = ftell(fOutFid); size += addWord(0); // dummy for "Number of entries" // Then, run through the chunk descriptors, and enter the entries // in this (compressed) Time-to-Sample table: unsigned numEntries = 0, numSamplesSoFar = 0; unsigned prevSampleDuration = 0; unsigned i; unsigned const samplesPerFrame = fCurrentIOState->fQTSamplesPerFrame; ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk; while (chunk != NULL) { unsigned const numSamples = chunk->fNumFrames*samplesPerFrame; numSamplesSoFar += numSamples; chunk = chunk->fNextChunk; } // Then, write out the last entry: for (i = 0; i < numSamplesSoFar; i+=12) { size += addWord(i + 1); ++numEntries; } if (i != (numSamplesSoFar - 1)) { size += addWord(numSamplesSoFar); ++numEntries; } // Now go back and fill in the "Number of entries" field: setWord(numEntriesPosition, numEntries); addAtomEnd; ps: the magic 12 i got it from ffmpeg command. When i was doing some research i used the ffmpeg command to transcode the openRTSP mp4 file to a new one with the libx264 enconder. I saw that ffmpeg put the following sync samples: 1, 13, 25, 37...so i use the same. -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew_umlauf at yahoo.com Mon Jul 27 11:16:30 2009 From: andrew_umlauf at yahoo.com (andrew_umlauf at yahoo.com) Date: Mon, 27 Jul 2009 11:16:30 -0700 (PDT) Subject: [Live-devel] MPEG2 to TS Conversion Problems Message-ID: <316512.22229.qm@web82007.mail.mud.yahoo.com> Indeed READ_FROM_FILES_SYNCHRONOUSLY is defined as 1 and I still get the error. I have noticed that other people have been reporting the same issue in previous posts in this forum. In one post response, the user was told to comment the exit(0) (line 91) in BasicTaskScheduler. Another post response said to make sure READ_FROM_FILES_SYNCHRONOUSLY is defined as 1. Unfortunately, on all of the posts reguarding issues similar to mine, there are no complete follow-ups and I have not had luck with anything that was suggested. As mentioned before, I also get weird behavior when I replace the ByteStreamFileSource with a SimpleRTPSource or SimpleUDPSource....I do not get the select() error with a zero byte output, but I get an output that does not play. I assume that all of these issues are related to improper windows compilation but I don't really know how to attack the problem. I have tried compiling with Visual Studio 9, 8, 7 and 6 on multiple machines following the windows build instructions provided on the live 555 site. -Andrew Umlauf --- On Sun, 7/26/09, Ross Finlayson wrote: > From: Ross Finlayson > Subject: Re: [Live-devel] MPEG2 to TS Conversion Problems > To: "LIVE555 Streaming Media - development & use" > Date: Sunday, July 26, 2009, 11:06 AM > > After further investigation, I > assumed that there was a problem with the way that > ByteStreamFileSource was handling IO on windows. > > > On Windows, the "READ_FROM_FILES_SYNCHRONOUSLY" constant > must be defined (to 1) in "ByteStreamFileSource.cpp".? > This should happen automatically (provided that __WIN32__ or > _WIN32 is defined), but you should check to make sure. > -- > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Mon Jul 27 12:48:04 2009 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 27 Jul 2009 21:48:04 +0200 Subject: [Live-devel] sync sample issue with flash players In-Reply-To: <38f061a30907270840j4757a2b3k1caee0e4feeb7680@mail.gmail.com> References: <38f061a30907270840j4757a2b3k1caee0e4feeb7680@mail.gmail.com> Message-ID: Gerardo, Thanks for the contribution. I will likely add this to the installed code. However, your suggested patch is incomplete, because you didn't say where to add the code that actually adds this atom. I.e., at some point in the code, there needs to be a statement like size += addAtom_stss(); Can you please let us know where you think that statement should go? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From garesuru at gmail.com Mon Jul 27 15:40:27 2009 From: garesuru at gmail.com (gerardo ares) Date: Mon, 27 Jul 2009 19:40:27 -0300 Subject: [Live-devel] sync sample issue with flash players In-Reply-To: References: <38f061a30907270840j4757a2b3k1caee0e4feeb7680@mail.gmail.com> Message-ID: <38f061a30907271540l77285be8nc265b61dfbff5eb0@mail.gmail.com> Hi, yes, sorry. I added between stts and stsc atoms: addAtom(stbl); size += addAtom_stsd(); size += addAtom_stts(); size += addAtom_stss(); size += addAtom_stsc(); size += addAtom_stsz(); size += addAtom_stco(); addAtomEnd; Gerardo. On Mon, Jul 27, 2009 at 4:48 PM, Ross Finlayson wrote: > Gerardo, > > Thanks for the contribution. > > I will likely add this to the installed code. However, your suggested > patch is incomplete, because you didn't say where to add the code that > actually adds this atom. I.e., at some point in the code, there needs to be > a statement like > size += addAtom_stss(); > > Can you please let us know where you think that statement should go? > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jul 27 18:13:07 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jul 2009 03:13:07 +0200 Subject: [Live-devel] sync sample issue with flash players In-Reply-To: <38f061a30907270840j4757a2b3k1caee0e4feeb7680@mail.gmail.com> References: <38f061a30907270840j4757a2b3k1caee0e4feeb7680@mail.gmail.com> Message-ID: OK, I've now released a new version (2009.07.28) of the "LIVE555 Streaming Media" software that includes this change. However, I changed your suggested code so that it adds the "stss" atom only for video streams. (For audio streams, such an atom doesn't appear to make sense (see ), and would lead to an extremely large number of entries anyway.) Also, I'm rather nervous about the assumption that 'key frames' occur every 12 frames in each video stream. This might be true for ffmpeg-generated H.264 streams, but or many (if not most) other video streams, that won't be the case, so I'm worried about what this might break. So if anyone runs into any problems with this change, please let us know. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yogeshu at interfaceinfosoft.com Mon Jul 27 23:32:16 2009 From: yogeshu at interfaceinfosoft.com (Yogesh) Date: Tue, 28 Jul 2009 12:02:16 +0530 Subject: [Live-devel] Streaming to mobile device using live555. Message-ID: <001001ca0f4d$28a375a0$79ea60e0$@com> Hello Support, We are developing streaming application in which we want to stream live video from server to mobile device (for example blackberry , Iphone) using RTSP protocol and live555 library . We are planning to use mpeg4 video for streaming as mobile device supports mpeg4 format. For this we tested streaming using testMPEG4VideoStreamer example we got stream on VLC player on another desktop PC but we are unable to get stream on blackberry mobile device simulator. Can we directly use this sample for streaming to mobile device simulator or any modification is required ? We have encoder which gives us mpeg2TS, mpeg4 etc encoded data from live video. How can we apply live video encoded data to live555 streamer. Can you provide some direction on this? Any help on this will be very helpful. Waiting for your reply. Thanks and Regards Yogesh -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jul 28 02:28:58 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jul 2009 11:28:58 +0200 Subject: [Live-devel] Streaming to mobile device using live555. In-Reply-To: <001001ca0f4d$28a375a0$79ea60e0$@com> References: <001001ca0f4d$28a375a0$79ea60e0$@com> Message-ID: >We are developing streaming application in which we want to stream >live video from server to mobile device (for example blackberry , >Iphone) using RTSP protocol and live555 library . We are planning >to use mpeg4 video for streaming as mobile device supports mpeg4 >format. For this we tested streaming using testMPEG4VideoStreamer >example we got stream on VLC player on another desktop PC but we are >unable to get stream on blackberry mobile device simulator. Can we >directly use this sample for streaming to mobile device simulator or >any modification is required ? "testMPEG4VideoStreamer" streams using multicast. Not all networks and/or clients support IP multicast. Instead, try using "testOnDemandRTSPServer" or "live555MediaServer", which stream via unicast (using RTSP/RTP). > We have encoder which gives us mpeg2TS, mpeg4 etc encoded data from >live video. How can we apply live video encoded data to live555 >streamer. Please read the FAQ! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From henxy21 at gmail.com Tue Jul 28 01:55:17 2009 From: henxy21 at gmail.com (huang) Date: Tue, 28 Jul 2009 16:55:17 +0800 Subject: [Live-devel] Streaming to mobile device using live555. In-Reply-To: <001001ca0f4d$28a375a0$79ea60e0$@com> References: <001001ca0f4d$28a375a0$79ea60e0$@com> Message-ID: dear Yogesh, Live555 could stream to mobile device and mobile have to support rtsp and rtp payload which was provided by both live555 and your device, I find out there many video players say thay could support rtsp and many video codec format but they don't mean that they either support all types of rtp payload for those codec format huang 2009/7/28 Yogesh > Hello Support, > > > > We are developing streaming application in which we want to stream live > video from server to mobile device (for example blackberry , Iphone) using > RTSP protocol and live555 library . We are planning to use mpeg4 video for > streaming as mobile device supports mpeg4 format. For this we tested > streaming using testMPEG4VideoStreamer example we got stream on VLC player > on another desktop PC but we are unable to get stream on blackberry mobile > device simulator. Can we directly use this sample for streaming to mobile > device simulator or any modification is required ? We have encoder which > gives us mpeg2TS, mpeg4 etc encoded data from live video. How can we apply > live video encoded data to live555 streamer. > > Can you provide some direction on this? > > Any help on this will be very helpful. Waiting for your reply. > > > > Thanks and Regards > > Yogesh > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From woods.biz at gmail.com Tue Jul 28 02:13:35 2009 From: woods.biz at gmail.com (Woods) Date: Tue, 28 Jul 2009 17:13:35 +0800 Subject: [Live-devel] MP Demux doubt (more details) Message-ID: <3ba8ebc0907280213w47caf3c9tb4e3f4be30145f17@mail.gmail.com> Dear all, I am writing a MP4Demux and MP4DemuxedElementaryStream, in way as MPEG1or2Demux and MPEG1or2DemuxedElementaryStream. In specific, my MP4Demux uses FFMPEG to read AVPacket from MP4 file, and feed the Packet->data to MP4DemuxedElementaryStream. I tested my code in following ways, with different results: Audio based test: (1) Only getAudioStream() from my MP4DemuxedElementaryStream (I set presentation time stamp here according to AVPacket's pts value) and feed the stream to BasicUDPSink as source. In this way, the streamed audio can be perfectly replayed on my vlc player. (2) Still getAudioStream() from my MP4DemuxedElementarystream, but this time feed it to MPEGTransportStreamFromESStream + MPEG2TransportStreamFramer + BasicUDPSink. This time the program reads through the input mp4 file very fast without correct delaying between audio frames. Of course, this stream is not playable on vlc. Why so, even though (as I mentioned in test 1) the presentation Time is set correctly? Video based test (this is a mess): No matter I use BasicUDPSink nor MPEGTransportStreamFromESStream+MPEG2TransportStreamFramer+BasicUDPSink, the stream is not playable on vlc player. But when I only use BasicUDPSink, the delaying between video frames is correct. But in both cases, the streamed video is recognized by VLC. I even record those video packets to be sent as a .m4e file, which is still not playable on vlc. So I want know is the extracted AVPacket the wanted video elementary packets? Can they be used to feed liveMedia classes as I described? I have tried this code for 2 weeks, no more idea now. :-( I will appreciate if someone could give me suggestion on this! Thank you very much. -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From Deepti.Saraswat at sdc.canon.co.in Tue Jul 28 03:13:15 2009 From: Deepti.Saraswat at sdc.canon.co.in (Deepti.Saraswat at sdc.canon.co.in) Date: Tue, 28 Jul 2009 15:43:15 +0530 Subject: [Live-devel] How to transfer a new RTP Payload type. Message-ID: Hello Ross, I think that LIVE 555 serves only specific RTP Payload types: I am creating a RTSPClient application using LIVE 555. But this time I have to stream some metadata through RTP only.And this metadata is having some different payload format that is not registered inside LIVE 555. What I need to do to receive this new kind of RTPPayload . Regards Deepti Help save paper - do you need to print this email ? This e-mail and any files transmitted with it are confidential and intended solely for the use of the individual(s) or entity addressed above. If you are not the intended recipient of this e-mail, then (i) please notify us at cipl.administrator at canon.co.in (ii) do not use the information contained & (iii) delete it from your system . E-mail communications are not secure. Though efforts have been taken to ensure that this communication has not been tampered with; however, Canon India Pvt Ltd. is not responsible for any changes made to the contents of or any attachments to this message without its consent. All information contained in this message which are not of an official nature shall not be deemed as given or endorsed by Canon India Pvt. Ltd. From vijay.rathi at gmail.com Tue Jul 28 03:53:32 2009 From: vijay.rathi at gmail.com (Vijay Rathi) Date: Tue, 28 Jul 2009 16:23:32 +0530 Subject: [Live-devel] Live 555 server and live streaming Message-ID: Hi, I want to use several network cameras as source of input ( like rtsp:// 192.168.0.1/StreamingMount1) to the Live 555 server with each camera get dynamic mount point (rtsp://livebox/camera1). The server should pull data from network source (camera streams out MPEG4 format stream) and acts as reflector. I can use the darwin injector but would prefer to use Live 555 as streaming server! Thanks in advance, Vijay -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor at dialog.su Tue Jul 28 04:14:37 2009 From: victor at dialog.su (Victor V. Vinokurov) Date: Tue, 28 Jul 2009 15:14:37 +0400 Subject: [Live-devel] Bug in windows version of the BasicTaskScheduler.cpp Message-ID: <4A6EDD9D.5000001@dialog.su> I use live555 lib under windows. I write code to get MPEG file duration: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); BasicUsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler); MPEG1or2FileServerDemux* demux = MPEG1or2FileServerDemux::createNew(*env, inputFileName, False); if(demux) return double(demux->fileDuration()); In ealier versions of live555 all works fine, but in newest version process terminates abnormally on line MPEG1or2FileServerDemux::createNew() I check source code and found bug in BasicTaskScheduler.cpp In this file lines from line 73 #if defined(__WIN32__) || defined(_WIN32) int err = WSAGetLastError(); // For some unknown reason, select() in Windoze sometimes fails with WSAEINVAL if // it was called with no entries set in "readSet". If this happens, ignore it: if (err == WSAEINVAL && readSet.fd_count == 0) { err = 0; // BUG at this line // To stop this from happening again, create a dummy readable socket: int dummySocketNum = socket(AF_INET, SOCK_DGRAM, 0); FD_SET((unsigned)dummySocketNum, &fReadSet); } if (err != EINTR) { #else must look like this #if defined(__WIN32__) || defined(_WIN32) int err = WSAGetLastError(); // For some unknown reason, select() in Windoze sometimes fails with WSAEINVAL if // it was called with no entries set in "readSet". If this happens, ignore it: if (err == WSAEINVAL && readSet.fd_count == 0) { err = EINTR; // err SHOULD be EINTR for the further comparision // To stop this from happening again, create a dummy readable socket: int dummySocketNum = socket(AF_INET, SOCK_DGRAM, 0); FD_SET((unsigned)dummySocketNum, &fReadSet); } if (err != EINTR) { #else Is this right? -- See you! --- Vityusha V. Vinokurov - programmer mailto:victor at dialog.su http://www.dialog.su From finlayson at live555.com Tue Jul 28 04:27:36 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jul 2009 13:27:36 +0200 Subject: [Live-devel] How to transfer a new RTP Payload type. In-Reply-To: References: Message-ID: >I am creating a RTSPClient application using LIVE 555. >But this time I have to stream some metadata through RTP only.And this >metadata is having some different payload format that is not registered >inside LIVE 555. >What I need to do to receive this new kind of RTPPayload . You would need to define and implement two new classes: 1/ A new subclass of "MultiFramedRTPSink", for sending RTP packets in the new payload format. 2/ A new subclass of "MultiFramedRTPSource", for receiving RTP packets in the new payload format. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Jul 28 04:57:18 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jul 2009 13:57:18 +0200 Subject: [Live-devel] Bug in windows version of the BasicTaskScheduler.cpp In-Reply-To: <4A6EDD9D.5000001@dialog.su> References: <4A6EDD9D.5000001@dialog.su> Message-ID: >I use live555 lib under windows. I'm sorry :-) >I check source code and found bug in BasicTaskScheduler.cpp >In this file lines from line 73 [...] >must look like this > >#if defined(__WIN32__) || defined(_WIN32) > int err = WSAGetLastError(); > // For some unknown reason, select() in Windoze sometimes fails with >WSAEINVAL if > // it was called with no entries set in "readSet". If this happens, >ignore it: > if (err == WSAEINVAL && readSet.fd_count == 0) { > err = EINTR; // err SHOULD be EINTR for the further comparision [...] >Is this right? Yes, that will work. This issue (which apparently has been existed for several versions now) will be fixed in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Jul 28 05:00:48 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jul 2009 14:00:48 +0200 Subject: [Live-devel] Live 555 server and live streaming In-Reply-To: References: Message-ID: >I want to use several network cameras as source of input ( like >rtsp://192.168.0.1/StreamingMount1) >to the Live 555 server with each camera get dynamic mount point >(rtsp://livebox/camera1). The server should pull data from network >source (camera streams out MPEG4 format stream) and acts as >reflector. What you're looking for is a 'RTSP proxy' - i.e., a RTSP server that also acts as a RTSP client. Unfortunately our software currently does not support this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jul 28 05:48:26 2009 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jul 2009 14:48:26 +0200 Subject: [Live-devel] Trying to play mp3 file or live streaming to wmplayer client from default MediaServer code In-Reply-To: References: Message-ID: >Can anyone provide information on how I can make the Windows Media >Player understand the describe response of the default MediaServer >of livemedia555streamer. Complain to Microsoft. The Windows Media Player is notoriously non-standards-compliant. Instead, I suggest using VLC as your media player. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From woods.biz at gmail.com Tue Jul 28 06:25:25 2009 From: woods.biz at gmail.com (Woods) Date: Tue, 28 Jul 2009 21:25:25 +0800 Subject: [Live-devel] Fwd: MP Demux doubt (more details) - follow up Message-ID: <3ba8ebc0907280625s72993e01k95e4aa591550247@mail.gmail.com> Hi Ross, Regarding the previous email, I also use MPEG4VideoStreamFramer to connect to my MP4DemuxedElementaryStream::getVideoStream(). I noticed, the MPEG4VideoStreamFramer does not output any packet to the final BasicUDPSink. Does it mean the retrieved AVPacket->data is not correct MPEG4 video packet, so that it is filtered out by the MPEG4VideoStreamParser? Thanks for your suggestion! Anyone has done FFMPEG + Livemedia work before? I appreciate your advice to make this work! Best Regards, Woods ---------- Forwarded message ---------- From: Woods Date: Tue, Jul 28, 2009 at 5:13 PM Subject: MP Demux doubt (more details) To: LIVE555 Streaming Media - development & use Cc: woods.biz at gmail.com Dear all, I am writing a MP4Demux and MP4DemuxedElementaryStream, in way as MPEG1or2Demux and MPEG1or2DemuxedElementaryStream. In specific, my MP4Demux uses FFMPEG to read AVPacket from MP4 file, and feed the Packet->data to MP4DemuxedElementaryStream. I tested my code in following ways, with different results: Audio based test: (1) Only getAudioStream() from my MP4DemuxedElementaryStream (I set presentation time stamp here according to AVPacket's pts value) and feed the stream to BasicUDPSink as source. In this way, the streamed audio can be perfectly replayed on my vlc player. (2) Still getAudioStream() from my MP4DemuxedElementarystream, but this time feed it to MPEGTransportStreamFromESStream + MPEG2TransportStreamFramer + BasicUDPSink. This time the program reads through the input mp4 file very fast without correct delaying between audio frames. Of course, this stream is not playable on vlc. Why so, even though (as I mentioned in test 1) the presentation Time is set correctly? Video based test (this is a mess): No matter I use BasicUDPSink nor MPEGTransportStreamFromESStream+MPEG2TransportStreamFramer+BasicUDPSink, the stream is not playable on vlc player. But when I only use BasicUDPSink, the delaying between video frames is correct. But in both cases, the streamed video is not recognized by VLC. I even record those video packets to be sent as a .m4e file, which is still not playable on vlc. So I want know is the extracted AVPacket the wanted video elementary packets? Can they be used to feed liveMedia classes as I described? I have tried this code for 2 weeks, no more idea now. :-( I will appreciate if someone could give me suggestion on this! Thank you very much. -- Woods -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From devaureshy at gmail.com Tue Jul 28 08:03:48 2009 From: devaureshy at gmail.com (Steve Jiekak) Date: Tue, 28 Jul 2009 17:03:48 +0200 Subject: [Live-devel] level and parameters for H264VideoRTPSink constructor Message-ID: <614230390907280803s1a7e2ce5q60ab24f14827fa37@mail.gmail.com> I am encoding using the profile baseline, level 1.3 , and I want to know which exadecimal integer match for the constructor of H264VideoRTPSink. Also, how to determine the right values for sprop_parameter_sets_str ?? Best regards, Steve jiekak -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Tue Jul 28 08:52:10 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Tue, 28 Jul 2009 08:52:10 -0700 Subject: [Live-devel] Problem with fPresentationTime Message-ID: I have an OnDemand RTSP server based around Live555 that streams out live H.264; I implemented a subsession and framer. On the client side, I have a receiver application based loosely around openRTSP, but with my own custom H264 MediaSink class. My problem is fPresentationTime; the values I hand off to Live555 on the server side are not the values I get on the client side. Furthermore, the _first_ value I receive on the client side is always crazy, and then values thereafter appear to increase monotonically (although they still differ from what was supplied to the server). I think I'm misunderstanding something here. In the server, my samples come to me timestamped with a REFERENCE_TIME, which is an __int64 value with 100 nanosecond units. So to convert from this to the timeval, I do the following: // Our samples are 100 NS units. So to get to usec, we divide by 10. fDurationInMicroseconds = sample->m_duration / 10; REFERENCE_TIME sampleTimeInUsec = sample->m_time / 10; fPresentationTime.tv_sec = (long) sampleTimeInUsec / 1000000; fPresentationTime.tv_usec = (long) sampleTimeInUsec % 1000000; ...I am likely doing something wrong, but I don't understand why the first timestamp I get on the receiver is out in left field. Could it be that running on Windows has some effect on the time stamp? (both server/client are running on Win32) Any help is much appreciated. -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Tue Jul 28 09:00:25 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Tue, 28 Jul 2009 09:00:25 -0700 Subject: [Live-devel] level and parameters for H264VideoRTPSink constructor In-Reply-To: <614230390907280803s1a7e2ce5q60ab24f14827fa37@mail.gmail.com> References: <614230390907280803s1a7e2ce5q60ab24f14827fa37@mail.gmail.com> Message-ID: On Tue, Jul 28, 2009 at 8:03 AM, Steve Jiekak wrote: > Also, how to determine the right values for sprop_parameter_sets_str ?? You'll get these values (SPS and PPS) from your H.264 encoder. You base-64 encode them, and then delimit them with a comma. So something like: sprop_parameter_sets=, You'll also get the profile info from your encoder. -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Tue Jul 28 09:52:52 2009 From: matt at schuckmannacres.com (Matt Schuckmann) Date: Tue, 28 Jul 2009 09:52:52 -0700 Subject: [Live-devel] sync sample issue with flash players In-Reply-To: References: <38f061a30907270840j4757a2b3k1caee0e4feeb7680@mail.gmail.com> Message-ID: <4A6F2CE4.9010203@schuckmannacres.com> I would never make any assumptions about a key frame every 12 frames. I don't know exactly what these atoms are but I regularly use FFMPEG and other encoders to create H.264 streams with very different key frame intervals. Matt S. Ross Finlayson wrote: > OK, I've now released a new version (2009.07.28) of the "LIVE555 > Streaming Media" software that includes this change. > > However, I changed your suggested code so that it adds the "stss" atom > only for video streams. (For audio streams, such an atom doesn't > appear to make sense (see > ), > and would lead to an extremely large number of entries anyway.) > > Also, I'm rather nervous about the assumption that 'key frames' occur > every 12 frames in each video stream. This might be true for > ffmpeg-generated H.264 streams, but or many (if not most) other video > streams, that won't be the case, so I'm worried about what this might > break. So if anyone runs into any problems with this change, please > let us know. From Deepti.Saraswat at sdc.canon.co.in Tue Jul 28 22:44:36 2009 From: Deepti.Saraswat at sdc.canon.co.in (Deepti.Saraswat at sdc.canon.co.in) Date: Wed, 29 Jul 2009 11:14:36 +0530 Subject: [Live-devel] How to transfer a new RTP Payload type. Message-ID: Hello Ross, My query is: 1. I have to transfer some metadata in XML format inside RTP Packet from a RTPServer to Client. For implementing client side functionality I have used LIVE 555. As of now I am receiving JPEG image data using the a class derived from MediaSink. 2. I peeped inside LIVE 555 source code and found that it process some limited types of payload (JPEG, MPEG H.264 etc) but to transfer metadata I will creating new kind of RTP Payload. Which LIVEMEdia doesnt handle. 3. My query is how can I receive that new type of metadata payload.Below is the method you sent me : >>You would need to define and implement two new classes: >>1/ A new subclass of "MultiFramedRTPSink", for sending RTP packets in >>the new payload format. >>2/ A new subclass of "MultiFramedRTPSource", for receiving RTP >>packets in the new payload format. But How will I assure that all the RTPPackets belong to the new RTP Type will fall under subclass of "MultiFramedRTPSource" and my rest of the video/audio packets will fall under the subclass of MediaSink(Which is running smoothly). Hope This time I am able to explain my problem clearly. Pl reply. Regards Deepti Help save paper - do you need to print this email ? This e-mail and any files transmitted with it are confidential and intended solely for the use of the individual(s) or entity addressed above. If you are not the intended recipient of this e-mail, then (i) please notify us at cipl.administrator at canon.co.in (ii) do not use the information contained & (iii) delete it from your system . E-mail communications are not secure. Though efforts have been taken to ensure that this communication has not been tampered with; however, Canon India Pvt Ltd. is not responsible for any changes made to the contents of or any attachments to this message without its consent. All information contained in this message which are not of an official nature shall not be deemed as given or endorsed by Canon India Pvt. Ltd. From tm18174 at salle.url.edu Wed Jul 29 01:08:11 2009 From: tm18174 at salle.url.edu (tm18174 at salle.url.edu) Date: Wed, 29 Jul 2009 10:08:11 +0200 (CEST) Subject: [Live-devel] H264VideoStreamFramer Message-ID: <966efad23dc555ba761e1d84e04ad71a.squirrel@webmail.salle.url.edu> Hi, I have to create a H264VideoStreamFramer subclass for my H264 RTSP Server. This subclass will be the one that provides frames (or in this case, Nal Units) to the H264VideoRTPSink class. As i understand it H264VideoStreamFramer is only a filter to make me implement the method: virtual Boolean currentNALUnitEndsAccessUnit() = 0; I also have to implement doGetNextFrame from the FramedSource class. The thing is that i cannot see exactly what my doGetNextFrame function has to do, cause GetNextFrame calls doGetNextFrame after initalizing some parameters and checking some stuff. I guess i have to give the NalUnits to the Sink class , but how?(i got my own nalunitbuffer class, which reads from a ffmepg file and provides me nalunits). Thanks for your attention, Daniel Collado From Deepti.Saraswat at sdc.canon.co.in Wed Jul 29 04:48:34 2009 From: Deepti.Saraswat at sdc.canon.co.in (Deepti.Saraswat at sdc.canon.co.in) Date: Wed, 29 Jul 2009 17:18:34 +0530 Subject: [Live-devel] AAC and G.711 Audio Support in LIVE 555 Message-ID: Hi Ross, Does LIVE 555 supports AAC and G.711 audio formats. Regards Deepti Help save paper - do you need to print this email ? This e-mail and any files transmitted with it are confidential and intended solely for the use of the individual(s) or entity addressed above. If you are not the intended recipient of this e-mail, then (i) please notify us at cipl.administrator at canon.co.in (ii) do not use the information contained & (iii) delete it from your system . E-mail communications are not secure. Though efforts have been taken to ensure that this communication has not been tampered with; however, Canon India Pvt Ltd. is not responsible for any changes made to the contents of or any attachments to this message without its consent. All information contained in this message which are not of an official nature shall not be deemed as given or endorsed by Canon India Pvt. Ltd. From finlayson at live555.com Wed Jul 29 08:15:22 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 29 Jul 2009 17:15:22 +0200 Subject: [Live-devel] AAC and G.711 Audio Support in LIVE 555 In-Reply-To: References: Message-ID: >Does LIVE 555 supports AAC Yes, using "MPEG4GenericRTPSink" (for sending) and "MPEG4GenericRTPSource" (for receiving). Note how "testOnDemandRTSPServer" streams ".aac" files, for example. > and G.711 audio formats. Yes, but currently only u-law, not a-law. Note how "testOnDemandRTSPServer" streams ".wav" files, for example. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jul 29 08:18:10 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 29 Jul 2009 17:18:10 +0200 Subject: [Live-devel] H264VideoStreamFramer In-Reply-To: <966efad23dc555ba761e1d84e04ad71a.squirrel@webmail.salle.url.edu> References: <966efad23dc555ba761e1d84e04ad71a.squirrel@webmail.salle.url.edu> Message-ID: >The thing is that i cannot see exactly what my doGetNextFrame function has >to do Look at the template code outlined in "DeviceSource.cpp", and also the many examples of "doGetNextFrame()" throughout the code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jul 29 08:24:54 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 29 Jul 2009 17:24:54 +0200 Subject: [Live-devel] How to transfer a new RTP Payload type. In-Reply-To: References: Message-ID: >3. My query is how can I receive that new type of metadata payload.Below is >the method you sent me : > >>>You would need to define and implement two new classes: > >>>1/ A new subclass of "MultiFramedRTPSink", for sending RTP packets in >>>the new payload format. >>>2/ A new subclass of "MultiFramedRTPSource", for receiving RTP > >>packets in the new payload format. Sorry, I forgot to mention a 3rd thing that you need to do: 3/ Modify "MediaSubsession::initiate()" so that it recognizes the SDP description for your payload type, and creates an appropriate "RTPSource" subclass. For an example of how to do this, search for "H264" in "liveMedia/MediaSession.cpp". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jul 29 11:14:39 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 29 Jul 2009 20:14:39 +0200 Subject: [Live-devel] Problem with fPresentationTime In-Reply-To: References: Message-ID: >My problem is fPresentationTime; the values I hand off to Live555 on >the server side are not the values I get on the client side. The presentation times that you generate at the server end *must be aligned with 'wall clock' time - i.e., the times that you would generate by calling "gettimeofday()". > Furthermore, the _first_ value I receive on the client side is always crazy This is normal, and espected. It's because the first few presentation times - before RTCP synchronization occurs - are just 'guesses' made by the receiving code (based on the receiver's 'wall clock' and the RTP timestamp). However, once RTCP synchronization occurs, all subsequent presentation times will be accurate (provided that the server generated its presentation times properly; see above). This means is that a receiver should be prepared for the fact that the first few presentation times (until RTCP synchronization starts) will not be accurate. The code, however, can check this by calling "RTPSource:: hasBeenSynchronizedUsingRTCP()". If this returns False, then the presentation times are not accurate, and should be treated with 'a grain of salt'. However, once the call to returns True, then the presentation times (from then on) will be accurate. >In the server, my samples come to me timestamped with a >REFERENCE_TIME, which is an __int64 value with 100 nanosecond units. >So to convert from this to the timeval, I do the following: > > // Our samples are 100 NS units. So to get to usec, we divide by 10. > fDurationInMicroseconds = sample->m_duration / 10; > REFERENCE_TIME sampleTimeInUsec = sample->m_time / 10; > fPresentationTime.tv_sec = (long) sampleTimeInUsec / 1000000; > fPresentationTime.tv_usec = (long) sampleTimeInUsec % 1000000; > >...I am likely doing something wrong Yes, you're not aligning these presentation times with 'wall clock' time - i.e., the time that you would get by calling "gettimeofday()". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kidjan at gmail.com Wed Jul 29 11:50:18 2009 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 29 Jul 2009 11:50:18 -0700 Subject: [Live-devel] Problem with fPresentationTime In-Reply-To: References: Message-ID: Thank you for the clarification. One question about 'wall clock' time: In the server, my samples come to me timestamped with a REFERENCE_TIME, >> which is an __int64 value with 100 nanosecond units. So to convert from this >> to the timeval, I do the following: >> >> // Our samples are 100 NS units. So to get to usec, we divide by >> 10. >> fDurationInMicroseconds = sample->m_duration / 10; >> REFERENCE_TIME sampleTimeInUsec = sample->m_time / 10; >> fPresentationTime.tv_sec = (long) sampleTimeInUsec / 1000000; >> fPresentationTime.tv_usec = (long) sampleTimeInUsec % 1000000; >> >> ...I am likely doing something wrong >> > > Yes, you're not aligning these presentation times with 'wall clock' time - > i.e., the time that you would get by calling "gettimeofday()". So, the sample times I get from my encoder start at 0 and increase (i.e. a relative timestamp); when you say 'wall clock' time, does this mean I should be taking note of the absolute start time? My video source is live, so the timestamps I get from the encoder are (somewhat) arbitrary. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jul 29 12:26:05 2009 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 29 Jul 2009 21:26:05 +0200 Subject: [Live-devel] Problem with fPresentationTime In-Reply-To: References: Message-ID: >Yes, you're not aligning these presentation times with 'wall clock' >time - i.e., the time that you would get by calling "gettimeofday()". > > >So, the sample times I get from my encoder start at 0 and increase >(i.e. a relative timestamp); when you say 'wall clock' time, does >this mean I should be taking note of the absolute start time? Yes. Note the start 'wall clock' time (by calling "gettimeofday()" when you get your first (0) sample time), and then add it to the sample time each time you set "fPresentationTime". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From udi_kl at hotmail.com Wed Jul 29 14:27:43 2009 From: udi_kl at hotmail.com (udi kleers) Date: Wed, 29 Jul 2009 17:27:43 -0400 Subject: [Live-devel] support for mp4 Message-ID: Is there any trick to make Live555 server boradcast mp4? Would appriciate any response. . Thanks, Udi _________________________________________________________________ Windows Live? Hotmail?: Search, add, and share the web?s latest sports videos. Check it out. http://www.windowslive.com/Online/Hotmail/Campaign/QuickAdd?ocid=TXT_TAGLM_WL_QA_HM_sports_videos_072009&cat=sports -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew_umlauf at yahoo.com Thu Jul 30 07:43:36 2009 From: andrew_umlauf at yahoo.com (Andrew Umlauf) Date: Thu, 30 Jul 2009 07:43:36 -0700 (PDT) Subject: [Live-devel] UTP to RTP Message-ID: <445866.80634.qm@web82007.mail.mud.yahoo.com> I am trying to configure an app to take a UDP source and stream it via RTP. However, I cannot get it to work. When I trace through the code, I find that on line 155 of MPEGVideoStreamFramer.cpp, aquiredFrameSize is never greater than 0. The following is my code: -Andrew #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" MediaSource* videoSource; void afterPlaying(void*); int main() { TaskScheduler* scheduler = BasicTaskScheduler::createNew(); UsageEnvironment*env = BasicUsageEnvironment::createNew(*scheduler); //Create our sink variables... char const* destinationAddrStr = "147.159.167.150"; const unsigned char ttl = 7; const Port rtpPort(8888); struct in_addr destinationAddress; destinationAddress.s_addr = our_inet_addr(destinationAddrStr); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); //Create our sink... RTPSink* videoSink = MPEG1or2VideoRTPSink::createNew(*env, &rtpGroupsock); if(videoSink == NULL){ *env << "Unable to create sink \n"; exit(1); } //Create our source variables... char const* srcAddrStr = "10.0.0.10"; struct in_addr srcAddress; srcAddress.s_addr = our_inet_addr(srcAddrStr); const Port udpPort(1234); Groupsock udpGroupsock(*env, srcAddress, udpPort, ttl); //Create our source... FramedSource* input = BasicUDPSource::createNew(*env, &udpGroupsock); if (input == NULL) { *env << "Unable to open source \n"; exit(1); } //Create our framer... videoSource = MPEG1or2VideoStreamFramer::createNew(*env, input); //Start to stream the data.... videoSink->startPlaying(*videoSource, afterPlaying, videoSink); env->taskScheduler().doEventLoop(); return 0; } void afterPlaying(void*) { Medium::close(videoSource); } From knaaait at gmail.com Thu Jul 30 07:35:29 2009 From: knaaait at gmail.com (Kip) Date: Thu, 30 Jul 2009 16:35:29 +0200 Subject: [Live-devel] Looking for ARQ developer. In-Reply-To: References: Message-ID: Hello, Not sure if this is the right place to ask, i'm looking for someone that implemented some kind of retransmission for RTP packets that could get lost. (ARQ?) Is there anyone on here that implemented something like this already or is willing to take a (paid) programming job to do such a thing? Please respond off list. Regards, Kip. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jul 30 08:02:58 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Jul 2009 17:02:58 +0200 Subject: [Live-devel] UTP to RTP In-Reply-To: <445866.80634.qm@web82007.mail.mud.yahoo.com> References: <445866.80634.qm@web82007.mail.mud.yahoo.com> Message-ID: >I am trying to configure an app to take a UDP source and stream it >via RTP. However, I cannot get it to work. When I trace through the >code, I find that on line 155 of MPEGVideoStreamFramer.cpp, >aquiredFrameSize is never greater than 0. Because your input source is a sequence of discrete frames, rather than a byte stream, you should use "MPEG1or2VideoStreamDiscreteFramer", instead of "MPEG1or2VideoStreamFramer". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From fabrizio.agnoli at gmail.com Thu Jul 30 09:41:38 2009 From: fabrizio.agnoli at gmail.com (Fabrizio Agnoli) Date: Thu, 30 Jul 2009 18:41:38 +0200 Subject: [Live-devel] H263 live encoder source Message-ID: <3a8482a10907300941r69ee5b12ic50ad4e85ac3ca80@mail.gmail.com> Hi all, I've an application which gets images from a webcam, then live encode them. I need to live stream this to a mobile device via a live555server. How can i manage this input stream (based on socket comm between the two apps) in live555?could BasicUDPSource the class which reads from the buffer? May I use H263plusVideoStreamFramer as framer for the H263 stream?? Thanks for help Fabrizio -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jul 30 10:41:06 2009 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Jul 2009 19:41:06 +0200 Subject: [Live-devel] H263 live encoder source In-Reply-To: <3a8482a10907300941r69ee5b12ic50ad4e85ac3ca80@mail.gmail.com> References: <3a8482a10907300941r69ee5b12ic50ad4e85ac3ca80@mail.gmail.com> Message-ID: >I've an application which gets images from a webcam, then live >encode them. I need to live stream this to a mobile device via a >live555server. How can i manage this input stream (based on socket >comm between the two apps) in live555?could BasicUDPSource the class >which reads from the buffer? It could, if your input is UDP packets. However, it'd be far better to have your input source be just an unstructured byte stream (i.e., a device file, a pipe, or a TCP connection). Then you could use "H263plusVideoStreamFramer". > May I use H263plusVideoStreamFramer as framer for the H263 stream?? If (and only if) your input is a byte stream. If, instead, it's a sequence of discrete frames, then you would need a new class "H263plusVideoStreamDiscreteFramer" (which you would need to write) instead. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From andrew_umlauf at yahoo.com Thu Jul 30 11:42:43 2009 From: andrew_umlauf at yahoo.com (Andrew Umlauf) Date: Thu, 30 Jul 2009 11:42:43 -0700 (PDT) Subject: [Live-devel] UTP to RTP Message-ID: <83669.1192.qm@web82007.mail.mud.yahoo.com> I appreciate the quick response. I am still having problems though. I made the change and I am not getting errors. I have tried with multiple mpeg 2 files. Here is the command-line output: ....... 4700441B, but we're not a fragment Warning: MPEG1or2VideoRTPSink::doSpecialFrameHandling saw strange first 4 bytes 47404412, but we're not a fragment Warning: MPEG1or2VideoRTPSink::doSpecialFrameHandling saw strange first 4 bytes 47004419, but we're not a fragment I do not get any video output except for an initial frame. This is the code: #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" FramedSource* videoSource; void afterPlaying(void*); int main() { TaskScheduler* scheduler = BasicTaskScheduler::createNew(); UsageEnvironment*env = BasicUsageEnvironment::createNew(*scheduler); //Create our sink variables... char const* destinationAddrStr = "147.159.167.150"; const unsigned char ttl = 1; const Port rtpPort(8888); struct in_addr destinationAddress; destinationAddress.s_addr = our_inet_addr(destinationAddrStr); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); //Create our sink... RTPSink* videoSink = MPEG1or2VideoRTPSink::createNew(*env, &rtpGroupsock); if(videoSink == NULL){ *env << "Unable to create sink \n"; exit(1); } //Create our source variables... char const* srcAddrStr = "10.0.0.10"; struct in_addr srcAddress; srcAddress.s_addr = our_inet_addr(srcAddrStr); const Port udpPort(1234); Groupsock udpGroupsock(*env, srcAddress, udpPort, ttl); //Create our source... FramedSource* input = BasicUDPSource::createNew(*env, &udpGroupsock); if (input == NULL) { *env << "Unable to open source \n"; exit(1); } //Create our framer... //videoSource = MPEG1or2VideoStreamFramer::createNew(*env, input); videoSource = MPEG1or2VideoStreamDiscreteFramer::createNew(*env, input); //Start to stream the data.... videoSink->startPlaying(*videoSource, afterPlaying, NULL); env->taskScheduler().doEventLoop(); return 0; } void afterPlaying(void*) { Medium::close(videoSource); } -Andrew P.S. I ment to say UDP not UTP in subject heading :- ) --- On Thu, 7/30/09, Ross Finlayson wrote: > From: Ross Finlayson > Subject: Re: [Live-devel] UTP to RTP > To: "LIVE555 Streaming Media - development & use" > Date: Thursday, July 30, 2009, 11:02 AM > > I am trying to configure an app > to take a UDP source and stream it via RTP. However, I > cannot get it to work. When I trace through the code, I find > that on line 155 of MPEGVideoStreamFramer.cpp, > aquiredFrameSize is never greater than 0. > > Because your input source is a sequence of discrete frames, > rather than a byte stream, you should use > "MPEG1or2VideoStreamDiscreteFramer", instead of > "MPEG1or2VideoStreamFramer". > -- > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From dliu.cn at gmail.com Thu Jul 30 22:04:30 2009 From: dliu.cn at gmail.com (Dong Liu) Date: Fri, 31 Jul 2009 01:04:30 -0400 Subject: [Live-devel] multiple instances of live555 library Message-ID: <4A727B5E.5080608@gmail.com> Hi, I'm developing an H264 RTP source filter for the Microsoft Directshow platform. The directshow filter is basically a shared library which is loaded dynamically. I created BasicTaskScheduler, BasicUsageEnvironment and MediaSession when the filter is loaded. If there is only one instance of the filter, everything works. But if I have two filters loaded, both two videos have a lot of distortions, it seems to me the two RTP streams somehow got mixed. But they are on totally different UDP ports. Is there anything I have to be aware to use multiple instances of live555 library in a single process? Thanks, Dong From woods.biz at gmail.com Fri Jul 31 01:10:12 2009 From: woods.biz at gmail.com (Woods) Date: Fri, 31 Jul 2009 16:10:12 +0800 Subject: [Live-devel] Can LiveMedia consume Flash video Message-ID: <3ba8ebc0907310110j2c691ea3ne1f6442b20e8b090@mail.gmail.com> Hi, I wish to know can Livemedia lib consume Flash Video source (through HTTP or FLV file) and send it as MPEG1or2 to UDP sink? Thank you! -- Woods -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jul 31 01:39:53 2009 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 31 Jul 2009 10:39:53 +0200 Subject: [Live-devel] multiple instances of live555 library In-Reply-To: <4A727B5E.5080608@gmail.com> References: <4A727B5E.5080608@gmail.com> Message-ID: >Is there anything I have to be aware to use multiple instances of >live555 library in a single process? You don't do this. Please everybody, read the FAQ! The "LIVE555 Streaming Media" software uses an event loop (a single thread of control) for concurrency, instead of using multiple threads. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sushil.chaudhri at gmail.com Fri Jul 31 04:06:35 2009 From: sushil.chaudhri at gmail.com (Sushil Chaudhari) Date: Fri, 31 Jul 2009 16:36:35 +0530 Subject: [Live-devel] Regarding the Full HD movie(1920x1080) ts file Streaming In-Reply-To: References: Message-ID: Hi All, We are using the Livemedia for streaming the full HD MPEGII ts file on windows platform. But we are seeing the packet loss at streamer side by dropping down the network usage. We have also observed that when the packet loss or network usage is getting down the receiver side there will be distortion on the video. We have also increases the OS buffer till 20000 for HD support but still there is a packet drop from the streamer as per FAQ. I would like to mention that this problem is coming only on high resolution videos i.e. (1920X1080). Also the behaviour is random in the same video streaming so we can not say that ther is a problem in the video. Also we are seeing this behaviour of video distortion in the peer to peer connection for streamer and receiver. Any help or pointer regarding the improvement of the livemedia for HD movie will be a great help for me. Thanks in advance. Regards Sushil -------------- next part -------------- An HTML attachment was scrubbed... URL: