From yinlijie2011 at 163.com Mon Aug 1 05:19:11 2011 From: yinlijie2011 at 163.com (yinlijie2011) Date: Mon, 1 Aug 2011 20:19:11 +0800 (CST) Subject: [Live-devel] problem about create file Message-ID: <806d8e.6cb9.131854895bb.Coremail.yinlijie2011@163.com> Dear, I use live555 library for my program that receive stream media from RTSP server and save them to a MP4 file per one minute. With time went by, more and more MP4 files will be create and save media. I know useQuickTimeFileSink::createNew()can create MP4 file pointer, but I don't know use which function can free the pointer when it get stream media enough time. In openRTSP's source, useMedium::close()to free MP4 file's pointer. I try use this function, but my program can't receive stream media when I use it, why? My step is, first, send PAUSE signal; second, useMedium::close() free MP4 file pointer and create new one; finally, send PLAY signal. Thank you! Yin Lijie -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 2 07:59:22 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 2 Aug 2011 10:59:22 -0400 Subject: [Live-devel] problem about create file In-Reply-To: <806d8e.6cb9.131854895bb.Coremail.yinlijie2011@163.com> References: <806d8e.6cb9.131854895bb.Coremail.yinlijie2011@163.com> Message-ID: <9F622341-6BDE-4912-AA8D-4D4074389059@live555.com> On Aug 1, 2011, at 8:19 AM, yinlijie2011 wrote: > I use live555 library for my program that receive stream media from RTSP server and save them to a MP4 file per one minute. With time went by, more and more MP4 files will be create and save media. > I know use QuickTimeFileSink::createNew() can create MP4 file pointer, but I don't know use which function can free the pointer when it get stream media enough time. In openRTSP's source, use Medium::close() to free MP4 file's pointer. I try use this function, but my program can't receive stream media when I use it, why? > My step is, first, send PAUSE signal; second, use Medium::close() free MP4 file pointer and create new one; finally, send PLAY signal. This should work (I think): 1/ Send the RTSP "PAUSE" command (using "RTSPClient::sendPauseCommand()") 2/ Call Medium::close(pointer-to-your-MP4-file-object) 3/ Create a new MP4-file-object 4/ Call "startPlaying()" on your new MP4-file-object 5/ Send the RTSP "PLAY" command (using "RTSPClient::sendPlayCommand()") Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Tue Aug 2 15:45:37 2011 From: matt at schuckmannacres.com (Matt Schuckmannn) Date: Tue, 02 Aug 2011 15:45:37 -0700 Subject: [Live-devel] Confused about how to generate fmtp line for H.264 source for SDP Message-ID: <4E387E11.10604@schuckmannacres.com> I'm working on up grading our use of Live555 RTSP server code to the latest version of the library, our old version was at least a couple of years old. In the new code it appears that the default behavior is to obtain the sps, pps, etc from the h.264 fragmenter (see auxSDPLine() for H264VideoRTPSink) however the fragmenter is not created until ContinuePlaying is called which is way to late to generate the fmtp line for the SDP. So I'm confused how this all supposed to work. I'm not sure if I should over ride the auxSDPLine() in my class derived from H264VideoRTPSink and format the fmtp line by hand or if I should try creating the fragmenter when I construct my H.264 RTPSink class or? Please advise. Thanks, Matt S. From finlayson at live555.com Wed Aug 3 06:14:13 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Aug 2011 09:14:13 -0400 Subject: [Live-devel] Confused about how to generate fmtp line for H.264 source for SDP In-Reply-To: <4E387E11.10604@schuckmannacres.com> References: <4E387E11.10604@schuckmannacres.com> Message-ID: <022F0059-F199-4798-9AA3-4EACD8A2DD3F@live555.com> On Aug 2, 2011, at 6:45 PM, Matt Schuckmannn wrote: > I'm working on up grading our use of Live555 RTSP server code to the latest version of the library, our old version was at least a couple of years old. Good heavens; there have been *many* improvements and bug fixes since then! > In the new code it appears that the default behavior is to obtain the sps, pps, etc from the h.264 fragmented Yes. Now, the SPS and PPS NAL units are assumed to be in the input NAL unit stream (and are extracted from there). That means that if we're streaming a H.264 stream 'on demand' (e.g., from a unicast RTSP server), we have to do a little trick (hack) to get this information for use in the stream's SDP description, before we start delivering to the first client. Basically, we have to 'stream' the input source to a dummy sink, until we see the data that we need. The place to do this is in your subclass of "ServerMediaSubsession" for H.264 video. Specifically, you reimplement the "getAuxSDPLine()" virtual function. For a model of how to do this, see our implementation of "H264VideoFileServerMediaSubsession". You will presumably do something similar, except with your own subclass. (Of course, as always, you will also implement the "createNewStreamSource()" and "createNewRTPSink()" virtual functions.) > I'm not sure if I should over ride the auxSDPLine() in my class derived from H264VideoRTPSink No, you should need to change (or reimplement) that code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Wed Aug 3 10:04:53 2011 From: matt at schuckmannacres.com (Matt Schuckmannn) Date: Wed, 03 Aug 2011 10:04:53 -0700 Subject: [Live-devel] Confused about how to generate fmtp line for H.264 source for SDP In-Reply-To: <022F0059-F199-4798-9AA3-4EACD8A2DD3F@live555.com> References: <4E387E11.10604@schuckmannacres.com> <022F0059-F199-4798-9AA3-4EACD8A2DD3F@live555.com> Message-ID: <4E397FB5.4090703@schuckmannacres.com> On Wednesday, August 03, 2011 6:14:13 AM, Ross Finlayson wrote: > On Aug 2, 2011, at 6:45 PM, Matt Schuckmannn wrote: > >> I'm working on up grading our use of Live555 RTSP server code to the >> latest version of the library, our old version was at least a couple >> of years old. > > Good heavens; there have been *many* improvements and bug fixes since then! Yes well, when we first adopted live555 we need RTPS over TCP to work with sending commands like SET_PARAMETER during the stream, so we did our own fix for that which made it difficult to keep up with your changes, and until recently we didn't have time to convert over to your new code. > Yes. Now, the SPS and PPS NAL units are assumed to be in the input NAL > unit stream (and are extracted from there). Is that a safe assumption, isn't optional to include the SPS and PPS NAL units in the stream? or at the very least make the very infrequent? > > That means that if we're streaming a H.264 stream 'on demand' (e.g., > from a unicast RTSP server), we have to do a little trick (hack) to get > this information for use in the stream's SDP description, before we > start delivering to the first client. Basically, we have to 'stream' the > input source to a dummy sink, until we see the data that we need. > > The place to do this is in your subclass of "ServerMediaSubsession" for > H.264 video. Specifically, you reimplement the "getAuxSDPLine()" virtual > function. > Hmm, I'll look into it, but my encoder gives me the sps and pps when I initialize it, it seems like it would be easier to just hand the sps and pps to the rtp sink and just re-implement auxSDPLine in my class, pretty much like I used to do. Is there a reason your recommending this approach > For a model of how to do this, see our implementation of > "H264VideoFileServerMediaSubsession". You will presumably do something > similar, except with your own subclass. (Of course, as always, you will > also implement t he "createNewStreamSource()" and "createNewRTPSink()" > virtual functions.) Thanks, Matt S. From warren at etr-usa.com Wed Aug 3 16:28:55 2011 From: warren at etr-usa.com (Warren Young) Date: Wed, 03 Aug 2011 17:28:55 -0600 Subject: [Live-devel] Trick play based pause interacts badly with Enseo STBs Message-ID: <4E39D9B7.2020301@etr-usa.com> If you use live555MediaServer to stream MPEG-2 in a TS to an Enseo HD2000 STB, pause and resume works fine as long as you haven't used the indexer to build .tsx files, or you disable trick play handling in MPEG2TransportFileServerMediaSubsession::pauseStream(): > void MPEG2TransportFileServerMediaSubsession > ::pauseStream(unsigned clientSessionId, void* streamToken) { > if (fIndexFile != NULL) { // we support 'trick play' > ClientTrickPlayState* client = lookupClient(clientSessionId); > if (client != NULL) { > client->updateStateOnPlayChange(False); > } > } > > // Call the original, default version of this routine: > OnDemandServerMediaSubsession::pauseStream(clientSessionId, streamToken); > } That is to say, the STB will resume a paused RTSP stream for an indexed MPEG-2 TS correctly if you remove everything but the last line, causing the server to behave as if the TS file doesn't have an index purely for the purposes of pause handling. With the .tsx file and problem code present, resuming causes the displayed video to look like digital satellite TV when there's snow on the dish. Jerky playback, occasional pauses, macroblocks decoding in the wrong place/time...ugly stuff. Additionally, it looks like the STB is giving up and trying to restart the stream at some point. While debugging this, we noticed that VLC doesn't send RTSP PAUSE when you press its Pause button. We assume it's working like a DVR in this instance, just buffering the streamed data to use when you unpause. I mention this that's the only other RTSP clients we have on hand. We lack another client that does send RTSP PAUSE, so as far as we can see, there's nothing wrong with disabling trick play support for pause. We assume there *is* some bad consequence, since the code probably wasn't written for no reason. Which client does this hack break? We have a solution to our problem, hacky though it is, but we'd be happier if the server just did the right thing with Enseo STBs out of the box, so we're willing to help you look into this, Ross. We can send you one of these STBs, plus tools and information that will help you debug the issue. I've attached a transcript of the RTSP conversation between an Enseo HD2000 and live555MediaServer. You see it start the stream, then pause and attempt to resume it. One of the things you'll notice in the transcript is that the STB sends a GET_PARAMETER request for "position" on pause. I couldn't find any documentation online saying what the client is supposed to get in response, only the RTSP RFC saying this is an open extension mechanism. We don't know which RTSP server implements this parameter (Kasenna or SeaChange, probably), but we assume the purpose is to let the STB tell the server where to resume from. We considered trying to modify the server to send a real reply for this parameter by looking at the trick play index, but then stumbled across our current hacky fix. -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: enseo-play-pause-resume.txt URL: From finlayson at live555.com Wed Aug 3 18:51:37 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 3 Aug 2011 21:51:37 -0400 Subject: [Live-devel] Confused about how to generate fmtp line for H.264 source for SDP In-Reply-To: <4E397FB5.4090703@schuckmannacres.com> References: <4E387E11.10604@schuckmannacres.com> <022F0059-F199-4798-9AA3-4EACD8A2DD3F@live555.com> <4E397FB5.4090703@schuckmannacres.com> Message-ID: <62005C6C-8C02-435E-95D4-730CA0EAA19C@live555.com> >> Yes. Now, the SPS and PPS NAL units are assumed to be in the input NAL unit stream (and are extracted from there). > > Is that a safe assumption, isn't optional to include the SPS and PPS NAL units in the stream? or at the very least make the very infrequent? Yes, it's uncommon for a supplier of a H.264 stream to know the SPS and PPS NAL units, but for those NAL units not to appear in the stream. Most commonly, these NAL units appear in the stream, but you don't know what they are - and can find them out only by scanning the stream. We now do that for you. > Hmm, I'll look into it, but my encoder gives me the sps and pps when I initialize it, it seems like it would be easier to just hand the sps and pps to the rtp sink and just re-implement auxSDPLine in my class, pretty much like I used to do. Is there a reason your recommending this approach Yes, the reason was that it would probably save you work. If the SPS and PPS NAL units also appear in the stream (which, for most encoders, they will), then you don't have to do any extra work (except perhaps duplicate some code that we already have in "H264VideoFileServerMediaSubsession"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 3 21:12:10 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 4 Aug 2011 00:12:10 -0400 Subject: [Live-devel] Trick play based pause interacts badly with Enseo STBs In-Reply-To: <4E39D9B7.2020301@etr-usa.com> References: <4E39D9B7.2020301@etr-usa.com> Message-ID: > We assume there *is* some bad consequence, since the code probably wasn't written for no reason. Which client does this hack break? The intention of the code is to make sure that the server's state (specifically, its record of where in the stream it is) is accurate, so that a subsequent RTSP "PLAY" command will cause the stream to get resumed from the correct place. This code was written specifically for the Amino STB (which was the first client we used that did 'trick play' operations). > we're willing to help you look into this, Ross. We can send you one of these STBs, plus tools and information that will help you debug the issue. Sure. You can find the mailing address on the front page of our web site: http://www.live555.com/ > One of the things you'll notice in the transcript is that the STB sends a GET_PARAMETER request for "position" on pause. I couldn't find any documentation online saying what the client is supposed to get in response This hasn't been standardized at all. If your STB is relying on this, then it's broken. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From aamer.786pk at gmail.com Thu Aug 4 03:28:13 2011 From: aamer.786pk at gmail.com (Aamer Sattar) Date: Thu, 4 Aug 2011 12:28:13 +0200 Subject: [Live-devel] Current playing position in live555MediaServer Message-ID: Hi, I am working on "live555MediaServer" class. Now to make our own Media Server, we can reuse MPEG1or2DemuxedServerMediaSubsessionwhich is subclass of OnDemandServerMediaSubsession . Now my question is that, by using the function float MPEG1or2DemuxedServerMediaSubsession::duration () const { return fOurDemux .fileDuration (); } This will give us the file duration in seconds and milliseconds of any media file. How can we get the current playing (seeking) position when we are streaming any media file (e.g. mpg) by using live555MediaServer program inside MPEG1or2DemuxedServerMediaSubsession subclass. Regards, AAMER -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Thu Aug 4 09:19:22 2011 From: matt at schuckmannacres.com (Matt Schuckmannn) Date: Thu, 04 Aug 2011 09:19:22 -0700 Subject: [Live-devel] Confused about how to generate fmtp line for H.264 source for SDP In-Reply-To: <62005C6C-8C02-435E-95D4-730CA0EAA19C@live555.com> References: <4E387E11.10604@schuckmannacres.com> <022F0059-F199-4798-9AA3-4EACD8A2DD3F@live555.com> <4E397FB5.4090703@schuckmannacres.com> <62005C6C-8C02-435E-95D4-730CA0EAA19C@live555.com> Message-ID: <4E3AC68A.1090000@schuckmannacres.com> Ok, thanks for the input. Matt S. On Wednesday, August 03, 2011 6:51:37 PM, Ross Finlayson wrote: >>> Yes. Now, the SPS and PPS NAL units are assumed to be in the input >>> NAL unit stream (and are extracted from there). >> >> Is that a safe assumption, isn't optional to include the SPS and PPS >> NAL units in the stream? or at the very least make the very infrequent? > > Yes, it's uncommon for a supplier of a H.264 stream to know the SPS and > PPS NAL units, but for those NAL units not to appear in the stream. Most > commonly, these NAL units appear in the stream, but you don't know what > they are - and can find them out only by scanning the stream. We now do > that for you. > > >> Hmm, I'll look into it, but my encoder gives me the sps and pps when I >> initialize it, it seems like it would be easier to just hand the sps >> and pps to the rtp sink and just re-implement auxSDPLine in my class, >> pretty much like I used to do. Is there a reason your recomme nding >> this approach > > Yes, the reason was that it would probably save you work. If the SPS and > PPS NAL units also appear in the stream (which, for most encoders, they > will), then you don't have to do any extra work (except perhaps > duplicate some code that we already have in > "H264VideoFileServerMediaSubsession"). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From kidjan at gmail.com Thu Aug 4 08:19:38 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Thu, 4 Aug 2011 08:19:38 -0700 Subject: [Live-devel] Confused about how to generate fmtp line for H.264 source for SDP In-Reply-To: <62005C6C-8C02-435E-95D4-730CA0EAA19C@live555.com> References: <4E387E11.10604@schuckmannacres.com> <022F0059-F199-4798-9AA3-4EACD8A2DD3F@live555.com> <4E397FB5.4090703@schuckmannacres.com> <62005C6C-8C02-435E-95D4-730CA0EAA19C@live555.com> Message-ID: On Wed, Aug 3, 2011 at 6:51 PM, Ross Finlayson wrote: > Yes, it's uncommon for a supplier of a H.264 stream to know the SPS and PPS > NAL units, but for those NAL units not to appear in the stream. Most > commonly, these NAL units appear in the stream, but you don't know what they > are - and can find them out only by scanning the stream. We now do that for > you. > I don't think this is actually that common--most encoders go out of their way to provide this data for you, and to get things running quickly it can be very important to have SPS/PPS on hand. Most container formats, such as MP4, make this data readily available (you can put it in stream, but it's bad form, IMO). > > Hmm, I'll look into it, but my encoder gives me the sps and pps when I > initialize it, it seems like it would be easier to just hand the sps and pps > to the rtp sink and just re-implement auxSDPLine in my class, pretty much > like I used to do. Is there a reason your recommending this approach > > > Yes, the reason was that it would probably save you work. If the SPS and > PPS NAL units also appear in the stream (which, for most encoders, they > will), then you don't have to do any extra work (except perhaps duplicate > some code that we already have in "H264VideoFileServerMediaSubsession"). > FWIW, I ran into the same issue when I updated my Live555 version. It's hard to write an on-demand session because there's no clean way of handing off the SPS/PPS info before things are up and running. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pranav.tipnis at einfochips.com Mon Aug 8 05:53:59 2011 From: pranav.tipnis at einfochips.com (Pranav Tipnis) Date: Mon, 08 Aug 2011 18:23:59 +0530 Subject: [Live-devel] C application using Live555 library Message-ID: <4E3FDC67.4030000@einfochips.com> Hello, I am very new to Live555 library. Studying a bit on Live555, I found that the library is written in C++. I have not yet studied the code of library or test programs. My question is: Is it possible to call Live555 library APIs in an application written in C? OR Is it necessary to write application in C++ only? I have a C program which captures the video frames from Camera. I want to stream those frames using Live555. Regards, Pranav From imaldonado at semex.com.mx Mon Aug 8 14:50:26 2011 From: imaldonado at semex.com.mx (Ivan Maldonado Zambrano) Date: Mon, 08 Aug 2011 16:50:26 -0500 Subject: [Live-devel] new Socket connection Message-ID: <1312840226.2566.14.camel@ivan.semex> Hi all, Actually I'm trying to integrate live555MediaServer App in a Qt Linux App. I have a doubt about live555MediaServer App. At the moment of play rtsp://192.168.100.228/ from VLC, live555MediaServer APP goes to DynamicRTSPServer.cpp FILE and enters to ServerMediaSession* DynamicRTSPServer::lookupServerMediaSession(char const* streamName) FUNCTION. - Who is calling DynamicRTSPServer::lookupServerMediaSession FUNCTION? I suppose that in some place there is a function called newSocketConnection that execute this function. - May somebody help me to understand what happen at the moment of play rtsp://192.168.100.228/? Regards and thanks in advance Iv?n Maza From finlayson at live555.com Mon Aug 8 15:39:07 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Aug 2011 15:39:07 -0700 Subject: [Live-devel] C application using Live555 library In-Reply-To: <4E3FDC67.4030000@einfochips.com> References: <4E3FDC67.4030000@einfochips.com> Message-ID: <454131A1-E775-43C2-9ACB-4DEC3045FD71@live555.com> > Is it possible to call Live555 library APIs in an application written in C? Yes, but you would first need to write a set of functions that act as an interface between your (C) application code and the (C++) LIVE555 library. These functions would have C linkage (i.e., they would be declared using 'extern "C"'), but they would be implemented in C++. In other words, you can do this, but you still need to be familiar with C++ programming and the C++ class model (so that you'll know how to write the interface functions). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Aug 8 18:22:27 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Aug 2011 18:22:27 -0700 Subject: [Live-devel] new Socket connection In-Reply-To: <1312840226.2566.14.camel@ivan.semex> References: <1312840226.2566.14.camel@ivan.semex> Message-ID: > - Who is calling DynamicRTSPServer::lookupServerMediaSession FUNCTION? I > suppose that in some place there is a function called > newSocketConnection that execute this function. Yes, and you can find it by running grep lookupServerMediaSession on the code. (It's in "RTSPServer" (of which "DynamicRTSPServer" is a subclass.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rpardridge at integrity-apps.com Mon Aug 8 19:45:49 2011 From: rpardridge at integrity-apps.com (Pardridge, Robert) Date: Tue, 9 Aug 2011 02:45:49 +0000 Subject: [Live-devel] Streaming video with metadata Message-ID: I want to stream video via RTSP, where the container has 2 streams, one is video and the other is data. I can stream just fine with live555 a video with just video, but if there is a data stream I get the message "Saw unexpected code" which I traced back to MPEG1or2VideoStreamFramer.cpp I was wondering how I could modify the live555 library to send the data stream as well as the video stream. Thanks! Robert -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Aug 8 22:21:59 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Aug 2011 22:21:59 -0700 Subject: [Live-devel] Streaming video with metadata In-Reply-To: References: Message-ID: On Aug 8, 2011, at 7:45 PM, Pardridge, Robert wrote: > I want to stream video via RTSP, where the container has 2 streams, one is video and the other is data. I can stream just fine with live555 a video with just video, but if there is a data stream I get the message ?Saw unexpected code? which I traced back to MPEG1or2VideoStreamFramer.cpp > > I was wondering how I could modify the live555 library to send the data stream as well as the video stream. There are a couple of issues here: 1/ How is the metadata stored within the MPEG Program Stream file? If it's stored in PES packets - with a well-defined stream_id for 'metadata' - then it could probably be extracted in the same way that audio and video is extracted, by modifying the "MPEG1or2Demux" code. If, however, the metadata is stored within the video stream (which is suggested by the "Saw unexpected code" error), then extracting it would be harder. 2/ (The big issue) What RTP payload format would you use to stream this metadata? There probably isn't one defined, which means that you'd need to define your own. But then you'd have the problem of no standard receiver being able to understand the metadata stream. In any case, it would be useful to see an example file that contains 'metadata'. Please put an example on a web server, and send a link to it, so we can download and test it for ourselves. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sonntex at gmail.com Mon Aug 8 23:43:06 2011 From: sonntex at gmail.com (Sonntex) Date: Tue, 9 Aug 2011 10:43:06 +0400 Subject: [Live-devel] Fwd: Jpeg video rtp source bug In-Reply-To: References: Message-ID: Hello! Sorry for my bad English. Jpeg video rtp source works fine for jpeg with two quantization tables. But I am trying to play motion jpeg video using vlc with only one quantization table. So there are some bugs in jpeg video rtp source. Jpeg header size is evaluated incorrectly. Index of quantization table is evaluated incorrectly for third component. You can find example of jpeg file with only one quantization table and patch file for jpeg video rtp source in attach to fix it. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 1.jpg Type: image/jpeg Size: 12658 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: JPEGVideoRTPSource.cpp.patch Type: text/x-patch Size: 310 bytes Desc: not available URL: From finlayson at live555.com Mon Aug 8 23:49:25 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 8 Aug 2011 23:49:25 -0700 Subject: [Live-devel] Fwd: Jpeg video rtp source bug In-Reply-To: References: Message-ID: <415C379F-C703-4411-A7E1-F7C9533EB29E@live555.com> Thanks. This change will be included in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From itay at action-item.co.il Tue Aug 9 00:33:29 2011 From: itay at action-item.co.il (Itay Levy) Date: Tue, 9 Aug 2011 10:33:29 +0300 Subject: [Live-devel] Getting FPS from rtsp stream In-Reply-To: References: Message-ID: Hi, Im using live555 in android device (using vlc) Do you happen to know how do I extract the current FPS and bitrate from the stream? Do you support it? Thanks, Itay -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 9 00:48:22 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 9 Aug 2011 00:48:22 -0700 Subject: [Live-devel] Getting FPS from rtsp stream In-Reply-To: References: Message-ID: > Im using live555 in android device (using vlc) > Do you happen to know how do I extract the current FPS and bitrate from the stream? > > Do you support it? No, not really. This information (FPS and bitrate) is usually deduced by the decoding software from the input data that it receives. It (especially not FPS) is not usually supplied to the decoding software from the streaming software. We are not responsible for the decoding software in VLC. Therefore, you should ask a VLC mailing list about this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From johannes.ebersold at dfki.de Tue Aug 9 08:03:34 2011 From: johannes.ebersold at dfki.de (Johannes Ebersold) Date: Tue, 09 Aug 2011 17:03:34 +0200 Subject: [Live-devel] Socket of RTSP-Server still blocked, when restarting the application In-Reply-To: <4E2D81CD.1090902@dfki.de> References: <4E2D81CD.1090902@dfki.de> Message-ID: <4E414C46.1050203@dfki.de> Hi, I am facing a problem with the binding of the port of the RTSP-Server. The Port used by the server is not freed correctly, although the applications ends correctly is not killed using ctrl-c, as described here:http://www.mail-archive.com/live-devel at lists.live555.com/msg06837.html When i want to restart the application immediately the port is still in use / blocked / open / whatever and the application throws an exception when creating the RTSP-Server. However, this is not 100% reproducible, but happens most of the time. The application streams H.264 video inside a MPEG2 transport stream via Unicast. I initialize the server, as pictured in the testOnDemandRTSPServer, and after the eventloop returns i remove the ServerMediaSession from the server and then call RTSPServer::close(rtsp_server); . Is there anything else i have to do? I expected that the port is correctly freed by calling close on the rtsp_server. Thanks for advice, hints and help. Regards, Johannes From matt at schuckmannacres.com Tue Aug 9 11:02:10 2011 From: matt at schuckmannacres.com (Matt Schuckmannn) Date: Tue, 09 Aug 2011 11:02:10 -0700 Subject: [Live-devel] Streaming video with metadata In-Reply-To: References: Message-ID: <4E417622.4000308@schuckmannacres.com> An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 9 13:14:10 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 9 Aug 2011 13:14:10 -0700 Subject: [Live-devel] Socket of RTSP-Server still blocked, when restarting the application In-Reply-To: <4E414C46.1050203@dfki.de> References: <4E2D81CD.1090902@dfki.de> <4E414C46.1050203@dfki.de> Message-ID: <6E703C95-B07F-4191-B655-A6ED4320D31C@live555.com> On Aug 9, 2011, at 8:03 AM, Johannes Ebersold wrote: > I am facing a problem with the binding of the port of the RTSP-Server. The Port used by the server is not freed correctly, although the applications ends correctly is not killed using ctrl-c, as described here:http://www.mail-archive.com/live-devel at lists.live555.com/msg06837.html Actually, if you read the follow-up messages in that thread, you'll see that the port (and socket) used by the server *is* freed correctly. However, the TCP protocol implementation (in the operating system) deliberately waits a few seconds afterwards before it can mark the TCP connection as being completely closed (and thus before the port can be reused by another server application (or rerunning the same server application). For more information, see http://www.unixguide.net/network/socketfaq/2.7.shtml http://developerweb.net/viewtopic.php?id=2941 and http://www.ssfnet.org/Exchange/tcp/tcpTutorialNotes.html in particular: "The main thing to recognize about connection teardown is that a connection in the TIME_WAIT state cannot move to the CLOSED state until it has waited for two times the maximum amount of time an IP datagram might live in the Internet. The reason for this is that while the local side of the connection has sent an ACK in response to the other side's FIN segment, it does not know that the ACK was successfully delivered. As a consequence this other side might re transmit its FIN segment, and this second FIN segment might be delayed in the network. If the connection were allowed to move directly to the CLOSED state, then another pair of application processes might come along and open the same connection, and the delayed FIN segment from the earlier incarnation of the connection would immediately initiate the termination of the later incarnation of that connection." There *is* a potential hack (involving SO_LINGER, as noted in the original thread) that could eliminate this delay. However, it's non-portable, and introduces the (albeit remote) possibility of incorrect behavior. So, I won't be adding this to the code. If you close a server, just be patient (and wait a few seconds) before trying to restart it again. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 10 01:35:19 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 10 Aug 2011 01:35:19 -0700 Subject: [Live-devel] The last silce or frame is not sended out by the media server In-Reply-To: References: Message-ID: <74030B06-7901-4FFC-8BD2-844AF51D70C3@live555.com> On Jul 25, 2011, at 4:30 AM, Tim wrote: > Hi, > when I use live555mediaServer as my rtsp server, I found that it always not send > the last frame(mpeg4) or slice(h264). I look into the source code, it seems tha > when it come to the last frame, the parser cann't found the next start code, so > the server can't handle the last frame's data correctly. I really not familiar > with the live555 library, so can anybody tell me some usefull information to > handle out this bug? I will very appreciate this, thank you. Thanks for the note. You found a bug in the code. It will be fixed in the next release of the software (probably within a week). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ken at starseedsoft.com Wed Aug 10 09:32:06 2011 From: ken at starseedsoft.com (ken at starseedsoft.com) Date: Wed, 10 Aug 2011 09:32:06 -0700 (PDT) Subject: [Live-devel] MP4 file creation In-Reply-To: <6130FBDE-7357-4769-AA95-C64EDB4A4768@live555.com> Message-ID: <1312993926.46155.YahooMailClassic@web1201.biz.mail.gq1.yahoo.com> Is it really this simple to generate an MP4 file? ?It is simply a serialized h264 stream? ?I don't need some other library in order to create an MP4 file? Thanks... /Ken --- On Fri, 7/29/11, Ross Finlayson wrote: From: Ross Finlayson Subject: Re: [Live-devel] Problem about openRTSP receive MP4 To: "LIVE555 Streaming Media - development & use" Date: Friday, July 29, 2011, 11:49 PM Use order "openRTSP -4 rtsp://...? >? test.mp4" to receive stream media.? When the end, the QuickTime can't play the mp4 file, why? Unfortunately, I don't know. ?However, it's very important that you specify the correct video frame rate, width and height - using the -f -w and -h options - see?http://www.live555.com/openRTSP/#quicktime You might also try playing your file using VLC: http://www.videolan.org/vlc ?(VLC can often play files that QuickTime Player cannot.) PS: My English is so bad, don't laugh me. Your English is good (and much better than my Chinese :-) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -----Inline Attachment Follows----- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 10 16:13:01 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 10 Aug 2011 16:13:01 -0700 Subject: [Live-devel] MP4 file creation In-Reply-To: <1312993926.46155.YahooMailClassic@web1201.biz.mail.gq1.yahoo.com> References: <1312993926.46155.YahooMailClassic@web1201.biz.mail.gq1.yahoo.com> Message-ID: <5243DD84-F6B0-4475-993C-B71C685B7C79@live555.com> On Aug 10, 2011, at 9:32 AM, ken at starseedsoft.com wrote: > Is it really this simple to generate an MP4 file? It is simply a serialized h264 stream? No it's not. A "serialized H.264 stream" is a '.264' or '.h264' file. A '.mp4' file has a completely different format (though it often contains H.264 video). > Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ken at starseedsoft.com Wed Aug 10 20:37:06 2011 From: ken at starseedsoft.com (ken at starseedsoft.com) Date: Wed, 10 Aug 2011 20:37:06 -0700 (PDT) Subject: [Live-devel] MP4 file creation In-Reply-To: <5243DD84-F6B0-4475-993C-B71C685B7C79@live555.com> Message-ID: <1313033826.19285.YahooMailClassic@web1206.biz.mail.gq1.yahoo.com> Arg...is FFMPEG/LibAV the best library to write an MP4 file?Thx/K --- On Wed, 8/10/11, Ross Finlayson wrote: From: Ross Finlayson Subject: Re: [Live-devel] MP4 file creation To: "LIVE555 Streaming Media - development & use" Date: Wednesday, August 10, 2011, 6:13 PM On Aug 10, 2011, at 9:32 AM, ken at starseedsoft.com wrote: Is it really this simple to generate an MP4 file? ?It is simply a serialized h264 stream? No it's not. ?A "serialized H.264 stream" is a '.264' or '.h264' file. ?A '.mp4' file has a completely different format (though it often contains H.264 video). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -----Inline Attachment Follows----- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From kidjan at gmail.com Wed Aug 10 22:33:32 2011 From: kidjan at gmail.com (Jeremy Noring) Date: Wed, 10 Aug 2011 22:33:32 -0700 Subject: [Live-devel] MP4 file creation In-Reply-To: <1313033826.19285.YahooMailClassic@web1206.biz.mail.gq1.yahoo.com> References: <5243DD84-F6B0-4475-993C-B71C685B7C79@live555.com> <1313033826.19285.YahooMailClassic@web1206.biz.mail.gq1.yahoo.com> Message-ID: Try this: http://code.google.com/p/mp4v2/ On Wed, Aug 10, 2011 at 8:37 PM, wrote: > Arg...is FFMPEG/LibAV the best library to write an MP4 file? > Thx > /K > > > --- On *Wed, 8/10/11, Ross Finlayson * wrote: > > > From: Ross Finlayson > Subject: Re: [Live-devel] MP4 file creation > > To: "LIVE555 Streaming Media - development & use" < > live-devel at ns.live555.com> > Date: Wednesday, August 10, 2011, 6:13 PM > > > > On Aug 10, 2011, at 9:32 AM, ken at starseedsoft.comwrote: > > Is it really this simple to generate an MP4 file? It is simply a > serialized h264 stream? > > > > No it's not. A "serialized H.264 stream" is a '.264' or '.h264' file. A > '.mp4' file has a completely different format (though it often contains > H.264 video). > > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > -----Inline Attachment Follows----- > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcmordie at gmail.com Thu Aug 11 08:08:21 2011 From: mcmordie at gmail.com (Dave McMordie) Date: Thu, 11 Aug 2011 11:08:21 -0400 Subject: [Live-devel] Sending periodic metadata XML fragments-- should I use SimpleRTPSource or modify MediaSession? Message-ID: Hi Ross / Live Developers, Great library-- it was a pleasure to integrate! We are transmitting a sequence of uncorrelated image frames using a subclass of JPEGVideoSource. We would also like to transmit a stream of XML fragments (one-to-one with the video frames) which contain information about each frame. What I have understood so far from previous threads is that I can create a custom payload format using subclasses of MultiFramedSource / MultiFramedSync. I would then modify the MediaSession::initiate method to add explicit support for this payload format. On the server side, I also create a new MediaSubsession containing the source/sink and add it to my MediaSession. It seems like modifying the liveMedia source to add our application-specific format would be a little inelegant. Is this really the best option, or do you have existing payload formats which could be subclassed to provide a container for a generic data stream? For example, would it be possible to use subclasses of SimpleRTPSource/Sink to send the raw stream? If the former option is preferable, is there a way to do this without modifying liveMedia, or should I just jump right in? Thanks again for your hard work and for making this code available. Best, Dave McMordie -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeclark2006 at aim.com Thu Aug 11 10:06:41 2011 From: jeclark2006 at aim.com (John Clark) Date: Thu, 11 Aug 2011 10:06:41 -0700 Subject: [Live-devel] Frame -- as used in the live555 code, is this a 'video' frame? Message-ID: <3A0BA209-20E8-4D14-97AC-0CB81C476353@aim.com> I am developing a package to use FEC code on video streams. Previously I was just applying the FEC code to any packet that was coming from a camera. However, in looking at the live555 code, it seems that, for example, an RTP stream is received then video data extracted in terms of 'frames', It in fact would be beneficial to apply FEC on a 'frame' bases rather than just arbitrary packets. The question I have is are these codec frames such as I-frames, and P-frames and the like? If so, what part of the code detects what type of frame, if there is any frame type detection. While this would probably be clear with some amount of study of the code, if someone could give a brief over view, and pointers to specific code chunks I'd appreciate it. Thanks, John Clark. From finlayson at live555.com Thu Aug 11 21:33:47 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 11 Aug 2011 21:33:47 -0700 Subject: [Live-devel] Sending periodic metadata XML fragments-- should I use SimpleRTPSource or modify MediaSession? In-Reply-To: References: Message-ID: On Aug 11, 2011, at 8:08 AM, Dave McMordie wrote: > Great library-- it was a pleasure to integrate! We are transmitting a sequence of uncorrelated image frames using a subclass of JPEGVideoSource. > > We would also like to transmit a stream of XML fragments (one-to-one with the video frames) which contain information about each frame. > Do "we" not have our own domain name? :-) Rather than using a separate RTP stream for this per-frame XML data (which would require a separate RTP payload format, and a lot of extra code for both the server and receiver(s), is it possible to store this per-frame data somewhere within the JPEG frame itself? I.e., does the JPEG format allow for application-specific 'meta data' somewhere within the JPEG frame? If so, then this would be (by far) the best solution. (Of course, you really shouldn't be streaming JPEG at all; it's a terrible codec for video streaming.) > What I have understood so far from previous threads is that I can create a custom payload format using subclasses of MultiFramedSource / MultiFramedSync. I would then modify the MediaSession::initiate method to add explicit support for this payload format. On the server side, I also create a new MediaSubsession containing the source/sink and add it to my MediaSession. "MediaSession" and "MediaSubsession" are used only by receivers. Servers (transmitters) use "ServerMediaSession" and (subclasses of) "ServerMediaSubsession". But you're correct. Unfortunately, to support a new RTP payload format in receivers, you need to modify the "MediaSession::initiate()" code. (Someday I hope to fix this, because it's better if developers don't have to modify the supplied code.) Anyway, there isn't really a suitable RTP payload format for your application (except, perhaps, the payload type "text/t140", defined by RFC 4103 "RTP Payload for Text Conversation"). So you'd need to define your own. But, as I noted above, the best solution would be to (somehow) store your per-frame XML data within each JPEG frame. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Aug 11 21:44:10 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 11 Aug 2011 21:44:10 -0700 Subject: [Live-devel] Frame -- as used in the live555 code, is this a 'video' frame? In-Reply-To: <3A0BA209-20E8-4D14-97AC-0CB81C476353@aim.com> References: <3A0BA209-20E8-4D14-97AC-0CB81C476353@aim.com> Message-ID: <9814EC74-D8E7-49CB-8D7A-FA26E06CE2B5@live555.com> The IETF has defined new payload formats (and encoding) for doing FEC on RTP streams. However, we don't yet implement any of them (although we might in the future). But implementing these requires a lot more work than just slapping some FEC encoding and decoding code onto the existing code. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From thomas at familie-gries.de Fri Aug 12 13:43:18 2011 From: thomas at familie-gries.de (Thomas Gries) Date: Fri, 12 Aug 2011 22:43:18 +0200 Subject: [Live-devel] H264 To web page Message-ID: <4E459066.7010303@familie-gries.de> Hello together, perhaps somebody can point me into the correct direction. I have an embedded device with a arm based processor running linux on. This device has a build in camera, giving me raw H264 data. Those live data I want to stream to a web page. This web page must display the video an all platforms (Windows, Mac, linux), so I want to use standard decoders. I am able to stream the H264 to vlc, no problem. But to a web page ? Any solution / hint would be very helpfull. Regards Thomas From finlayson at live555.com Fri Aug 12 17:41:24 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 12 Aug 2011 17:41:24 -0700 Subject: [Live-devel] H264 To web page In-Reply-To: <4E459066.7010303@familie-gries.de> References: <4E459066.7010303@familie-gries.de> Message-ID: > I have an embedded device with a arm based processor running linux on. > This device has > a build in camera, giving me raw H264 data. Those live data I want to > stream to a web page. This web page must display the video an all > platforms (Windows, Mac, linux), so I want to use standard decoders. It's not currently possible to do this with our software (unless your web browsers use a RTSP/RTP 'plugin' of some sort). Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Aug 13 02:02:11 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 13 Aug 2011 02:02:11 -0700 Subject: [Live-devel] [PATCH] Allow streaming from subdirectories In-Reply-To: <1311777869-13206-1-git-send-email-mike@mikebwilliams.com> References: <1311777869-13206-1-git-send-email-mike@mikebwilliams.com> Message-ID: FYI, this is now supported in the latest version (2011.08.13) of the "LIVE555 Streaming Media" code (and in the installed "LIVE555 Media Server" application binaries). Thanks again for the suggestion. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthias.traub at bt-websolutions.at Tue Aug 16 02:55:05 2011 From: matthias.traub at bt-websolutions.at (matthias.traub at bt-websolutions.at) Date: Tue, 16 Aug 2011 11:55:05 +0200 (CEST) Subject: [Live-devel] =?iso-8859-1?q?Streaming_single_QImages?= Message-ID: <20110816095505.33BAD5D412E6@517.bces.de> Hi I?m trying to stream single Images. My application receives iplimages and converts these to QImages for further manipulation. After that I simply want to stream these images. I?m not very familiar with video encoding or stream etc. If tried to implement a DeviceSource like in one example that if found in the web but I still not receive any images when I?m trying to watch the stream. It would be really nice if you could help me with that problem. With kind regards Matthias /// StreamingServer.cpp /// author: Traub Matthias #include "StreamingServer.h" #include StreamingServer::StreamingServer() : menv(NULL), minputFileName("test.m4v"), mvideoSource(NULL), mvideoSink(NULL), mfileSource(NULL) { } StreamingServer::~StreamingServer() { } void StreamingServer::run() { std::cout << "SERVER STARTET AS THREAD!" << std::endl; // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); menv = BasicUsageEnvironment::createNew(*scheduler); // Create 'groupsocks' for RTP and RTCP: struct in_addr destinationAddress; destinationAddress.s_addr = chooseRandomIPv4SSMAddress(*menv); const unsigned short rtpPortNum = 18888; const unsigned short rtcpPortNum = rtpPortNum+1; const unsigned char ttl = 255; const Port rtpPort(rtpPortNum); const Port rtcpPort(rtcpPortNum); Groupsock rtpGroupsock(*menv, destinationAddress, rtpPort, ttl); rtpGroupsock.multicastSendOnly(); // we're a SSM source Groupsock rtcpGroupsock(*menv, destinationAddress, rtcpPort, ttl); rtcpGroupsock.multicastSendOnly(); // we're a SSM source // Create a 'MPEG-4 Video RTP' sink from the RTP 'groupsock': mvideoSink = MPEG4ESVideoRTPSink::createNew(*menv, &rtpGroupsock, 96); // Create (and start) a 'RTCP instance' for this RTP sink: const unsigned estimatedSessionBandwidth = 500; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* rtcp = RTCPInstance::createNew(*menv, &rtcpGroupsock, estimatedSessionBandwidth, CNAME, mvideoSink, NULL /* we're a server */, True /* we're a SSM source */); // Note: This starts RTCP running automatically RTSPServer* rtspServer = RTSPServer::createNew(*menv, 8554); if (rtspServer == NULL) { *menv << "Failed to create RTSP server: " << menv->getResultMsg() << "\n"; exit(1); } ServerMediaSession* sms = ServerMediaSession::createNew(*menv, "testStream", minputFileName, "Session streamed by \"testMPEG4VideoStreamer\"", True /*SSM*/); sms->addSubsession(PassiveServerMediaSubsession::createNew(*mvideoSink, rtcp)); rtspServer->addServerMediaSession(sms); char* url = rtspServer->rtspURL(sms); *menv << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; // Start the streaming: *menv << "Beginning streaming...\n"; // Open the input source DeviceParameters params; mfileSource = DeviceSourceImage::createNew(*menv, params); // Save Source in member Variable //this->mfileSource = fileSource; if (mfileSource == NULL) { *menv << "Unable to open source\n"; exit(1); } play(); //FramedSource* videoES = mfileSource; //// Create a framer for the Video Elementary Stream: //mvideoSource = MPEG4VideoStreamFramer::createNew(*menv, videoES); //// Finally, start playing: //*menv << "Beginning to read from file...\n"; //mvideoSink->startPlaying(*mvideoSource, NULL, mvideoSink); menv->taskScheduler().doEventLoop(); // does not return } void StreamingServer::play() { FramedSource* videoES = mfileSource; // Create a framer for the Video Elementary Stream: mvideoSource = MPEG4VideoStreamFramer::createNew(*menv, videoES); // Finally, start playing: *menv << "Beginning to read from file...\n"; mvideoSink->startPlaying(*mvideoSource, NULL, mvideoSink); } void StreamingServer::receiveFrame(QImage img/*IplImage *image*/) { // FWD to DeviceSourceImage mfileSource->receiveFrame(img); } ----------------------------------------------------------------------------------------------------- /// StreamingServer.h /// author: Traub Matthias #include #include #include #include "BasicUsageEnvironment.hh" #include "DeviceSourceImage.h" #include "GroupsockHelper.hh" #include "liveMedia.hh" #include "cv.h" #include #include class StreamingServer : public QThread { public: StreamingServer(); ~StreamingServer(); void receiveFrame(QImage img); virtual void run(); private: UsageEnvironment* menv; char const* minputFileName; MPEG4VideoStreamFramer* mvideoSource; RTPSink* mvideoSink; DeviceSourceImage* mfileSource; void play(); }; ------------------------------------------------------------------------------------------------ /// DeviceSourceImage.cpp /// author: Traub Matthias #include "DeviceSourceImage.h" #include // for "gettimeofday()" #include #include #include #include DeviceSourceImage* DeviceSourceImage::createNew(UsageEnvironment& env, DeviceParameters params) { return new DeviceSourceImage(env, params); } DeviceSourceImage::DeviceSourceImage(UsageEnvironment& env, DeviceParameters params) : DeviceSource(env, params) { std::cout << "DeviceSourceImage::DeviceSourceImage() called!" << std::endl; mJpegBufferUpToDate = false; } DeviceSourceImage::~DeviceSourceImage() { std::cout << "DeviceSourceImage::~DeviceSourceImage() called!" << std::endl; } void DeviceSourceImage::doGetNextFrame() { // This function is called (by our 'downstream' object) when it asks for new data. std::cout << "DeviceSourceImage::doGetNextFrame() called!" << std::endl; deliverFrame(); } void DeviceSourceImage::receiveFrame(QImage img) { mImg = img; mJpegBufferUpToDate = false; } void DeviceSourceImage::deliverFrame() { // This function is called when new frame data is available from the device. // We deliver this data by copying it to the 'downstream' object, using the following parameters (class members): // 'in' parameters (these should *not* be modified by this function): // fTo: The frame data is copied to this address. // (Note that the variable "fTo" is *not* modified. Instead, // the frame data is copied to the address pointed to by "fTo".) // fMaxSize: This is the maximum number of bytes that can be copied // (If the actual frame is larger than this, then it should // be truncated, and "fNumTruncatedBytes" set accordingly.) // 'out' parameters (these are modified by this function): // fFrameSize: Should be set to the delivered frame size (<= fMaxSize). // fNumTruncatedBytes: Should be set iff the delivered frame would have been // bigger // than "fMaxSize", in which case it's set to the number of bytes // that have been omitted. // fPresentationTime: Should be set to the frame's presentation time // (seconds, microseconds). This time must be aligned with 'wall-clock time' - i.e., the time that you would get // by calling "gettimeofday()". // fDurationInMicroseconds: Should be set to the frame's duration, if known. // If, however, the device is a 'live source' (e.g., encoded from a camera or microphone), then we probably don't need // to set this variable, because - in this case - data will never arrive 'early'. // Note the code below. // Check if buffer is out of date and image is available if (!mJpegBufferUpToDate && mImg != QImage()) { QBuffer buffer(&mJpegBuffer); buffer.open(QIODevice::WriteOnly); bool ok = mImg.save(&buffer, "JPG", 15); // writes image into a buffer in JPG format buffer.close(); mJpegBufferUpToDate = true; } fFrameSize = 0; static int framesOk = 0; static int framesSkipped = 0; gettimeofday(&fPresentationTime, 0); // Check if Buffer is filled and smaller than outputbuffer size if (mJpegBuffer.size() && mJpegBuffer.size() <= fMaxSize) { // Copy imagedata onto outputbuffer (fTo) fFrameSize = mJpegBuffer.size(); memcpy_s(fTo, fMaxSize, mJpegBuffer.constData(), mJpegBuffer.size()); ++framesOk; } // Just for testing to fill up buffer else if(mJpegBuffer.size()) { fFrameSize = fMaxSize; memcpy_s(fTo, fMaxSize, mJpegBuffer.constData() , fMaxSize); ++framesSkipped; } // just for testing of performance changes Sleep(1); // Finished writting into outputbuffer and schedule nextTask() = envir().taskScheduler().scheduleDelayedTask(3000, (TaskFunc*)FramedSource::afterGetting, this); } --------------------------------------------------------------------------------------------- /// DeviceSourceImage.h /// author: Traub Matthias #ifndef _DEVICE_SOURCE_IMAGE_HH #define _DEVICE_SOURCE_IMAGE_HH #include "DeviceSource.hh" #include #include #include class DeviceSourceImage: public DeviceSource { public: static DeviceSourceImage* createNew(UsageEnvironment& env, DeviceParameters params); void receiveFrame(QImage img); protected: DeviceSourceImage(UsageEnvironment& env, DeviceParameters params); // called only by createNew(), or by subclass constructors virtual ~DeviceSourceImage(); private: // redefined virtual functions: virtual void doGetNextFrame(); private: void deliverFrame(); private: QImage mImg; QByteArray mJpegBuffer; bool mJpegBufferUpToDate; }; #endif From imaldonado at semex.com.mx Tue Aug 16 10:10:11 2011 From: imaldonado at semex.com.mx (Ivan Maldonado Zambrano) Date: Tue, 16 Aug 2011 12:10:11 -0500 Subject: [Live-devel] bitrate and data from memory location Message-ID: <1313514611.2542.86.camel@ivan.semex> Hi all, Actually I have a problem with my Stream Engine application and I'd like to know if somebody can help me. I have two questions: 1) Is there a way to vary transfer bitrate? I mean, is there a way in Live555 to reduce or increment the number of bits send by package?. I was looking for in internet and I found something about "estBitrate" variable but it did not work. 2) Live555 normally takes data from a file or a pipe, is there a way to indicate that data is located in a specific memory direction/location? I mean, is there a way to indicate Live555 that take data from a memory location with a specific size? Regards and thanks in advance Iv?n Maldonado Zambrano From finlayson at live555.com Tue Aug 16 20:18:38 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 16 Aug 2011 20:18:38 -0700 Subject: [Live-devel] Streaming single QImages In-Reply-To: <20110816095505.33BAD5D412E6@517.bces.de> References: <20110816095505.33BAD5D412E6@517.bces.de> Message-ID: > I?m trying to stream single Images. I hope you mean "a sequence of frames", because that's what streaming video is. > My application receives iplimages > and converts these to QImages for further manipulation. I don't know what "QImages" are. What codec is used to encode them? Your use of "MPEG4ESVideoRTPSink" and "MPEG4VideoStreamFramer" and "code assumes that they are MPEG-4 video frames; I hope that's the case. Also (assuming that your input frames are MPEG-4 video), if your input class is delivering frames one-at-a-time, you should use "MPEG4VideoStreamDiscreteFramer" rather than "MPEG4VideoStreamFramer". Finally, because you're basing your code on our "DeviceSource" example, you should use an up-to-date version of this - which uses a much easier to use "EventTrigger" mechanism. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 16 20:27:07 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 16 Aug 2011 20:27:07 -0700 Subject: [Live-devel] bitrate and data from memory location In-Reply-To: <1313514611.2542.86.camel@ivan.semex> References: <1313514611.2542.86.camel@ivan.semex> Message-ID: On Aug 16, 2011, at 10:10 AM, Ivan Maldonado Zambrano wrote: > 1) Is there a way to vary transfer bitrate? I mean, is there a way in > Live555 to reduce or increment the number of bits send by package?. No. > 2) Live555 normally takes data from a file or a pipe, is there a way to > indicate that data is located in a specific memory direction/location? I > mean, is there a way to indicate Live555 that take data from a memory > location with a specific size? Yes, you (the server implementor) can stream from a "ByteStreamMemoryBufferSource", instead of from a "ByteStreamFileSource". To use this, you would write a new subclass of "ServerMediaSubsession", and reimplement the virtual function "createNewStreamSource()". Note, though, that to use "ByteStreamMemoryBufferSource", the memory buffer must be static - i.e., fixed-size, and filled in with data ahead of time. If, instead, you are using the memory as a FIFO, then you must write your own source class (perhaps using our "DeviceSource" code as a model) to encapsulate it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mszeles at netavis.hu Thu Aug 18 02:03:09 2011 From: mszeles at netavis.hu (Miklos Szeles) Date: Thu, 18 Aug 2011 11:03:09 +0200 Subject: [Live-devel] RTSP client runs on 100% CPU Message-ID: <4E4CD54D.7050509@netavis.hu> Hi, We are using live555 to get RTSP streams from IP cameras. It has been running fine with every kind of cameras for years. Unfortunately recently we've experienced problems with Sanyo cameras. In the beginning everything runs smoothly and approximately after a 60-70 secs the camera closes the TCP socket for the RTSP stream. The RTP stream runs without problem after this issue, but the streamer eats 100% of the CPU from this point. It looks like the select() always returns without blocking after the camera closed the socket so it results in an endless loop of calling select(). The RTSPClient's packet handler can read 0 bytes all the time and simply writes an error message but no handling of this issue happens. We are using a relatively old version of the live555 library. We haven't upgraded since till now everything worked fine. Is it possible that the latest version can correctly handle this issue? Best regards, Mikl?s void BasicTaskScheduler::SingleStep(unsigned maxDelayTime) { fd_set readSet = fReadSet; // make a copy for this select() call DelayInterval const& timeToDelay = fDelayQueue.timeToNextAlarm(); struct timeval tv_timeToDelay; tv_timeToDelay.tv_sec = timeToDelay.seconds(); tv_timeToDelay.tv_usec = timeToDelay.useconds(); // Very large "tv_sec" values cause select() to fail. // Don't make it any larger than 1 million seconds (11.5 days) const long MAX_TV_SEC = MILLION; if (tv_timeToDelay.tv_sec > MAX_TV_SEC) { tv_timeToDelay.tv_sec = MAX_TV_SEC; } // Also check our "maxDelayTime" parameter (if it's > 0): if (maxDelayTime > 0 && (tv_timeToDelay.tv_sec > (long)maxDelayTime/MILLION || (tv_timeToDelay.tv_sec == (long)maxDelayTime/MILLION && tv_timeToDelay.tv_usec > (long)maxDelayTime%MILLION))) { tv_timeToDelay.tv_sec = maxDelayTime/MILLION; tv_timeToDelay.tv_usec = maxDelayTime%MILLION; } int selectResult = select(fMaxNumSockets, &readSet, NULL, NULL, &tv_timeToDelay); if (selectResult < 0) { #if defined(__WIN32__) || defined(_WIN32) int err = WSAGetLastError(); // For some unknown reason, select() in Windoze sometimes fails with WSAEINVAL if // it was called with no entries set in "readSet". If this happens, ignore it: if (err == WSAEINVAL && readSet.fd_count == 0) { err = 0; // To stop this from happening again, create a dummy readable socket: int dummySocketNum = socket(AF_INET, SOCK_DGRAM, 0); FD_SET((unsigned)dummySocketNum, &fReadSet); } if (err != 0) { #else if (errno != EINTR && errno != EAGAIN) { #endif // Unexpected error - treat this as fatal: #if !defined(_WIN32_WCE) perror("BasicTaskScheduler::SingleStep(): select() fails"); #endif exit(0); } } -- Mikl?s Szeles NETAVIS Kft. Web: www.netavis.net Mail: mszeles at netavis.hu H?v?sv?lgyi ?t 54. H-1021 Budapest Hungary Tel: +36 1 225 31 38 Fax: +36 1 225 31 39 From finlayson at live555.com Thu Aug 18 06:45:46 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Aug 2011 06:45:46 -0700 Subject: [Live-devel] RTSP client runs on 100% CPU In-Reply-To: <4E4CD54D.7050509@netavis.hu> References: <4E4CD54D.7050509@netavis.hu> Message-ID: > We are using a relatively old version of the live555 library. Then I can't help you (see ). Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Aug 18 06:51:03 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Aug 2011 06:51:03 -0700 Subject: [Live-devel] RTSP client runs on 100% CPU In-Reply-To: <4E4CD54D.7050509@netavis.hu> References: <4E4CD54D.7050509@netavis.hu> Message-ID: <4826D55A-A7CA-4562-9682-FC4863E14F82@live555.com> > Is it possible that the latest version can correctly handle this issue? Of course it's possible, because new versions of the library don't just add new functionality, but improve upon (i.e., fix bugs in) old versions. Why not just upgrade? (If you're having trouble upgrading because you've modified the supplied code 'in place', then this illustrates why you should not be doing that. See ) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mszeles at netavis.hu Thu Aug 18 07:01:33 2011 From: mszeles at netavis.hu (Miklos Szeles) Date: Thu, 18 Aug 2011 16:01:33 +0200 Subject: [Live-devel] RTSP client runs on 100% CPU In-Reply-To: References: <4E4CD54D.7050509@netavis.hu> Message-ID: <4E4D1B3D.5000008@netavis.hu> No problem, meanwhile I managed to update the live555 library with the latest one, and the problem does not exist anymore. Best regards, Mikl?s 2011.08.18. 15:45 keltez?ssel, Ross Finlayson ?rta: >> We are using a relatively old version of the live555 library. > > Then I can't help you (see > ). Sorry. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From mszeles at netavis.hu Thu Aug 18 07:27:30 2011 From: mszeles at netavis.hu (Miklos Szeles) Date: Thu, 18 Aug 2011 16:27:30 +0200 Subject: [Live-devel] RTSP multicast problem Message-ID: <4E4D2152.5030208@netavis.hu> Hi Ross, I've found the following mailing regarding "Many multicast sources with same port problem". http://www.mail-archive.com/live-devel at lists.live555.com/msg05777.html Unfortunately we have the same problem in one situation as Zed had. 2 IP cameras multicasting their streams with different multicast addresses but with the same port, and both RTSP clients receive the packets from all the devices. I've tried to adopt the code according to your proposed changes, but unfortunately it didn't work for us. Can you please help how can we modify the code to make this work correctly? Unfortunately patching the linux kernel is not an available option for us in this case. Best regards, Mikl?s From finlayson at live555.com Thu Aug 18 07:47:26 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Aug 2011 07:47:26 -0700 Subject: [Live-devel] RTSP multicast problem In-Reply-To: <4E4D2152.5030208@netavis.hu> References: <4E4D2152.5030208@netavis.hu> Message-ID: <0212F04A-3C56-4D87-81B0-CB12163FA5DD@live555.com> Unfortunately I don't know of a solution to this, other than 1/ Fixing the OS kernel, or 2/ Using a different OS (e.g. FreeBSD) that doesn't have the problem, or 3/ Changing at least one of the cameras to use a different port, or 4/ Running your two client applications on separate hosts. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From imaldonado at semex.com.mx Thu Aug 18 08:55:10 2011 From: imaldonado at semex.com.mx (Ivan Maldonado Zambrano) Date: Thu, 18 Aug 2011 10:55:10 -0500 Subject: [Live-devel] How does Live555 deal with low bandwidth? In-Reply-To: <1313514611.2542.86.camel@ivan.semex> References: <1313514611.2542.86.camel@ivan.semex> Message-ID: <1313682910.2526.18.camel@ivan.semex> Hi all, If Live555 can not vary transfer bitrate. How does Live555 deal with low bandwidth? I mean, at the moment that Live555 stream data and there is a low bandwidth it suppose to loss frames/data. Does Live555 implement a mechanism to avoid this kind of problems? Suppose I have a client that feedback, I can know if I have a problem with my bandwidth and probably I can modify a Live555 variable to reduce loss frames/data. Regards and thanks in advance Iv?n Maldonado Zambrano -------------------------------------------------------------------------------- Date: Tue, 16 Aug 2011 20:27:07 -0700 From: Ross Finlayson To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] bitrate and data from memory location Message-ID: Content-Type: text/plain; charset="us-ascii" On Aug 16, 2011, at 10:10 AM, Ivan Maldonado Zambrano wrote: > 1) Is there a way to vary transfer bitrate? I mean, is there a way in > Live555 to reduce or increment the number of bits send by package?. No. > 2) Live555 normally takes data from a file or a pipe, is there a way to > indicate that data is located in a specific memory direction/location? I > mean, is there a way to indicate Live555 that take data from a memory > location with a specific size? Yes, you (the server implementor) can stream from a "ByteStreamMemoryBufferSource", instead of from a "ByteStreamFileSource". To use this, you would write a new subclass of "ServerMediaSubsession", and reimplement the virtual function "createNewStreamSource()". Note, though, that to use "ByteStreamMemoryBufferSource", the memory buffer must be static - i.e., fixed-size, and filled in with data ahead of time. If, instead, you are using the memory as a FIFO, then you must write your own source class (perhaps using our "DeviceSource" code as a model) to encapsulate it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ On Tue, 2011-08-16 at 12:10 -0500, Ivan Maldonado Zambrano wrote: > Hi all, > > Actually I have a problem with my Stream Engine application and I'd like > to know if somebody can help me. > > I have two questions: > > 1) Is there a way to vary transfer bitrate? I mean, is there a way in > Live555 to reduce or increment the number of bits send by package?. > I was looking for in internet and I found something about > "estBitrate" variable but it did not work. > > 2) Live555 normally takes data from a file or a pipe, is there a way to > indicate that data is located in a specific memory direction/location? I > mean, is there a way to indicate Live555 that take data from a memory > location with a specific size? > > Regards and thanks in advance > Iv?n Maldonado Zambrano From matt at schuckmannacres.com Thu Aug 18 09:16:40 2011 From: matt at schuckmannacres.com (Matt Schuckmannn) Date: Thu, 18 Aug 2011 09:16:40 -0700 Subject: [Live-devel] How does Live555 deal with low bandwidth? In-Reply-To: <1313682910.2526.18.camel@ivan.semex> References: <1313514611.2542.86.camel@ivan.semex> <1313682910.2526.18.camel@ivan.semex> Message-ID: <4E4D3AE8.8070201@schuckmannacres.com> Live555 can't reduce the content, which is what determines the bit rate, Live555 is just the mechanism for sending the content so Live555 can't do anything to change the bit rate. Since Live555 implements RTSP and RTP and those protocols have some mechanisms to detect report bandwidth problems, theoretically Live555 could tell your application about bandwidth problems and your application could do something to reduce the bit rate if possible, e.g. a live camera encoder could dynamically reduce the bit rate or frame rate or ? Matt S. On 8/18/2011 8:55 AM, Ivan Maldonado Zambrano wrote: > Hi all, > > If Live555 can not vary transfer bitrate. How does Live555 deal with low > bandwidth? I mean, at the moment that Live555 stream data and there is a > low bandwidth it suppose to loss frames/data. Does Live555 implement a > mechanism to avoid this kind of problems? > > Suppose I have a client that feedback, I can know if I have a problem > with my bandwidth and probably I can modify a Live555 variable to reduce > loss frames/data. > > Regards and thanks in advance > Iv?n Maldonado Zambrano > > -------------------------------------------------------------------------------- > > Date: Tue, 16 Aug 2011 20:27:07 -0700 > From: Ross Finlayson > To: LIVE555 Streaming Media - development& use > > Subject: Re: [Live-devel] bitrate and data from memory location > Message-ID: > Content-Type: text/plain; charset="us-ascii" > > > On Aug 16, 2011, at 10:10 AM, Ivan Maldonado Zambrano wrote: > >> 1) Is there a way to vary transfer bitrate? I mean, is there a way in >> Live555 to reduce or increment the number of bits send by package?. > No. > >> 2) Live555 normally takes data from a file or a pipe, is there a way > to >> indicate that data is located in a specific memory direction/location? > I >> mean, is there a way to indicate Live555 that take data from a memory >> location with a specific size? > Yes, you (the server implementor) can stream from a > "ByteStreamMemoryBufferSource", instead of from a > "ByteStreamFileSource". To use this, you would write a new subclass of > "ServerMediaSubsession", and reimplement the virtual function > "createNewStreamSource()". > > Note, though, that to use "ByteStreamMemoryBufferSource", the memory > buffer must be static - i.e., fixed-size, and filled in with data ahead > of time. If, instead, you are using the memory as a FIFO, then you must > write your own source class (perhaps using our "DeviceSource" code as a > model) to encapsulate it. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > On Tue, 2011-08-16 at 12:10 -0500, Ivan Maldonado Zambrano wrote: >> Hi all, >> >> Actually I have a problem with my Stream Engine application and I'd like >> to know if somebody can help me. >> >> I have two questions: >> >> 1) Is there a way to vary transfer bitrate? I mean, is there a way in >> Live555 to reduce or increment the number of bits send by package?. >> I was looking for in internet and I found something about >> "estBitrate" variable but it did not work. >> >> 2) Live555 normally takes data from a file or a pipe, is there a way to >> indicate that data is located in a specific memory direction/location? I >> mean, is there a way to indicate Live555 that take data from a memory >> location with a specific size? >> >> Regards and thanks in advance >> Iv?n Maldonado Zambrano > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From hatrungttha at gmail.com Thu Aug 18 03:27:37 2011 From: hatrungttha at gmail.com (Ha, Tuan Trung) Date: Thu, 18 Aug 2011 17:27:37 +0700 Subject: [Live-devel] Implement custom RTP payload formats Message-ID: Hi everyone, I am a newbie to liveMedia library. I want to ask where is the best place to implement a custom RTP payload formats? Thanks in advanced. Trung. From mail2ashi.86 at gmail.com Thu Aug 18 05:04:43 2011 From: mail2ashi.86 at gmail.com (Ashish Mathur) Date: Thu, 18 Aug 2011 12:04:43 +0000 (UTC) Subject: [Live-devel] RTSP client implementation issue using live555 libs Message-ID: Hi, I am currently writing a RTSPClient application for STB using live555 libs. The idea is to play a RTSP stream residing on a RTSP server through a RTSP client application residing on a STB. I was taking the reference from the testProgs like Play_common.cpp and OpenRTSP.cpp. I have been able to open RTSP session and now want to start playing the stream. However, for playing the RTSP stream i want to read the data from RTPSource socket buffers and write them into a buffer which my STB application can use for demuxing, decoding etc. purpose. But in the testProgs make use of FileSink (or any class derived from MediaSink) and then read the data from RTPSource to FileSink. I don't want to write the RTPSource data to a file. So, what i can't figure out is that can i directly read data from RTPSource and write it into a local buffer that my STB application can use? If yes, then how? Regards, Ashish From rob.krakora at messagenetsystems.com Thu Aug 18 08:31:44 2011 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Thu, 18 Aug 2011 11:31:44 -0400 Subject: [Live-devel] RTSP multicast problem In-Reply-To: <0212F04A-3C56-4D87-81B0-CB12163FA5DD@live555.com> References: <4E4D2152.5030208@netavis.hu> <0212F04A-3C56-4D87-81B0-CB12163FA5DD@live555.com> Message-ID: Unicast from the cameras and connect to them via a Linux machine that can then act as a multicast gateway that is running live555 server. This way the multicast addresses/ports can be managed easily. You will need a 1 Gbps or 10 Gbps connection to the Linux machine based on the number of cameras and their aggregate bitrate. Probably a 1Gbps would suffice. 2011/8/18 Ross Finlayson > Unfortunately I don't know of a solution to this, other than > 1/ Fixing the OS kernel, or > 2/ Using a different OS (e.g. FreeBSD) that doesn't have the problem, or > 3/ Changing at least one of the cameras to use a different port, or > 4/ Running your two client applications on separate hosts. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Rob Krakora MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext 212 (317)663-0808 Fax -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Aug 18 20:16:01 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Aug 2011 20:16:01 -0700 Subject: [Live-devel] Implement custom RTP payload formats In-Reply-To: References: Message-ID: On Aug 18, 2011, at 3:27 AM, Ha, Tuan Trung wrote: > I want to ask where is the best place to implement a custom RTP payload formats? For sending, write a new subclass of "MultiFramedRTPSink". Also, if you want to stream this from a RTSP server, you must also write a new subclass of "ServerMediaSubsession". For receiving, write a new subclass of "MultiFramedRTPSource". You must also modify the "MediaSession::initiate()" code to include your new payload format. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Aug 18 20:19:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 18 Aug 2011 20:19:31 -0700 Subject: [Live-devel] RTSP client implementation issue using live555 libs In-Reply-To: References: Message-ID: > So, what i can't figure out is that can i directly read data from RTPSource > and write it into a local buffer that my STB application can use? If yes, then > how? If your RTSP/RTP is a single video-only source (or a Transport Stream), then the simplest way to do this is to just run "openRTSP", and pipe its output into your STB's decoding/rendering application (which would read from stdin). Alternatively, you can write your own application code (using our existing "openRTSP" code as a model), but using your own custom subclass of "MediaSink" (instead of the existing "FileSink") to receive and process the data. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mszeles at netavis.hu Fri Aug 19 00:34:27 2011 From: mszeles at netavis.hu (Miklos Szeles) Date: Fri, 19 Aug 2011 09:34:27 +0200 Subject: [Live-devel] RTSP multicast problem In-Reply-To: <0212F04A-3C56-4D87-81B0-CB12163FA5DD@live555.com> References: <4E4D2152.5030208@netavis.hu> <0212F04A-3C56-4D87-81B0-CB12163FA5DD@live555.com> Message-ID: <4E4E1203.5020306@netavis.hu> Thanks for your answer Ross. Can you give some hints regarding the OS kernel fixing? Where/what should we change to solve this issue by patching the kernel? Best regards, Mikl?s 2011.08.18. 16:47 keltez?ssel, Ross Finlayson ?rta: > Unfortunately I don't know of a solution to this, other than > 1/ Fixing the OS kernel, or > 2/ Using a different OS (e.g. FreeBSD) that doesn't have the problem, or > 3/ Changing at least one of the cameras to use a different port, or > 4/ Running your two client applications on separate hosts. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Mikl?s Szeles NETAVIS Kft. Web: www.netavis.net Mail: mszeles at netavis.hu Hu"v?sv?lgyi ?t 54. H-1021 Budapest Hungary Tel: +36 1 225 31 38 Fax: +36 1 225 31 39 -------------- next part -------------- An HTML attachment was scrubbed... URL: From imaldonado at semex.com.mx Fri Aug 19 09:09:32 2011 From: imaldonado at semex.com.mx (Ivan Maldonado Zambrano) Date: Fri, 19 Aug 2011 11:09:32 -0500 Subject: [Live-devel] Detect and report bandwidth problems Message-ID: <1313770172.6504.14.camel@ivan.semex> Hi all, In a previous mail I got this answer: ------------------------------------------------------------------------- Live555 can't reduce the content, which is what determines the bit rate, Live555 is just the mechanism for sending the content so Live555 can't do anything to change the bit rate. Since Live555 implements RTSP and RTP and those protocols have some mechanisms to detect report bandwidth problems, theoretically Live555 could tell your application about bandwidth problems and your application could do something to reduce the bit rate if possible, e.g. a live camera encoder could dynamically reduce the bit rate or frame rate or ? Matt S. ------------------------------------------------------------------------- I was looking for how to detect bandwidth problems and I found that a good way is by counting losses frames in my client-app. I mean, client-app will count number of losses frames in a period of time and send a feed back to server-app. Of this way server-app can vary bitrate. My question is: can you suggest me another mechanism to detect bandwidth problems? or if exist, how a Live555 client can detect a bandwidth problem? Like a note: My server implement Live555 and my client VLC. Regards and thanks in advance Iv?n Maldonado Zambrano From finlayson at live555.com Fri Aug 19 23:22:02 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 19 Aug 2011 23:22:02 -0700 Subject: [Live-devel] New LIVE555 version - supports receiving new RTP payload types without modifying the supplied code Message-ID: <14AB6A93-AAA1-4BE3-BF3B-D80758EB18A5@live555.com> FYI, the latest version (2011.08.20) of the "LIVE555 Streaming Media" code has modified the "MediaSession" and "MediaSubsession" classes to make it possible for developers to add support for receiving new RTP payload formats, without having to modify the "MediaSession" or "MediaSubsession" code itself. To do this, you define your own subclasses of "MediaSession" and "MediaSubsession". For more details, see the comment near the top of "liveMedia/include/MediaSession.hh". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nouri at soroush.net Sun Aug 21 00:56:25 2011 From: nouri at soroush.net (Mojtaba Nouri) Date: Sun, 21 Aug 2011 12:26:25 +0430 Subject: [Live-devel] A problem with indexed file duration Message-ID: Hi Ross Here is a ts file which I recorded directly from a dvb-t card. http://dl.dropbox.com/u/7920298/ts.tar.gz The file's duration is 5 s, but when I index it and get its duration, it returns 11630 s. Could you detect the problem with this ts? Thanks Mojtaba Nouri -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 21 02:45:25 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 21 Aug 2011 02:45:25 -0700 Subject: [Live-devel] A problem with indexed file duration In-Reply-To: References: Message-ID: <2C0A06DE-EF00-46EF-90FF-3383AEAECA77@live555.com> > Here is a ts file which I recorded directly from a dvb-t card. > http://dl.dropbox.com/u/7920298/ts.tar.gz > The file's duration is 5 s, but when I index it and get its duration, it returns 11630 s. > Could you detect the problem with this ts? The problem is that this file's PCR timestamps are all messed up. Very early in the file, the PCR timestamp suddenly decreases by about 10831 seconds. Our indexing software handles this OK, by pretending that the decrease didn't happen. About 5.6 seconds later, however, the PCR timestamp suddenly jumps - by about 11619 seconds. Our indexing software believes this (it has no reason not to), which is why it ends up thinking that the whole file is 11630 seconds long. In summary: Your encoder is broken; you should fix it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nouri at soroush.net Sun Aug 21 02:57:32 2011 From: nouri at soroush.net (Mojtaba Nouri) Date: Sun, 21 Aug 2011 14:27:32 +0430 Subject: [Live-devel] A problem with indexed file duration In-Reply-To: <2C0A06DE-EF00-46EF-90FF-3383AEAECA77@live555.com> References: <2C0A06DE-EF00-46EF-90FF-3383AEAECA77@live555.com> Message-ID: Thank you, I will work on it. On Sun, Aug 21, 2011 at 2:15 PM, Ross Finlayson wrote: > Here is a ts file which I recorded directly from a dvb-t card. > http://dl.dropbox.com/u/7920298/ts.tar.gz > The file's duration is 5 s, but when I index it and get its duration, it > returns 11630 s. > Could you detect the problem with this ts? > > > The problem is that this file's PCR timestamps are all messed up. > > Very early in the file, the PCR timestamp suddenly decreases by about 10831 > seconds. Our indexing software handles this OK, by pretending that the > decrease didn't happen. > > About 5.6 seconds later, however, the PCR timestamp suddenly jumps - by > about 11619 seconds. Our indexing software believes this (it has no reason > not to), which is why it ends up thinking that the whole file is 11630 > seconds long. > > In summary: Your encoder is broken; you should fix it. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vad at vad.pp.ru Sun Aug 21 21:02:37 2011 From: vad at vad.pp.ru (Vadim Kosarev) Date: Mon, 22 Aug 2011 10:02:37 +0600 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with deleting RTSPClientSession Message-ID: <54F5E0FCD1C54438936816039007E82B@vadhomepc> Hello all! I'm developing RTSP server based on liveMedia library. I have started separate thread for liveMedia in my class RMXFRTSPThread and call doEventLoop(). Then I start VLC and connect to RTSP stream. This stream must take media data from my live H264 video source (I have written child class of FramedSource and child class of OnDemandServerMediaSubsession to do it). The live video source take about 10 sec to initialize (in separate thread). My child of FramedSource triggers event in task scheduler to signal live stream ready. Then H264VideoRTPSink must get SPS and PPS from this live stream to compose SDP lines. To wait for live source and SDP lines my class H264VideoDeviceServerMediaSubsession (child of OnDemandServerMediaSubsession) calls doEventLoop() exactly as H264VideoFileServerMediaSubsession library class. But RTSP client (VLC) closes TCP connection to server before all initializations are complete. This stream initialization process causes exception - library code accesses RTSPClientSession object which has already been deleted. This is a stack when the RTSPClientSession object is deleted: 1. RTSPServer::RTSPClientSession::handleRequestBytes 2. RTSPServer::RTSPClientSession::incomingRequestHandler1 3. RTSPServer::RTSPClientSession::incomingRequestHandler 4. BasicTaskScheduler::SingleStep 5. RMXFRTSPTaskScheduler::SingleStep 6. BasicTaskScheduler0::doEventLoop 7. H264VideoDeviceServerMediaSubsession::getAuxSDPLine 8. OnDemandServerMediaSubsession::setSDPLinesFromRTPSink 9. OnDemandServerMediaSubsession::sdpLines 10. ServerMediaSession::generateSDPDescription 11. RTSPServer::RTSPClientSession::handleCmd_DESCRIBE 12. RTSPServer::RTSPClientSession::handleRequestBytes 13. RTSPServer::RTSPClientSession::incomingRequestHandler1 14. RTSPServer::RTSPClientSession::incomingRequestHandler 15. BasicTaskScheduler::SingleStep 16. RMXFRTSPTaskScheduler::SingleStep 17. BasicTaskScheduler0::doEventLoop 18. RMXFRTSPThread::ThreadProc Some explanations of names in stack: RMXFRTSPThread::ThreadProc - my class with single thread for liveMedia lib RMXFRTSPTaskScheduler - my child class of BasicTaskScheduler H264VideoDeviceServerMediaSubsession - my child class of OnDemandServerMediaSubsession What causes this exception? There are explanation: RTSPServer::RTSPClientSession::handleRequestBytes on top of stack deletes current RTSPClientSession object because incoming client TCP connection is closed (newBytesRead argument is 0 - see first 'if' on top of handleRequestBytes method). But the same RTSPClientSession object is handling DESCRIBE client command below in stack (lines 11-14; it waits for SDP) and when the live video device SDP lines is ready, library code accesses already deleted RTSPClientSession object. I think there are error in choosing RTSPClientSession deleting point (because of possible nested doEventLoop calls). Best regards, Vadim Kosarev From finlayson at live555.com Sun Aug 21 22:02:06 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 21 Aug 2011 22:02:06 -0700 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with deleting RTSPClientSession In-Reply-To: <54F5E0FCD1C54438936816039007E82B@vadhomepc> References: <54F5E0FCD1C54438936816039007E82B@vadhomepc> Message-ID: Are you using the latest version of the code? The problem that you describe looks somewhat similar to a bug that was fixed in a recent release. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 21 22:12:46 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 21 Aug 2011 22:12:46 -0700 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with deleting RTSPClientSession In-Reply-To: <54F5E0FCD1C54438936816039007E82B@vadhomepc> References: <54F5E0FCD1C54438936816039007E82B@vadhomepc> Message-ID: > calls doEventLoop() exactly as H264VideoFileServerMediaSubsession library class. One more thing. Make sure that you're using a recent version of the "H264VideoFileServerMediaSubsession" code here. This was the code that was modified in a recent revision of the code to fix a bug that was similar to the problem that you're seeing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vad at vad.pp.ru Sun Aug 21 23:00:31 2011 From: vad at vad.pp.ru (Vadim Kosarev) Date: Mon, 22 Aug 2011 12:00:31 +0600 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with In-Reply-To: Message-ID: <36CF53EA7F3349F085072FA6F321C47C@vadhomepc> Hello Ross! > One more thing. Make sure that you're using a recent version > of the "H264VideoFileServerMediaSubsession" code here. > This was the code that was modified in a recent revision of the code > to fix a bug that was similar to the problem that you're seeing. Do You mean this fix? 2011.06.14: - Fixed a race condition in "H264VideoFileServerMediaSubsession" and "MPEG4VideoFileServerMediaSubsession" that could have been triggered when two separate clients tried to stream the same file concurrently. (Thanks to Bruno Abreu for reporting this.) I have written my child class of OnDemandServerMediaSubsession using 2011.03.14 version of liveMedia (without this fix). Now I have included this fix to my class code and update liveMedia to 2011.08.13 but the bug I described still arises. The fix 2011.06.14 refers to TWO clients connecting to one stream (so there was TWO RTSPClientSession objects waiting for SDP of one stream). The bug I described refers to ONE client connecting to one stream which source have long initialization period. The reason of the bug (delete RTSPClientSession object in nested doEventLoop call while this object is still used in outer doEventLoop call) is not affected by this fix. Best regards, Vadim Kosarev From vad at vad.pp.ru Mon Aug 22 00:08:31 2011 From: vad at vad.pp.ru (Vadim Kosarev) Date: Mon, 22 Aug 2011 13:08:31 +0600 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with In-Reply-To: Message-ID: <8D065BC78DFA45A2AFCE7B270156DFDD@vadhomepc> Hello, Ross! Maybe this patch will fix the bug (changes in RTSPClientSession deletion algorithm). I'm still testing it now. http://vad.pp.ru/liveMedia/2.diff Best regards, Vadim Kosarev From finlayson at live555.com Mon Aug 22 00:44:02 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 22 Aug 2011 00:44:02 -0700 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with In-Reply-To: <8D065BC78DFA45A2AFCE7B270156DFDD@vadhomepc> References: <8D065BC78DFA45A2AFCE7B270156DFDD@vadhomepc> Message-ID: Yes, I think you've identified a problem with the code that I'll need to fix. Something like your proposed patch might work. I need to think about this some more? Anyway, thanks for reporting the issue. I hope to come out with a new version of the code that fixes it soon. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vad at vad.pp.ru Mon Aug 22 00:53:41 2011 From: vad at vad.pp.ru (Vadim Kosarev) Date: Mon, 22 Aug 2011 13:53:41 +0600 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with In-Reply-To: Message-ID: <02083BFE82EC4F2F8BE6872E0F2215A1@vadhomepc> Thanks Ross! From yinlijie2011 at 163.com Mon Aug 22 01:12:05 2011 From: yinlijie2011 at 163.com (Yin Lijie) Date: Mon, 22 Aug 2011 16:12:05 +0800 (CST) Subject: [Live-devel] Problem about create many MP4-file Message-ID: <2ff918b2.ce93.131f08be6e6.Coremail.yinlijie2011@163.com> Hi all, I want use Live555 library for my program. I want receive H.264 media and AAC audio from RTSP server, and get them to a MP4-file every ten minute. Look! This is my step: 1/ Send the RTSP "PAUSE" command when time is up 2/ Call Medium::close(pointer-to-your-MP4-file-object) 3/ Create a new MP4-file-object 4/ Call "startPlaying()" on your new MP4-file-object 5/ Send the RTSP "PLAY" command But the program will dead inTaskScheduler::doEventLoop() (I use visual stdio 6 to debug). The OS(windows xp) say: memory can't written. How can I solve this problem. Thank you! Best wishes from Yin Lijie! -------------- next part -------------- An HTML attachment was scrubbed... URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:34 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:34 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 3) Message-ID: <4E521E36.7040204@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:33 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:33 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 2) Message-ID: <4E521E35.4070504@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:37 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:37 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 4) Message-ID: <4E521E39.60002@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:38 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:38 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 5) Message-ID: <4E521E3A.1080304@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:42 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:42 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 6) Message-ID: <4E521E3E.1020404@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:49 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:49 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 7) Message-ID: <4E521E45.7060903@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:50 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:50 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 8) Message-ID: <4E521E46.7070705@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:51 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:51 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 10) Message-ID: <4E521E47.8050309@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From raffin at ermes-cctv.com Mon Aug 22 02:15:51 2011 From: raffin at ermes-cctv.com (Mario Raffin) Date: Mon, 22 Aug 2011 11:15:51 +0200 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 9) Message-ID: <4E521E47.20700@ermes-cctv.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: raffin.vcf Type: text/x-vcard Size: 266 bytes Desc: not available URL: From finlayson at live555.com Mon Aug 22 02:34:37 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 22 Aug 2011 02:34:37 -0700 Subject: [Live-devel] Mario Raffin is out of office (was: live-devel Digest, Vol 94, Issue 2) In-Reply-To: <4E521E35.4070504@ermes-cctv.com> References: <4E521E35.4070504@ermes-cctv.com> Message-ID: Arggh! Sorry, folks, but this person's mail system is apparently broken - it should not be sending these 'out of office' responses back to the mailing list. I've turned off mailing list delivery to this person - at least until he fixes his mail system. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt at schuckmannacres.com Mon Aug 22 09:12:23 2011 From: matt at schuckmannacres.com (Matt Schuckmannn) Date: Mon, 22 Aug 2011 09:12:23 -0700 Subject: [Live-devel] Detect and report bandwidth problems In-Reply-To: <1313770172.6504.14.camel@ivan.semex> References: <1313770172.6504.14.camel@ivan.semex> Message-ID: <4E527FE7.1080801@schuckmannacres.com> Read the RTP/RTCP spec. (specifically the RTCP part). Then look at the Live555 server code. Matt S. On Friday, August 19, 2011 9:09:32 AM, Ivan Maldonado Zambrano wrote: > Hi all, > > In a previous mail I got this answer: > > ------------------------------------------------------------------------- > Live555 can't reduce the content, which is what determines the bit > rate, > Live555 is just the mechanism for sending the content so Live555 can't > do anything to change the bit rate. > Since Live555 implements RTSP and RTP and those protocols have some > mechanisms to detect report bandwidth problems, theoretically Live555 > could tell your application about bandwidth problems and your > application could do something to reduce the bit rate if possible, e.g. > a live camera encoder could dynamically reduce the bit rate or frame > rate or ? > > Matt S. > ------------------------------------------------------------------------- > > I was looking for how to detect bandwid th problems and I found that a > good way is by counting losses frames in my client-app. I mean, > client-app will count number of losses frames in a period of time and > send a feed back to server-app. Of this way server-app can vary bitrate. > > My question is: can you suggest me another mechanism to detect bandwidth > problems? or if exist, how a Live555 client can detect a bandwidth > problem? > > Like a note: My server implement Live555 and my client VLC. > > Regards and thanks in advance > Iv?n Maldonado Zambrano > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Mon Aug 22 15:05:18 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 22 Aug 2011 15:05:18 -0700 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with In-Reply-To: <02083BFE82EC4F2F8BE6872E0F2215A1@vadhomepc> References: <02083BFE82EC4F2F8BE6872E0F2215A1@vadhomepc> Message-ID: <5E97D890-7247-40E9-AC9E-8CE52EEA6543@live555.com> Vadim, FYI, I have now released a new version (2011.08.22) of the "LIVE555 Streaming Media" code that should fix this problem. Please let me know if it works OK for you now. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vad at vad.pp.ru Mon Aug 22 22:13:26 2011 From: vad at vad.pp.ru (Vadim Kosarev) Date: Tue, 23 Aug 2011 11:13:26 +0600 Subject: [Live-devel] Nested TaskScheduler::doEventLoop causes bug with In-Reply-To: <5E97D890-7247-40E9-AC9E-8CE52EEA6543@live555.com> Message-ID: <04BA7D8EC5B8434DB4C78509CACC619A@vadhomepc> Hello Ross! Yes, in version 2011.08.22 the bug has disappeared! Thank You very much! From mszeles at netavis.hu Tue Aug 23 06:25:22 2011 From: mszeles at netavis.hu (Miklos Szeles) Date: Tue, 23 Aug 2011 15:25:22 +0200 Subject: [Live-devel] RTSP multicast problem In-Reply-To: <4E4E1203.5020306@netavis.hu> References: <4E4D2152.5030208@netavis.hu> <0212F04A-3C56-4D87-81B0-CB12163FA5DD@live555.com> <4E4E1203.5020306@netavis.hu> Message-ID: <4E53AA42.4040108@netavis.hu> Hi, Can anybody help in this issue? We are using CentOS 5.5. Is there any patch available for CentOS which solves this issue? What other linux distribution works correctly besides FreeBSD? BR, Mikl?s 2011.08.19. 9:34 keltez?ssel, Miklos Szeles ?rta: > Thanks for your answer Ross. Can you give some hints regarding the OS > kernel fixing? Where/what should we change to solve this issue by > patching the kernel? > Best regards, > Mikl?s > > 2011.08.18. 16:47 keltez?ssel, Ross Finlayson ?rta: >> Unfortunately I don't know of a solution to this, other than >> 1/ Fixing the OS kernel, or >> 2/ Using a different OS (e.g. FreeBSD) that doesn't have the problem, or >> 3/ Changing at least one of the cameras to use a different port, or >> 4/ Running your two client applications on separate hosts. >> >> >> Ross Finlayson >> Live Networks, Inc. >> http://www.live555.com/ >> >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > > -- > Mikl?s Szeles > > NETAVIS Kft. > Web:www.netavis.net > Mail:mszeles at netavis.hu > Hu"v?sv?lgyi ?t 54. > H-1021 Budapest > Hungary > Tel: +36 1 225 31 38 > Fax: +36 1 225 31 39 > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Mikl?s Szeles NETAVIS Kft. Web: www.netavis.net Mail: mszeles at netavis.hu Hu"v?sv?lgyi ?t 54. H-1021 Budapest Hungary Tel: +36 1 225 31 38 Fax: +36 1 225 31 39 -------------- next part -------------- An HTML attachment was scrubbed... URL: From gbonneau at miranda.com Tue Aug 23 20:15:34 2011 From: gbonneau at miranda.com (BONNEAU Guy) Date: Wed, 24 Aug 2011 03:15:34 +0000 Subject: [Live-devel] RTSP multicast problem In-Reply-To: <4E53AA42.4040108@netavis.hu> References: <4E4D2152.5030208@netavis.hu> <0212F04A-3C56-4D87-81B0-CB12163FA5DD@live555.com> <4E4E1203.5020306@netavis.hu> <4E53AA42.4040108@netavis.hu> Message-ID: <24665DDC0D7CF047BD6471A56E615EA631901EF2@CA-OPS-MAILBOX.miranda.com> We also had the same issue with live555 (many people had!). We had the issue with Linux only. We were greatly helped with the explanation of these 2 links. http://www.nmsl.cs.ucsb.edu/MulticastSocketsBook/ http://stackoverflow.com/questions/2741936 Look at the mcreceive.c example of the first link that shows how to solve the issue. Basically we did a minor change in the live555 library such that the setupDatagramSocket method in GrousockHelper do bind with the multicast group address if Linux. This solved the issue. You can also check this thread: http://lists.live555.com/pipermail/live-devel/2010-October/012630.html Will give you explanations why Ross thinks this is bad idea to implement the above solution in the live555 library. I have include patchs that we used on library released 2009 07 28 that kept the compatibility with the library yet solved the issue. Unfortunately we don't work the latest library yet. Hope this help. Guy From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Miklos Szeles Sent: Tuesday, August 23, 2011 9:25 To: live-devel at ns.live555.com Subject: Re: [Live-devel] RTSP multicast problem Hi, Can anybody help in this issue? We are using CentOS 5.5. Is there any patch available for CentOS which solves this issue? What other linux distribution works correctly besides FreeBSD? BR, Mikl?s 2011.08.19. 9:34 keltez?ssel, Miklos Szeles ?rta: Thanks for your answer Ross. Can you give some hints regarding the OS kernel fixing? Where/what should we change to solve this issue by patching the kernel? Best regards, Mikl?s 2011.08.18. 16:47 keltez?ssel, Ross Finlayson ?rta: Unfortunately I don't know of a solution to this, other than 1/ Fixing the OS kernel, or 2/ Using a different OS (e.g. FreeBSD) that doesn't have the problem, or 3/ Changing at least one of the cameras to use a different port, or 4/ Running your two client applications on separate hosts. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -- Mikl?s Szeles NETAVIS Kft. Web: www.netavis.net Mail: mszeles at netavis.hu H?v?sv?lgyi ?t 54. H-1021 Budapest Hungary Tel: +36 1 225 31 38 Fax: +36 1 225 31 39 _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -- Mikl?s Szeles NETAVIS Kft. Web: www.netavis.net Mail: mszeles at netavis.hu H?v?sv?lgyi ?t 54. H-1021 Budapest Hungary Tel: +36 1 225 31 38 Fax: +36 1 225 31 39 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: GroupsockHelper.cpp.diff Type: application/octet-stream Size: 1136 bytes Desc: GroupsockHelper.cpp.diff URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: GroupsockHelper.hh.diff Type: application/octet-stream Size: 509 bytes Desc: GroupsockHelper.hh.diff URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: NetInterface.cpp.diff Type: application/octet-stream Size: 716 bytes Desc: NetInterface.cpp.diff URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: NetInterface.hh.diff Type: application/octet-stream Size: 489 bytes Desc: NetInterface.hh.diff URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Groupsock.cpp.diff Type: application/octet-stream Size: 1601 bytes Desc: Groupsock.cpp.diff URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Groupsock.hh.diff Type: application/octet-stream Size: 454 bytes Desc: Groupsock.hh.diff URL: From taylesworth at realitymobile.com Wed Aug 24 16:07:24 2011 From: taylesworth at realitymobile.com (Tom Aylesworth) Date: Wed, 24 Aug 2011 19:07:24 -0400 Subject: [Live-devel] Streaming from client to server Message-ID: I'm new to Live555 but hope to be able to use it as the basis for an RTSP client. Viewing a stream from our server using Live555 is very straightforward, thanks to the sample code. But I'd also like to be able to send a stream to the server. Our server is expecting an ANNOUNCE, followed by a SETUP, and then a RECORD to start receiving the stream from the client. I'm able to create an SDP, send it with an ANNOUNCE (using RTSPClient::sendAnnounceCommand), and then send SETUP and RECORD commands. But I'm unsure of the best way to set up the source for streaming the data. RTSPClient wants a MediaSubsession which appears to be designed for only receiving a stream. I can't find a way to attach my own custom source to a MediaSubsession; nor can I see a way to use a ServerMediaSubsession with an RTSPClient. Am I missing something, or is this correct? If it is correct, does anyone have a suggestion for the best way to do what I want? I'm thinking of subclassing RTSPClient to either allow me to use a ServerMediaSubsession or a new MediaSubsession class. Does that seem like a reasonable approach? Thanks in advance, Thomas Aylesworth From finlayson at live555.com Wed Aug 24 21:27:49 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 24 Aug 2011 21:27:49 -0700 Subject: [Live-devel] Streaming from client to server In-Reply-To: References: Message-ID: <1CFFBEB7-4627-4C49-A917-6B5C498447B5@live555.com> > I'm new to Live555 but hope to be able to use it as the basis for an RTSP client. Viewing a stream from our server using Live555 is very straightforward, thanks to the sample code. But I'd also like to be able to send a stream to the server. Unfortunately this is something that hasn't been fully standardized, and is (IMHO) not a particularly good idea anyway. (IMHO, it's better for the server to 'pull' its input data from a data source, rather than having a data source 'push' the input data to the server.) However, we do have some legacy code in the "LIVE555 Streaming Media" source code that supports this 'push' model for one particular case: Streaming to Apple 'Darwin Streaming Servers'. Although we no longer support this code, you might find it useful. Note the "DarwinInjector" class, and the demo applications "testMPEG1or2AudioVideoToDarwin" and "testMPEG4VideoToDarwin", that illustrates how to use the "DarwinInjector" class. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From taylesworth at realitymobile.com Thu Aug 25 15:40:47 2011 From: taylesworth at realitymobile.com (Tom Aylesworth) Date: Thu, 25 Aug 2011 18:40:47 -0400 Subject: [Live-devel] Streaming from client to server In-Reply-To: References: Message-ID: <0D75E0E38A78ED40B5933CCBB98119F393D8373C7F@hermes.realitymobile.local> The DarwinInjector class was exactly what I was looking for. Thanks for the quick response. Just because I'm curious (and I recognize this goes beyond support of the Live555 library so no problem if you don't have time to respond), could you expand on why you think the push model is a bad idea, and what you would recommend instead? Thanks again for the help, and for a great library. Thomas Aylesworth ------------------------------ > I'm new to Live555 but hope to be able to use it as the basis for an RTSP client. Viewing a stream from our server using Live555 is very straightforward, thanks to the sample code. But I'd also like to be able to send a stream to the server. Unfortunately this is something that hasn't been fully standardized, and is (IMHO) not a particularly good idea anyway. (IMHO, it's better for the server to 'pull' its input data from a data source, rather than having a data source 'push' the input data to the server.) However, we do have some legacy code in the "LIVE555 Streaming Media" source code that supports this 'push' model for one particular case: Streaming to Apple 'Darwin Streaming Servers'. Although we no longer support this code, you might find it useful. Note the "DarwinInjector" class, and the demo applications "testMPEG1or2AudioVideoToDarwin" and "testMPEG4VideoToDarwin", that illustrates how to use the "DarwinInjector" class. Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Aug 25 21:07:51 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 25 Aug 2011 21:07:51 -0700 Subject: [Live-devel] Streaming from client to server In-Reply-To: <0D75E0E38A78ED40B5933CCBB98119F393D8373C7F@hermes.realitymobile.local> References: <0D75E0E38A78ED40B5933CCBB98119F393D8373C7F@hermes.realitymobile.local> Message-ID: <7A8544C9-759F-4875-83CD-4E4D28FF5F34@live555.com> > Just because I'm curious (and I recognize this goes beyond support of the Live555 library so no problem if you don't have time to respond), could you expand on why you think the push model is a bad idea The biggest problem that I see with the 'push' model is that it complicates the server implementation, because the arrival of the data into the server is decoupled from the streaming of the data out of the server (i.e., to regular clients). I.e., an implementation of a server that supports the 'push' model will need to include some sort of buffering mechanism (to buffer the incoming data). Then, when streaming this data out to clients, the server will read - i.e., 'pull' - from this buffer. But because the server is, in reality, 'pulling' the data that it streams out to clients, then why not just have it 'pull' the data directly from the source? This is much easier to implement. (In fact, if the underlying OS supports a 'networked file' abstraction, then it's trivial to implement; the server can just continue to think that it's streaming from a file.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.stevens at uqconnect.edu.au Fri Aug 26 20:33:14 2011 From: alexander.stevens at uqconnect.edu.au (Alexander Stevens) Date: Sat, 27 Aug 2011 13:33:14 +1000 Subject: [Live-devel] Unusual H264VideoStreamDiscreteFramer Compile Error Message-ID: <1314415994.3890.13.camel@milky> Hey guys, I have an unusual compile error, that "logically" shouldn't even occur. g++ -Wall -pedantic -g main.c -c g++ -Wall -pedantic -g v4l2_camera.c -c g++ -Wall -pedantic -g -I../UsageEnvironment/include -I../groupsock/include -I../liveMedia/include -I../BasicUsageEnvironment/include live555.cpp -c live555.cpp: In function ?void play()?: live555.cpp:60:63: error: no matching function for call to ?RTPSink::startPlaying(H264VideoStreamDiscreteFramer&, void (&)(), RTPSink*&)? ../liveMedia/include/MediaSink.hh:34:11: note: candidate is: Boolean MediaSink::startPlaying(MediaSource&, void (*)(void*), void*) make: *** [live555] Error 1 So, essentially I have copied the testH264VideoStreamer.cpp test program, and modified it to use my own custom DeviceSource called x264EncoderSource (this implementation could be why I'm getting the error, but I'm not too sure) and then changed the framer to H264VideoStreamDiscreteFramer instead of H264VideoStreamFramer. Otherwise, I haven't changed all too much of the underlying code. I have turned a few variables over to being global - dirty I know, but just want to get something functional before I start refining. What's even more unusual, is that when I go back to H264VideoStreamFramer, the same compile error comes back up. This is the bazaar repository link to my source code (hosted on launchpad): http://bazaar.launchpad.net/~alex-stevens/+junk/spyPanda/files/31 Cheers, Alex Stevens Mechatronic Engineering Student University of Queensland, Australia From finlayson at live555.com Sat Aug 27 17:28:43 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 27 Aug 2011 17:28:43 -0700 Subject: [Live-devel] Unusual H264VideoStreamDiscreteFramer Compile Error In-Reply-To: <1314415994.3890.13.camel@milky> References: <1314415994.3890.13.camel@milky> Message-ID: <0B3BFCAE-FB7D-4967-AD45-C2D45139197F@live555.com> The problem is your "afterPlaying()" function (in "live555.cpp"). You defined it as void afterPlaying(void); It should be void afterPlaying(void*); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.stevens at uqconnect.edu.au Sat Aug 27 19:00:56 2011 From: alexander.stevens at uqconnect.edu.au (Alexander Stevens) Date: Sun, 28 Aug 2011 12:00:56 +1000 Subject: [Live-devel] Unusual H264VideoStreamDiscreteFramer Compile Error In-Reply-To: <0B3BFCAE-FB7D-4967-AD45-C2D45139197F@live555.com> References: <1314415994.3890.13.camel@milky> <0B3BFCAE-FB7D-4967-AD45-C2D45139197F@live555.com> Message-ID: <1314496856.1747.21.camel@milky> Haha, yeah, I got a kind email from my supervisor pointing out my embarrassing little mistake... Cheers though Ross! Unfortunately, I've hit yet again another snag... Even though I'm linking to the Live555 libraries, I'm receiving undefined reference errors galore. Still not sure why though... maybe someone can clarify? My program is located at the base directory of the live libraries and I am just pointing to each include directory using -I as shown below and -L for each library... My machine is running 64 bit Ubuntu 11.04 with G ++ v4.5.2 and have compiled the live555 libraries with the linux-64bit configuration. My program is also including the BasicUsageEnvironment.hh, liveMedia.hh, GroupsockHelper.hh and FramedSource.hh (for my custom DeviceSource implementation). Once again, my source is here http://bazaar.launchpad.net/~alex-stevens/+junk/spyPanda/files/33 (Sorry for the massive post) g++ -Wall -pedantic -g main.c -c g++ -Wall -pedantic -g v4l2_camera.c -c g++ -Wall -pedantic -g -I../UsageEnvironment/include -I../groupsock/include -I../liveMedia/include -I../BasicUsageEnvironment/include -O2 live555.cpp -c g++ -Wall -pedantic -g -I../UsageEnvironment/include -I../groupsock/include -I../liveMedia/include -I../BasicUsageEnvironment/include -O2 x264EncoderSource.cpp -c g++ -Wall -pedantic -g -L../UsageEnvironment/libUsageEnvironment.a -L../groupsock/libgroupsock.a -L../liveMedia/libliveMedia.a -L../BasicUsageEnvironment/libBasicUsageEnvironment.a -lx264 -lv4l2 -lUsageEnvironment -lBasicUsageEnvironment -lliveMedia -lgroupsock -O2 *.o -o spyPanda live555.o: In function `play()': /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:56: undefined reference to `H264VideoStreamDiscreteFramer::createNew(UsageEnvironment&, FramedSource*)' live555.o: In function `afterPlaying(void*)': /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:44: undefined reference to `Medium::close(Medium*)' live555.o: In function `live555_thread(void*)': /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:66: undefined reference to `BasicTaskScheduler::createNew()' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:67: undefined reference to `BasicUsageEnvironment::createNew(TaskScheduler&)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:71: undefined reference to `chooseRandomIPv4SSMAddress(UsageEnvironment&)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:77: undefined reference to `Port::Port(unsigned short)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:78: undefined reference to `Port::Port(unsigned short)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:80: undefined reference to `Groupsock::Groupsock(UsageEnvironment&, in_addr const&, Port, unsigned char)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:81: undefined reference to `Groupsock::multicastSendOnly()' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:82: undefined reference to `Groupsock::Groupsock(UsageEnvironment&, in_addr const&, Port, unsigned char)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:83: undefined reference to `Groupsock::multicastSendOnly()' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:86: undefined reference to `OutPacketBuffer::maxSize' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:87: undefined reference to `H264VideoRTPSink::createNew(UsageEnvironment&, Groupsock*, unsigned char)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:98: undefined reference to `RTCPInstance::createNew(UsageEnvironment&, Groupsock*, unsigned int, unsigned char const*, RTPSink*, RTPSource const*, unsigned int)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:101: undefined reference to `Port::Port(unsigned short)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:101: undefined reference to `RTSPServer::createNew(UsageEnvironment&, Port, UserAuthenticationDatabase*, unsigned int)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:109: undefined reference to `ServerMediaSession::createNew(UsageEnvironment&, char const*, char const*, char const*, unsigned int, char const*)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:110: undefined reference to `PassiveServerMediaSubsession::createNew(RTPSink&, RTCPInstance*)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:110: undefined reference to `ServerMediaSession::addSubsession(ServerMediaSubsession*)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:111: undefined reference to `RTSPServer::addServerMediaSession(ServerMediaSession*)' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:113: undefined reference to `RTSPServer::rtspURL(ServerMediaSession const*, int) const' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:82: undefined reference to `Groupsock::~Groupsock()' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:80: undefined reference to `Groupsock::~Groupsock()' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:82: undefined reference to `Groupsock::~Groupsock()' /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:80: undefined reference to `Groupsock::~Groupsock()' live555.o: In function `__static_initialization_and_destruction_0': /home/alex/Documents/Thesis/Research/live/spyPanda/../BasicUsageEnvironment/include/DelayQueue.hh:114: undefined reference to `DELAY_SECOND' /home/alex/Documents/Thesis/Research/live/spyPanda/../BasicUsageEnvironment/include/DelayQueue.hh:114: undefined reference to `operator*(short, DelayInterval const&)' /home/alex/Documents/Thesis/Research/live/spyPanda/../BasicUsageEnvironment/include/DelayQueue.hh:115: undefined reference to `operator*(short, DelayInterval const&)' /home/alex/Documents/Thesis/Research/live/spyPanda/../BasicUsageEnvironment/include/DelayQueue.hh:116: undefined reference to `operator*(short, DelayInterval const&)' live555.o: In function `play()': /home/alex/Documents/Thesis/Research/live/spyPanda/live555.cpp:60: undefined reference to `MediaSink::startPlaying(MediaSource&, void (*)(void*), void*)' x264EncoderSource.o: In function `~x264EncoderSource': /home/alex/Documents/Thesis/Research/live/spyPanda/x264EncoderSource.cpp:49: undefined reference to `FramedSource::~FramedSource()' x264EncoderSource.o: In function `x264EncoderSource': /home/alex/Documents/Thesis/Research/live/spyPanda/x264EncoderSource.cpp:40: undefined reference to `FramedSource::FramedSource(UsageEnvironment&)' /home/alex/Documents/Thesis/Research/live/spyPanda/x264EncoderSource.cpp:45: undefined reference to `x264EncoderSource::errorHandler(void*)' /home/alex/Documents/Thesis/Research/live/spyPanda/x264EncoderSource.cpp:40: undefined reference to `FramedSource::~FramedSource()' x264EncoderSource.o: In function `x264EncoderSource::doGetNextFrame()': /home/alex/Documents/Thesis/Research/live/spyPanda/x264EncoderSource.cpp:54: undefined reference to `FramedSource::handleClosure(void*)' x264EncoderSource.o: In function `~x264EncoderSource': /home/alex/Documents/Thesis/Research/live/spyPanda/x264EncoderSource.cpp:49: undefined reference to `FramedSource::~FramedSource()' x264EncoderSource.o: In function `x264EncoderSource::deliverFrame()': /home/alex/Documents/Thesis/Research/live/spyPanda/x264EncoderSource.cpp:85: undefined reference to `FramedSource::afterGetting(FramedSource*)' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x10): undefined reference to `MediaSource::isSource() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x18): undefined reference to `Medium::isSink() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x20): undefined reference to `Medium::isRTCPInstance() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x28): undefined reference to `Medium::isRTSPClient() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x30): undefined reference to `Medium::isRTSPServer() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x38): undefined reference to `Medium::isMediaSession() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x40): undefined reference to `Medium::isServerMediaSession() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x48): undefined reference to `Medium::isDarwinInjector() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x60): undefined reference to `MediaSource::getAttributes() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x68): undefined reference to `MediaSource::MIMEtype() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x70): undefined reference to `FramedSource::isFramedSource() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x78): undefined reference to `MediaSource::isRTPSource() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x80): undefined reference to `MediaSource::isMPEG1or2VideoStreamFramer() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x88): undefined reference to `MediaSource::isMPEG4VideoStreamFramer() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x90): undefined reference to `MediaSource::isH264VideoStreamFramer() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0x98): undefined reference to `MediaSource::isDVVideoStreamFramer() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0xa0): undefined reference to `MediaSource::isJPEGVideoSource() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0xa8): undefined reference to `MediaSource::isAMRAudioSource() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0xb0): undefined reference to `FramedSource::maxFrameSize() const' x264EncoderSource.o:(.rodata._ZTV17x264EncoderSource[vtable for x264EncoderSource]+0xc0): undefined reference to `FramedSource::doStopGettingFrames()' x264EncoderSource.o:(.rodata._ZTI17x264EncoderSource[typeinfo for x264EncoderSource]+0x10): undefined reference to `typeinfo for FramedSource' collect2: ld returned 1 exit status make: *** [all] Error 1 On Sat, 2011-08-27 at 17:28 -0700, Ross Finlayson wrote: > The problem is your "afterPlaying()" function (in "live555.cpp"). You > defined it as > void afterPlaying(void); > It should be > void afterPlaying(void*); > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From simon.mal at xl.wp.pl Sat Aug 27 02:34:27 2011 From: simon.mal at xl.wp.pl (Szymon Malewski) Date: Sat, 27 Aug 2011 11:34:27 +0200 Subject: [Live-devel] Server moved to 64bit - packet loss In-Reply-To: <4e58b1a168fea2.53331364@wp.pl> References: <4e58b1a168fea2.53331364@wp.pl> Message-ID: <4e58ba233c04f6.15724432@wp.pl> Hi, I'm using Live555 for my client - server system. It's using my own payload type, but with SimpleRTPSink and SimpleRTPSource. At earlier stages I provide frames of 1,2MB. Bitrate is very high (about 480Mbps) - I'm testing it in private network with no cross traffic. I've increased system buffers (in Ubuntu 11.04 Natty Narwhal) and called "increaseSendBufferTo()" in server and "increaseReceiveBufferTo()" in client. It all worked fine. Then I moved server to another machine and I'm experiencing severe packet loss (60-70%) at client side. I'm not familiar with statistics in Live555, for quick check I've added packets counters in GroupsockHelper.cpp in writeSocket and readSocket. Client receives about 1/3 of sent packets. Lowering framerate didn't help much. On the other hand tcpstat shows correct traffic network on both sides. The main difference I see between these two machines is that the new one is 64bit. So it seems that client doesn't get packets from system, when server is 64bit, while it gets them when server is 32bit. When server and client are running on the same machine it works in all cases. Did anyone experience such a strange behaviour? Is there in live555 library anything that might cause this (e.g. some options on socket initialization)? Or what system configuration should I check and compare? Thank you, Szymon From finlayson at live555.com Sat Aug 27 22:52:11 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 27 Aug 2011 22:52:11 -0700 Subject: [Live-devel] Server moved to 64bit - packet loss In-Reply-To: <4e58ba233c04f6.15724432@wp.pl> References: <4e58b1a168fea2.53331364@wp.pl> <4e58ba233c04f6.15724432@wp.pl> Message-ID: > Then I moved server to another machine and I'm experiencing severe packet loss (60-70%) at client side. > I'm not familiar with statistics in Live555, for quick check I've added packets counters in GroupsockHelper.cpp in writeSocket and readSocket. Client receives about 1/3 of sent packets. Lowering framerate didn't help much. > On the other hand tcpstat shows correct traffic network on both sides. Well, if the *only* difference between the two setups is that the server is running on a different machine, then it should be easy to see - by looking at the network traffic - what's different about the new server's packets that's causing packet loss at the client. I suspect, however, that this is not the only difference between the two setups. Are you sure that the new server is not running on a different LAN, or that it's not behind a firewall or a different router? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.stevens at uqconnect.edu.au Sun Aug 28 21:15:07 2011 From: alexander.stevens at uqconnect.edu.au (Alexander Stevens) Date: Mon, 29 Aug 2011 14:15:07 +1000 Subject: [Live-devel] Unusual H264VideoStreamDiscreteFramer Compile Error In-Reply-To: <1314496856.1747.21.camel@milky> References: <1314415994.3890.13.camel@milky> <0B3BFCAE-FB7D-4967-AD45-C2D45139197F@live555.com> <1314496856.1747.21.camel@milky> Message-ID: <1314591307.1830.10.camel@milky> Alright, so after a bit of tinkering, and a more in depth look into how Live555 compiles its own test programs... I've resolved my problem by using the compile structure set out like this... It should resolve any undefined reference errors for Live555. LIVE_INCLUDES=-I../UsageEnvironment/include -I../groupsock/include -I../liveMedia/include -I../BasicUsageEnvironment/include LIVE_LIBS=../liveMedia/libliveMedia.a ../groupsock/libgroupsock.a ../BasicUsageEnvironment/libBasicUsageEnvironment.a ../UsageEnvironment/libUsageEnvironment.a g++ $(LIVE_INCLUDES) yourProgramSource.c -c g++ -o yourProgramName -L. yourProgramSource.o $(LIVE_LIBS) Sorry to those who I've incurred many facepalms to. Haha. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangwudi3110 at gmail.com Sun Aug 28 23:09:30 2011 From: yangwudi3110 at gmail.com (=?GB2312?B?0e6zrA==?=) Date: Mon, 29 Aug 2011 14:09:30 +0800 Subject: [Live-devel] how to stream TS file to STB via Raw UDP Message-ID: I have just read the FAQ, and it metioned that "*STBs typically handle only MPEG Transport Streams, not Elementary Streams or other data formats. (Because MPEG Transport Streams contain their own timestamps, they can be streamed via raw-UDP, with RTP timestamps not being required.)* ". Then what should I do to acheive this purpose? And if I want to use "testOnDemandRTSPServer.cpp" as a template, did I must modify many related files? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 28 23:49:46 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 28 Aug 2011 23:49:46 -0700 Subject: [Live-devel] how to stream TS file to STB via Raw UDP In-Reply-To: References: Message-ID: > I have just read the FAQ, and it metioned that "STBs typically handle only MPEG Transport Streams, not Elementary Streams or other data formats. (Because MPEG Transport Streams contain their own timestamps, they can be streamed via raw-UDP, with RTP timestamps not being required.) ". Then what should I do to acheive this purpose? It's the RTSP client - not the server - that chooses whether or not the Transport Stream data is to be sent via RTP-over-UDP (the standard), or via raw-UDP. If your client requests raw-UDP streaming, and does so in a way that our server understands (because, as I noted in the FAQ, there is no one standard way to request this), then our server will stream via raw-UDP. If, however, it requests RTP-over-UDP streaming (the standard), then our server will stream the data that way. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tech at digiton.ru Mon Aug 29 01:53:33 2011 From: tech at digiton.ru (Dmitriy Vasil'ev) Date: Mon, 29 Aug 2011 12:53:33 +0400 Subject: [Live-devel] bug in the mp3 receiver Message-ID: <9EA36DA1B13E4BA3B1F7740A18FCA6BC@tech> Hello, I have found a bug in the testMP3Recevier. I am testing pair: testMP3Streamer and testMP3Receiver. The receiver stop receiving sound, if I do restart testMP3Streamer. For example: I have file test.mp3 128 kbit/sec, 44100 Hz, stereo, 1 hour long. Start the receiver: testMP3Receiver > ttt.mp3 Start the streamer. After 10 min I do stop the streamer (Cntrl+C) and start again. After another 5 min I do stop the streamer (Cntrl+C) and start again. After another 5 min I do stop the streamer (Cntrl+C) and start again. Stop the receiver. Stop the streamer. Check ttt.mp3. This sound file only 10 min long! I did test on Windows and Linux: the same result. What I can check? Do you can fix this bug? Best regards, Dmitriy -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Aug 29 02:19:48 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 29 Aug 2011 02:19:48 -0700 Subject: [Live-devel] bug in the mp3 receiver In-Reply-To: <9EA36DA1B13E4BA3B1F7740A18FCA6BC@tech> References: <9EA36DA1B13E4BA3B1F7740A18FCA6BC@tech> Message-ID: <688F18DD-9664-467A-B454-16771EA425CE@live555.com> I'm not sure why this is happening; it seems to arise from the fact that both the streamer and receiver applications are running on the same host. For some reason, after the streamer application is killed, the OS is (for some reason) disabling the receiver's reception of the multicast group. Again, I don't understand why this is happening, and I don't know what - if anything - can be done in our code to prevent it. One way to prevent it, though, is to run "testMP3Receiver" on a different computer (but on the same LAN) as "testMP3Streamer". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tech at digiton.ru Mon Aug 29 02:44:50 2011 From: tech at digiton.ru (Dmitriy Vasil'ev) Date: Mon, 29 Aug 2011 13:44:50 +0400 Subject: [Live-devel] bug in the mp3 receiver In-Reply-To: <688F18DD-9664-467A-B454-16771EA425CE@live555.com> References: <9EA36DA1B13E4BA3B1F7740A18FCA6BC@tech> <688F18DD-9664-467A-B454-16771EA425CE@live555.com> Message-ID: <74016DB8091B4F6DBC5A9845892F7F40@tech> Hello Ross, I have done this test on separate computers and on the same host. The same result. Best regards, Dmitriy From: Ross Finlayson Sent: Monday, August 29, 2011 1:19 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] bug in the mp3 receiver I'm not sure why this is happening; it seems to arise from the fact that both the streamer and receiver applications are running on the same host. For some reason, after the streamer application is killed, the OS is (for some reason) disabling the receiver's reception of the multicast group. Again, I don't understand why this is happening, and I don't know what - if anything - can be done in our code to prevent it. One way to prevent it, though, is to run "testMP3Receiver" on a different computer (but on the same LAN) as "testMP3Streamer". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------------------------------------------------------------------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Aug 29 11:17:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 29 Aug 2011 11:17:53 -0700 Subject: [Live-devel] bug in the mp3 receiver In-Reply-To: <74016DB8091B4F6DBC5A9845892F7F40@tech> References: <9EA36DA1B13E4BA3B1F7740A18FCA6BC@tech> <688F18DD-9664-467A-B454-16771EA425CE@live555.com> <74016DB8091B4F6DBC5A9845892F7F40@tech> Message-ID: <0F8258F0-E68C-451C-9ED9-0D5D93A26B80@live555.com> OK, the problem is that the receiver is expecting a single, continuous RTP stream. When you stop, then restart the streamer, there's a 50% chance that the next RTP sequence numbers (which start out at a randomly-chosen value) will be less than the previous ones. If that happens, the new incoming RTP packets will be rejected. This is not a bug. You are simply not using the applications in the way that they were intended. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ken at starseedsoft.com Mon Aug 29 17:53:07 2011 From: ken at starseedsoft.com (Ken Dunne) Date: Mon, 29 Aug 2011 17:53:07 -0700 (PDT) Subject: [Live-devel] how to stream both H264 video and AAC audio ? Message-ID: <1314665587.57844.YahooMailNeo@web1211.biz.mail.gq1.yahoo.com> Hi; ? I've gotten H264 streaming to work, thanks to the live555 and old posts to this list. Now, i am trying to add audio to the same stream. ? I've been reading through old posts in this list, and have gathered that to have the audio and video taken together, that i must add both to the same ServerMediaSession.? I've also read, for AAC, that i need to make a class (AACSMS) based on 'ADTSAudioFileServerMediaSubsession' and provide the guts of 'createNewStreamSource' and 'createNewRTPSink', which i have done.? I have made 'AACSMS' inherit from 'OnDemandServerMediaSubsession' since it is a live AAC source.? I have also created a class that acquires the live frames 'AACLiveSource', a subclass of FramedSource, which is created within 'createNewStreamSource' ? My question is how to put this all together correctly.? I have had to make the AACSMS two 'create' members public in order to do this, so i'm wondering if i've missed something in how to set this up.? Also, is it OK for both Audio and Video to use the same 'GroupSock' ? Thanks for your clarifications! ------------------------- ??? ? Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); ??? ? RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554); ??? ? ServerMediaSession* sms? = ServerMediaSession::createNew(*env, "testStream", "Streaming live H264",? "xxx", True ); ??? ? //Video subsession: (works, no problems) ??? ? MediaSource* videoSource = H264VideoStreamDiscreteFramer::createNew(*env, videoRawBuffer); ??? ? RTPSink* videoSink = H264VideoRTPSink::createNew(*env, &rtpGroupsock, 96); ??? ? ServerMediaSubsession* subsessionVideo = PassiveServerMediaSubsession::createNew(*videoSink, rtcp); ??? ? sms->addSubsession(subsessionVideo); ??? ? //Audio subsession: ??? ? // for audio that is linked to this video, add a second subsession ??? ? AACSMS*? subsessionAudio = AACSMS::createNew(*env, true); ??? ? // the AACLiveSource instance: ??? ? FramedSource* audioSource = subsessionAudio->createNewStreamSource(1, estBitrate); ??? ? RTPSink* audioSink = subsessionAudio->createNewRTPSink( &rtpGroupsock, 97, audioSource); ??? ? sms->addSubsession(subsessionAudio); ??? ? rtspServer->addServerMediaSession(sms); ------------------------- From freefis at gmail.com Mon Aug 29 21:55:43 2011 From: freefis at gmail.com (free.wang) Date: Tue, 30 Aug 2011 12:55:43 +0800 Subject: [Live-devel] can i upload stream to the server In-Reply-To: References: Message-ID: hi every developer , i am new in this group. i got trouble with live555 media server. can i upload stream to the server? thx a lot. :) -- ??????????????????????????. The Crankiness of Belief achieves Great , not the Trick of Regulation. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jasleen at beesys.com Mon Aug 29 22:16:14 2011 From: Jasleen at beesys.com (Jasleen Kaur) Date: Tue, 30 Aug 2011 05:16:14 +0000 Subject: [Live-devel] building live555 in VS 2010 In-Reply-To: References: Message-ID: I have been trying to build the make files using Visual Studio 10. I am facing problems while running the nmake for the mak files. I have attached the screenshot of the problem seen. It says 'Fatal Error U1073: dont know how to make 'include/livemedia_version.hh' on building livemedia.mak Pls guide. Best Regards Jasleen Kaur -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 30 00:51:51 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 30 Aug 2011 00:51:51 -0700 Subject: [Live-devel] how to stream both H264 video and AAC audio ? In-Reply-To: <1314665587.57844.YahooMailNeo@web1211.biz.mail.gq1.yahoo.com> References: <1314665587.57844.YahooMailNeo@web1211.biz.mail.gq1.yahoo.com> Message-ID: <62600B29-19DD-402A-B2EC-B3AC315664DA@live555.com> > I've gotten H264 streaming to work, thanks to the live555 and old posts to this list. > Now, i am trying to add audio to the same stream. > > I've been reading through old posts in this list, and have gathered that to have the audio and video taken together, that i must add both to the same ServerMediaSession. Yes. (See, for example, how the "testOnDemandRTSPServer" demo application implements the streaming of audio+video substreams from a MPEG Program Stream: "testProgs/testOnDemandRTSPServer.cpp", lines 96-111 > I've also read, for AAC, that i need to make a class (AACSMS) based on 'ADTSAudioFileServerMediaSubsession' and provide the guts of 'createNewStreamSource' and 'createNewRTPSink', which i have done. I have made 'AACSMS' inherit from 'OnDemandServerMediaSubsession' since it is a live AAC source. I have also created a class that acquires the live frames 'AACLiveSource', a subclass of FramedSource, which is created within 'createNewStreamSource' > > My question is how to put this all together correctly. I have had to make the AACSMS two 'create' members public in order to do this Actually, it's sufficient to make them "protected:", because they're called only by their ancestor class "OnDemandServerMediaSubsession". > , so i'm wondering if i've missed something in how to set this up. Yes - note my comments below! Once you've sorted this all out, I suggest that you begin by testing your server with our "openRTSP" command-line client. Another 'gotcha' that you'll face will be time synchronization. It's essential that when you set the "fPresentationTime" values for both audio and video (i.e., when you implement the two "doGetNextFrame()" virtual functions), that you set them to accurate values that are synchronized with each other. > Also, is it OK for both Audio and Video to use the same 'GroupSock' ? No! They must use different "Groupsock" objects. But you shouldn't be caring about this, because the "Groupsock" objects are created automatically from the "OnDemandServerMediaSubsession" code. You don't need to deal with them yourself. I.e., your code should *not* be creating "Groupsock" objects, and it should *not* be calling "createNewStreamSource()" and "createNewRTPSink()". This is done automatically by "OnDemandServerMediaSubsession". All you need to do is define those two functions, not call them. Again, I suggest looking at the "testOnDemandRTSPServer" code for guidance. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangwudi3110 at gmail.com Tue Aug 30 01:22:12 2011 From: yangwudi3110 at gmail.com (=?GB2312?B?0e6zrA==?=) Date: Tue, 30 Aug 2011 16:22:12 +0800 Subject: [Live-devel] how to stream TS file to STB via Raw UDP Message-ID: >Thanks for your reply. I did some tests just now. >Server: ./testOnDemandRTSPServer >Client: vlc media player >The default way is RTSP & RTP/UDP, that's OK. And I notice it can also use Tunneling. >So I use port 8000 as "RTSP-over-HTTP tunneling". I compare the two processes, and found > when use tunneling, the OPTION packet is like this: >"TCP OPTIONS rtsp://192.168.10.152:8000/mpeg2TransportStreamTest"; > And when transferring, the sizeof single UDP packet is 1328 (size of default RTP/UDP packet is 1336). 1336 - 8 (RTP head) = 1328. Does that means it really uses RTSP-over-TCP and transfers content via RAW-UDP? From finlayson at live555.com Tue Aug 30 02:01:53 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 30 Aug 2011 02:01:53 -0700 Subject: [Live-devel] can i upload stream to the server In-Reply-To: References: Message-ID: > i got trouble with live555 media server. > > can i upload stream to the server? No, the "LIVE555 Media Server" application currently supports streaming from files only. So, you can't use it to stream from a remote data source, unless you have a way of accessing that data source as a file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 30 02:07:18 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 30 Aug 2011 02:07:18 -0700 Subject: [Live-devel] how to stream TS file to STB via Raw UDP In-Reply-To: References: Message-ID: <63B2EA89-B051-46C5-A229-73544FBCE118@live555.com> If you use "RTSP-over-HTTP" tunneling, then RTSP commands and responses, and RTP packets, and RTCP packets, are *all* carried over HTTP (and thus TCP), not UDP. If you don't use "RTSP-over-HTTP" tunneling, then RTSP commands and responses are carried over HTTP (and thus TCP), but RTP packets and RTCP packets are carried over UDP. This is the default behavior. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Tue Aug 30 07:21:39 2011 From: jshanab at smartwire.com (Jeff Shanab) Date: Tue, 30 Aug 2011 14:21:39 +0000 Subject: [Live-devel] Streaming from client to server In-Reply-To: <7A8544C9-759F-4875-83CD-4E4D28FF5F34@live555.com> References: <0D75E0E38A78ED40B5933CCBB98119F393D8373C7F@hermes.realitymobile.local> <7A8544C9-759F-4875-83CD-4E4D28FF5F34@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC20B167175@IL-BOL-EXCH01.smartwire.com> Thanks for this thread, I am also dealing with rtsp push but quite a different animal. Traditional pull requires ports open thru firewalls and static IP's (or dyndns) A push (or confiscated connection from initial push) is one way to stream from behind a firewall without setting up port forwarding. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, August 25, 2011 11:08 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Streaming from client to server Just because I'm curious (and I recognize this goes beyond support of the Live555 library so no problem if you don't have time to respond), could you expand on why you think the push model is a bad idea The biggest problem that I see with the 'push' model is that it complicates the server implementation, because the arrival of the data into the server is decoupled from the streaming of the data out of the server (i.e., to regular clients). I.e., an implementation of a server that supports the 'push' model will need to include some sort of buffering mechanism (to buffer the incoming data). Then, when streaming this data out to clients, the server will read - i.e., 'pull' - from this buffer. But because the server is, in reality, 'pulling' the data that it streams out to clients, then why not just have it 'pull' the data directly from the source? This is much easier to implement. (In fact, if the underlying OS supports a 'networked file' abstraction, then it's trivial to implement; the server can just continue to think that it's streaming from a file.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From taylesworth at realitymobile.com Tue Aug 30 08:05:22 2011 From: taylesworth at realitymobile.com (Tom Aylesworth) Date: Tue, 30 Aug 2011 11:05:22 -0400 Subject: [Live-devel] Trouble streaming JPEG Message-ID: I'm using the DarwinInjector class to stream JPEG to our server. (Yes, I know -- we ultimately don't plan to use JPEG but it lets us get the framework in place quickly.) I have no trouble sending JPEG frames of ~10K (plus or minus 2K) but when sending higher resolution frames (480x360) of ~40K we see a lot of image corruption starting about halfway through the image that looks very much like what I'd expect from a buffer being overwritten before its prior contents had been sent. Strangely, this doesn't appear related to frame rate. If I limit my video source to only produce a frame per second, I get the same results. I've attached my JpegCaptureDeviceSource class which provides the JPEGVideoSource implementation for my video source. It's heavily based on the ElphelJPEGDeviceSource sample but since our video source doesn't provide a pull mechanism, I added two new public methods for it to provide data to JpegCaptureDeviceSource. The gotAFrame() method provides the image data and JPEG quality. The close() method tells it that the source won't provide more data. It's using the Live 555 event mechanism to trigger our task scheduler to deliver the frame. The only other change from Elphel is that we need to pull the quantization tables from the JPEG header. Our video capture code (which we already had and works great in our existing product) runs on its own thread, captures an image in JPEG format and calls JpegCaptureDeviceSource::gotAFrame(). It also passes in a frameObj which contains state data about the frame. It then waits and doesn't provide another frame until JpegCaptureDeviceSource::deliverFrameToClient() calls doneWithFrame() with that same frameObj. This makes me somewhat confident that there are no buffer overwrites on our end. By time we start capturing another frame, we've already copied the image data into the fTo buffer in FramedSource. Here's the code that sets up the DarwinInjector, source, and sink. (If the syntax looks strange it's because it's Objective-C++ which we are using as a thin interface layer between our Objective-C application and the Live555 C++ code.) Obviously, this runs on its own thread. // create Darwin injector DarwinInjector * injector = DarwinInjector::createNew(*self.rtspEnvironment, progName, verbosityLevel); // create video sink struct in_addr dummyDestAddress; dummyDestAddress.s_addr = 0; Groupsock rtpGroupsockVideo(*self.rtspEnvironment, dummyDestAddress, 0, 0); Groupsock rtcpGroupsockVideo(*self.rtspEnvironment, dummyDestAddress, 0, 0); self.videoSink = JPEGVideoRTPSink::createNew(*self.rtspEnvironment, &rtpGroupsockVideo); // create video source self.videoSource = JpegCaptureDeviceSource::createNew(*self.rtspEnvironment); if (self.videoSource == NULL) { *self.rtspEnvironment << "Unable to open video source: " << self.rtspEnvironment->getResultMsg() << "\n"; [self shutdownWithExitCode:1]; } *self.rtspEnvironment << "Starting video capture device...\n"; self.videoSink->startPlaying(*self.videoSource, afterPlaying, self.videoSink); // create rtcp channel const unsigned estimatedSessionBandwidthVideo = 2000; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* videoRTCP = RTCPInstance::createNew(*self.rtspEnvironment, &rtcpGroupsockVideo, estimatedSessionBandwidthVideo, CNAME, videoSink, NULL /* we're a server */); // connect rtp and rtcp channels to Darwin injector injector->addStream(videoSink, videoRTCP); // connect to server and start streaming data if (!injector->setDestination([self.rtspServer UTF8String], [self.rtspFile UTF8String], "Cannonball", "Cannonball live stream")) { *self.rtspEnvironment << "injector->setDestination() failed: " << self.rtspEnvironment->getResultMsg() << "\n"; [self shutdownWithExitCode:1]; return; } *self.rtspEnvironment << "Play this stream (from the Darwin Streaming Server) using the URL:\n" << "\trtsp://" << [self.rtspServer UTF8String] << "/" << [self.rtspFile UTF8String] << "\n"; // never returns self.rtspEnvironment->taskScheduler().doEventLoop(); Any thoughts about what might be causing the issue we are seeing with streaming larger images? Is there something I need to configure to handle larger images, maybe on the sink (RTP) side? Thanks, Thomas Aylesworth -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: JpegCaptureDeviceSource.cpp Type: application/octet-stream Size: 6097 bytes Desc: JpegCaptureDeviceSource.cpp URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: JpegCaptureDeviceSource.hh Type: application/octet-stream Size: 1897 bytes Desc: JpegCaptureDeviceSource.hh URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From michael.hasenburger at cable.vol.at Wed Aug 31 02:03:55 2011 From: michael.hasenburger at cable.vol.at (michael.hasenburger at cable.vol.at) Date: Wed, 31 Aug 2011 11:03:55 +0200 Subject: [Live-devel] 'Stuttering' TS video transcoded by ffmpeg Message-ID: Hello, I have problems with H.264/AAC transport streams transcoded by FFmpeg. The playback isn't smoothly by RTSP server and VLC as client. Other TS files f.e. transcoded by VLC are OK, but I'd like to use ffmpeg directly. If I play TS file local or by HTTP apple streaming, then playback is fine. I uploaded an example TS: https://rapidshare.com/files/1763103204/mp4.zip FFmpeg is newest revision from GIT. I use following command for transcoding: ffmpeg -i mp2.ts -y -re -isync -vcodec libx264 -r 25 -b 600k -vprofile baseline -preset fast -tune grain -level 1.2 -acodec libfaac -ab 64k -ac 2 -f mpegts mp4.ts I hope you can help me. Thank you! BR, Mike From finlayson at live555.com Wed Aug 31 02:29:28 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 31 Aug 2011 02:29:28 -0700 Subject: [Live-devel] 'Stuttering' TS video transcoded by ffmpeg In-Reply-To: References: Message-ID: Our streaming code has trouble with Transport Stream files that are extremely VBR (variable bit rate). (For files like this, it has trouble figuring out the right times to send the output packets so as to avoid both 'buffer underflow' at the receiver, and 'buffer overflow' (or packet loss) at the receiver.) Someday we may improve our code to handle such files better, but for now, your best option is to try to encode your Transport Stream files so as to not make them excessively VBR. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From peppino_in_casa at hotmail.com Wed Aug 31 08:02:11 2011 From: peppino_in_casa at hotmail.com (Peppino Incasa) Date: Wed, 31 Aug 2011 15:02:11 +0000 Subject: [Live-devel] Mythtv and IPTV over HTTP Message-ID: Hi, I want to use mythtv to see IPTV over HTTP. How can I edit the source of LiveMedia for to do this? Thanks, Peppino. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 31 09:50:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 31 Aug 2011 09:50:31 -0700 Subject: [Live-devel] Mythtv and IPTV over HTTP In-Reply-To: References: Message-ID: <6A40A77E-CF71-42B3-8139-9F2B5D75A9A9@live555.com> On Aug 31, 2011, at 8:02 AM, Peppino Incasa wrote: > Hi, > I want to use mythtv to see IPTV over HTTP. > How can I edit the source of LiveMedia for to do this? You don't 'edit' the source at all - just as you don't edit the source of any other source-code library that you use. If you want to extend the functionality of our library, you do so via C++ subclassing - i.e., by writing new code of your own; not by modifying the existing code. But in any case, I don't think your question is relevant to this mailing list, because the only (client) HTTP streaming that our library supports is for RTSP-over-HTTP, which is already fully implemented. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dekarl at spaetfruehstuecken.org Wed Aug 31 10:38:16 2011 From: dekarl at spaetfruehstuecken.org (Karl Dietz) Date: Wed, 31 Aug 2011 19:38:16 +0200 Subject: [Live-devel] Mythtv and IPTV over HTTP In-Reply-To: References: Message-ID: <4E5E7188.2020502@spaetfruehstuecken.org> Hello Peppino, > I want to use mythtv to see IPTV over HTTP. You should ask that over at the MythTV lists at mythtv-users at mythtv.org Might be good to know what you mean by "IPTV over HTTP". Can you add a link to a stream to your message to the mythtv list? > How can I edit the source of LiveMedia for to do this? What Ross said. Regards, Karl From warren at etr-usa.com Wed Aug 31 12:00:49 2011 From: warren at etr-usa.com (Warren Young) Date: Wed, 31 Aug 2011 13:00:49 -0600 Subject: [Live-devel] 'Stuttering' TS video transcoded by ffmpeg In-Reply-To: References: Message-ID: <4E5E84E1.4090808@etr-usa.com> On 8/31/2011 3:29 AM, Ross Finlayson wrote: > > Someday we may improve our code to handle such files better, but for > now, your best option is to try to encode your Transport Stream files so > as to not make them excessively VBR. The -muxrate parameter to ffmpeg is supposed to make it null-stuff the streams to bring them up to CBR, or near-CBR. I haven't tried it.