From christophe at lemoine-fr.com Tue Feb 1 01:12:18 2011 From: christophe at lemoine-fr.com (Christophe Lemoine) Date: Tue, 01 Feb 2011 11:12:18 +0200 Subject: [Live-devel] TS file with high VBR In-Reply-To: References: Message-ID: <4D47CE72.3090105@lemoine-fr.com> Hi Ross, Thanks for spending time on this issue. I understand that this is a major change and it have to be considered carefully..... I could not find a good tuning that works with very high VBR TS files..... So I end up using CBR files for now. Not only this creates very big files (generating a CBR file is not easy and I have to keep a very high margin to make sure that all frames fits), but it also increases the network usage.... I was thinking of writing a new media subsession and a new framer that would read the file ahead (it would work only for file streaming), but I found it quite hard to just extend existing classes (MPEG2TransportStreamFramer, MPEG2TransportFileServerMediaSubsession and DynamicRTSPServer) as many methods are private instead of protected: is this intentional ? BR Christophe On 01/29/2011 02:43 AM, Ross Finlayson wrote: >>> - Or: use the index file to store current frame BR (so use >>> the actual BR of a frame instead of an average based on past frames) >> >> Yes, for Transport Stream files for which we have pre-computed an >> index file, we do - in principle - have enough 'future' information >> (using PCR values) from which we could compute a more accurate >> 'duration' for each outgoing Transport Stream packet. In the past I >> had considered modifying "MPEG2TransportStreamFramer" to use this >> information, but didn't (because such a solution would work only for >> Transport Stream files for which we have index files, whereas the >> existing code works for any Transport Stream data). >> >> However, because - as you noted - excessively VBR streams are more >> likely to be a problem with H.264 Transport Streams, I think I'll >> take another look at this, to see if we can use the index file - when >> present - to generate more accurate Transport Stream 'durations'. > > As promised, I looked into whether it would be feasible to do this > (use the index file to compute more accurate 'durations' for each > outgoing RTP packet, when streaming an indexed Transport Stream file. > (When streaming from a non-indexed file, or a live stream, the > existing code would continue to be used.) I came up with a solution > that would work, but decided not to check it in, because I'm not > convinced that it would be a good idea, for several reasons: > 1/ My solution changes the format of the index file (to add a new > 'duration per transport packet' field). Old index files would no > longer work, which means that everyone would have to reindex all of > their Transport Stream files. (Also, index files would become about > 36% bigger.) > 2/ While streaming a Transport Stream file, the server would also > need to read the whole of its corresponding index file, leading to an > almost 10% overhead in file system reads. > 3/ Although we would now know more accurate 'durations' for each > outgoing packet, we would still need to adjust these using some 'fudge > factors' (tuning parameters), because otherwise we'd often end up > seeing 'buffer underflow' at the receivers (due to network jitter, for > example). (So the new scheme would not eliminate the 'fudge factors' > present in the current scheme.) > 4/ For highly VBR files, this new solution - by computing more > accurate 'durations' for each outgoing packet - would end up > generating an output network packet stream that's more bursty than the > current scheme (which works by computing a weighted average > duration). This would potentially lead to more network congestion > (and possibly more packet loss). > > Therefore, my current thinking is that it's probably best to retain > the existing scheme, even though its tuning parameters ('fudge > factors') sometimes need to be tweaked in special circumstances. (I > could potentially still be convinced otherwise, though. But right now > I don't plan to change the existing "MPEG2TransportStreamFramer" > implementation.) From finlayson at live555.com Tue Feb 1 01:54:52 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Feb 2011 01:54:52 -0800 Subject: [Live-devel] TS file with high VBR In-Reply-To: <4D47CE72.3090105@lemoine-fr.com> References: <4D47CE72.3090105@lemoine-fr.com> Message-ID: >I found it quite hard to just extend existing classes >(MPEG2TransportStreamFramer, MPEG2TransportFileServerMediaSubsession >and DynamicRTSPServer) as many methods are private instead of >protected: is this intentional ? Usually not; I just don't like declaring variables as "protected:" unless I see an immediate need for it. As always, though, if you have specific suggestions for variables that you'd like to see changed from "private:" to "protected:", just let us know. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From michalk at awpi.com Tue Feb 1 06:44:24 2011 From: michalk at awpi.com (Brian Michalk) Date: Tue, 01 Feb 2011 08:44:24 -0600 Subject: [Live-devel] Simple Filter leading to a CODEC In-Reply-To: References: <4D474044.4030806@awpi.com> Message-ID: <4D481C48.406@awpi.com> This is a brand new codec, so I'll need to specify an RTP payload for that. It's for low bitrate high packet loss networks, and works off of single frame encoding, because a dropped iFrame is not desirable. On 01/31/2011 08:38 PM, Ross Finlayson wrote: > You're getting a little ahead of yourself here. First, you need to > tell us what codec you wish to stream (via RTP). This will tell us > whether or not there is a RTP payload format defined for that codec, > and whether or not we already implement that RTP payload format. From finlayson at live555.com Tue Feb 1 08:05:48 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Feb 2011 08:05:48 -0800 Subject: [Live-devel] Simple Filter leading to a CODEC In-Reply-To: <4D481C48.406@awpi.com> References: <4D474044.4030806@awpi.com> <4D481C48.406@awpi.com> Message-ID: >This is a brand new codec, so I'll need to specify an RTP payload >for that. It's for low bitrate high packet loss networks, and works >off of single frame encoding, because a dropped iFrame is not >desirable. OK, let us know when you've submitted an Internet Draft that describes your proposed RTP payload format, and then I'll help you implement it using the "LIVE555 Streaming Media" libraries. (This implementation would be (at least) a subclass of "MultiFramedRTPSink" (for sending RTP), and a subclass of "MultiFramedRTPSource" (for receiving RTP).) (It's going to be interesting to see how you come up with a single-frame video encoding that's also low-bitrate, because those two usually don't go together.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From michalk at awpi.com Tue Feb 1 08:25:16 2011 From: michalk at awpi.com (Brian Michalk) Date: Tue, 01 Feb 2011 10:25:16 -0600 Subject: [Live-devel] Simple Filter leading to a CODEC In-Reply-To: References: <4D474044.4030806@awpi.com> <4D481C48.406@awpi.com> Message-ID: <4D4833EC.6060001@awpi.com> On 2/1/2011 10:05 AM, Ross Finlayson wrote: >> This is a brand new codec, so I'll need to specify an RTP payload for >> that. It's for low bitrate high packet loss networks, and works off >> of single frame encoding, because a dropped iFrame is not desirable. > > OK, let us know when you've submitted an Internet Draft that describes > your proposed RTP payload format, and then I'll help you implement it > using the "LIVE555 Streaming Media" libraries. (This implementation > would be (at least) a subclass of "MultiFramedRTPSink" (for sending > RTP), and a subclass of "MultiFramedRTPSource" (for receiving RTP).) > > (It's going to be interesting to see how you come up with a > single-frame video encoding that's also low-bitrate, because those two > usually don't go together.) Thanks. Actually, my problem right now is not with RTP at all, it's with how to access the picture frames to implement the codec. I was wanting to know the "best" way to go about tinkering with the video. My thought was to take the MPEG2 sources and modify those, but perhaps there is a better way? That's why I laid out the simple goal of modifying the Y plane. Once I get there, I am confident that the rest will be fairly straightforward. From finlayson at live555.com Tue Feb 1 08:34:43 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Feb 2011 08:34:43 -0800 Subject: [Live-devel] Simple Filter leading to a CODEC In-Reply-To: <4D4833EC.6060001@awpi.com> References: <4D474044.4030806@awpi.com> <4D481C48.406@awpi.com> <4D4833EC.6060001@awpi.com> Message-ID: >Thanks. Actually, my problem right now is not with RTP at all, it's >with how to access the picture frames to implement the codec. If you want to do this using our library, then you would write your own 'filter' class - i.e., a subclass of "FramedFilter" - for encoding each frame. In particular, your "FramedFilter" subclass would implement the virtual function "doGetNextFrame()". I suggest that you start by taking a look at our implementation of the "uLawFromPCMAudioSource" class, because that's the closest thing that we have in our library to a 'codec'. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From edi87 at fibertel.com.ar Tue Feb 1 09:48:31 2011 From: edi87 at fibertel.com.ar (edi87 at fibertel.com.ar) Date: Tue, 01 Feb 2011 14:48:31 -0300 Subject: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV ? Message-ID: <47d820c6fd6.4d481d3f@fibertel.com.ar> Ok, I tried and found a way to stream this as I need. I simply do: { char const* streamName = "MPEGTEST"; char const* v_inputFileName = "test_2.m4e"; char const* a_inputFileName = "test_2.aac"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(MPEG4VideoFileServerMediaSubsession::createNew(*env, v_inputFileName, reuseFirstSource)); sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(*env, a_inputFileName, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, v_inputFileName); } And I have a stream that have MPEG4-GENERIC and MP4V-ES. The problem I got now is the duration... I found that in both subsessiions the duration is 0... and I did not found code to retrieve duration of MPEG4Video and ADTSAudio. It isnt implemented yet? or am I doing something wrong? Thanks! ----- Mensaje original ----- De: Fecha: Lunes, Enero 31, 2011 12:53 pm Asunto: Re: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV ? > I see, so I'm not able to almost modify the rtspserver to use > MPEG4ES and MPEG4GENERIC? > > I mean to take testMPEG4ESVideoTest and aacAudioTest to make one > stream that openRTSP can convert to .mov? > > > > ----- Mensaje original ----- > De: Ross Finlayson > Fecha: Lunes, Enero 31, 2011 12:17 pm > Asunto: Re: [Live-devel] testOnDemandRTSPServer always stream > audio/MPA and video/MPV ? > > > >So, is there any way to stream a MPEG4+AAC file with live555? > > > > No, we currently do not have code for demultiplexing a ".mp4" or > > ".mov"-format file. > > -- > > > > Ross Finlayson > > Live Networks, Inc. > > http://www.live555.com/ > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > > From dli at Cernium.com Tue Feb 1 10:00:28 2011 From: dli at Cernium.com (Dunling Li) Date: Tue, 1 Feb 2011 13:00:28 -0500 Subject: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream In-Reply-To: References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> Message-ID: <03A2FA9E0D3DC841992E682BF528771802C9A8D0@lipwig.Cernium.local> Hi Ross, Thanks for your useful guide. Here is my situation: We use HTTP live streaming to send TS files to iphone. Our sever is lighttpd. We split our a video clip into multiple 4 seconds h264 files, then convert them to TS files in the server for streaming. We have simple h264 to TS convertor (not sufficient) which allow us to see video clips on iphone. The iphone uses mobile quick time video player. I sent you one of our h264 video clip before, we stream 5 or 6 similar files once. Our goal is to use live555 library to replace our current h264-to-TS convertor. We converted those h264 files to TS files using testH264VideoToTransportStream, the TS files are good for VLC and media player. However, we manually put the converted TS files to our lighttpd server for streaming, iphone can not display any TS video clips from testH264VideoToTransportStream. Can you advice me how to fix it? Many thanks Dunling P.S. Probably HTTP live streaming validator adds too much distraction here. The HTTP live streaming validator is a development tool on MAC, the link below shows some details: http://developer.apple.com/library/ios/#technotes/tn2010/tn2235.html. I am new in MAC development. I haven't found any explanation about the posted errors form validator in http://developer.apple.com/devforums/. -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, January 31, 2011 8:29 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream >I use testH264VideoToTransportStream to generate some TS files. None >of them can be played in iPhone. The live streaming validator shows >the following error: > >ERROR: Invalid media segment: The validator helper exited due to a >fatal error: segment duration is not finite. > >Can anyone point me how to fix this problem? That seems unlikely, considering that: 1/ You didn't provide (or post a link to) a single ".264" file that illustrates your problem. 2/ You didn't say what software (on the iPhone) you are using to try to play the resulting TS file. 3/ Perhaps you meant to say that you are trying to *stream* the TS file to the iPhone (rather than trying to play the file on the iPhone)? If that's the case, you didn't say what software you are using as a server (to stream your file). 4/ You didn't say what "the live streaming validator" is. 5/ You didn't say what "the live streaming validator"s documentation (or online support forums or mailing lists) had to say about the meaning of the error message that you saw. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Feb 1 17:06:11 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Feb 2011 17:06:11 -0800 Subject: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream In-Reply-To: <03A2FA9E0D3DC841992E682BF528771802C9A8D0@lipwig.Cernium.local> References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> <03A2FA9E0D3DC841992E682BF528771802C9A8D0@lipwig.Cernium.local> Message-ID: >We converted those h264 files to TS files using >testH264VideoToTransportStream, the TS files are good for VLC and media >player. However, we manually put the converted TS files to our lighttpd >server for streaming, iphone can not display any TS video clips from >testH264VideoToTransportStream. Can you advice me how to fix it? No, because I don't know what the problem is. How could I possible know why the iPhone is not playing your streams? I suggest that you ask an appropriate Apple QuickTime Streaming mailing list what - if anything - might be wrong with the Transport Stream files (you'll need to give them an example) that might be making the iPhone unhappy. The only thing I can suggest is that instead of starting with multiple 4-second H.264 files, and converting each of those to Transport Stream files, you might be better off converting a single (entire) H.264 file into a Transport Stream file, and then splitting the resulting Transport Stream file into 4-second components. That might make the iPhone happier. But who knows... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From korpal28 at gmail.com Tue Feb 1 02:48:00 2011 From: korpal28 at gmail.com (korpal) Date: Tue, 1 Feb 2011 11:48:00 +0100 Subject: [Live-devel] RTSP H264 frame inversion In-Reply-To: References: Message-ID: > > Thanks for this fast answer. For me, it's correct that VLC cannot read the >> file, the file I sent you contains only NAL paquets (See joined screenshot >> with details), and so is not readable by VLC, correct me if I'm wrong. >> > > I don't know why your file is not readable by VLC. You'll need to ask a > VLC mailing list about that. > OK, I will check with VLC team why I cannot open my file. > About DTS/PTS, is it possible that the presentation time is wrong? In >> regards of my code, I never give a presentation time from my device source >> (encoder in fact). But in debug, I saw that a presentation time is computed >> by the Live555 stream framer. Maybe did I forget to code something? >> > > Yes, your input device object should be setting the "fPresentationTime" > field in its implementation of "doGetNextFrame()", when it delivers each NAL > unit. Concerning this point, I've a misunderstanding about how set this value. I'll explain : my frames are coming in a decoding order, that means that for an IDR period like "IBBP", I've "IPBB". So, if each time I proceed a "doGetNextFrame()" I set the fPresentationTime to the value given by gettimeofday(), I see clearly that my video is displayed in decoding order. So, do I've to adapt my fPresentationTime to give a later timestamp for PFrames that dependent BFrames? In my mind, this will be do with a picture timing value given in SEI, but if I check the Live555 code, I just see an "empty" analyze_sei_data() function with the comment "Later, perhaps adjust "fPresentationTime" if we saw a "pic_timing" SEI payload??? #####" Regards, St?phane -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Feb 1 20:45:01 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Feb 2011 20:45:01 -0800 Subject: [Live-devel] RTSP H264 frame inversion In-Reply-To: References: Message-ID: >Yes, your input device object should be setting the >"fPresentationTime" field in its implementation of >"doGetNextFrame()", when it delivers each NAL unit. > > >Concerning this point, I've a misunderstanding about how set this >value. I'll explain : my frames are coming in a decoding order, that >means that for an IDR period like "IBBP", I've "IPBB". So, if each >time I proceed a "doGetNextFrame()" I set the fPresentationTime to >the value given by gettimeofday(), I see clearly that my video is >displayed in decoding order. So, do I've to adapt my >fPresentationTime to give a later timestamp for PFrames that >dependent BFrames? The last "Informative Note" in section 5.1 of RFC 3984 suggests that yes, you should do this - i.e., if you have "B" frames, the presentation times should not be monotonically increasing. >In my mind, this will be do with a picture timing value given in >SEI, but if I check the Live555 code, I just see an "empty" >analyze_sei_data() function with the comment "Later, perhaps adjust >"fPresentationTime" if we saw a "pic_timing" SEI payload??? #####" That's because - when I coded "H264VideoStreamFramer" - I had yet to see a stream that had any "pic_timing" SEI payloads. However, once you've generated such a stream, as a ".264" (and verified that VLC plays it properly, when renamed as ".h264"), please send me a copy, and I'll update our framer implementation accordingly. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Feb 1 20:55:31 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Feb 2011 20:55:31 -0800 Subject: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV? In-Reply-To: <47d820c6fd6.4d481d3f@fibertel.com.ar> References: <47d820c6fd6.4d481d3f@fibertel.com.ar> Message-ID: > >sms->addSubsession(MPEG4VideoFileServerMediaSubsession::createNew(*env, >v_inputFileName, reuseFirstSource)); > >sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(*env, >a_inputFileName, reuseFirstSource)); [...] >And I have a stream that have MPEG4-GENERIC and MP4V-ES. > >The problem I got now is the duration... I found that in both >subsessiions the duration is 0... and I did not found code to >retrieve duration of MPEG4Video and ADTSAudio. No, the "fDurationInMicroseconds" field is set in both cases. For "MPEG4VideoFileServerMediaSubsession", it is set in "MPEG4VideoStreamFramer" (actually, in its parent class "MPEGVideoStreamFramer"). For "ADTSAudioFileServerMediaSubsession" it is set in "ADTSAudioFileSource". This is a 'wild goose chase'; the frame durations should be set properly. (You can verify this by noting the "durationInMicroseconds" parameter to each call to "MultiFramedRTPSink::afterGettingFrame()", for each of your two "RTPSink"s.) (As always, this assumes that you have not modified the supplied source code. If you have made *any* modification to the supplied source code, you can't expect any support on this mailing list.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From edi87 at fibertel.com.ar Wed Feb 2 08:19:47 2011 From: edi87 at fibertel.com.ar (Jonathan Granade) Date: Wed, 02 Feb 2011 13:19:47 -0300 Subject: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV? In-Reply-To: References: <47d820c6fd6.4d481d3f@fibertel.com.ar> Message-ID: <4D498423.1040709@fibertel.com.ar> I learned that I should not modify the source code Ross ;) I did not modify any part of the code, anything change I need, I make subclasses. I want to finish this to make a test and send it here, maybe you can add it to the lib or maybe it will be useful for somebody. I need to check about fDurationInMicroseconds, I did not notice that before... The problem I have now is other... Remember I was dealing with an infinite video loop? I want to stream a 15 secs video+audio in an infinite loop, so the stream will be seeking to the start over and over... What I did was create a new object (subclass) OnDemandContinuousServerMediaSubsession and add a task to the scheduler that is called every ~1 second and compare the current stream progress with the video duration. When progress >= duration I do a 'loop' on the stream, something like (sorry i dont have the code here): pauseStream seekStream (to the abs byte 0) startPlaying That work fine when I stream a MPEG1or2* file, but when I stream using the way I just mentioned (MPEG4 + ADTS), it does not work. Do you have any idea about why this happen? Thanks in advance On 02/02/2011 01:55 AM, Ross Finlayson wrote: >> >> sms->addSubsession(MPEG4VideoFileServerMediaSubsession::createNew(*env, v_inputFileName, >> reuseFirstSource)); >> >> sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(*env, >> a_inputFileName, reuseFirstSource)); > > [...] > >> And I have a stream that have MPEG4-GENERIC and MP4V-ES. >> >> The problem I got now is the duration... I found that in both >> subsessiions the duration is 0... and I did not found code to retrieve >> duration of MPEG4Video and ADTSAudio. > > No, the "fDurationInMicroseconds" field is set in both cases. For > "MPEG4VideoFileServerMediaSubsession", it is set in > "MPEG4VideoStreamFramer" (actually, in its parent class > "MPEGVideoStreamFramer"). For "ADTSAudioFileServerMediaSubsession" it is > set in "ADTSAudioFileSource". > > This is a 'wild goose chase'; the frame durations should be set > properly. (You can verify this by noting the "durationInMicroseconds" > parameter to each call to "MultiFramedRTPSink::afterGettingFrame()", for > each of your two "RTPSink"s.) > > (As always, this assumes that you have not modified the supplied source > code. If you have made *any* modification to the supplied source code, > you can't expect any support on this mailing list.) From edi87 at fibertel.com.ar Wed Feb 2 15:49:09 2011 From: edi87 at fibertel.com.ar (Jonathan Granade) Date: Wed, 02 Feb 2011 20:49:09 -0300 Subject: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV? In-Reply-To: References: <47d820c6fd6.4d481d3f@fibertel.com.ar> Message-ID: <4D49ED75.8050204@fibertel.com.ar> Ross, I checked what you mentioned about the duration... I see that fDurationInMicroseconds is filled where you said, but method duration() of OnDemandServerMediaSubsession is always 0 for both streams, while if I stream a mpeg file with MPEG1or2* objects, duration() method returns the duration of the file. Is there any way to get it? Thanks On 02/02/2011 01:55 AM, Ross Finlayson wrote: >> >> sms->addSubsession(MPEG4VideoFileServerMediaSubsession::createNew(*env, v_inputFileName, >> reuseFirstSource)); >> >> sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(*env, >> a_inputFileName, reuseFirstSource)); > > [...] > >> And I have a stream that have MPEG4-GENERIC and MP4V-ES. >> >> The problem I got now is the duration... I found that in both >> subsessiions the duration is 0... and I did not found code to retrieve >> duration of MPEG4Video and ADTSAudio. > > No, the "fDurationInMicroseconds" field is set in both cases. For > "MPEG4VideoFileServerMediaSubsession", it is set in > "MPEG4VideoStreamFramer" (actually, in its parent class > "MPEGVideoStreamFramer"). For "ADTSAudioFileServerMediaSubsession" it is > set in "ADTSAudioFileSource". > > This is a 'wild goose chase'; the frame durations should be set > properly. (You can verify this by noting the "durationInMicroseconds" > parameter to each call to "MultiFramedRTPSink::afterGettingFrame()", for > each of your two "RTPSink"s.) > > (As always, this assumes that you have not modified the supplied source > code. If you have made *any* modification to the supplied source code, > you can't expect any support on this mailing list.) From finlayson at live555.com Wed Feb 2 22:56:13 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Feb 2011 22:56:13 -0800 Subject: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV? In-Reply-To: <4D498423.1040709@fibertel.com.ar> References: <47d820c6fd6.4d481d3f@fibertel.com.ar> <4D498423.1040709@fibertel.com.ar> Message-ID: >When progress >= duration I do a 'loop' on the stream, something >like (sorry i dont have the code here): > >pauseStream >seekStream (to the abs byte 0) >startPlaying > >That work fine when I stream a MPEG1or2* file, but when I stream >using the way I just mentioned (MPEG4 + ADTS), it does not work. > >Do you have any idea about why this happen? Yes - it's because we don't support server 'trick play' operations (which include 'seek') on these media types. See: http://www.live555.com/liveMedia/faq.html#trick-mode http://www.live555.com/mediaServer/#trick-play This is also why the "duration()" virtual function has not been implemented for these "*ServerMediaSubsession" classes (because, for example, we can't know the duration of a MPEG-4 video file unless we were to scan all the way through it). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From edi87 at fibertel.com.ar Thu Feb 3 06:34:57 2011 From: edi87 at fibertel.com.ar (edi87 at fibertel.com.ar) Date: Thu, 03 Feb 2011 11:34:57 -0300 Subject: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV? Message-ID: <7bb69771447a.4d4a92e1@fibertel.com.ar> I see... I would need to add support for that also I think it should not be too difficult, if I'm not wrong, MPEG4 video duration and trick play support should be too similar to MPEG1or2. I'm not sure about ADTS, but I would need to play with them... I will keep you updated about it. Thanks for your help Ross! ----- Mensaje original ----- De: Ross Finlayson Fecha: Jueves, Febrero 3, 2011 3:56 am Asunto: Re: [Live-devel] testOnDemandRTSPServer always stream audio/MPA and video/MPV? > >When progress >= duration I do a 'loop' on the stream, something > >like (sorry i dont have the code here): > > > >pauseStream > >seekStream (to the abs byte 0) > >startPlaying > > > >That work fine when I stream a MPEG1or2* file, but when I stream > >using the way I just mentioned (MPEG4 + ADTS), it does not work. > > > >Do you have any idea about why this happen? > > Yes - it's because we don't support server 'trick play' operations > (which include 'seek') on these media types. > > See: > http://www.live555.com/liveMedia/faq.html#trick-mode > http://www.live555.com/mediaServer/#trick-play > > This is also why the "duration()" virtual function has not been > implemented for these "*ServerMediaSubsession" classes (because, > for > example, we can't know the duration of a MPEG-4 video file unless > we > were to scan all the way through it). > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From sebastien-devel at celeos.eu Fri Feb 4 00:41:49 2011 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Fri, 04 Feb 2011 09:41:49 +0100 Subject: [Live-devel] successful command but NULL result string Message-ID: <1296808909.29592.6.camel@stim-desktop> Hi, I have a case where the response to a "sendOptionsCommand" returns no error, (result = 200) but the result string is NULL. I read RTSPClient.hh and the description of the responseHandler isn't very clear if this can be possible. Could you tell me if it's a bug, in the library ? I noticed you call strDup(NULL) in this case. The test url is : rtsp://wmlive.bbc.net.uk/wms/england/lrnewcastle?bbc- Regards, S?bastien. From finlayson at live555.com Fri Feb 4 01:10:18 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Feb 2011 01:10:18 -0800 Subject: [Live-devel] successful command but NULL result string In-Reply-To: <1296808909.29592.6.camel@stim-desktop> References: <1296808909.29592.6.camel@stim-desktop> Message-ID: >I have a case where the response to a "sendOptionsCommand" returns no >error, (result = 200) but the result string is NULL. >I read RTSPClient.hh and the description of the responseHandler isn't >very clear if this can be possible. >Could you tell me if it's a bug, in the library ? >I noticed you call strDup(NULL) in this case. > >The test url is : >rtsp://wmlive.bbc.net.uk/wms/england/lrnewcastle?bbc- %./openRTSP -o 'rtsp://wmlive.bbc.net.uk/wms/england/lrnewcastle?bbc-' Opening connection to 212.58.251.90, port 554... ...remote connection opened Sending request: OPTIONS rtsp://wmlive.bbc.net.uk/wms/england/lrnewcastle?bbc- RTSP/1.0 CSeq: 2 User-Agent: ./openRTSP (LIVE555 Streaming Media v2011.01.24) Received 277 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK Supported: com.microsoft.wm.srvppair, com.microsoft.wm.sswitch, com.microsoft.wm.eosmsg, com.microsoft.wm.fastcache, com.microsoft.wm.packetpairssrc, com.microsoft.wm.startupprofile Date: Fri, 04 Feb 2011 09:03:37 GMT CSeq: 2 Server: WMServer/9.1.1.5001 RTSP "OPTIONS" request returned: (NULL) No, this is not a bug. As you can see, the server is some brain-damaged non-standard Microsoft thing. It's not including a "Public:" header in its response to our RTSP "OPTIONS" command. So, as far as our client is concerned, this is a NULL response. (And "strDup(NULL)" is OK; I made sure that it just returns NULL in that case.) If you think that's bad, wait until you see the SDP description that this server returns in response to "DESCRIBE": v=0 o=- 201102040905570718 201102040905570718 IN IP4 127.0.0.1 s=BBC Newcastle c=IN IP4 0.0.0.0 b=AS:49 a=maxps:2261 t=0 0 a=control:rtsp://wmlive.bbc.net.uk/wms%5Cengland%5Clrnewcastle/ a=etag:{6CC7D408-8ED2-A217-F5F7-098DCE207CB1} a=range:npt=1.579-1.579 a=type:broadcast,playlist a=recvonly a=pgmpu:data:application/x-wms-contentdesc,8,language,31,0,,5,title,31,13,BBC%20Newcastle,6,author,31,13,BBC%20Newcastle,9,copyright,31,36,(c)%20British%20Broadcasting%20Corporation,35,WMS_CONTENT_DESCRIPTION_DESCRIPTION,31,0,,30,WMS_CONTENT_DESCRIPTION_RATING,31,0,,44,WMS_CONTENT_DESCRIPTION_SERVER_BRANDING_INFO,31,12,WMServer/9.1,51,WMS_CONTENT_DESCRIPTION_PLAYLIST_ENTRY_START_OFFSET,3,4,1579,47,WMS_CONTENT_DESCRIPTION_PLAYLIST_ENTRY_DURATION,3,1,0,58,WMS_CONTENT_DESCRIPTION_COPIED_METADATA_FROM_PLAYLIST_FILE,3,1,1,42,WMS_CONTENT_DESCRIPTION_PLAYLIST_ENTRY_URL,31,1,/%0D%0A a=pgmpu:data:application/vnd.ms.wms-hdr.asfv1;base64,MCaydY5mzxGm2QCqAGLObL8TAAAAAAAABwAAAAECMyaydY5mzxGm2QCqAGLObKgAAAAAAAAAHAAcAEoAAgACAEIAQgBDACAATgBlAHcAYwBhAHMAdABsAGUAAABCAEIAQwAgAE4AZQB3AGMAYQBzAHQAbABlAAAAKABjACkAIABCAHIAaQB0AGkAcwBoACAAQgByAG8AYQBkAGMAYQBzAHQAaQBuAGcAIABDAG8AcgBwAG8AcgBhAHQAaQBvAG4AAAAAAAAAznX4e41G0RGNggBgl8misiAAAAAAAAAAAQABAAa+AACh3KuMR6nPEY7kAMAMIFNlaAAAAAAAAAALoLIVMSJeSYbP9ao+6bYb8RMAAAAAAACQcvDWScTLAf////8AAAAAAAAAAAAAAAAAAAAAAAAAACsGAAAAAAAACQAAANUIAADVCAAABr4AALUDv18uqc8RjuMAwAwgU2WxEAAAAAAAABHS06u6qc8RjuYAwAwgU2UGAIMQAACpRkN84O/8S7IpOT7eQVyFJwAAAAAAAAABAAxlAG4ALQBnAGIAAADLpeYUcsYyQ4OZqWlSBltaWAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAmLsAACsGAAAAAAAAmLsAACsGAAAAAAAAtggAAAAAAAABAAAAAAAAAAAAAAAAAAAAXYvxJoRF7EefXw5lHwRSyRoAAAAAAAAAAgHqy/jFr1t3SIRnqoxE+kzKegAAAAAAAAACAAAAAQAMAAIAAgAAAEkAcwBWAEIAUgAAAAAAAAABADQAAAAGAAAARABlAHYAaQBjAGUAQwBvAG4AZgBvAHIAbQBhAG4AYwBlAFQAZQBtAHAAbABhAHQAZQAAAEwAMgAAAHTUBhjfyglFpLqaq8uWquhwDwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACRB9y3t6nPEY7mAMAMIFNlcgAAAAAAAABAnmn4TVvPEaj9AIBfXEQrUM3Dv49hzxGLsgCqALTiIAAAAAAAAAAAHAAAAAgAAAABAABizmxhAQEARKwAAHMXAAC2CBAACgAAiAAADwDZIgAAAbYItggBAABApNDSB+PSEZfwAKDJXqhQpAAAAAAAAAADABwAVwBNAEYAUwBEAEsAVgBlAHIAcwBpAG8AbgAAAAAAHAAxADAALgAwADAALgAwADAALgA0ADAAMAA3AAAAGgBXAE0ARgBTAEQASwBOAGUAZQBkAGUAZAAAAAAAFgAwAC4AMAAuADAALgAwADAAMAAwAAAADABJAHMAVgBCAFIAAAACAAQAAAAAAEBS0YYdMdARo6QAoMkDSPaqAAAAAAAAAEFS0YYdMdARo6QAoMkDSPYBAAAAAgAYAFcAaQBuAGQAbwB3AHMAIABNAGUAZABpAGEAIABBAHUAZABpAG8AIAA5AC4AMQAAACIAIAA0ADgAIABrAGIAcABzACwAIAA0ADQAIABrAEgAegAsACAAbQBvAG4AbwAgADEALQBwAGEAcwBzACAAQwBCAFIAAAACAGEBNiaydY5mzxGm2QCqAGLObDIAAAAAAAAAC6CyFTEiXkmGz/WqPum2GwAAAAAAAAAAAQE= m=audio 0 RTP/AVP 96 b=AS:49 b=X-AV:49 b=RS:0 b=RR:0 a=rtpmap:96 x-asf-pf/1000 a=control:audio a=stream:1 m=application 0 RTP/AVP 96 b=RS:0 b=RR:0 a=rtpmap:96 x-wms-rtx/1000 a=control:rtx a=stream:65536 And then, needless to say: Unable to create receiver for "audio/X-ASF-PF" subsession: RTP payload format unknown or not supported Unable to create receiver for "application/X-WMS-RTX" subsession: RTP payload format unknown or not supported -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ottavio at videotec.com Fri Feb 4 01:29:03 2011 From: ottavio at videotec.com (Ottavio Campana) Date: Fri, 04 Feb 2011 10:29:03 +0100 Subject: [Live-devel] relaying RTSP? Message-ID: <4D4BC6DF.1010204@videotec.com> Hi, I am trying to implement a relay for RTSP or something similar for the following situation: +--------------+ +---------------------------+ +-----------+ | pc | | box with 2 NICs | | camera | |192.168.0.1/24|<--->|192.168.0.2/24 10.0.0.1/24|<--->|10.0.0.2/24| +--------------+ +---------------------------+ +-----------+ On the pc I have a *crappy* software that to read some data from the box with 2 NICs and receive the RTSP video from the camera. The easiest solution would be opening two connection to two different IPs on the 192.168.0.0/24 lan, but the software can deal with just one IP. Thus, since I only need to receive the video from the camera, I considered the solution I just showed you, by hiding the camera in the lan 10.0.0.0/24 and relaying it. Now the question is: how can I do this with live555? Is there something already available? Or, but this is not related to live555, do you know if it possibile to do it just with NAT? And what if the camera uses multicasting to stream? Any hint is appreciated. Thank you, Ottavio From sebastien-devel at celeos.eu Fri Feb 4 02:02:04 2011 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Fri, 04 Feb 2011 11:02:04 +0100 Subject: [Live-devel] successful command but NULL result string In-Reply-To: References: <1296808909.29592.6.camel@stim-desktop> Message-ID: <1296813724.29592.10.camel@stim-desktop> On Fri, 2011-02-04 at 01:10 -0800, Ross Finlayson wrote: > No, this is not a bug. As you can see, the server is some > brain-damaged non-standard Microsoft thing. It's not including a > "Public:" header in its response to our RTSP "OPTIONS" command. So, > as far as our client is concerned, this is a NULL response. Ok. I thought an empty response ("") would be more logic, but why not. But in this case it would be good to note this can happen in the function description. Thanks. S?bastien. From finlayson at live555.com Fri Feb 4 05:49:29 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Feb 2011 05:49:29 -0800 Subject: [Live-devel] successful command but NULL result string In-Reply-To: <1296813724.29592.10.camel@stim-desktop> References: <1296808909.29592.6.camel@stim-desktop> <1296813724.29592.10.camel@stim-desktop> Message-ID: >On Fri, 2011-02-04 at 01:10 -0800, Ross Finlayson wrote: >> No, this is not a bug. As you can see, the server is some >> brain-damaged non-standard Microsoft thing. It's not including a >> "Public:" header in its response to our RTSP "OPTIONS" command. So, >> as far as our client is concerned, this is a NULL response. > >Ok. >I thought an empty response ("") would be more logic Perhaps, though an empty string ("") would be returned if the server's response included a "Public:" header that was empty - i.e. Public: >But in this case it would be good to note this can happen in the >function description. Yes, we already do ("RTSPClient.hh", line 54). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Feb 4 05:54:49 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Feb 2011 05:54:49 -0800 Subject: [Live-devel] relaying RTSP? In-Reply-To: <4D4BC6DF.1010204@videotec.com> References: <4D4BC6DF.1010204@videotec.com> Message-ID: >Now the question is: how can I do this with live555? Is there something >already available? Not really. Your simplest solution - if your camera supports RTSP/RTP-over-TCP - would be to have your "pc" connect to the camera directly (i.e., through your NAT gateway) using RTSP/RTP-over-TCP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sebastien-devel at celeos.eu Fri Feb 4 06:00:42 2011 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Fri, 04 Feb 2011 15:00:42 +0100 Subject: [Live-devel] successful command but NULL result string In-Reply-To: References: <1296808909.29592.6.camel@stim-desktop> <1296813724.29592.10.camel@stim-desktop> Message-ID: <1296828042.29592.31.camel@stim-desktop> > Perhaps, though an empty string ("") would be returned if the > server's response included a "Public:" header that was empty - i.e. > > Public: Ok, that make sense > > >But in this case it would be good to note this can happen in the > >function description. > > Yes, we already do ("RTSPClient.hh", line 54). Yes, but it's not very clear. It says : "resultCode": If zero, then the command completed successfully. -> this is my case "resultString" for a successful "OPTIONS" command will be a list of allowed commands. -> Ok I am in this case, and there is no allowed command, so I expected an empty string. That's why I assumed I did not need to test for a NULL string if the response was successfull (ie resultCode == 0) From dli at Cernium.com Fri Feb 4 12:58:05 2011 From: dli at Cernium.com (Dunling Li) Date: Fri, 4 Feb 2011 15:58:05 -0500 Subject: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream In-Reply-To: References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local><03A2FA9E0D3DC841992E682BF528771802C9A8D0@lipwig.Cernium.local> Message-ID: <03A2FA9E0D3DC841992E682BF528771802C9AFCF@lipwig.Cernium.local> Ross, Thanks for your suggestions. Dunling -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, February 01, 2011 8:06 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream >We converted those h264 files to TS files using >testH264VideoToTransportStream, the TS files are good for VLC and media >player. However, we manually put the converted TS files to our lighttpd >server for streaming, iphone can not display any TS video clips from >testH264VideoToTransportStream. Can you advice me how to fix it? No, because I don't know what the problem is. How could I possible know why the iPhone is not playing your streams? I suggest that you ask an appropriate Apple QuickTime Streaming mailing list what - if anything - might be wrong with the Transport Stream files (you'll need to give them an example) that might be making the iPhone unhappy. The only thing I can suggest is that instead of starting with multiple 4-second H.264 files, and converting each of those to Transport Stream files, you might be better off converting a single (entire) H.264 file into a Transport Stream file, and then splitting the resulting Transport Stream file into 4-second components. That might make the iPhone happier. But who knows... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From warren at etr-usa.com Fri Feb 4 14:57:05 2011 From: warren at etr-usa.com (Warren Young) Date: Fri, 04 Feb 2011 15:57:05 -0700 Subject: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream In-Reply-To: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> Message-ID: <4D4C8441.3010901@etr-usa.com> On 1/31/2011 4:23 PM, Dunling Li wrote: > > I use testH264VideoToTransportStream to generate some TS files. None of > them can be played in iPhone. QuickTime can't do that on desktop platforms, either. See the FAQ on the MPEG-2 component's store page: http://store.apple.com/us/product/D2187Z/A There's no reason to suppose iOS would have TS capability when QuickTime on desktop platforms doesn't. (And yes, I realize you're talking about MPEG-4 and I'm talking about MPEG-2. The point is, if QuickTime had native TS support, the MPEG-2 playback component FAQ wouldn't need to disclaim this, since it could use the native QT support to demux the incoming stream prior to decoding.) Apple wants you to use MPEG-4 part 14 ("m4v") packaging for H.264 video. Why? Because part 14 is based on the QuickTime container format, so they don't have to do as much work. From finlayson at live555.com Fri Feb 4 15:41:44 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Feb 2011 15:41:44 -0800 Subject: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream In-Reply-To: <4D4C8441.3010901@etr-usa.com> References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> <4D4C8441.3010901@etr-usa.com> Message-ID: >On 1/31/2011 4:23 PM, Dunling Li wrote: >>I use testH264VideoToTransportStream to generate some TS files. None of >>them can be played in iPhone. > >QuickTime can't do that on desktop platforms, either. The original poster's original message was a bit unclear, but he later clarified it. He wasn't really talking about playing Transport Stream files natively on the iPhone. Instead, he was talking about streaming (segmented) Transport Stream files *to* the iPhone, using Apple's new "HTTP Live Streaming" hack. He wanted to know why the iPhone wouldn't render such streams, when their Transport Stream files had been generated by our "testH264VideoToTransportStream" demo application. But in any case, this subject is OFF TOPIC for this mailing list (unless and until someone suggests a specific change to our 'H.264 to Transport Stream' conversion mechanism that is likely to make the iPhone happier about playing such streams). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From renatomauro at libero.it Tue Feb 8 03:28:51 2011 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Tue, 8 Feb 2011 12:28:51 +0100 Subject: [Live-devel] Routing References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local><4D4C8441.3010901@etr-usa.com> Message-ID: Hello Ross. I have OpenRTSP running on PC "A" and an OnDemandRTSPServer running on PC "B". PC "A" with NIC set up with IP 192.168.2.27 SNM 255.255.255.0 GW 192.168.2.1 PC "B" with NIC set up with IP 192.168.7.2 SNM 255.255.255.0 GW 192.168.7.1 In this scenario, is there a chance that OpenRTSP could receive a stream? I tried with both RTP over UDP and RTP over TCP with no success. Using JPerf I checked with success the transmission of many 32Kbyte sized UDP packets on ports 4095, 53800 and 63400. Using PCs "A" and "B" on the same network sub-class (both with IP 192.168.7.x), it works. Using PCs "A" and "B" on the different network sub-class, SNM set up with 255.255.0.0 and gateway switched off, it works. Thank you very much, Renato MAURO From finlayson at live555.com Tue Feb 8 17:16:26 2011 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Feb 2011 17:16:26 -0800 Subject: [Live-devel] Routing In-Reply-To: References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local><4D4C8441.3 010901@etr-usa.com> Message-ID: > I have OpenRTSP running on PC "A" and an OnDemandRTSPServer >running on PC "B". > PC "A" with NIC set up with IP 192.168.2.27 SNM 255.255.255.0 GW >192.168.2.1 > PC "B" with NIC set up with IP 192.168.7.2 SNM 255.255.255.0 GW >192.168.7.1 > > In this scenario, is there a chance that OpenRTSP could receive a stream? Whether or not there is connectivity between these two computers depends *entirely* on (1) the computers' operating systems, and (2) the network, router(s)/firewall(s)/NAT(s) between them. Application-level software (like "openRTSP") has no control over this. > I tried with both RTP over UDP and RTP over TCP with no success. > Using JPerf I checked with success the transmission of many >32Kbyte sized UDP packets on ports 4095, 53800 and 63400. Even though UDP works, perhaps TCP doesn't? You probably have a firewall somewhere that's blocking TCP connections. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Priya_Raghavendra at mindtree.com Wed Feb 9 02:20:54 2011 From: Priya_Raghavendra at mindtree.com (Priya Raghavendra) Date: Wed, 9 Feb 2011 10:20:54 +0000 Subject: [Live-devel] Mpeg1or2 Stream very jittery Message-ID: Hi, In our application we are receiving a live stream, doing a ffmpeg decode, doing some overlay on the video and then encoding it as a mpeg2 stream. After this we are streaming the video using live. The problem is that the video when viewed in VLC is very jittery. So I analyzed the packets in wire shark and observed that the time stamp of each of the UDP packets is not in sequence. Ex. Timestamp: 1487557790 - Frame n Timestamp: 1487554190 - Frame n + 1 For the next frame the timestamp will not be in sequence. [cid:image001.jpg at 01CBC871.2288C740] Why are the time stamps not in sequence when the packets are received? I have observed that on the sending side, for every frame the fPresentationTime is in the correct sequence. The code snippet from where I am reading the queue and updating the presentation time is given below. void LiveInputSource:: incomingDataHandler1() { int ret; if (!isCurrentlyAwaitingData()) return ; ret = readData(); if (ret < 0 ) { handleClosure (this); } else if (ret == 0 ) { int uSecsToDelay = 50000 ; nextTask() = envir().taskScheduler().scheduleDelayedTask (uSecsToDelay, (TaskFunc*)incomingDataHandler, this); } else { nextTask() = envir().taskScheduler().scheduleDelayedTask (0, (TaskFunc*)afterGetting, this); } } int LiveInputVideoSource:: readData() { /* Get frame from queue */ queue_get (frame, &frame_len); if (frame_len == 0) { return 0; } /* Copy frame to fTo */ memcpy (fTo, frame, frame_len); fFrameSize = frame_len; gettimeofday (&fPresentationTime); return 1; } ? For MPEG2 we are using this class MPEG1or2VideoStreamDiscreteFramer. So when I see this code I observe that a lot of manipulation is done to the presentation time. So is it possible that the time stamp values are getting jumbled here? void MPEG1or2VideoStreamDiscreteFramer ::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes, struct timeval presentationTime, unsigned durationInMicroseconds) Thanks in Advance. Regards, Priya ________________________________ http://www.mindtree.com/email/disclaimer.html -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.jpg Type: image/jpeg Size: 31597 bytes Desc: image001.jpg URL: From finlayson at live555.com Wed Feb 9 19:45:50 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Feb 2011 19:45:50 -0800 Subject: [Live-devel] Mpeg1or2 Stream very jittery In-Reply-To: References: Message-ID: First, rather than 'polling' (with a 50 ms delay) to check whether new data can be read, it is better to use the new 'event trigger' mechanism. See "liveMedia/DeviceSource.cpp" for a model on how to do this. >? For MPEG2 we are using this class MPEG1or2VideoStreamDiscreteFramer. >So when I see this code I observe that a lot of >manipulation is done to the presentation time. >So is it possible that the time stamp values are >getting jumbled here? "MPEG1or2VideoStreamDiscreteFramer" assumes that its input frames are in 'decoding order' - i.e., the order in which frames should be fed into a decoder (at the receiving end). This is the same order in which frames should be sent across the network (via RTP). The 'preentation time' for each frame, however, will be different - it corresponds to the time that each frame will be displayed on the (receiver's) screen. Therefore, the "MPEG1or2VideoStreamDiscreteFramer" code adjusts the presentation times as appropriate (if the stream contains "B" frames). To summarize: If your input data frames are being delivered in decoding order, then everything should be OK; any 'jittery' behavior that you see must be due to something else (unrelated to our code). However, if your input data frames are *not* being delivered (to "MPEG1or2VideoStreamDiscreteFramer") in decoding order, then you will need to fix this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From komal.dhote at gmail.com Thu Feb 10 06:41:59 2011 From: komal.dhote at gmail.com (Komal Dhote) Date: Thu, 10 Feb 2011 20:11:59 +0530 Subject: [Live-devel] Routing In-Reply-To: References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> <4D4C8441.3010901@etr-usa.com> Message-ID: Hi All, I have used openRTSP for my application. As my application has simultanous receiving four RTSP H264 encoded streams(1080P) and decoding it. I have created four client as given in the playCommon.cpp. I am able to receive three streams properly but when i add fourth channel (fourth client) i get a problem in reception i miss some frames(I or P). Anybody faced this problems. Please help in this regards Is it possible to receive four rtsp streams from the openRTSP library? Regards, Kumar Komal -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Feb 12 01:20:06 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 12 Feb 2011 19:20:06 +1000 Subject: [Live-devel] Routing In-Reply-To: References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> <4D4C8441.3010901@etr-usa.com> Message-ID: >Is it possible to receive four rtsp streams from the openRTSP library? There's no such limitation in the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From fb8fb8 at gmail.com Fri Feb 11 19:12:50 2011 From: fb8fb8 at gmail.com (Tom) Date: Fri, 11 Feb 2011 21:12:50 -0600 Subject: [Live-devel] Fwd: I cannot stream my mpg file through LIVE555 Media Server In-Reply-To: References: Message-ID: I just started using live555 yesterday but i have a problem about streaming mpg file. I have installed live555 Media Server on a Ubuntu PC, and use VLC player on another Ubuntu PC to get stream from this PC. I can stream my mp3 file in this way by typing "rtsp://111.111.11.11:8554/", but I cannot stream my mpg files in this way. What does "A MPEG-1 or 2 Program Stream file (with file name suffix ".mpg")" mean? Is it just a mpg file? Thanks! Bo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Feb 12 15:51:55 2011 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 13 Feb 2011 09:51:55 +1000 Subject: [Live-devel] Fwd: I cannot stream my mpg file through LIVE555 Media Server In-Reply-To: References: Message-ID: >I just started using live555 yesterday but i have a problem about >streaming mpg file. >I have installed live555 Media Server on a Ubuntu PC, and use VLC >player on another Ubuntu PC to get stream from this PC. >I can stream my mp3 file in this way by typing >"rtsp://111.111.11.11:8554/", >but I cannot stream my mpg files in this way. See http://www.live555.com/liveMedia/faq.html#my-file-doesnt-work We'll need to see an example of a file that doesn't work. >What does "A MPEG-1 or 2 Program Stream file (with file name suffix >".mpg")" mean? It means a MPEG-1 or 2 Program Stream file, whose file name ends with ".mpg". :-) See http://en.wikipedia.org/wiki/Program_stream > Is it just a mpg file? Yes. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix at embedded-sol.com Sun Feb 13 06:07:40 2011 From: felix at embedded-sol.com (Felix Radensky) Date: Sun, 13 Feb 2011 16:07:40 +0200 Subject: [Live-devel] Using event triggers Message-ID: <4D57E5AC.6020501@embedded-sol.com> Hi, I'm trying to integrate live555 based H264 RTSP streamer into an existing application. I'd like to invoke live555 in the place where encoded h264 buffer is already available to application, so it looks like using new event trigger mechanism is a suitable approach. I don't have any file descriptors added to live555 event loop. This causes code is BasicTaskScheduler::SingleStep() to produce the following error: BasicTaskScheduler::SingleStep(): select() fails: Bad file descriptor Am I missing something obvious or current code cannot work as I think it should (skip select() if no file descriptors were added to event loop). I'm using version 2011.01.24. Thanks. Felix. From finlayson at live555.com Sun Feb 13 17:11:34 2011 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Feb 2011 11:11:34 +1000 Subject: [Live-devel] Using event triggers In-Reply-To: <4D57E5AC.6020501@embedded-sol.com> References: <4D57E5AC.6020501@embedded-sol.com> Message-ID: >I don't have any file descriptors added to live555 event loop. This >causes code is BasicTaskScheduler::SingleStep() to produce the >following error: > >BasicTaskScheduler::SingleStep(): select() fails: Bad file descriptor This shows that your code - perhaps indirectly - has called "TaskScheduler:: turnOnBackgroundReadHandling()" with an invalid socket number (perhaps, a socket number that has already been closed?). Note that the error can also refer to network sockets; not just sockets from open files. (In Unix, these are one and the same.) Does your code contain any calls to "turnOnBackgroundReadHandler()"? Or to "ByteStreamFileSource::createNew()"? Or to "BasicUDPSource::createNew()"? Or perhaps you're closing some open files (or "Groupsock" or "RTPSource" or "RTCPInstance" objects) when you shouldn't be? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From marceltella at gmail.com Thu Feb 17 02:54:38 2011 From: marceltella at gmail.com (Marcel Tella) Date: Thu, 17 Feb 2011 11:54:38 +0100 Subject: [Live-devel] Http streaming!! Message-ID: Hi! I'm an studient from Barcelona, Catalonia (Polythecnical University of Catalonia) which tries to do the final thesis about streaming in the web. The thing is, I've got an algorithm whose output has to go directly to the web in real time. I've read a lot of different technologies and I've noticed that live555 is for streaming propouses, written in C++. I would like to know if it's possible with live555 to HTTP streaming, using the VP8 encoding method and the WebM container. Thank you! I really appreciate your help! -- Marcel Tella. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Feb 17 09:10:23 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Feb 2011 04:10:23 +1100 Subject: [Live-devel] Http streaming!! In-Reply-To: References: Message-ID: >I would like to know if it's possible with live555 to HTTP >streaming, using the VP8 encoding method and the WebM container. No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From felix at embedded-sol.com Thu Feb 17 10:49:48 2011 From: felix at embedded-sol.com (Felix Radensky) Date: Thu, 17 Feb 2011 20:49:48 +0200 Subject: [Live-devel] Using event triggers In-Reply-To: References: <4D57E5AC.6020501@embedded-sol.com> Message-ID: <4D5D6DCC.2050902@embedded-sol.com> Hi Ross, On 02/14/2011 03:11 AM, Ross Finlayson wrote: >> I don't have any file descriptors added to live555 event loop. This >> causes code is BasicTaskScheduler::SingleStep() to produce the >> following error: >> >> BasicTaskScheduler::SingleStep(): select() fails: Bad file descriptor > > This shows that your code - perhaps indirectly - has called > "TaskScheduler:: turnOnBackgroundReadHandling()" with an invalid > socket number (perhaps, a socket number that has already been > closed?). Note that the error can also refer to network sockets; > not just sockets from open files. (In Unix, these are one and the > same.) > > Does your code contain any calls to "turnOnBackgroundReadHandler()"? > Or to "ByteStreamFileSource::createNew()"? Or to > "BasicUDPSource::createNew()"? > > Or perhaps you're closing some open files (or "Groupsock" or > "RTPSource" or "RTCPInstance" objects) when you shouldn't be? Thanks for a prompt reply. I was using live555 from 2 different threads, which is not supported. Moving everything into a single thread fixed my problem. Thanks for a great software ! Felix. From rafael.madeira at idea-ip.com Thu Feb 17 12:10:50 2011 From: rafael.madeira at idea-ip.com (Rafael Madeira) Date: Thu, 17 Feb 2011 18:10:50 -0200 Subject: [Live-devel] Streaming with live555 using named pipes with dm6467 Message-ID: Hi, I'm trying send TS content generated by my DM6467 enconder application. Currently, i send TS data through a named pipe to "testMPEG2TransportStreamer" modified live555 program. Then i try see, in real time, encoded data on VLC. Everything works fine when i save a ".ts" file on hard disk and then send to VLC. But doing a "real time" comunication between Live555 and my app even with RTP or RTSP protocol i can't see anything in VLC. VLC reports (through its debug interface) a lot of Ts discontinuity and/or TS duplicate. Anyone knows what i could be doing wrong? Is that the best way to use Live555 as a live streamer?? Tanks in advance. -- Att, Rafael Madeira Idea! Electronic Systems www.idea-ip.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Feb 17 13:02:32 2011 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Feb 2011 08:02:32 +1100 Subject: [Live-devel] Streaming with live555 using named pipes with dm6467 In-Reply-To: References: Message-ID: >I'm trying send TS content generated by my DM6467 enconder >application. Currently, i send TS data through a named pipe to >"testMPEG2TransportStreamer" modified live555 program. Then i try >see, in real time, encoded data on VLC. > >Everything works fine when i save a ".ts" file on hard disk and then >send to VLC. > >But doing a "real time" comunication between Live555 and my app even >with RTP or RTSP protocol i can't see anything in VLC. VLC reports >(through its debug interface) a lot of Ts discontinuity and/or TS >duplicate. > >Anyone knows what i could be doing wrong? You may be seeing (IP multicast) packet loss somewhere between the streamer app and VLC. I suggest that you first try using "openRTSP" as a client, to record the incoming stream as a (Transport Stream) file - that you can examine and try playing. > Is that the best way to use Live555 as a live streamer?? If you plan to have only a single client, then you should probably stream via unicast rather than via multicast. You can easily modify the "testOnDemandRTSPServer" demo application to stream from your named pipe via unicast: See http://www.live555.com/liveMedia/faq.html#liveInput-unicast -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sebastien-devel at celeos.eu Fri Feb 18 06:49:08 2011 From: sebastien-devel at celeos.eu (=?ISO-8859-1?Q?S=E9bastien?= Escudier) Date: Fri, 18 Feb 2011 15:49:08 +0100 Subject: [Live-devel] issue with getResultMsg from the old api Message-ID: <1298040548.12401.15.camel@stim-desktop> Hi, Vlc 1.1 uses the old synchronus API, and I noticed something changed in live555 a while ago. Before, (I tested with a version from march 2010), the getResultMsg() after a describe command returned : cannot handle DESCRIBE response: RTSP/1.0 401 Unauthorized latest version now returns only: 401 Unauthorized The issue is that it does not contain RTSP/1.0 anymore, and vlc is checking this to get the error code. Regards, Sebastien. From rafael.madeira at idea-ip.com Mon Feb 21 13:44:02 2011 From: rafael.madeira at idea-ip.com (Rafael Madeira) Date: Mon, 21 Feb 2011 18:44:02 -0300 Subject: [Live-devel] Streaming with live555 using named pipes with dm6467 In-Reply-To: References: Message-ID: Hi, tanks for fast replay.. On Thu, Feb 17, 2011 at 7:02 PM, Ross Finlayson wrote: > I'm trying send TS content generated by my DM6467 enconder application. >> Currently, i send TS data through a named pipe to >> "testMPEG2TransportStreamer" modified live555 program. Then i try see, in >> real time, encoded data on VLC. >> >> Everything works fine when i save a ".ts" file on hard disk and then send >> to VLC. >> >> But doing a "real time" comunication between Live555 and my app even with >> RTP or RTSP protocol i can't see anything in VLC. VLC reports (through its >> debug interface) a lot of Ts discontinuity and/or TS duplicate. >> >> Anyone knows what i could be doing wrong? >> > > You may be seeing (IP multicast) packet loss somewhere between the streamer > app and VLC. > > I suggest that you first try using "openRTSP" as a client, to record the > incoming stream as a (Transport Stream) file - that you can examine and try > playing. > > > > Is that the best way to use Live555 as a live streamer?? >> > > If you plan to have only a single client, then you should probably stream > via unicast rather than via multicast. > > You can easily modify the "testOnDemandRTSPServer" demo application to > stream from your named pipe via unicast: See > http://www.live555.com/liveMedia/faq.html#liveInput-unicast Yes, i'm planning using a single client. So, i'm trying use "testOnDemandRTSPServer". But, for use with a named pipe i cant, until now, makes stuffs work. Is there a place where i can set the amount of data "testOnDemandRTSPServer" will read in each interation?? It seems to be a possible bottleneck for my app, because the apparent loss of data at VLC side could be a pipe overflow. Tanks, > > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- Att, Rafael Madeira Idea! Electronic Systems www.idea-ip.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From max.goettner at dstream.de Tue Feb 22 09:08:07 2011 From: max.goettner at dstream.de (=?iso-8859-1?Q?Max_G=F6ttner_-_d.stream?=) Date: Tue, 22 Feb 2011 18:08:07 +0100 Subject: [Live-devel] .avi files written by openRTSP Message-ID: <000001cbd2b3$135445a0$39fcd0e0$@goettner@dstream.de> Hi Ross, hi people, I am new to this list. I have been developing a rtsp streaming client application based on the openRtsp test program for a couple of time now. I use the AviFileSink to write an mpeg4 stream from a camera to a file. Files can be played by all desired players, but we would like to improve the seeking and reverse playing functionality. Bad seeking performance would be, slow framewise backward transport in virtual dub, or unreliable jumps and frameswise move in MCI-based video players (that would be the applicational target of the generated files). Analyzing the .avi files show, that no index list is written, virtual dub gives the following message: ?index not found or damaged; keyframe flag reconstruction was not specified in open options and the video stream is not a known keyframe-only type. Seeking the video may be extremely slow? Now, I am no specialist for the avi-file format, but my first attempt was to add an index with another software, for example virtual dub. I succeeded in creating the list, but it did not give satisfiying results on seeking performance. Comparing the files with other files that provide good seeking performance did not show any structural indifference. Is anyone familiar with my problem? Do I concern the right issues on solving that problem, or is anything about the construction of the avi-header, flags etc. of importance. Could it even be a question of the video-codec? Thanks for help in advance, Greetings, Max. -------------- next part -------------- An HTML attachment was scrubbed... URL: From komal.dhote at gmail.com Tue Feb 22 11:13:02 2011 From: komal.dhote at gmail.com (Komal Dhote) Date: Wed, 23 Feb 2011 00:43:02 +0530 Subject: [Live-devel] iPhone CAN"T play the TS files generated by testH264VideoToTransportStream In-Reply-To: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> References: <03A2FA9E0D3DC841992E682BF528771802C9A7D1@lipwig.Cernium.local> Message-ID: Hi I am making multiple client so i can receive the multiple streams (H264 1080P 30fps). I have modified the single step in the basictaskscheduler as 1. While one loop is removed void BasicTaskScheduler0::doEventLoop(char* watchVariable) { // Repeatedly loop, handling readble sockets and timed events: // while (1) { if (watchVariable != NULL && *watchVariable != 0) { } else SingleStep(); // } } 2. call back function is modified Boolean FileSink::continuePlaying() { if (fSource == NULL) return False; fSource->getNextFrame(fBuffer, fBufferSize, myafterGettingFrame, this, onSourceClosure, this); return True; } 3. myafterGettingFrame set a flag data_available 4. call this function to get the data while (1) { env1->taskScheduler().doEventLoop(); if (data_available == TRUE) { fileSink1->AftetGetFrame(); data_available = FALSE; } env2->taskScheduler().doEventLoop(); if (data_available == TRUE) { fileSink2->AftetGetFrame(); data_available = FALSE; } env3->taskScheduler().doEventLoop(); if (data_available == TRUE) { fileSink3->AftetGetFrame(); data_available = FALSE; } env4->taskScheduler().doEventLoop(); if (data_available == TRUE) { fileSink4->AftetGetFrame(); data_available = FALSE; } } When i receive single channel i receive without any errors. But when i receive two or more streams then there is error in the decoder (H264 decoding) Can anybody tell me the way i am calling the here is correct? Or Do i need to call env->taskScheduler().doEventLoop(); in seperate thread? Please guide in this regards, Komal Kumar -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Feb 22 19:19:25 2011 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Feb 2011 13:19:25 +1000 Subject: [Live-devel] .avi files written by openRTSP In-Reply-To: <000001cbd2b3$135445a0$39fcd0e0$@goettner@dstream.de> References: <000001cbd2b3$135445a0$39fcd0e0$@goettner@dstream.de> Message-ID: The "AVIFileSink" class - which implements writing AVI-format files - is incomplete, and could undoubtedly be improved. I don't have time to work on this myself, however, so I'd appreciate any improvements to this code that anyone wishes to make. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Feb 23 11:51:54 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Feb 2011 05:51:54 +1000 Subject: [Live-devel] Streaming with live555 using named pipes with dm6467 In-Reply-To: References: Message-ID: >Yes, i'm planning using a single client. So, i'm trying >use "testOnDemandRTSPServer". But, for use with a named pipe i cant, >until now, makes stuffs work. > >Is there a place where i can set the amount of >data "testOnDemandRTSPServer" will read in each interation?? Not really. This is probably irrelevant for your problem anyway. > It seems to be a possible bottleneck for my app, because the >apparent loss of data at VLC side could be a pipe overflow. The server will stream the Transport Stream data at the appropriate rate, as determined by the PCR timestamps in the data. If the pipe between your encoder and the server is overflowing (you should be able to determine for sure whether this is happening, BTW), then it will probably be because either (i) your encoder is generating incorrect PCR timestamps (which is causing the server to stream too slowly), or, much more likely, (ii) your encoder is generating Transport Stream data too fast, and is not blocking if/when it fills up the pipe. Because your Transport Stream encoder is writing to a pipe, it needs to block if/when the pipe fills up. But because - from the encoder's point of view - the pipe is just like an open file, this should be happening automatically. In any case, this seems to be just basic Unix systems programming, and has nothing to do with our code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Feb 23 12:47:59 2011 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Feb 2011 06:47:59 +1000 Subject: [Live-devel] issue with getResultMsg from the old api In-Reply-To: <1298040548.12401.15.camel@stim-desktop> References: <1298040548.12401.15.camel@stim-desktop> Message-ID: >Vlc 1.1 uses the old synchronus API, and I noticed something changed in >live555 a while ago. >Before, (I tested with a version from march 2010), the getResultMsg() >after a describe command returned : >cannot handle DESCRIBE response: RTSP/1.0 401 Unauthorized > >latest version now returns only: >401 Unauthorized > >The issue is that it does not contain RTSP/1.0 anymore, and vlc is >checking this to get the error code. Yes, the new asynchronous "RTSPClient" API (and thus also the old, synchronous API, because it's implemented using the new API) now returns result messages that begin with the response code, not with "RTSP/x.y" or "HTTP/x.y" - to make them more meaningful. I didn't realize that VLC was expecting the old form of result message; it should instead be checking for an embedded "401 ". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From 44072429 at qq.com Wed Feb 23 01:18:33 2011 From: 44072429 at qq.com (=?ISO-8859-1?B?NDQwNzI0Mjk=?=) Date: Wed, 23 Feb 2011 17:18:33 +0800 Subject: [Live-devel] hi.problem about rtsp server. Message-ID: there are one video source.with 8mbps rate. now 48 client want to get video. but only 32 client can be connectted. if i use telnet 192.168.2.146 554(my server's ip) it will success but want i click two enter(or more). no 404 bad request response. and my server do not have any notify. what's this. is there any config parameters?just like rtspcommon.h #define RTSP_PARAM_STRING_MAX 1024 regards. Kun You. -------------- next part -------------- An HTML attachment was scrubbed... URL: From akaan.turk at gmail.com Mon Feb 28 00:23:23 2011 From: akaan.turk at gmail.com (Kaan Turkyilmaz) Date: Mon, 28 Feb 2011 10:23:23 +0200 Subject: [Live-devel] Streaming server Message-ID: Hi everyone. I am new here. I am a senior developer. Sorry for inconvenient questions. I am planning to use MPEG-2, directly streaming MPEG-2 instead of h.264. Which streaming server can I use? And How? Thanks for your help and time. -- --Kaan -------------- next part -------------- An HTML attachment was scrubbed... URL: From mtamilarasu at gmail.com Wed Feb 23 03:51:50 2011 From: mtamilarasu at gmail.com (tamil arasu) Date: Wed, 23 Feb 2011 17:21:50 +0530 Subject: [Live-devel] Regarding wis-streamer issue Message-ID: Hi All, I am new to streaming application.We are using The Appro-IPNC as a RTSP Server and Gstreamer as a RTSP Client.The Mpeg4 stream is available in the following url rtsp://192.168.1.168/mpeg4.When We tried to get mpeg4 stream from RTSP Server by using four RTSP Client at same time,We are getting the following error FramedSource::getNextFrame(): attempting to read more than once at the same time! After this , wis-streamer is going to zombie in the RTSP Server Application. After killing the wis-streamer Application, If we start the Application again ,This time Everything is working properly. Can anyone help me on this. Thanka and regards, Tamilarasu.M -------------- next part -------------- An HTML attachment was scrubbed... URL: From en_1230 at 163.com Fri Feb 25 19:43:09 2011 From: en_1230 at 163.com (en_1230) Date: Sat, 26 Feb 2011 11:43:09 +0800 (CST) Subject: [Live-devel] i don't know how to use Iunknown::CreateMediaSession Message-ID: <1462f385.1468c.12e6010129a.Coremail.en_1230@163.com> Hi, We just found sourcecode ofMorgan RTP DirectShowTM Filters", I don't know how to use Iunknown::CreateMediaSession in Morgan RTP DirectShow .We don't known how to development client of com, Please guide me how to programe Iunknown::CreateMediaSession in client of com and how to programe in server of com in order to building SDP. Best regards . -------------- next part -------------- An HTML attachment was scrubbed... URL: From hustmark at gmail.com Mon Feb 28 17:20:36 2011 From: hustmark at gmail.com (=?GB2312?B?wu2x887k?=) Date: Tue, 1 Mar 2011 09:20:36 +0800 Subject: [Live-devel] Bug about demo.aac Message-ID: The demo.aac is the one in http://www.live555.com/liveMedia/public/aac/. I just use live555MediaServer.exe as the server, VLC and openRTSP as the clients. The samplerate from the SDP description is 22050 hz, but the audio decoder (faad) in VLC reports the value of 44100hz, that's strange and something maybe wrong. Here is the log messages from openRTSP: Opened URL "rtsp://192.168.20.1:8554/demo.aac", returning a SDP description: v=0 o=- 1298640530227418 1 IN IP4 192.168.20.1 s=AAC Audio, streamed by the LIVE555 Media Server i=demo.aac t=0 0 a=tool:LIVE555 Streaming Media v2011.01.19 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:AAC Audio, streamed by the LIVE555 Media Server a=x-qt-text-inf:demo.aac m=audio 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:96 a=rtpmap:96 MPEG4-GENERIC/22050/6 a=fmtp:96 streamtype=5;profile-level-id=1;mode=AAC-hbr;sizelength=13;indexlength =3;indexdeltalength=3;config=13b0 a=control:track1 From d.heinzelmann at ids-imaging.de Mon Feb 28 05:26:35 2011 From: d.heinzelmann at ids-imaging.de (Heinzelmann David) Date: Mon, 28 Feb 2011 14:26:35 +0100 Subject: [Live-devel] Live555 Shared Library Visual Studio Message-ID: <4CB3F3483954BD4ABD0B4DBB35C7E33801C48BE9@exchsrv.idszentral.local> Hi, I do like to include your library in a Win app as LGPL. But to include the library dynamically in Visual Studio the classes have to be exported with "__declspec(dllexport)". Do I have to add "__declspec(dllexport)" to every class? And does this violate the LGPL? Or is there any other possiblity to add Live555 dynamically in Visual Studio? Regards, David. -------------- next part -------------- An HTML attachment was scrubbed... URL: