From finlayson at live555.com Fri Jun 1 00:11:52 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 1 Jun 2007 00:11:52 -0700 Subject: [Live-devel] live-devel Digest, Vol 43, Issue 24 In-Reply-To: <7EAD1AEEA7621C45899FE99123E124A0E02D53@dbde01.ent.ti.com> References: <7EAD1AEEA7621C45899FE99123E124A0E02D53@dbde01.ent.ti.com> Message-ID: Here's the problem: >Opened URL "rtsp://172.24.141.104:554/Video/edit.mpg", returning a SDP >description: >v=0 >o=- 3389626461 0 IN IP4 0.0.0.0 >s=Video RTSP Server >t=3389626461 0 >m=video 0 RTP/AVP 33 >a=rtpmap:33 H264/90000 >a=control:rtsp://172.24.141.104:554/Video/edit.mpg >a=range:npt=0.0-144.159 Your server is explicitly saying that your stream lasts 144.159 seconds. Therefore "openRTSP", by default, records only for that length of time. If you want to (try to) record for a longer period of time, then you can do so using the "-e " option. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From julian.lamberty at mytum.de Fri Jun 1 01:15:42 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 01 Jun 2007 10:15:42 +0200 Subject: [Live-devel] UDP checksum wrong Message-ID: <465FD5AE.1000501@mytum.de> Hi! Since my transcoder now transcodes I have a new problem: I deliver complete MPEG4 Frames to MPEG4VideoStreamDiscreteFramer followed by MPEG4ESVideoRTPSink. But all the packets sent have a wrong UDP checksum. Wireshark reports that a lot of subsequent UDP packets have the SAME checksum. For example: ... Checksum: 0xb63d [incorrect, should be 0x9e8c] Checksum: 0xb63d [incorrect, should be 0x9e8b] Checksum: 0xb63d [incorrect, should be 0x9e8a] Checksum: 0xb63d [incorrect, should be 0x9e89] ... Additionally all the packets do have the same Length (347 in the example above). It also seems strange that the correct checksum should decrease by 0x1 for each packet. What's the problem here? How can that be? Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070601/52e93158/attachment.bin From julian.lamberty at mytum.de Fri Jun 1 04:05:41 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 01 Jun 2007 13:05:41 +0200 Subject: [Live-devel] UDP checksum wrong Message-ID: <465FFD85.8000806@mytum.de> Sorry, my fault ;) You should not run wireshark on the same computer that sends the packets... -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070601/9b27e442/attachment.bin From lartc at manchotnetworks.net Fri Jun 1 04:42:36 2007 From: lartc at manchotnetworks.net (lartc) Date: Fri, 01 Jun 2007 13:42:36 +0200 Subject: [Live-devel] UDP checksum wrong In-Reply-To: <465FFD85.8000806@mytum.de> References: <465FFD85.8000806@mytum.de> Message-ID: <1180698156.5587.1.camel@sumatra.radius.fr> this "error" can also occur when your the checksum process is offloaded and performed by the network adapter ... On Fri, 2007-06-01 at 13:05 +0200, Julian Lamberty wrote: > Sorry, my fault ;) You should not run wireshark on the same computer > that sends the packets... > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- "simplified chinese" is not nearly as easy as they would have you believe ... a superlative oxymoron" --anonymous From finlayson at live555.com Fri Jun 1 04:53:14 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 1 Jun 2007 04:53:14 -0700 Subject: [Live-devel] UDP checksum wrong In-Reply-To: <465FD5AE.1000501@mytum.de> References: <465FD5AE.1000501@mytum.de> Message-ID: UDP checksumming is done by the OS or network adaptor, and has absolutely nothing to do with the "LIVE555 Streaming Media" code, which runs above all of this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From severin.schoepke at gmail.com Fri Jun 1 07:08:24 2007 From: severin.schoepke at gmail.com (Severin Schoepke) Date: Fri, 01 Jun 2007 16:08:24 +0200 Subject: [Live-devel] Synchronizing audio and video streams In-Reply-To: References: <46571067.6040101@gmail.com> Message-ID: <46602858.7030603@gmail.com> Hello again, I'm referring to the following mail: I'm working on an application that streams live generated content (audio and video) using the Darwin Streamnig Server. I decided to use ffmpeg for the encoding part and live555 for the streaming part. The basic architecture is as follows: In a first thread I generate the content, and in a second one I stream the content to the DSS. Data is passed between the threads using two buffers. I implemented two FramedSource subclasses for the streaming part, one for audio and one for video data. Both work as follows: They read raw data from their input buffer and encode it using ffmpeg (using MPEG4 and MP2 codecs) and write to their output buffers (fTo). My sources are then connected to an MPEG4VideoStreamFramer and an MPEG1or2AudioStreamFramer respectively. These are connected to a DarwinInjector (based on the code of testMPEG4VideoToDarwin). The problem I have now is that the streams are not synchronized when viewing them (using QuickTime or VLC). Based on debug printf's I found out that the audio source's doGetNextFrame() is called much more that the video source's. Therefore, the audio stream is played ahead and the video stream is lagging behind. I set correct presentation times for both streams, so I thought that live555 does 'the rest' synchronizing the streams, but it seems not to work. Therefore I'd like to ask you if I should implement a synchronization of the streams by myself, or if I'm doing something wrong... I followed your suggestion and tried it using your server... I got an answer: Ross Finlayson schrieb: > We're no longer providing support for the "DarwinInjector" code. > Instead, we recommend that you use our own RTSP server > implementation, rather than a separate "Darwin Streaming Server". > So I tried to use your RTSP server but it didn't change the situation. My streams are still diverting. I investigated the problem further and fond out, that live555 queries my sound source much more often than it queries the video source. I suspect it has something to do with data size: my audio source provides only about 300 bytes per invocation, whereas the video source provides between 3000 and 8000bytes per invocation (I'm referring to a call of doGetNextFrame() as an invocation). I suspect that live555 waits with streaming out video until it has aboout the same size of audio data. This way, my audio source gets polled about 10 times as much as the video source, even though it can only provide about twice as much frames than the video source... I know, the above sounds a bit strange and is not very clear, but I hope someone understands what I mean and I hope that this person tells me wether this behaviour is to be expected and how I could change it. cheers, Severin From finlayson at live555.com Fri Jun 1 12:53:52 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 1 Jun 2007 12:53:52 -0700 Subject: [Live-devel] Synchronizing audio and video streams In-Reply-To: <46602858.7030603@gmail.com> References: <46571067.6040101@gmail.com> <46602858.7030603@gmail.com> Message-ID: >So I tried to use your RTSP server but it didn't change the situation. >My streams are still diverting. I investigated the problem further and >fond out, that live555 queries my sound source much more often than it >queries the video source. I suspect it has something to do with data >size No, that shouldn't matter. If you (ii) give your data (audio and video) accurate presentation times (the "fPresentationTime" variable) - tied to the local 'wall clock' time (e.g., using "gettimeofday()"), and (ii) use RTCP (by creating "RTCPInstance" objects for each "RTPSink"), then audio/video sync *will* work correctly at the client end. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mjn at oxysys.com Fri Jun 1 13:34:57 2007 From: mjn at oxysys.com (Marc Neuberger) Date: Fri, 01 Jun 2007 16:34:57 -0400 Subject: [Live-devel] Performance Message-ID: <466082F1.9050409@oxysys.com> A couple of months ago, there was a discussion of performance of the live555 libraries on Linux, and the discussion turned to the efficiency of select() vs. epoll(). Studying the performance my own epoll()-based scheduler, I strongly suspect that the far bigger source of inefficiency is the DelayQueue implementation that BasicTaskScheduler0 uses. This queue is a linked list, causing O(n) cost to adding and deleting timers. Which happens a lot. If I understand the behavior correctly, a 45-second idle timer is rescheduled on each packet. This almost invariably goes to the end of the scheduling queue. With the stock code, I had results similar to Vlad Seyakov's: I petered out at about 140-150 sessions. With my rewritten scheduler, I've been able to get to 400-500 sessions. My scheduling queue is based on an STL set<> with an appropriate less-than operator. This provides O(log n) insert/delete. Even so, I find that scheduling and unscheduling timers accounts for approximately 1/3 of the CPU at 400 sessions. I made one other observation: readSocket() in GroupsockHelper.cpp calls blockUntilReadable(). blockUntilReadable() uses select() to wait for the socket to be ready. This has two problems: first, we really shouldn't ever be blocking, since this blocks all sessions: if the data isn't ready, we should go back to the event loop. This should happen rarely, if ever, of course, since presumably we're only calling this after a select()/epoll() has triggered. The larger problem is that the use of select() a server to 1024 file descriptors unless you override the size of fd_sets in your build, and that of course, creates a performance degradation. Architecturally is seems a little harder to replace parts of groupsock than to replace parts of UsageEnvironment to make changes like this. Marc Neuberger From finlayson at live555.com Fri Jun 1 14:09:46 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 1 Jun 2007 14:09:46 -0700 Subject: [Live-devel] Performance In-Reply-To: <466082F1.9050409@oxysys.com> References: <466082F1.9050409@oxysys.com> Message-ID: >Studying the performance my own epoll()-based scheduler, I strongly >suspect that the far bigger source of inefficiency is the DelayQueue >implementation that BasicTaskScheduler0 uses. This queue is a linked >list, causing O(n) cost to adding and deleting timers. Which happens a >lot. If I understand the behavior correctly, a 45-second idle timer is >rescheduled on each packet. No, that's not correct. The RTSP server implementation's 'liveness check' timer gets rescheduled only after the receipt of an incoming *RTCP packet* (or an incoming RTSP command) - not on every (or any) outgoing packet. >With the stock code, I had results similar to Vlad Seyakov's: I petered >out at about 140-150 sessions. With my rewritten scheduler, I've been >able to get to 400-500 sessions. My scheduling queue is based on an STL >set<> with an appropriate less-than operator. This provides O(log n) >insert/delete. I'd be happy to consider replacing the current DelayQueue implementation with something that performs better with large number of tasks (with relatively small number of tasks, the existing implementation shoud be OK). However, I don't want to use the STL, because I don't want to make the "LIVE555 Streaming Media" dependent upon it (because I want this code to continue to be useful for (e.g.) embedded systems that are relatively memory constrained, or which might not have the STL available for other reasons). >I made one other observation: readSocket() in GroupsockHelper.cpp calls >blockUntilReadable(). blockUntilReadable() uses select() to wait for the >socket to be ready. This has two problems: first, we really shouldn't >ever be blocking, since this blocks all sessions: if the data isn't >ready, we should go back to the event loop. Yes. Unfortunately there are still a few places in the code where we want to do a synchronous (blocking) read on a socket. To handle that case, we include the "select()" call in "readSocket()", even though it's usually (i.e., for asynchronous reading) a no-op. At some point, I should get rid of these (few) remaining blocking socket reads, and remove the "select()" call from "readSocket()". Actually, as you're just running a RTSP server, you can probably remove the "select()" call right now. You could give that a try, to see if it improves performance on your system. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mjn at oxysys.com Fri Jun 1 14:59:47 2007 From: mjn at oxysys.com (Marc Neuberger) Date: Fri, 01 Jun 2007 17:59:47 -0400 Subject: [Live-devel] Performance In-Reply-To: References: <466082F1.9050409@oxysys.com> Message-ID: <466096D3.3070002@oxysys.com> Ross Finlayson wrote: > No, that's not correct. The RTSP server implementation's 'liveness > check' timer gets rescheduled only after the receipt of an incoming > *RTCP packet* (or an incoming RTSP command) - not on every (or any) > outgoing packet. > Ah good, that makes a great deal more sense. > However, I don't want to use the STL, > because I don't want to make the "LIVE555 Streaming Media" dependent > upon it (because I want this code to continue to be useful for (e.g.) > embedded systems that are relatively memory constrained, or which > might not have the STL available for other reasons). > Yeah, that makes sense, too. I certainly wouldn't attempt to write the equivalent of the STL class (a Red-Black tree) myself. I agree that the current implementation is perfectly fine for most uses. Largely, I'm offering a tip to others that may find themselves in my situation about where to look for performance. > At some point, I should get rid of these (few) remaining blocking > socket reads, and remove the "select()" call from "readSocket()". > Actually, as you're just running a RTSP server, you can probably > remove the "select()" call right now. You could give that a try, to > see if it improves performance on your system. > Yes, I did this. The figures of 400-500 sessions I quoted had the call to blockUntilReadable commented out. Marc Neuberger -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070601/c6b8dd8b/attachment.html From mjn at oxysys.com Fri Jun 1 15:00:48 2007 From: mjn at oxysys.com (Marc Neuberger) Date: Fri, 01 Jun 2007 18:00:48 -0400 Subject: [Live-devel] RTCP socket blocking Message-ID: <46609710.40802@oxysys.com> I am having an extremely occasional hang of a live555-based linux rtsp server under heavy load. I have induced a core dump to see where the hang occurs. It seems that we hang waiting for a packet on the RTCP socket. The RTCP socket does not appear to be set to non-blocking. Now, at first glance, it would appear that, even though it is a blocking socket, this should never happen, since we've had select() (or in my case, epoll()) report data available. However, it turns out that the linux kernel feels free to drop UDP packets after notifying a socket that it is readable. From the select() man page: Under Linux, select() may report a socket file descriptor as "ready for reading", while nevertheless a subsequent read blocks. This could for example happen when data has arrived but upon examination has wrong checksum and is discarded. There may be other circumstances in which a file descriptor is spuriously reported as ready. Thus it may be safer to use O_NONBLOCK on sockets that should not block. As I thought about the complete hang I was seeing, I became suspicious of my theory, since presumably, the read would return on the next RTCP packet. So I instrumented my scheduler with timings of the callbacks for turnOnBackgroundReading. I counted callbacks that take >10ms >100ms and >1000ms. I find that I do see quite a few instances of a background read task taking over 1 second. This leads me to believe that the read of an RTCP packet has to wait for the _next_ RTCP packet from time to time. My server hangs on the exceptionally rare instance that this lost RTCP packet is the _last_ RTCP packet coming from the client. Is there a built-in assumption that the RTCP socket is blocking? If I just change the code to make it non-blocking, will there be any ill effect on the session when such an RTCP packet is lost? Marc Neuberger From xcsmith at rockwellcollins.com Fri Jun 1 15:34:52 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Fri, 1 Jun 2007 17:34:52 -0500 Subject: [Live-devel] openRTSP Subsession Destruction Message-ID: I am looking at the source testProgs/playCommon.cpp, in particular the shutdown() and afterPlaying() functions. Does calling Medium::close(subsession->sink) call the subsession destructor also? This is what the comment says in subsessionAfterPlaying(). I set a breakpoint on MediaSubsession::~MediaSubsession() and it does not get called. Are these cleared some other way, or don't they need to be cleared? ./openRTSP -V rtsp://10.145.223.24:8554/incredibles.ts Ross Finlayson 1/14/07: Generally speaking, you should destroy objects in the opposite order from that in which they were created. Also, subclasses of "Medium" are destroyed using "Medium::close()", whereas "Groupsock" objects are destroyed using "delete". (This is historical ugliness...) Thx! (And thank you very much for helping me with ReceivingInterfaceAddr!) Xochitl -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070601/c71b1fe2/attachment.html From finlayson at live555.com Fri Jun 1 16:10:19 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 1 Jun 2007 16:10:19 -0700 Subject: [Live-devel] RTCP socket blocking In-Reply-To: <46609710.40802@oxysys.com> References: <46609710.40802@oxysys.com> Message-ID: >Is there a built-in assumption that the RTCP socket is blocking? No. Both incoming RTCP and incoming RTP packets are read asynchronously, from the event loop, so their sockets don't need to be blocking. > If I >just change the code to make it non-blocking, will there be any ill >effect on the session when such an RTCP packet is lost? Probably not. The best way to do this would be to add the line makeSocketNonBlocking(fGS->socketNum()); to the (currently empty) "RTPInterface::RTPInterface" constructor (in "liveMedia/RTPInterface.cpp"). That way, it will work for incoming RTP packets also. (Although your application - as a RTSP server - doesn't have any incoming RTP packets, other applications do.) Please let us know whether that works OK for you. If so, I'll add it to the next release of the code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Jun 1 16:27:46 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 1 Jun 2007 16:27:46 -0700 Subject: [Live-devel] openRTSP Subsession Destruction In-Reply-To: References: Message-ID: >I am looking at the source testProgs/playCommon.cpp, in particular >the shutdown() and afterPlaying() functions. >Does calling Medium::close(subsession->sink) call the subsession >destructor also? This is what the comment says in >subsessionAfterPlaying(). That comment is perhaps a bit misleading. When it says that it "closes the subsession", it means that it closes down the stream that's associated with the subsession. The actual "MediaSubsession" object itself is closed within the "MediaSession" destructor, which (should) get called from within the "shutdown()" routine (in the call to "Medium::close(session)"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070601/f4b119db/attachment.html From julian.lamberty at mytum.de Sat Jun 2 02:59:00 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Sat, 02 Jun 2007 11:59:00 +0200 Subject: [Live-devel] FramedFilter questions Message-ID: <46613F64.5060102@mytum.de> Hi! I've some questions related to the FramedFilter class. I subclassed it to transcode a MPEG2 Stream to MPEG4. So the structure of my program looks like: MPEG1or2VideoRTPSource -> Transcoder (my class) -> MPEG4VideoStreamDiscreteFramer -> MPEG4ESVideoRTPSink My Transcoder class reads data as long as it is able to decode one frame. When one frame has been decoded it is passed to an encoder (ffmpeg) which writes one complete frame into the fTo buffer (limited by fMaxSize). The problem is that the MPEG4 stream I send seems to be totally corrupted. Thus I have some questions: Which variables do I have to set before I pass data to MPEG4VideoStreamDiscreteFramer? Do I need to care about fpresentationTime, fdurationInMicrosecond? If yes, how do they have to be set? Right now I just set them to the values I get from the source (presentationTime, durationIn Microseconds (which equals -1 btw)). Is this correct? I'm somehow stuck right now, I do not know whats wrong. When I just pass packets everything works fine (of course I use another structure then: MPEG1or2VideoRTPSource -> Transcoder (just memcpys) -> MPEG2VideoStreamDiscreteFramer -> MPEG1or2VideoRTPSink). Ideas and help appreciated ;) Thanks! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070602/80ce68c7/attachment-0001.bin From finlayson at live555.com Sat Jun 2 09:40:48 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 2 Jun 2007 09:40:48 -0700 Subject: [Live-devel] FramedFilter questions In-Reply-To: <46613F64.5060102@mytum.de> References: <46613F64.5060102@mytum.de> Message-ID: >Ideas and help appreciated ;) If you haven't already done so, look at the commented section of "DeviceSource.cpp" for hints/advice. (Also, of course, the code of existing "FramedFilter" subclasses.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From julian.lamberty at mytum.de Sun Jun 3 02:02:02 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Sun, 03 Jun 2007 11:02:02 +0200 Subject: [Live-devel] FramedFilter questions References: 46613F64.5060102@mytum.de Message-ID: <4662838A.4040509@mytum.de> As you already guessed, I did that ;) I took MP3ADUTranscoder as an example where presentation time and duration are just passed through. I don't get why my code does not work. When I write one complete frame to fTo, MPEG4VideoStreamDiscreteFramer should be able to pass them correctly to MPEG4ESVideoRTPSink, right? If that is the case and I can pass through duration and presentation time, then I'm really stuck. I use VLC to receive the data an display the stream but it gives me lots of "macroblock errors" and "motion vectors not available" warnings... -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070603/3946fa4c/attachment.bin From ravinder.singh at ti.com Sun Jun 3 22:23:12 2007 From: ravinder.singh at ti.com (singh, Ravinder) Date: Mon, 4 Jun 2007 10:53:12 +0530 Subject: [Live-devel] Complete file not streamed In-Reply-To: Message-ID: <7EAD1AEEA7621C45899FE99123E124A0E02F9F@dbde01.ent.ti.com> Hi Ross Stream is of 144.159 seconds duration only; problem is server stops streaming the file after 60 seconds only, as a result rest of the file Is not streamed, our client wait for 144.159 sec's duration and when Sending teardown response to server after 144.159 duration it reports Received TEARDOWN response: RTSP/1.0 454 Session Not Found CSeq: 5 Session: 93547611 What I think is openRTSP is not returning responses to server while Streaming is going on as a result of which server is aborting after Some time. Server is playing fine with bit band client. Regards Ravinder From finlayson at live555.com Sun Jun 3 22:37:40 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 3 Jun 2007 22:37:40 -0700 Subject: [Live-devel] Complete file not streamed In-Reply-To: <7EAD1AEEA7621C45899FE99123E124A0E02F9F@dbde01.ent.ti.com> References: <7EAD1AEEA7621C45899FE99123E124A0E02F9F@dbde01.ent.ti.com> Message-ID: >Hi Ross >Stream is of 144.159 seconds duration only; problem is server stops >streaming the file after 60 seconds only Well then that's a problem with your server - not our client! As far as I can tell, our client is working perfectly. What *might* be happening is that your server is expecting some non-standard periodic keep-alive packets or requests from the client, and if it doesn't receive any within 60 seconds, then the server times out the session. But again, that is a problem with your server. The standard way for a server to check the 'liveness' of a client is via incoming RTCP "RR" packets (from the client). Our client sends such packets. You need to contact the manufacturer of your server to ask them why it's not working properly with our (IETF standard) client. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From bidibulle at operamail.com Mon Jun 4 01:18:45 2007 From: bidibulle at operamail.com (David Betrand) Date: Mon, 4 Jun 2007 09:18:45 +0100 Subject: [Live-devel] Performance Message-ID: <20070604081845.C4892CA0BF@ws5-11.us4.outblaze.com> Hello Marc, We use liveMedia library in a server-side application and we noticed the same performance issue with DelayQueue class. We tried to optimize it but couldn't get good results. Would you mind if I ask you to share your optimized code ? Ross, I understand your point regarding embedded sysytems but one solution could be to define a compilation flag which determines if STL classes can be used. This would allow using optimized code like what Marc did in DelayQueue, for server-side solutions which require very good performance. Thanks for your feedback. David > > However, I don't want to use the STL, because I don't want to > > make the "LIVE555 Streaming Media" dependent upon it (because I > > want this code to continue to be useful for (e.g.) embedded > > systems that are relatively memory constrained, or which might > > not have the STL available for other reasons). > > > Yeah, that makes sense, too. I certainly wouldn't attempt to write > the equivalent of the STL class (a Red-Black tree) myself. I agree > that the current implementation is perfectly fine for most uses. > Largely, I'm offering a tip to others that may find themselves in > my situation about where to look for performance. -- _______________________________________________ Surf the Web in a faster, safer and easier way: Download Opera 9 at http://www.opera.com Powered by Outblaze From julian.lamberty at mytum.de Mon Jun 4 05:13:08 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Mon, 04 Jun 2007 14:13:08 +0200 Subject: [Live-devel] MPEG4VideoStreamDiscreteFramer Message-ID: <466401D4.4010109@mytum.de> Hi! I have an encoder that writes MPEG4 frames to a buffer which I want to stream out with MPEG4ESVideoRTPSink. I'm using the MPEG4VideoStreamDiscreteFramer class in between. The problem is that the stream is corrupted, but if I write the buffer to a file like "test.m4e" it can be played correctly. There seems to be a problem passing the buffer into MPEG4VideoStreamDiscreteFramer. I have also tried to stream the "test.m4e" I created with the test program "testMPEG4VideoStreamer", the output is also corrupted. What could that be caused by? I have my encode function write to a buffer and memcpy that buffer into fTo before I set fFrameSize and fNumTruncatedBytes. Could that be a problem relating timestamps? Please help me, I'm stuck... Thanks Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070604/4900ba15/attachment.bin From severin.schoepke at gmail.com Mon Jun 4 10:32:39 2007 From: severin.schoepke at gmail.com (Severin Schoepke) Date: Mon, 04 Jun 2007 19:32:39 +0200 Subject: [Live-devel] Synchronizing audio and video streams In-Reply-To: References: <46571067.6040101@gmail.com> <46602858.7030603@gmail.com> Message-ID: <46644CB7.1030909@gmail.com> Hello again! I investigated a little deeper and here is what I came up with: Just to recap: I have two threads, one is reading audio and video frames and stores them in two queues. Another thread reads these queues, encodes the frames to MPEG4 and MP3, and streams them out via live555. The live555 part is organized as follows: I have two FramedSource subclasses that provide encoded audio and video frames respectively. These are connected to an MPEG4VideoFramer and an MPEG1or2AudioFramer respectively, which are connected to an MPEG4ESVideoRTPSink and an MPEG1or2AudioRTPSink. To provide the stream, I use a DarwinInjector (I tested your RTSPServer and it yielded in the same results). My problem is, that the video source's doGetNextFrame() is called much less than the audio source's. As a result, the video source's queue gets filled with video frames, that are not consumed immediately. Therefore, the video stream begins to lag behind the audio stream. Here is some debug output (sorry, it has to be that detailed and long for it to make sense): Beginning streaming... Beginning to read from video input... 2007-06-04 15:36:58.101 Mischpult[12008] 1180964218:69929 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.107 Mischpult[12008] 1180964218:102202 - buffering audio frame (now 2 audio frames in queue) 2007-06-04 15:36:58.133 Mischpult[12008] 1180964218:118806 - buffering video frame (now 1 video frames in queue) 2007-06-04 15:36:58.138 Mischpult[12008] 1180964218:135415 - buffering audio frame (now 3 audio frames in queue) 2007-06-04 15:36:58.152 Mischpult[12008] 1180964218:151894 - doGetNextFrame(): provided Video frame: fFrameSize = 16505 bytes with fPresentationTime = 1180964218:118806 (now 0 Video frames in queue) 2007-06-04 15:36:58.178 Mischpult[12008] 1180964218:168648 - buffering video frame (now 1 video frames in queue) 2007-06-04 15:36:58.179 Mischpult[12008] 1180964218:168648 - buffering audio frame (now 4 audio frames in queue) 2007-06-04 15:36:58.206 Mischpult[12008] 1180964218:201928 - buffering audio frame (now 5 audio frames in queue) 2007-06-04 15:36:58.207 Mischpult[12008] 1180964218:206957 - doGetNextFrame(): provided Video frame: fFrameSize = 10751 bytes with fPresentationTime = 1180964218:168648 (now 0 Video frames in queue) Beginning to read from audio input... 2007-06-04 15:36:58.209 Mischpult[12008] 1180964218:208258 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:69929 (now 4 Audio frames in queue) 2007-06-04 15:36:58.209 Mischpult[12008] 1180964218:208463 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:102202 (now 3 Audio frames in queue) 2007-06-04 15:36:58.209 Mischpult[12008] 1180964218:208654 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:135415 (now 2 Audio frames in queue) Play this stream (from the Darwin Streaming Server) using the URL: 2007-06-04 15:36:58.226 Mischpult[12008] 1180964218:218542 - buffering video frame (now 1 video frames in queue) rtsp://127.0.0.1/test.sdp 2007-06-04 15:36:58.238 Mischpult[12008] 1180964218:235192 - buffering audio frame (now 3 audio frames in queue) 2007-06-04 15:36:58.239 Mischpult[12008] 1180964218:238262 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:168648 (now 2 Audio frames in queue) 2007-06-04 15:36:58.239 Mischpult[12008] 1180964218:238564 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:201928 (now 1 Audio frames in queue) 2007-06-04 15:36:58.239 Mischpult[12008] 1180964218:238838 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:235192 (now 0 Audio frames in queue) 2007-06-04 15:36:58.275 Mischpult[12008] 1180964218:268453 - buffering video frame (now 2 video frames in queue) 2007-06-04 15:36:58.276 Mischpult[12008] 1180964218:268453 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.291 Mischpult[12008] 1180964218:290250 - doGetNextFrame(): provided Video frame: fFrameSize = 33815 bytes with fPresentationTime = 1180964218:218542 (now 1 Video frames in queue) 2007-06-04 15:36:58.304 Mischpult[12008] 1180964218:301744 - buffering audio frame (now 2 audio frames in queue) 2007-06-04 15:36:58.324 Mischpult[12008] 1180964218:323193 - doGetNextFrame(): provided Video frame: fFrameSize = 18718 bytes with fPresentationTime = 1180964218:268453 (now 0 Video frames in queue) 2007-06-04 15:36:58.327 Mischpult[12008] 1180964218:326644 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:268453 (now 1 Audio frames in queue) 2007-06-04 15:36:58.327 Mischpult[12008] 1180964218:326893 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:301744 (now 0 Audio frames in queue) 2007-06-04 15:36:58.328 Mischpult[12008] 1180964218:318377 - buffering video frame (now 2 video frames in queue) 2007-06-04 15:36:58.338 Mischpult[12008] 1180964218:334998 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.338 Mischpult[12008] 1180964218:337320 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:334998 (now 0 Audio frames in queue) 2007-06-04 15:36:58.366 Mischpult[12008] 1180964218:365940 - doGetNextFrame(): provided Video frame: fFrameSize = 15993 bytes with fPresentationTime = 1180964218:318377 (now 0 Video frames in queue) 2007-06-04 15:36:58.376 Mischpult[12008] 1180964218:368262 - buffering video frame (now 1 video frames in queue) 2007-06-04 15:36:58.376 Mischpult[12008] 1180964218:368262 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.405 Mischpult[12008] 1180964218:401508 - buffering audio frame (now 2 audio frames in queue) 2007-06-04 15:36:58.406 Mischpult[12008] 1180964218:405368 - doGetNextFrame(): provided Video frame: fFrameSize = 11362 bytes with fPresentationTime = 1180964218:368262 (now 0 Video frames in queue) 2007-06-04 15:36:58.406 Mischpult[12008] 1180964218:405875 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:368262 (now 1 Audio frames in queue) 2007-06-04 15:36:58.407 Mischpult[12008] 1180964218:406090 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:401508 (now 0 Audio frames in queue) 2007-06-04 15:36:58.425 Mischpult[12008] 1180964218:418148 - buffering video frame (now 1 video frames in queue) 2007-06-04 15:36:58.438 Mischpult[12008] 1180964218:434783 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.438 Mischpult[12008] 1180964218:437275 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:434783 (now 0 Audio frames in queue) 2007-06-04 15:36:58.463 Mischpult[12008] 1180964218:462300 - doGetNextFrame(): provided Video frame: fFrameSize = 6914 bytes with fPresentationTime = 1180964218:418148 (now 0 Video frames in queue) 2007-06-04 15:36:58.475 Mischpult[12008] 1180964218:468062 - buffering video frame (now 1 video frames in queue) 2007-06-04 15:36:58.475 Mischpult[12008] 1180964218:468062 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.480 Mischpult[12008] 1180964218:479619 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:468062 (now 0 Audio frames in queue) 2007-06-04 15:36:58.503 Mischpult[12008] 1180964218:501314 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.504 Mischpult[12008] 1180964218:503167 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:501314 (now 0 Audio frames in queue) 2007-06-04 15:36:58.526 Mischpult[12008] 1180964218:517955 - buffering video frame (now 2 video frames in queue) 2007-06-04 15:36:58.537 Mischpult[12008] 1180964218:534568 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.537 Mischpult[12008] 1180964218:536853 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:534568 (now 0 Audio frames in queue) 2007-06-04 15:36:58.576 Mischpult[12008] 1180964218:567849 - buffering video frame (now 3 video frames in queue) 2007-06-04 15:36:58.576 Mischpult[12008] 1180964218:567849 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.576 Mischpult[12008] 1180964218:575914 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:567849 (now 0 Audio frames in queue) 2007-06-04 15:36:58.604 Mischpult[12008] 1180964218:601117 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.604 Mischpult[12008] 1180964218:603868 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:601117 (now 0 Audio frames in queue) 2007-06-04 15:36:58.625 Mischpult[12008] 1180964218:617749 - buffering video frame (now 4 video frames in queue) 2007-06-04 15:36:58.637 Mischpult[12008] 1180964218:634375 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.637 Mischpult[12008] 1180964218:636793 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:634375 (now 0 Audio frames in queue) 2007-06-04 15:36:58.674 Mischpult[12008] 1180964218:667639 - buffering video frame (now 5 video frames in queue) 2007-06-04 15:36:58.675 Mischpult[12008] 1180964218:667639 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.677 Mischpult[12008] 1180964218:676360 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:667639 (now 0 Audio frames in queue) 2007-06-04 15:36:58.703 Mischpult[12008] 1180964218:700917 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.703 Mischpult[12008] 1180964218:702700 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:700917 (now 0 Audio frames in queue) 2007-06-04 15:36:58.725 Mischpult[12008] 1180964218:717556 - buffering video frame (now 6 video frames in queue) 2007-06-04 15:36:58.737 Mischpult[12008] 1180964218:734194 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.737 Mischpult[12008] 1180964218:736303 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:734194 (now 0 Audio frames in queue) 2007-06-04 15:36:58.775 Mischpult[12008] 1180964218:767466 - buffering video frame (now 7 video frames in queue) 2007-06-04 15:36:58.776 Mischpult[12008] 1180964218:767466 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.777 Mischpult[12008] 1180964218:776165 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:767466 (now 0 Audio frames in queue) 2007-06-04 15:36:58.804 Mischpult[12008] 1180964218:800710 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.804 Mischpult[12008] 1180964218:803513 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:800710 (now 0 Audio frames in queue) 2007-06-04 15:36:58.824 Mischpult[12008] 1180964218:817343 - buffering video frame (now 8 video frames in queue) 2007-06-04 15:36:58.836 Mischpult[12008] 1180964218:833977 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.836 Mischpult[12008] 1180964218:835898 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:833977 (now 0 Audio frames in queue) 2007-06-04 15:36:58.874 Mischpult[12008] 1180964218:867232 - buffering video frame (now 9 video frames in queue) 2007-06-04 15:36:58.874 Mischpult[12008] 1180964218:867232 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.874 Mischpult[12008] 1180964218:873908 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:867232 (now 0 Audio frames in queue) 2007-06-04 15:36:58.903 Mischpult[12008] 1180964218:900515 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.903 Mischpult[12008] 1180964218:902707 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:900515 (now 0 Audio frames in queue) 2007-06-04 15:36:58.925 Mischpult[12008] 1180964218:917143 - buffering video frame (now 10 video frames in queue) 2007-06-04 15:36:58.936 Mischpult[12008] 1180964218:933803 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.936 Mischpult[12008] 1180964218:935908 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:933803 (now 0 Audio frames in queue) 2007-06-04 15:36:58.975 Mischpult[12008] 1180964218:967039 - buffering video frame (now 11 video frames in queue) 2007-06-04 15:36:58.975 Mischpult[12008] 1180964218:967039 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:58.976 Mischpult[12008] 1180964218:975325 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964218:967039 (now 0 Audio frames in queue) 2007-06-04 15:36:59.004 Mischpult[12008] 1180964219:318 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.004 Mischpult[12008] 1180964219:3354 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:318 (now 0 Audio frames in queue) 2007-06-04 15:36:59.024 Mischpult[12008] 1180964219:16939 - buffering video frame (now 12 video frames in queue) 2007-06-04 15:36:59.036 Mischpult[12008] 1180964219:33566 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.037 Mischpult[12008] 1180964219:36807 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:33566 (now 0 Audio frames in queue) 2007-06-04 15:36:59.073 Mischpult[12008] 1180964219:66841 - buffering video frame (now 13 video frames in queue) 2007-06-04 15:36:59.074 Mischpult[12008] 1180964219:66841 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.078 Mischpult[12008] 1180964219:77372 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:66841 (now 0 Audio frames in queue) 2007-06-04 15:36:59.102 Mischpult[12008] 1180964219:100110 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.103 Mischpult[12008] 1180964219:102081 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:100110 (now 0 Audio frames in queue) 2007-06-04 15:36:59.124 Mischpult[12008] 1180964219:116753 - buffering video frame (now 14 video frames in queue) 2007-06-04 15:36:59.136 Mischpult[12008] 1180964219:133379 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.136 Mischpult[12008] 1180964219:135859 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:133379 (now 0 Audio frames in queue) 2007-06-04 15:36:59.174 Mischpult[12008] 1180964219:166671 - buffering video frame (now 15 video frames in queue) 2007-06-04 15:36:59.175 Mischpult[12008] 1180964219:166671 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.175 Mischpult[12008] 1180964219:174519 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:166671 (now 0 Audio frames in queue) 2007-06-04 15:36:59.203 Mischpult[12008] 1180964219:199908 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.203 Mischpult[12008] 1180964219:202622 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:199908 (now 0 Audio frames in queue) 2007-06-04 15:36:59.224 Mischpult[12008] 1180964219:216545 - buffering video frame (now 16 video frames in queue) 2007-06-04 15:36:59.235 Mischpult[12008] 1180964219:233167 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.236 Mischpult[12008] 1180964219:235221 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:233167 (now 0 Audio frames in queue) 2007-06-04 15:36:59.260 Mischpult[12008] 1180964219:259716 - doGetNextFrame(): provided Video frame: fFrameSize = 5938 bytes with fPresentationTime = 1180964218:468062 (now 15 Video frames in queue) 2007-06-04 15:36:59.273 Mischpult[12008] 1180964219:266457 - buffering video frame (now 16 video frames in queue) 2007-06-04 15:36:59.274 Mischpult[12008] 1180964219:266457 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.274 Mischpult[12008] 1180964219:273770 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:266457 (now 0 Audio frames in queue) 2007-06-04 15:36:59.302 Mischpult[12008] 1180964219:299706 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.302 Mischpult[12008] 1180964219:301692 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:299706 (now 0 Audio frames in queue) 2007-06-04 15:36:59.324 Mischpult[12008] 1180964219:316339 - buffering video frame (now 17 video frames in queue) 2007-06-04 15:36:59.335 Mischpult[12008] 1180964219:332990 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.336 Mischpult[12008] 1180964219:335085 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:332990 (now 0 Audio frames in queue) 2007-06-04 15:36:59.374 Mischpult[12008] 1180964219:366240 - buffering video frame (now 18 video frames in queue) 2007-06-04 15:36:59.375 Mischpult[12008] 1180964219:366240 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.375 Mischpult[12008] 1180964219:374349 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:366240 (now 0 Audio frames in queue) 2007-06-04 15:36:59.414 Mischpult[12008] 1180964219:399538 - buffering video frame (now 19 video frames in queue) 2007-06-04 15:36:59.415 Mischpult[12008] 1180964219:399538 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.415 Mischpult[12008] 1180964219:414506 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:399538 (now 0 Audio frames in queue) 2007-06-04 15:36:59.443 Mischpult[12008] 1180964219:432771 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.443 Mischpult[12008] 1180964219:442352 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:432771 (now 0 Audio frames in queue) 2007-06-04 15:36:59.463 Mischpult[12008] 1180964219:449407 - buffering video frame (now 20 video frames in queue) 2007-06-04 15:36:59.476 Mischpult[12008] 1180964219:466037 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.477 Mischpult[12008] 1180964219:476075 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:466037 (now 0 Audio frames in queue) 2007-06-04 15:36:59.513 Mischpult[12008] 1180964219:499305 - buffering video frame (now 21 video frames in queue) 2007-06-04 15:36:59.514 Mischpult[12008] 1180964219:499305 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.514 Mischpult[12008] 1180964219:513593 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:499305 (now 0 Audio frames in queue) 2007-06-04 15:36:59.542 Mischpult[12008] 1180964219:532585 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.542 Mischpult[12008] 1180964219:541832 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:532585 (now 0 Audio frames in queue) 2007-06-04 15:36:59.564 Mischpult[12008] 1180964219:549203 - buffering video frame (now 22 video frames in queue) 2007-06-04 15:36:59.576 Mischpult[12008] 1180964219:565836 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.576 Mischpult[12008] 1180964219:575712 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:565836 (now 0 Audio frames in queue) 2007-06-04 15:36:59.613 Mischpult[12008] 1180964219:599108 - buffering video frame (now 23 video frames in queue) 2007-06-04 15:36:59.614 Mischpult[12008] 1180964219:599108 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.614 Mischpult[12008] 1180964219:613753 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:599108 (now 0 Audio frames in queue) 2007-06-04 15:36:59.643 Mischpult[12008] 1180964219:632373 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.643 Mischpult[12008] 1180964219:642488 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:632373 (now 0 Audio frames in queue) 2007-06-04 15:36:59.664 Mischpult[12008] 1180964219:649007 - buffering video frame (now 24 video frames in queue) 2007-06-04 15:36:59.675 Mischpult[12008] 1180964219:665632 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.676 Mischpult[12008] 1180964219:675335 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:665632 (now 0 Audio frames in queue) 2007-06-04 15:36:59.714 Mischpult[12008] 1180964219:698906 - buffering video frame (now 25 video frames in queue) 2007-06-04 15:36:59.714 Mischpult[12008] 1180964219:698906 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.718 Mischpult[12008] 1180964219:717617 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:698906 (now 0 Audio frames in queue) 2007-06-04 15:36:59.743 Mischpult[12008] 1180964219:732171 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.743 Mischpult[12008] 1180964219:742665 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:732171 (now 0 Audio frames in queue) 2007-06-04 15:36:59.764 Mischpult[12008] 1180964219:748806 - buffering video frame (now 26 video frames in queue) 2007-06-04 15:36:59.770 Mischpult[12008] 1180964219:769106 - doGetNextFrame(): provided Video frame: fFrameSize = 4694 bytes with fPresentationTime = 1180964218:517955 (now 25 Video frames in queue) 2007-06-04 15:36:59.775 Mischpult[12008] 1180964219:765447 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.776 Mischpult[12008] 1180964219:775119 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:765447 (now 0 Audio frames in queue) 2007-06-04 15:36:59.813 Mischpult[12008] 1180964219:798720 - buffering video frame (now 26 video frames in queue) 2007-06-04 15:36:59.813 Mischpult[12008] 1180964219:798720 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.813 Mischpult[12008] 1180964219:812770 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:798720 (now 0 Audio frames in queue) 2007-06-04 15:36:59.842 Mischpult[12008] 1180964219:831974 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.842 Mischpult[12008] 1180964219:841810 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:831974 (now 0 Audio frames in queue) 2007-06-04 15:36:59.863 Mischpult[12008] 1180964219:848610 - buffering video frame (now 27 video frames in queue) 2007-06-04 15:36:59.875 Mischpult[12008] 1180964219:865244 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.879 Mischpult[12008] 1180964219:878629 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:865244 (now 0 Audio frames in queue) 2007-06-04 15:36:59.913 Mischpult[12008] 1180964219:898514 - buffering video frame (now 28 video frames in queue) 2007-06-04 15:36:59.913 Mischpult[12008] 1180964219:898514 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.921 Mischpult[12008] 1180964219:920452 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:898514 (now 0 Audio frames in queue) 2007-06-04 15:36:59.942 Mischpult[12008] 1180964219:931772 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.942 Mischpult[12008] 1180964219:941443 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:931772 (now 0 Audio frames in queue) 2007-06-04 15:36:59.963 Mischpult[12008] 1180964219:948405 - buffering video frame (now 29 video frames in queue) 2007-06-04 15:36:59.975 Mischpult[12008] 1180964219:965043 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:36:59.976 Mischpult[12008] 1180964219:975041 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:965043 (now 0 Audio frames in queue) 2007-06-04 15:37:00.013 Mischpult[12008] 1180964219:998321 - buffering video frame (now 30 video frames in queue) 2007-06-04 15:37:00.013 Mischpult[12008] 1180964219:998321 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.014 Mischpult[12008] 1180964220:13239 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964219:998321 (now 0 Audio frames in queue) 2007-06-04 15:37:00.041 Mischpult[12008] 1180964220:31567 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.041 Mischpult[12008] 1180964220:40842 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964220:31567 (now 0 Audio frames in queue) 2007-06-04 15:37:00.063 Mischpult[12008] 1180964220:48199 - buffering video frame (now 31 video frames in queue) 2007-06-04 15:37:00.075 Mischpult[12008] 1180964220:64833 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.078 Mischpult[12008] 1180964220:77866 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964220:64833 (now 0 Audio frames in queue) 2007-06-04 15:37:00.113 Mischpult[12008] 1180964220:98101 - buffering video frame (now 32 video frames in queue) 2007-06-04 15:37:00.113 Mischpult[12008] 1180964220:98101 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.119 Mischpult[12008] 1180964220:117985 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964220:98101 (now 0 Audio frames in queue) 2007-06-04 15:37:00.141 Mischpult[12008] 1180964220:131378 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.141 Mischpult[12008] 1180964220:140775 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964220:131378 (now 0 Audio frames in queue) 2007-06-04 15:37:00.163 Mischpult[12008] 1180964220:147998 - buffering video frame (now 33 video frames in queue) 2007-06-04 15:37:00.175 Mischpult[12008] 1180964220:164647 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.175 Mischpult[12008] 1180964220:174591 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964220:164647 (now 0 Audio frames in queue) 2007-06-04 15:37:00.212 Mischpult[12008] 1180964220:197905 - buffering video frame (now 34 video frames in queue) 2007-06-04 15:37:00.213 Mischpult[12008] 1180964220:197905 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.213 Mischpult[12008] 1180964220:212492 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964220:197905 (now 0 Audio frames in queue) 2007-06-04 15:37:00.241 Mischpult[12008] 1180964220:231172 - buffering audio frame (now 1 audio frames in queue) 2007-06-04 15:37:00.241 Mischpult[12008] 1180964220:240791 - doGetNextFrame(): provided Audio frame: fFrameSize = 384 bytes with fPresentationTime = 1180964220:231172 (now 0 Audio frames in queue) 2007-06-04 15:37:00.262 Mischpult[12008] 1180964220:247802 - buffering video frame (now 35 video frames in queue) 2007-06-04 15:37:00.269 Mischpult[12008] 1180964220:268774 - doGetNextFrame(): provided Video frame: fFrameSize = 4600 bytes with fPresentationTime = 1180964218:567849 (now 34 Video frames in queue) I also added some debug printfs to MultiframedRTPSink and found out, that the sinks are queried in the same 'asynchronous' manner (the video sink is queried much less than the audio sink). The strange numbers in the following are pointer values and let you distinguish the audio and the video sink. Beginning streaming... Beginning to read from video input... MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) Beginning to read from audio input... MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) Play this stream (from the Darwin Streaming Server) using the URL: rtsp://127.0.0.1/test.sdp MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) MultiframedRTPSink (0xef6a0c0) querying input source (0xef6a420) MultiframedRTPSink (0xef67490) querying input source (0xef6a710) Now to my question: Why is the video sink and thus the video source queried at a much lower frequency than the audio respectives? I'm sorry for the long mail, but I hope it makes sense that way... cheers, Severin Ross Finlayson schrieb: >> So I tried to use your RTSP server but it didn't change the situation. >> My streams are still diverting. I investigated the problem further and >> fond out, that live555 queries my sound source much more often than it >> queries the video source. I suspect it has something to do with data >> size >> > > No, that shouldn't matter. If you (ii) give your data (audio and > video) accurate presentation times (the "fPresentationTime" variable) > - tied to the local 'wall clock' time (e.g., using "gettimeofday()"), > and (ii) use RTCP (by creating "RTCPInstance" objects for each > "RTPSink"), then audio/video sync *will* work correctly at the client > end. > From raphael.kindt at gmail.com Tue Jun 5 00:33:05 2007 From: raphael.kindt at gmail.com (=?ISO-8859-1?Q?Rapha=EBl_Kindt?=) Date: Tue, 5 Jun 2007 09:33:05 +0200 Subject: [Live-devel] [live-devel] Too long delay for JPEG live image. Message-ID: <8b9f68640706050033w75f13b16l62d8abea5417f48b@mail.gmail.com> Hello, I'm trying to developpe a MJPEG live streamer server. I work with a camera and our specific driver (that I've wrote too) for our frame grabber. To do this I've learned about Elphel example. I've wrote a JPEGVideoSource derivated class then I've overload the doGetNextFrame function. Now I can see remotely live JPEG video from camera with a VLC client program. But I've observe a delay of 2 sec between images received by the driver and those received by VLC client. It seem there is a too big fifo buffer for live image between them. I've disable our fifo buffer to see what's happened. Now I copy directly image (30 kB) to FramedSource::fTo data members. But the problem is always present. How can I reduce this delay? Thanks in advance for your help... -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070605/59ee8926/attachment.html From finlayson at live555.com Tue Jun 5 01:36:26 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 5 Jun 2007 01:36:26 -0700 Subject: [Live-devel] [live-devel] Too long delay for JPEG live image. In-Reply-To: <8b9f68640706050033w75f13b16l62d8abea5417f48b@mail.gmail.com> References: <8b9f68640706050033w75f13b16l62d8abea5417f48b@mail.gmail.com> Message-ID: >Hello, > >I'm trying to developpe a MJPEG live streamer server. >I work with a camera and our specific driver (that I've wrote too) >for our frame grabber. >To do this I've learned about Elphel example. >I've wrote a JPEGVideoSource derivated class then I've overload the >doGetNextFrame function. > >Now I can see remotely live JPEG video from camera with a VLC client program. >But I've observe a delay of 2 sec between images received by the >driver and those received by VLC client. >It seem there is a too big fifo buffer for live image between them. >I've disable our fifo buffer to see what's happened. >Now I copy directly image (30 kB) to FramedSource::fTo data members. >But the problem is always present. > >How can I reduce this delay? There's no signifcant delay in the "LIVE555 Streaming Media" code - at either the sending end or the receiving (VLC) end. However, VLC does have a separate jitter buffer that - by default - adds 1.2 seconds (1200 ms) at the receiving end. You can reduce this by changing VLC's Preferences->Input/Codecs->Demuxers->RTP/RTSP->Advanced->Caching value (ms) Don't forget also that JPEG encoding (even in hardware) and decoding also add some delay, which you are unlikely to be able to reduce. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From raphael.kindt at gmail.com Tue Jun 5 02:26:18 2007 From: raphael.kindt at gmail.com (=?us-ascii?Q?Raphael_KINDT?=) Date: Tue, 5 Jun 2007 11:26:18 +0200 Subject: [Live-devel] [live-devel] Too long delay for JPEG live image. In-Reply-To: Message-ID: OK... It works fine now. Just by changing caching value to 50 ms. Thanks! :-) -----Message d'origine----- De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]De la part de Ross Finlayson Envoye : mardi 5 juin 2007 10:36 A : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] [live-devel] Too long delay for JPEG live image. >Hello, > >I'm trying to developpe a MJPEG live streamer server. >I work with a camera and our specific driver (that I've wrote too) >for our frame grabber. >To do this I've learned about Elphel example. >I've wrote a JPEGVideoSource derivated class then I've overload the >doGetNextFrame function. > >Now I can see remotely live JPEG video from camera with a VLC client program. >But I've observe a delay of 2 sec between images received by the >driver and those received by VLC client. >It seem there is a too big fifo buffer for live image between them. >I've disable our fifo buffer to see what's happened. >Now I copy directly image (30 kB) to FramedSource::fTo data members. >But the problem is always present. > >How can I reduce this delay? There's no signifcant delay in the "LIVE555 Streaming Media" code - at either the sending end or the receiving (VLC) end. However, VLC does have a separate jitter buffer that - by default - adds 1.2 seconds (1200 ms) at the receiving end. You can reduce this by changing VLC's Preferences->Input/Codecs->Demuxers->RTP/RTSP->Advanced->Caching value (ms) Don't forget also that JPEG encoding (even in hardware) and decoding also add some delay, which you are unlikely to be able to reduce. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From raphael.kindt at gmail.com Tue Jun 5 03:35:50 2007 From: raphael.kindt at gmail.com (=?us-ascii?Q?Raphael_KINDT?=) Date: Tue, 5 Jun 2007 12:35:50 +0200 Subject: [Live-devel] [live-devel] Too long delay for JPEG live image. In-Reply-To: Message-ID: Hello, I've another question. I observe that there is a difference between my original JPEG header and the header constructs by the client. To detect this difference I've used OpenRTSP. The difference is the quantization table. I've used a qFactor = 128 and I've overload the JPEGVideoSource::quantizationTables (with precision = 0 and length = 128). I've forced the driver (and thus hardware coder) to use the same table as that which JPEGVideoRTPSource uses. The driver use 100% for the quality factor. After all that, there is always a difference. What can I do to use the same quantization table as that which originals images use? Thanks a lot. -----Message d'origine----- De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]De la part de Ross Finlayson Envoye : mardi 5 juin 2007 10:36 A : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] [live-devel] Too long delay for JPEG live image. >Hello, > >I'm trying to developpe a MJPEG live streamer server. >I work with a camera and our specific driver (that I've wrote too) >for our frame grabber. >To do this I've learned about Elphel example. >I've wrote a JPEGVideoSource derivated class then I've overload the >doGetNextFrame function. > >Now I can see remotely live JPEG video from camera with a VLC client program. >But I've observe a delay of 2 sec between images received by the >driver and those received by VLC client. >It seem there is a too big fifo buffer for live image between them. >I've disable our fifo buffer to see what's happened. >Now I copy directly image (30 kB) to FramedSource::fTo data members. >But the problem is always present. > >How can I reduce this delay? There's no signifcant delay in the "LIVE555 Streaming Media" code - at either the sending end or the receiving (VLC) end. However, VLC does have a separate jitter buffer that - by default - adds 1.2 seconds (1200 ms) at the receiving end. You can reduce this by changing VLC's Preferences->Input/Codecs->Demuxers->RTP/RTSP->Advanced->Caching value (ms) Don't forget also that JPEG encoding (even in hardware) and decoding also add some delay, which you are unlikely to be able to reduce. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From adigupt_000 at rediffmail.com Tue Jun 5 23:13:10 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 6 Jun 2007 06:13:10 -0000 Subject: [Live-devel] how to use live555 with JMF Message-ID: <20070606061310.28074.qmail@webmail90.rediffmail.com> ? hi, I want to use live555MediaServer as a streaming server and JMStudio (JMF player ) as a client side player to run the media streamed by live555 server . Can any one tell me how to do that ??? thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070605/8d305876/attachment.html From adigupt_000 at rediffmail.com Tue Jun 5 23:15:14 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 6 Jun 2007 06:15:14 -0000 Subject: [Live-devel] how to use live555 as a RTP server Message-ID: <20070606061514.24514.qmail@webmail87.rediffmail.com> ? hi , Can any one tell me how to use live555MediaServer to transmit the media file through RTP (i.e without using any RTSP messages ) . thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070605/7972b134/attachment.html From finlayson at live555.com Tue Jun 5 23:19:54 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 5 Jun 2007 23:19:54 -0700 Subject: [Live-devel] how to use live555 with JMF In-Reply-To: <20070606061310.28074.qmail@webmail90.rediffmail.com> References: <20070606061310.28074.qmail@webmail90.rediffmail.com> Message-ID: > >hi, > >I want to use live555MediaServer as a streaming server and JMStudio >(JMF player ) as a client side player to run the media streamed by >live555 server . Can any one tell me how to do that ??? You will have to ask a JMF mailing list. The "live555MediaServer" is a standards-compliant RTSP/RTP server, so any standard RTSP/RTP client should be able to play from it. Unfortunately this is not the right mailing list for someone who wants to develope a client in Java. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Tue Jun 5 23:48:21 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 5 Jun 2007 23:48:21 -0700 Subject: [Live-devel] how to use live555 as a RTP server In-Reply-To: <20070606061514.24514.qmail@webmail87.rediffmail.com> References: <20070606061514.24514.qmail@webmail87.rediffmail.com> Message-ID: >Can any one tell me how to use live555MediaServer to transmit the >media file through RTP (i.e without using any RTSP messages ) . The "live555MediaServer" can't do this; it's a RTSP server. However, you should look at the various demo applications in the "testProgs" directory (in the "LIVE555 Streaming Media" software). There are several applications there - called "test*Streamer" - that stream via RTP, in some cases without a built-in RTSP server. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ken.hilliard at gmail.com Wed Jun 6 00:44:24 2007 From: ken.hilliard at gmail.com (Ken Hilliard) Date: Wed, 6 Jun 2007 14:44:24 +0700 Subject: [Live-devel] multiple audio stream w/Live555 Media Server Message-ID: <4b531160706060044g6973a44amedc0278a97087c29@mail.gmail.com> I downloaded and installed the Linux version of live555MediaServer into a directory on Ubuntu. I copied my mpegts files in that directory and started streaming videos with a single audio track via an Amino 125 STB. The video was h.264 and the audio aac. I used the Amino's javascript play function ( AVMedia.Play), passing it the url "rtsp://192.168.2.5:8554/test.ts" to stream and play the video. At this point everything's OK. Then I added video files with 2 audio streams. There seems to be only 1 audio stream received--at least as reported by the Amino STB. Also the reported audio pid is not the same value that is in the mpegts source file. To verify the mpegts file's correctness I used a media player to play the video and could select either audio streams. I also used an mpeg editor to verify the pes and stream ids. Is there something special I must do to stream multiple audio streams? Does live555MediaServer output logging that would provide diagnostic information? Is there a simple command line tool that could be used to connect to the RTSP server and report the available streams. BTW: I'm a newbie. thx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070606/3dcb804c/attachment.html From finlayson at live555.com Wed Jun 6 00:56:58 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Jun 2007 00:56:58 -0700 Subject: [Live-devel] multiple audio stream w/Live555 Media Server In-Reply-To: <4b531160706060044g6973a44amedc0278a97087c29@mail.gmail.com> References: <4b531160706060044g6973a44amedc0278a97087c29@mail.gmail.com> Message-ID: >I downloaded and installed the Linux version of live555MediaServer >into a directory on Ubuntu. I copied my mpegts files in that >directory and started streaming videos with a single audio track via >an Amino 125 STB. The video was h.264 and the audio aac. I used the >Amino's javascript play function (AVMedia.Play), passing it the url >"rtsp://192.168.2.5:8554/test.ts" to stream and play the video. At >this point everything's OK. Then I added video files with 2 audio >streams. There seems to be only 1 audio stream received--at least as >reported by the Amino STB This is an issue with your Amino STB client, not our server. Our server - when streaming MPEG Tranport Stream data - does not demultiplex the Transport Stream; instead, it streams the entire Transport Stream 'as is'. Therefore, if your client cannot play the stream, then it is a problem with your client. Unfortunately we cannot help you debug problems with the Amino STB client. Instead, you should contact Amino, or refer to some other (Amino-related) mailing list or forum. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From adigupt_000 at rediffmail.com Wed Jun 6 01:09:50 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 6 Jun 2007 08:09:50 -0000 Subject: [Live-devel] how to stream H.264 via RTP Message-ID: <20070606080950.1121.qmail@webmail91.rediffmail.com> ? hi, I am able to stream MPEG1/2 videos through RTP using testMPEG1or2VideoStreamer program(testprogs) .. but i want to stream H.264 videos through RTP. I read http://www.live555.com/liveMedia/faq.html#h264-streaming but not able to understand it completely. Can any one tell how to do this (i.e what is the work of "H264VideoStreamFramer" module ). thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070606/5cc2d5cf/attachment.html From adigupt_000 at rediffmail.com Wed Jun 6 01:12:23 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 6 Jun 2007 08:12:23 -0000 Subject: [Live-devel] how to stream H.264 via RTP Message-ID: <20070606081223.24161.qmail@webmail94.rediffmail.com> ? hi, I am able to stream MPEG1/2 videos through RTP using testMPEG1or2VideoStreamer program(testprogs) .. but i want to stream H.264 videos through RTP. I read http://www.live555.com/liveMedia/faq.html#h264-streaming but not able to understand it completely. Can any one tell how to do this (i.e what is the work of "H264VideoStreamFramer" module ). thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070606/84a776d2/attachment.html From ken.hilliard at gmail.com Wed Jun 6 01:47:45 2007 From: ken.hilliard at gmail.com (Ken Hilliard) Date: Wed, 6 Jun 2007 15:47:45 +0700 Subject: [Live-devel] multiple audio stream w/Live555 Media Server In-Reply-To: References: <4b531160706060044g6973a44amedc0278a97087c29@mail.gmail.com> Message-ID: <4b531160706060147r36af0556vbeebca0390ba9477@mail.gmail.com> thx Ross. I was also able to verify this by using the VLC client in place of my STB and it played all the audio tracks with no problem. On 6/6/07, Ross Finlayson wrote: > > >I downloaded and installed the Linux version of live555MediaServer > >into a directory on Ubuntu. I copied my mpegts files in that > >directory and started streaming videos with a single audio track via > >an Amino 125 STB. The video was h.264 and the audio aac. I used the > >Amino's javascript play function (AVMedia.Play), passing it the url > >"rtsp://192.168.2.5:8554/test.ts" to stream and play the video. At > >this point everything's OK. Then I added video files with 2 audio > >streams. There seems to be only 1 audio stream received--at least as > >reported by the Amino STB > > This is an issue with your Amino STB client, not our server. Our > server - when streaming MPEG Tranport Stream data - does not > demultiplex the Transport Stream; instead, it streams the entire > Transport Stream 'as is'. Therefore, if your client cannot play the > stream, then it is a problem with your client. > > Unfortunately we cannot help you debug problems with the Amino STB > client. Instead, you should contact Amino, or refer to some other > (Amino-related) mailing list or forum. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070606/bba05441/attachment.html From may_ank77 at yahoo.com Wed Jun 6 01:52:01 2007 From: may_ank77 at yahoo.com (mayank agarwal) Date: Wed, 6 Jun 2007 01:52:01 -0700 (PDT) Subject: [Live-devel] RTSP client application Message-ID: <931626.53465.qm@web90406.mail.mud.yahoo.com> Hello all, I am new to RTSP and streaming.I have downloaded the freely available RTSP streaming code from live555.com site and trying to understand it.My aim is to develop the RTSP client application using the files and libraries already given in the code. Please guide me how should i proceed and how much time it will take to develop the required application. Regards, Mayank ____________________________________________________________________________________ Be a PS3 game guru. Get your game face on with the latest PS3 news and previews at Yahoo! Games. http://videogames.yahoo.com/platform?platform=120121 From vinodjoshi at tataelxsi.co.in Wed Jun 6 03:51:55 2007 From: vinodjoshi at tataelxsi.co.in (Vinod Madhav Joshi) Date: Wed, 6 Jun 2007 16:21:55 +0530 Subject: [Live-devel] PTS for Program Stream Message-ID: <000001c7a828$b3bfd400$022a320a@telxsi.com> Hi all, We are using Live 555 streaming media server to stream MPEG 2 PS to the set top box. The server will stream audio and video on seperate ports for program stream. After receiving the streams on client side, how to calculate PTS? Whether we have to use RTP timestamp values for this? Anybody knows about this or what information i have to look for(Any RFC)? Thank You. From julian.lamberty at mytum.de Wed Jun 6 05:23:56 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Wed, 06 Jun 2007 14:23:56 +0200 Subject: [Live-devel] Problem Streaming MPEG4ES from buffer Message-ID: <4666A75C.80001@mytum.de> Hi! I've a ffmpeg code that writes MPEG4 frames into a buffer. I want to stream these frames over RTP/RTSP. At the moment I pass the buffer to an "MPEG4VideoStreamDiscreteFramer" that sends it to an "MPEG4ESVideoRTPSink". But the stream I receive is totally corrupted. Using wireshark I can see, that all the packets I send have nearly the same content, only a few bytes change form packet to packet. When I write the buffer to a file ("fwrite(outbuf, 1, enc_bytes, file)"), I can stream that file perfectly via the test program "testMPEG4VideoStreamer". Taking the same buffer and copying it to fTo results in a broken stream("memcpy(fTo, outbuf, enc_bytes)"). Do I have to change fpresentationTime and fdurationInMicroseconds? Right now I just set them to presentationTime and durationInMicorseconds... fFrameSize is set to enc_bytes if smaller fMaxSize, that should be correct, right? Do I have to use "MPEG4VideoStreamFramer" instead as the test program does? Please help me handling that buffer over to live555. Thanks! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070606/04c84ac1/attachment.bin From julian.lamberty at mytum.de Wed Jun 6 05:32:51 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Wed, 06 Jun 2007 14:32:51 +0200 Subject: [Live-devel] Problem Streaming MPEG4ES from buffer Message-ID: <4666A973.3050501@mytum.de> If I dump the stream at the receiver with openRTSP I also get a corrupted stream, so it's not a problem with the player I'm using (VLC with live555 support). -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070606/53ccc9ae/attachment-0001.bin From severin.schoepke at gmail.com Wed Jun 6 06:04:02 2007 From: severin.schoepke at gmail.com (Severin Schoepke) Date: Wed, 06 Jun 2007 15:04:02 +0200 Subject: [Live-devel] Problem Streaming MPEG4ES from buffer In-Reply-To: <4666A75C.80001@mytum.de> References: <4666A75C.80001@mytum.de> Message-ID: <4666B0C2.3030200@gmail.com> Hi Julian, I can stream ffmpeg encoded MPEG4 buffers using an MPEG4VideoStreamFramer perfectly well. I suppose you try that! cheers, Severin Julian Lamberty schrieb: > Do I have to use "MPEG4VideoStreamFramer" instead as the test program > does? From julian.lamberty at mytum.de Wed Jun 6 06:25:03 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Wed, 06 Jun 2007 15:25:03 +0200 Subject: [Live-devel] Problem Streaming MPEG4ES from buffer References: 4666A75C.80001@mytum.de Message-ID: <4666B5AF.60105@mytum.de> Hi Severin! How did you do that, I'm trying exactly the same. But when I use MPEG4VideoStreamFramer instead of the discrete one I get errors that say: StreamParser::afterGettingBytes() warning: read 23329 bytes; expected no more than 10026 MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (60752). 90747 bytes of trailing data was dropped! Correct this by increasing "OutPacketBuffer::maxSize" to at least 151499, *before* creating this 'RTPSink'. (Current value is 60000.) MPEG4VideoStreamParser::parseVideoObjectLayer(): This appears to be a 'short video header', which we current don't support [mpeg4 @ 0xa23148]buffer smaller than minimum size I use the following code to fill the buffer: enc_bytes = avcodec_encode_video(enc_codec_ctx, outbuf, fMaxSize, dec_frame); if(enc_bytes >= 0) { memcpy(fTo, outbuf, enc_bytes); if(enc_bytes > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = enc_bytes - fMaxSize; } else { fFrameSize = enc_bytes; fNumTruncatedBytes = 0; } fPresentationTime = presentationTime; fDurationInMicroseconds = durationInMicroseconds; } afterGetting(this); Would be very nice if you could help me finding my mistake... Thanks! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070606/26998584/attachment.bin From finlayson at live555.com Wed Jun 6 06:45:30 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Jun 2007 06:45:30 -0700 Subject: [Live-devel] PTS for Program Stream In-Reply-To: <000001c7a828$b3bfd400$022a320a@telxsi.com> References: <000001c7a828$b3bfd400$022a320a@telxsi.com> Message-ID: > We are using Live 555 streaming media server to stream MPEG 2 PS to >the set top box. > The server will stream audio and video on seperate ports for program >stream. > After receiving the streams on client side, how to calculate PTS? Are you using our software to develop your client? If not, then we can't help you - sorry. If, however, you are using our software to develop your client, then the presentation timestamp for each piece of data read from a "RTPSource" object is passed - as a parameter - to the 'after getting' function that was previously passed in the call to "getNextFrame" on the "RTPSource" object. (Note the many examples in the code.) > Whether we have to use RTP timestamp values for this? No. Our software automatically calculates the presentation time from the RTP timestamps and RTCP reports. You don't need to calculate this yourself - instead, just use the "presentationTime" parameter. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jun 6 06:52:30 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Jun 2007 06:52:30 -0700 Subject: [Live-devel] Problem Streaming MPEG4ES from buffer In-Reply-To: <4666B5AF.60105@mytum.de> References: 4666A75C.80001@mytum.de <4666B5AF.60105@mytum.de> Message-ID: >I use the following code to fill the buffer: > >enc_bytes = avcodec_encode_video(enc_codec_ctx, outbuf, fMaxSize, dec_frame); >if(enc_bytes >= 0) > { > memcpy(fTo, outbuf, enc_bytes); > > if(enc_bytes > fMaxSize) This is very bad. You must never copy more than "fMaxSize" bytes to the destination pointed to by "fTo". Therefore, you must check the data size against "fMaxSize" *before* you copy it. I.e, you should instead do: if(enc_bytes > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = enc_bytes - fMaxSize; } else { fFrameSize = enc_bytes; fNumTruncatedBytes = 0; } memcpy(fTo, outbuf, fFrameSize); -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From julian.lamberty at mytum.de Wed Jun 6 07:03:20 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Wed, 06 Jun 2007 16:03:20 +0200 Subject: [Live-devel] Problem Streaming MPEG4ES from buffer References: 4666B5AF.60105@mytum.de Message-ID: <4666BEA8.6050304@mytum.de> Thanks for your reply, I changed my code. Nevertheless, enc_bytes did never exceed fMaxSize in my program before and thus the problem still exists. VLC, which I use to play the stream, reports loads of errors regarding late pictures and damaged headers... :( Any more ideas? Thank you! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070606/c13cc764/attachment.bin From homocean at ibr.cs.tu-bs.de Wed Jun 6 07:46:14 2007 From: homocean at ibr.cs.tu-bs.de (homocean at ibr.cs.tu-bs.de) Date: Wed, 6 Jun 2007 16:46:14 +0200 (CEST) Subject: [Live-devel] SeqNo In-Reply-To: References: <1878.134.169.173.74.1180388606.squirrel@mail.ibr.cs.tu-bs.de> <3059.134.169.35.237.1180516648.squirrel@mail.ibr.cs.tu-bs.de> Message-ID: <4808.134.169.35.237.1181141174.squirrel@mail.ibr.cs.tu-bs.de> >>Is there any posibility to access this information (the sequence numbers >>that make up each frame) from mplayer without modifying any live source > > No. My goal is to optimize mplayer based on live for wlan streaming. But I am not allowed to modify the live library. If I do not have access to the sequence numbers that make up a frame how can I tell if a frame got lost? It would be at least helpful to know the seq no of each frame. I tried interrogating the bufferQueue->rtpSource()->curPacketRTPSeqNum() before each getBuffer(), but this does not behave as I would have expected. The seqNo I get does not correspond to the beginning of the frame I read with getBuffer. And this is because live continues receiving rtp packets from the server before mplayer receives its buffered frame (if I understood wrong please correct me). Is there another way of finding out if a frame(or parts of it) got lost? Or is it the last solution to implements my own rtp client and link it to mplayer(I might not have enough time for that)? Regards, Silviu > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Wed Jun 6 13:15:36 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Jun 2007 13:15:36 -0700 Subject: [Live-devel] Problem Streaming MPEG4ES from buffer In-Reply-To: <4666BEA8.6050304@mytum.de> References: 4666B5AF.60105@mytum.de <4666BEA8.6050304@mytum.de> Message-ID: >Content-Type: multipart/signed; protocol="application/x-pkcs7-signature"; > micalg=sha1; boundary="------------ms020209000004030702020306" > >Thanks for your reply, I changed my code. > >Nevertheless, enc_bytes did never exceed fMaxSize in my program >before and thus the problem still exists. VLC, which I use to play >the stream, reports loads of errors regarding late pictures and >damaged headers... :( Any more ideas? The only other thing I can think of is that perhaps you are modifying the "fTo" variable before you call "memcpy()"- thereby copying your data to the wrong place. Apart from this, you're on your own. You've probably made a simple error somewhere. But fortunately, You Have Complete Source Code. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ravinder.singh at ti.com Wed Jun 6 22:25:06 2007 From: ravinder.singh at ti.com (singh, Ravinder) Date: Thu, 7 Jun 2007 10:55:06 +0530 Subject: [Live-devel] getting link time errors while building for arm.linux In-Reply-To: Message-ID: <7EAD1AEEA7621C45899FE99123E124A0E035A4@dbde01.ent.ti.com> Hi Ross I am getting lot of undefined reference errors when I am cross compiling the code for arm architecture, following are some errors. I am using uclibc Tool chain. make[1]: Entering directory `/home/ravinder/rtsp/live_arm/testProgs' arm-linux-gcc -otestMP3Streamer -L. testMP3Streamer.o ../liveMedia/libliveMedia.a ../groupsock/libgroupsock.a ../UsageEnvironment/libUsageEnvironment.a ../BasicUsageEnvironment/libBasicUsageEnvironment.a testMP3Streamer.o: In function `main': testMP3Streamer.cpp:(.text+0x140): undefined reference to `operator new(unsigned int)' testMP3Streamer.cpp:(.text+0x18c): undefined reference to `operator new(unsigned int)' testMP3Streamer.cpp:(.text+0x280): undefined reference to `operator delete(void*)' testMP3Streamer.cpp:(.text+0x294): undefined reference to `__gxx_personality_sj0' ../liveMedia/libliveMedia.a: In function `Medium::~Medium()': Locale.cpp:(.text+0xbc): undefined reference to `operator delete(void*)' ../liveMedia/libliveMedia.a: In function `_Tables::getOurTables(UsageEnvironment&)': Locale.cpp:(.text+0x160): undefined reference to `operator new(unsigned int)' ../liveMedia/libliveMedia.a: In function `_Tables::~_Tables()': Locale.cpp:(.text+0x2a0): undefined reference to `operator delete(void*)' ../liveMedia/libliveMedia.a: In function `MediaLookupTable::ourMedia(UsageEnvironment&)': Locale.cpp:(.text+0x35c): undefined reference to `operator new(unsigned int)' Locale.cpp:(.text+0x3b0): undefined reference to `operator delete(void*)' Locale.cpp:(.text+0x3c4): undefined reference to `__gxx_personality_sj0' ../liveMedia/libliveMedia.a: In function `MediaLookupTable::~MediaLookupTable()': Locale.cpp:(.text+0x5e0): undefined reference to `operator delete(void*)' ../liveMedia/libliveMedia.a: In function `MediaSource::~MediaSource()': Locale.cpp:(.text+0x67c): undefined reference to `operator delete(void*)' ../liveMedia/libliveMedia.a: In function `FramedSource::~FramedSource()': Locale.cpp:(.text+0x838): undefined reference to `operator delete(void*)' ../liveMedia/libliveMedia.a: In function `FramedFileSource::~FramedFileSource()': Locale.cpp:(.text+0xaa4): undefined reference to `operator delete(void*)' ../liveMedia/libliveMedia.a: In function `FramedFilter::~FramedFilter()': Locale.cpp:(.text+0xba0): undefined reference to `__gxx_personality_sj0' ../liveMedia/libliveMedia.a: In function `FramedFilter::~FramedFilter()': Locale.cpp:(.text+0xc5c): undefined reference to `__gxx_personality_sj0' ../liveMedia/libliveMedia.a: In function `FramedFilter::~FramedFilter()': Locale.cpp:(.text+0xce0): undefined reference to `operator delete(void*)' Locale.cpp:(.text+0xd20): undefined reference to `__gxx_personality_sj0' Regards Ravinder -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of live-devel-request at ns.live555.com Sent: Wednesday, June 06, 2007 8:16 PM To: live-devel at ns.live555.com Subject: live-devel Digest, Vol 44, Issue 6 Send live-devel mailing list submissions to live-devel at lists.live555.com To subscribe or unsubscribe via the World Wide Web, visit http://lists.live555.com/mailman/listinfo/live-devel or, via email, send a message with subject or body 'help' to live-devel-request at lists.live555.com You can reach the person managing the list at live-devel-owner at lists.live555.com When replying, please edit your Subject line so it is more specific than "Re: Contents of live-devel digest..." Today's Topics: 1. Re: Problem Streaming MPEG4ES from buffer (Severin Schoepke) 2. Problem Streaming MPEG4ES from buffer (Julian Lamberty) 3. Re: PTS for Program Stream (Ross Finlayson) 4. Re: Problem Streaming MPEG4ES from buffer (Ross Finlayson) 5. Problem Streaming MPEG4ES from buffer (Julian Lamberty) 6. Re: SeqNo (homocean at ibr.cs.tu-bs.de) ---------------------------------------------------------------------- Message: 1 Date: Wed, 06 Jun 2007 15:04:02 +0200 From: Severin Schoepke Subject: Re: [Live-devel] Problem Streaming MPEG4ES from buffer To: LIVE555 Streaming Media - development & use Message-ID: <4666B0C2.3030200 at gmail.com> Content-Type: text/plain; charset=ISO-8859-15; format=flowed Hi Julian, I can stream ffmpeg encoded MPEG4 buffers using an MPEG4VideoStreamFramer perfectly well. I suppose you try that! cheers, Severin Julian Lamberty schrieb: > Do I have to use "MPEG4VideoStreamFramer" instead as the test program > does? ------------------------------ Message: 2 Date: Wed, 06 Jun 2007 15:25:03 +0200 From: Julian Lamberty Subject: [Live-devel] Problem Streaming MPEG4ES from buffer To: live-devel at ns.live555.com Message-ID: <4666B5AF.60105 at mytum.de> Content-Type: text/plain; charset="iso-8859-15" Hi Severin! How did you do that, I'm trying exactly the same. But when I use MPEG4VideoStreamFramer instead of the discrete one I get errors that say: StreamParser::afterGettingBytes() warning: read 23329 bytes; expected no more than 10026 MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (60752). 90747 bytes of trailing data was dropped! Correct this by increasing "OutPacketBuffer::maxSize" to at least 151499, *before* creating this 'RTPSink'. (Current value is 60000.) MPEG4VideoStreamParser::parseVideoObjectLayer(): This appears to be a 'short video header', which we current don't support [mpeg4 @ 0xa23148]buffer smaller than minimum size I use the following code to fill the buffer: enc_bytes = avcodec_encode_video(enc_codec_ctx, outbuf, fMaxSize, dec_frame); if(enc_bytes >= 0) { memcpy(fTo, outbuf, enc_bytes); if(enc_bytes > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = enc_bytes - fMaxSize; } else { fFrameSize = enc_bytes; fNumTruncatedBytes = 0; } fPresentationTime = presentationTime; fDurationInMicroseconds = durationInMicroseconds; } afterGetting(this); Would be very nice if you could help me finding my mistake... Thanks! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070606/26998 584/attachment-0001.bin ------------------------------ Message: 3 Date: Wed, 6 Jun 2007 06:45:30 -0700 From: Ross Finlayson Subject: Re: [Live-devel] PTS for Program Stream To: LIVE555 Streaming Media - development & use Message-ID: Content-Type: text/plain; charset="us-ascii" ; format="flowed" > We are using Live 555 streaming media server to stream MPEG 2 PS to >the set top box. > The server will stream audio and video on seperate ports for program >stream. > After receiving the streams on client side, how to calculate PTS? Are you using our software to develop your client? If not, then we can't help you - sorry. If, however, you are using our software to develop your client, then the presentation timestamp for each piece of data read from a "RTPSource" object is passed - as a parameter - to the 'after getting' function that was previously passed in the call to "getNextFrame" on the "RTPSource" object. (Note the many examples in the code.) > Whether we have to use RTP timestamp values for this? No. Our software automatically calculates the presentation time from the RTP timestamps and RTCP reports. You don't need to calculate this yourself - instead, just use the "presentationTime" parameter. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------ Message: 4 Date: Wed, 6 Jun 2007 06:52:30 -0700 From: Ross Finlayson Subject: Re: [Live-devel] Problem Streaming MPEG4ES from buffer To: LIVE555 Streaming Media - development & use Message-ID: Content-Type: text/plain; charset="us-ascii" ; format="flowed" >I use the following code to fill the buffer: > >enc_bytes = avcodec_encode_video(enc_codec_ctx, outbuf, fMaxSize, dec_frame); >if(enc_bytes >= 0) > { > memcpy(fTo, outbuf, enc_bytes); > > if(enc_bytes > fMaxSize) This is very bad. You must never copy more than "fMaxSize" bytes to the destination pointed to by "fTo". Therefore, you must check the data size against "fMaxSize" *before* you copy it. I.e, you should instead do: if(enc_bytes > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = enc_bytes - fMaxSize; } else { fFrameSize = enc_bytes; fNumTruncatedBytes = 0; } memcpy(fTo, outbuf, fFrameSize); -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------ Message: 5 Date: Wed, 06 Jun 2007 16:03:20 +0200 From: Julian Lamberty Subject: [Live-devel] Problem Streaming MPEG4ES from buffer To: live-devel at ns.live555.com Message-ID: <4666BEA8.6050304 at mytum.de> Content-Type: text/plain; charset="iso-8859-15" Thanks for your reply, I changed my code. Nevertheless, enc_bytes did never exceed fMaxSize in my program before and thus the problem still exists. VLC, which I use to play the stream, reports loads of errors regarding late pictures and damaged headers... :( Any more ideas? Thank you! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070606/c13cc 764/attachment-0001.bin ------------------------------ Message: 6 Date: Wed, 6 Jun 2007 16:46:14 +0200 (CEST) From: homocean at ibr.cs.tu-bs.de Subject: Re: [Live-devel] SeqNo To: "LIVE555 Streaming Media - development & use" Message-ID: <4808.134.169.35.237.1181141174.squirrel at mail.ibr.cs.tu-bs.de> Content-Type: text/plain;charset=iso-8859-1 >>Is there any posibility to access this information (the sequence numbers >>that make up each frame) from mplayer without modifying any live source > > No. My goal is to optimize mplayer based on live for wlan streaming. But I am not allowed to modify the live library. If I do not have access to the sequence numbers that make up a frame how can I tell if a frame got lost? It would be at least helpful to know the seq no of each frame. I tried interrogating the bufferQueue->rtpSource()->curPacketRTPSeqNum() before each getBuffer(), but this does not behave as I would have expected. The seqNo I get does not correspond to the beginning of the frame I read with getBuffer. And this is because live continues receiving rtp packets from the server before mplayer receives its buffered frame (if I understood wrong please correct me). Is there another way of finding out if a frame(or parts of it) got lost? Or is it the last solution to implements my own rtp client and link it to mplayer(I might not have enough time for that)? Regards, Silviu > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > ------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel End of live-devel Digest, Vol 44, Issue 6 ***************************************** From finlayson at live555.com Wed Jun 6 23:13:41 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Jun 2007 23:13:41 -0700 Subject: [Live-devel] getting link time errors while building for arm.linux In-Reply-To: <7EAD1AEEA7621C45899FE99123E124A0E035A4@dbde01.ent.ti.com> References: <7EAD1AEEA7621C45899FE99123E124A0E035A4@dbde01.ent.ti.com> Message-ID: >I am getting lot of undefined reference errors when I am cross >compiling the code for arm architecture, following are some errors. I am >using uclibc >Tool chain. > >make[1]: Entering directory `/home/ravinder/rtsp/live_arm/testProgs' >arm-linux-gcc -otestMP3Streamer -L. testMP3Streamer.o >../liveMedia/libliveMedia.a ../groupsock/libgroupsock.a >../UsageEnvironment/libUsageEnvironment.a >../BasicUsageEnvironment/libBasicUsageEnvironment.a >testMP3Streamer.o: In function `main': >testMP3Streamer.cpp:(.text+0x140): undefined reference to `operator >new(unsigned int)' The problem is that you're not linking with the C++ runtime libraries. Try changing the definition of LINK in "config.armlinux" from $(CROSS_COMPILE)gcc -o to $(CROSS_COMPILE)g++ -o Perhaps that might help. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From may_ank77 at yahoo.com Thu Jun 7 02:08:18 2007 From: may_ank77 at yahoo.com (mayank agarwal) Date: Thu, 7 Jun 2007 02:08:18 -0700 (PDT) Subject: [Live-devel] Delay between audio and video Message-ID: <647756.61625.qm@web90414.mail.mud.yahoo.com> Hi all, I am running the downloaded LiveMedia code for steaming multiplexed audio/video stream using RTSP on client side.I was going through the code but i was not able to find out the code or the file in which audio/video synchronization is done. Please help me in this regard. Regards, Mayank ____________________________________________________________________________________ Get the Yahoo! toolbar and be alerted to new email wherever you're surfing. http://new.toolbar.yahoo.com/toolbar/features/mail/index.php From igor at mir2.org Thu Jun 7 03:37:21 2007 From: igor at mir2.org (Igor Bukanov) Date: Thu, 7 Jun 2007 12:37:21 +0200 Subject: [Live-devel] OnDemandServerMediaSubsession::deleteStream problem Message-ID: <7dee4710706070337k14029eebrc78f3dd3f79fd577@mail.gmail.com> The current code for OnDemandServerMediaSubsession::deleteStream, http://live555.com/liveMedia/doxygen/html/classOnDemandServerMediaSubsession.html#55a633f69121ee4f71874003d422f41c calls streamState->endPlaying even when destination is null. This immediately segfaults in endPlaying on null dereference. I observe this pretty reliably when OnDemandServerMediaSubsession is used in a video server implementation when the same source is unicasts to several VLC. When all VLC processes crashes when decoding particular MPEG4 frame almost at the same time this eventually leads to that deleteStream calls with already removed clientSessionId. The attached patch fixes that via skipping calls to streamState->endPlaying with unknown clientSessionId. Regards, Igor Index: OnDemandServerMediaSubsession.cpp =================================================================== RCS file: /var/cvsroot/repos1/videoserver/external/live/liveMedia/OnDemandServerMediaSubsession.cpp,v retrieving revision 1.1 diff -U7 -p -r1.1 OnDemandServerMediaSubsession.cpp --- OnDemandServerMediaSubsession.cpp 29 May 2007 11:35:10 -0000 1.1 +++ OnDemandServerMediaSubsession.cpp 7 Jun 2007 10:17:31 -0000 @@ -295,24 +295,25 @@ void OnDemandServerMediaSubsession::setS if (streamState != NULL && streamState->mediaSource() != NULL) { setStreamSourceScale(streamState->mediaSource(), scale); } } void OnDemandServerMediaSubsession::deleteStream(unsigned clientSessionId, void*& streamToken) { + StreamState* streamState = (StreamState*)streamToken; + // Look up (and remove) the destinations for this client session: Destinations* destinations = (Destinations*)(fDestinationsHashTable->Lookup((char const*)clientSessionId)); if (destinations != NULL) { fDestinationsHashTable->Remove((char const*)clientSessionId); - } - // Stop streaming to these destinations: - StreamState* streamState = (StreamState*)streamToken; - if (streamState != NULL) streamState->endPlaying(destinations); + // Stop streaming to these destinations: + if (streamState != NULL) streamState->endPlaying(destinations); + } // Delete the "StreamState" structure if it's no longer being used: if (streamState != NULL && streamState->referenceCount() >= 0) { if (streamState->referenceCount() > 0) --streamState->referenceCount(); if (streamState->referenceCount() == 0) { delete streamState; if (fLastStreamToken == streamToken) fLastStreamToken = NULL; -------------- next part -------------- A non-text attachment was scrubbed... Name: OnDemandServerMediaSubsession.patch Type: application/octet-stream Size: 1740 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070607/1e8fb169/attachment.obj From may_ank77 at yahoo.com Thu Jun 7 06:34:14 2007 From: may_ank77 at yahoo.com (mayank agarwal) Date: Thu, 7 Jun 2007 06:34:14 -0700 (PDT) Subject: [Live-devel] Understanding RTSP reference code from live555.com Message-ID: <153149.84720.qm@web90404.mail.mud.yahoo.com> Hi all, I have downloaded the freely available reference code from live555.com and compile,run it and tried to stream test.mpg file as input in client/server mode. Actually i am getting the delay between audio and video on streaming. Now i am trying to understand the code.I have the following doubts: In MPEG1or2ProgramStreamFileDuration function in do while loop file duration is calculated.I want to go into the detail of the steps involved in calculating the file duration. Thanks and Regards, Mayank ____________________________________________________________________________________ Fussy? Opinionated? Impossible to please? Perfect. Join Yahoo!'s user panel and lay it on us. http://surveylink.yahoo.com/gmrs/yahoo_panel_invite.asp?a=7 From finlayson at live555.com Thu Jun 7 06:37:07 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 7 Jun 2007 06:37:07 -0700 Subject: [Live-devel] OnDemandServerMediaSubsession::deleteStream problem In-Reply-To: <7dee4710706070337k14029eebrc78f3dd3f79fd577@mail.gmail.com> References: <7dee4710706070337k14029eebrc78f3dd3f79fd577@mail.gmail.com> Message-ID: Thanks for the bug report. This fix will be included in the next release of the software. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From xcsmith at rockwellcollins.com Thu Jun 7 14:30:52 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Thu, 7 Jun 2007 16:30:52 -0500 Subject: [Live-devel] LIVE555 code standard: comments Message-ID: When commenting LIVE555 code, is it prefered to have the comment blocks appear after the code block or function definition the comment block pertains to? I can't tell which is the prefered location. Are larger comment blocks using // prefered over /* */ ? What about Doxygen comments? ( /** @class **/ ?) Thanks! Xochitl -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070607/3154a8df/attachment.html From finlayson at live555.com Thu Jun 7 16:48:29 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 7 Jun 2007 16:48:29 -0700 Subject: [Live-devel] LIVE555 code standard: comments In-Reply-To: References: Message-ID: >When commenting LIVE555 code, is it prefered to have the comment >blocks appear after the code block or function definition the >comment block pertains to? It depends. If the comment is of the form // We're about to do X then I usually put it before. If the comment is of the form // We just did X, because Y then I usually put it after. >Are larger comment blocks using // prefered over /* */ ? Yes, because // blocks are easier to edit. > What about Doxygen comments? ( /** @class **/ ?) I don't use those. From what I can tell, Doxygen does a good job already. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070607/8b2ecd68/attachment.html From jnitin at ssdi.sharp.co.in Thu Jun 7 22:57:02 2007 From: jnitin at ssdi.sharp.co.in (Nitin Jain) Date: Fri, 8 Jun 2007 11:27:02 +0530 Subject: [Live-devel] Support for multiple NPT ranges in the Range header Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB5631FB@KABEX01.sharpamericas.com> Hi all, I have one question regarding LIVE555 media server. Does LIVE 555 RTSP server support multiple NPT ranges in the Range header? e.g Range: npt=10-30, npt=45.5-60. By looking at sourcecode I think presently LIVE555 does not support this feature.Can anyone conform this. Regards Nitin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070607/b0ffd577/attachment.html From finlayson at live555.com Thu Jun 7 23:35:23 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 7 Jun 2007 23:35:23 -0700 Subject: [Live-devel] Support for multiple NPT ranges in the Range header In-Reply-To: <9219A756FBA09B4D8CE75D4D7987D6CB5631FB@KABEX01.sharpamericas.com> References: <9219A756FBA09B4D8CE75D4D7987D6CB5631FB@KABEX01.sharpamericas.com> Message-ID: >I have one question regarding LIVE555 media server. Does LIVE 555 >RTSP server support multiple NPT ranges in the Range header? No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070607/13fc0e3b/attachment.html From sony at mht.bme.hu Fri Jun 8 00:24:02 2007 From: sony at mht.bme.hu (Tran Minh Son) Date: Fri, 08 Jun 2007 09:24:02 +0200 Subject: [Live-devel] Ports for RTSP Message-ID: <46690412.3060302@mht.bme.hu> Dear all, I would like to run RTSP server on a Linux machine with firewall. How can I open the port for Live555 media server? As I know, it uses a fix port (554 or 8554) to receive request from client. After that Live server will open an arbitrary port to send out the media via RTP over TCP or UDP; It informs this port to client so that he / she knows where to connect. From the viewpoint of firewall how can I release this on-the-fly port? Thank you From finlayson at live555.com Fri Jun 8 01:17:58 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 8 Jun 2007 01:17:58 -0700 Subject: [Live-devel] Ports for RTSP In-Reply-To: <46690412.3060302@mht.bme.hu> References: <46690412.3060302@mht.bme.hu> Message-ID: >Dear all, >I would like to run RTSP server on a Linux machine with firewall. How >can I open the port for Live555 media server? As I know, it uses a fix >port (554 or 8554) to receive request from client. After that Live >server will open an arbitrary port to send out the media via RTP over >TCP or UDP That's true only for UDP streaming. If you stream over TCP, then no UDP ports are used. In that case, you need only open the RTSP TCP port (554 or 8554). If you're behind a firewall, and cannot open UDP ports, then you can still serve streams via TCP, provided that you open the RTSP TCP port. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From sony at mht.bme.hu Fri Jun 8 01:59:18 2007 From: sony at mht.bme.hu (Tran Minh Son) Date: Fri, 08 Jun 2007 10:59:18 +0200 Subject: [Live-devel] Ports for RTSP In-Reply-To: References: <46690412.3060302@mht.bme.hu> Message-ID: <46691A66.2090402@mht.bme.hu> Hi Ross Finlayson Thank you for your answer. Can you show me how I can "order" the media with TCP not UDP using OpenRTSPClient (Sorry that I didnot check the syntax of OpenRTSPClient, you are here that is why I am lazy :-)) I wonder how many clients the MediaServer can serve simutanously through TCP using only one port Regards Ross Finlayson a ?crit : >> Dear all, >> I would like to run RTSP server on a Linux machine with firewall. How >> can I open the port for Live555 media server? As I know, it uses a fix >> port (554 or 8554) to receive request from client. After that Live >> server will open an arbitrary port to send out the media via RTP over >> TCP or UDP >> > > That's true only for UDP streaming. If you stream over TCP, then no > UDP ports are used. In that case, you need only open the RTSP TCP > port (554 or 8554). > > If you're behind a firewall, and cannot open UDP ports, then you can > still serve streams via TCP, provided that you open the RTSP TCP port. > From finlayson at live555.com Fri Jun 8 02:06:58 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 8 Jun 2007 02:06:58 -0700 Subject: [Live-devel] Ports for RTSP In-Reply-To: <46691A66.2090402@mht.bme.hu> References: <46690412.3060302@mht.bme.hu> <46691A66.2090402@mht.bme.hu> Message-ID: >Thank you for your answer. Can you show me how I can "order" the media >with TCP not UDP using OpenRTSPClient (Sorry that I didnot check the >syntax of OpenRTSPClient, you are here that is why I am lazy :-)) In other words, you'd rather bother the hundreds of other people on this mailing list than take the time to read the online documentation for "openRTSP"? If you read the documentation, you'll get the answer that you're looking for. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From may_ank77 at yahoo.com Fri Jun 8 03:27:21 2007 From: may_ank77 at yahoo.com (mayank agarwal) Date: Fri, 8 Jun 2007 03:27:21 -0700 (PDT) Subject: [Live-devel] Delay between capture and display Message-ID: <274257.32861.qm@web90415.mail.mud.yahoo.com> Hi all, I am streaming capturing live video fro webcam and streaming it using live555 code to RTSP server and then displaying the output in m player.But i am getting the delay between the capture and what is being displayed of about 4-5 seconds.I am trying to find out the reason for this delay. Any points or suggestions why this delay is happening. Regards, Mayank ____________________________________________________________________________________ Get the free Yahoo! toolbar and rest assured with the added security of spyware protection. http://new.toolbar.yahoo.com/toolbar/features/norton/index.php From julian.lamberty at mytum.de Fri Jun 8 04:09:47 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 08 Jun 2007 13:09:47 +0200 Subject: [Live-devel] doGetNextFrame() Message-ID: <466938FB.7030605@mytum.de> Hi! I've added some stdout messages to MPEG4VideoStreamDiscreteFramer and I can see that there are many calls to doGetNextFrame() and afterGettingFrame1() even if my source did not deliver one frame: MPEG4VideoStreamDiscreteFramer::doGetNextFrame() passed MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() frame size: 0 MPEG4VideoStreamDiscreteFramer::doGetNextFrame() passed MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() frame size: 0 MPEG4VideoStreamDiscreteFramer::doGetNextFrame() passed MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() frame size: 0 ...(repeats ~50 times) Decoded frame 1 [I, 138484 Bytes] in 11.76 ms Encoded frame 1 [I, 23329 Bytes] in 10.41 ms MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() assuming complete picture MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() VISUAL_OBJECT_SEQUENCE_START_CODE MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() frame size: 23329 MultiFramedRTPSink::sendPacketIfNecessary() Packet sent MultiFramedRTPSink::sendPacketIfNecessary() Packet sent MultiFramedRTPSink::sendPacketIfNecessary() Packet sent ...(more packets sent) MPEG4VideoStreamDiscreteFramer::doGetNextFrame() passed MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() assuming complete picture MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() VISUAL_OBJECT_SEQUENCE_START_CODE MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() frame size: 23329 MultiFramedRTPSink::sendPacketIfNecessary() Packet sent MultiFramedRTPSink::sendPacketIfNecessary() Packet sent MultiFramedRTPSink::sendPacketIfNecessary() Packet sent ...(more packets sent) How do I tell the framer to wait for one frame until asking for the next? Thanks Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070608/26671521/attachment-0001.bin From may_ank77 at yahoo.com Fri Jun 8 06:04:29 2007 From: may_ank77 at yahoo.com (mayank agarwal) Date: Fri, 8 Jun 2007 06:04:29 -0700 (PDT) Subject: [Live-devel] Live555 code Message-ID: <626150.97921.qm@web90407.mail.mud.yahoo.com> Hi all, I want to run the downloaded live 555 code as RTSP client.Please guide me how i would be able to do this. Regards, Mayank ____________________________________________________________________________________ Need a vacation? Get great deals to amazing places on Yahoo! Travel. http://travel.yahoo.com/ From julian.lamberty at mytum.de Fri Jun 8 06:24:56 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 08 Jun 2007 15:24:56 +0200 Subject: [Live-devel] afterGetting(this) Message-ID: <466958A8.9010601@mytum.de> Hi! I would like to know how I can 1. Request more data from a source (MPEG1or2VideoRTPSource) 2. Tell a sink (MPEG4VideoStreamDiscreteFramer) that data is completely delivered to buffer (this should be afterGetting(this), right?) Right now I have a code structure like: void Transcoder::doGetNextFrame() { fInputSource->getNextFrame(...); } void Transcoder::afterGettingFrame(...) { Transcoder* transcoder = (Transcoder*)clientData; transcoder->afterGettingFrame1(...); } void Transcoder::afterGettingFrame1(...) { decode_from_inbuf_to_frame if(successfully decoded) { encode_frame_to_fTo if(successfully encoded) { deliver (<- afterGetting(this)?) } } else { request more data (<- I don't know how to do this, if I simply call doGetNextFrame() once again it complains about being read more than once) } } Thanks for your help! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070608/1cd389d7/attachment.bin From finlayson at live555.com Fri Jun 8 07:03:08 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 8 Jun 2007 07:03:08 -0700 Subject: [Live-devel] doGetNextFrame() In-Reply-To: <466938FB.7030605@mytum.de> References: <466938FB.7030605@mytum.de> Message-ID: >I've added some stdout messages to MPEG4VideoStreamDiscreteFramer >and I can see that there are many calls to doGetNextFrame() and >afterGettingFrame1() even if my source did not deliver one frame: > >MPEG4VideoStreamDiscreteFramer::doGetNextFrame() passed >MPEG4VideoStreamDiscreteFramer::afterGettingFrame1() frame size: 0 That's your problem: Your upstream source (your "Transcoder" object) is delivering a 0-length frame. Are you sure that your "Transcoder" code is always correctly setting "fFrameSize" before calling "afterGetting(this)"? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From julian.lamberty at mytum.de Fri Jun 8 07:22:28 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 08 Jun 2007 16:22:28 +0200 Subject: [Live-devel] doGetNextFrame() References: 466938FB.7030605@mytum.de Message-ID: <46696624.3030402@mytum.de> > Are you sure that your "Transcoder" code is always correctly setting "fFrameSize" before calling "afterGetting(this)"? OK, I found the problem I think, but now I need support: I made a call to afterGetting(this) even if there was no encoded frame available. I changed that now, but there is another problem: How can I get more data from the Source without a call to afterGetting(this) before (I get "attempting to read more than once" messages)? I can't call afterGetting before an encoded frame is ready but there can't be an encoded frame if the decoder doesn't have enough data (you remember that chicken-or-the-egg-problem ;)). What I need is to know how I can read multiple packets from the source before delivering one big packet to the client. Which functions do I have to call?? Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070608/1bad869d/attachment.bin From RossellaFortuna at libero.it Fri Jun 8 11:25:51 2007 From: RossellaFortuna at libero.it (rossellafortuna ) Date: Fri, 8 Jun 2007 20:25:51 +0200 Subject: [Live-devel] (no subject) Message-ID: Hi Ross, I've just read this mail [http://lists.live555.com/pipermail/live-devel/2007-February/006237.html]and I wander in discovering my work of thesis is very similar to the project you were working on. I'm trying to create a streaming system H.264/avc SVC on RTP/UDP, and I already thank you for (indirectly) suggesting me to use live555 libs with VLC server.. However I had another little complication: I should implement a kind of bandwidth rate control, by sending the base layer to a given equation-based rate Rb, changing with a function of packets in the reciever buffer, and num of on-fly packets. In order to set enhancement layer bit-rate (for now I suppose I have only 1 FGS enhancement layer) I should implement tfrc because the rate of enhancement layer is Rtrfc - Rb; Now: 1) What kind of protocols should I use for tfrc bandwidth calculation? (a kind of RTP/RTCP? DCCP (but it's only unicast)? AVPF/RTP?) 2) Does lib555 allow me to extract all the parameters I need for rate control? Does it support tfrc? 3) In conjuction with tfrc, what kind of rtp packetization (SSRC multiplexing or multiple-rtp streams) is better to use for the different layers? It's my first mail, please treat me well ;)... and excuse my english!! I already thank you for any kind of help Regards Rossella ------------------------------------------------------ Passa a Infostrada. ADSL e Telefono senza limiti e senza canone Telecom http://click.libero.it/infostrada From finlayson at live555.com Fri Jun 8 11:47:09 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 8 Jun 2007 11:47:09 -0700 Subject: [Live-devel] doGetNextFrame() In-Reply-To: <46696624.3030402@mytum.de> References: 466938FB.7030605@mytum.de <46696624.3030402@mytum.de> Message-ID: >How can I get more data from the Source without a call to >afterGetting(this) before (I get "attempting to read more than once" >messages)? I can't call afterGetting before an encoded frame is >ready but there can't be an encoded frame if the decoder doesn't >have enough data (you remember that chicken-or-the-egg-problem ;)). > >What I need is to know how I can read multiple packets from the >source before delivering one big packet to the client. Which >functions do I have to call?? To get data from a source object, always call "thatSource->getNextFrame()". Note that the error about "atempting to read more than once" occurs if you call "getNextFrame()" on the *same* object a second time, before the 'after getting' function from the first call has been invoked. I.e., it's OK to do: source->getNextFrame(..., afterGettingFunc, ...); void afterGettingFunc(...) { source->getNextFrame(...); } but you *cannot* do: source->getNextFrame(..., afterGettingFunc, ...); source->getNextFrame(...); -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From xcsmith at rockwellcollins.com Fri Jun 8 17:40:17 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Fri, 8 Jun 2007 19:40:17 -0500 Subject: [Live-devel] Possible Patches, any interest? Message-ID: Ross and Community: I am trying to clean up my code in preparation for release. Would there be any interest in having the following items granted back to the library? Thanks for your consideration! Xochitl Class NPTClock: A class used by RTSPClient child class which receives the RTP-time, sequence number, npt start, and scale from the RTSP PLAY response and uses this information to provide the current NPT. The clock would be updated each successful PLAY request and could be queried at any time in the PLAYING state to report an estimated NPT based on the RTSP info and the system time. Session up-time is also available. Class RTSPClientEnhanced (couldn't think of a very good name): inherited publicly from RTSPClient (RTSPClient was modified) uses the NPTClock class to be able to report an estimated NPT at any time during the session allows user to select whether or not to perform TCP teardown after each RTSP message (I know the LIVE server doesn't support these clients) defaults to connection-oriented operation provides accessor function which returns the RTSP Server response code to the application for potentially more refined error handling. Patch to RTSPClient support inherited RTSP Clients added optional timeout for waiting on RTSP Server responses. defaults to infinite blocking. when setting the transport parameters, multicast parameter is delivered with "port" instead of "client_port" when built with RTSP_ALLOW_CLIENT_DESTINATION_SETTING. Both multicast and unicast parameters can be requested. SET_PARAMETER message creation altered to match option-tags specified in RFC2326 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070608/6a869867/attachment-0001.html From finlayson at live555.com Fri Jun 8 17:50:44 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 8 Jun 2007 17:50:44 -0700 Subject: [Live-devel] Possible Patches, any interest? In-Reply-To: References: Message-ID: >I am trying to clean up my code in preparation for release. Would >there be any interest in having the following items granted back to >the library? I'm not so interested (at least not right now) in your subclasses of "RTSPClient". However, I'd be interested in looking at your changes to "RTSPClient" itself; I might make some of those changes to the installed version. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070608/1b957898/attachment.html From julian.lamberty at mytum.de Sat Jun 9 02:33:49 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Sat, 09 Jun 2007 11:33:49 +0200 Subject: [Live-devel] doGetNextFrame() In-Reply-To: References: 466938FB.7030605@mytum.de <46696624.3030402@mytum.de> Message-ID: <466A73FD.30503@mytum.de> > Note that the error about "atempting to read more than once" occurs > if you call "getNextFrame()" on the *same* object a second time, > before the 'after getting' function from the first call has been > invoked. I.e., it's OK to do: > > source->getNextFrame(..., afterGettingFunc, ...); > > void afterGettingFunc(...) { > source->getNextFrame(...); > } > > but you *cannot* do: > > source->getNextFrame(..., afterGettingFunc, ...); > source->getNextFrame(...); > Hmm, I can't get it working: The structure is: void Transcoder::doGetNextFrame() { fInputSource->getNextFrame(inbuf, INBUF_SIZE, afterGettingFrame, this, handleClosure, this); } void Transcoder::afterGettingFrame(void* clientData, unsigned numBytesRead, unsigned numTruncatedBytes, struct timeval presentationTime, unsigned durationInMicroseconds) { Transcoder* transcoder = (Transcoder*)clientData; transcoder->afterGettingFrame1(numBytesRead, numTruncatedBytes, presentationTime, durationInMicroseconds); } void Transcoder::afterGettingFrame1(unsigned numBytesRead, unsigned numTruncatedBytes, struct timeval presentationTime, unsigned durationInMicroseconds) { ... if(complete) { afterGetting(this); } else { fInputSource->getNextFrame(inbuf, INBUF_SIZE, afterGettingFrame, this, handleClosure, this); } } This results in the "attempting to read more than once" error. I need the afterGettingFrame1 function to get new data like in dogetNextFrame() until complete != 0. How can I do that, i.e. how does the call to getNextFrame() have to look? Thanks for your support! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070609/1419c087/attachment.bin From zhj.zhao at gmail.com Mon Jun 11 05:49:11 2007 From: zhj.zhao at gmail.com (jerry zhao) Date: Mon, 11 Jun 2007 14:49:11 +0200 Subject: [Live-devel] quetstion about H263PLUSVideo* classes Message-ID: Hello,all I have a question about H263PLUSVideo* classes. I have read the source code about H263PLUSVideo* classes. I found that the input file is opened by class ByteStreamFileSource. What I want to do is: I use the RTCP packet to estimate the bankwidth, and then I will change the frame rate to adapt to the bandwidth. When the bandwidth is limited, I will change the framerate and discard some frames. My problem is that I don't know how to implement frame discarding. My idea is, I would transfer the changed frame rate(from class H263plusVideoStreamFramer) to the class ByteStreamFileSource, and then I could use fseek to set the file position so that I could discard some frames. I dont't know whether what I thougt is right. And I don't know how to transfer the value of frame rate to class ByteStreamFileSource. Could anyone give me any advice? Any idea would be great appreciated. Thanks. Jerry -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070611/f41d80c1/attachment.html From xcsmith at rockwellcollins.com Mon Jun 11 09:47:28 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Mon, 11 Jun 2007 11:47:28 -0500 Subject: [Live-devel] LIVE555 code standard: comments In-Reply-To: Message-ID: I notice in the class headers like RTSPClient the function description appears after the function. is it prefered to put the function explanation after or before the function? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070611/aa8969e5/attachment.html From finlayson at live555.com Mon Jun 11 14:25:07 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 11 Jun 2007 14:25:07 -0700 Subject: [Live-devel] LIVE555 code standard: comments In-Reply-To: References: Message-ID: >I notice in the class headers like RTSPClient the function >description appears after the function. is it prefered to put the >function explanation after or before the function? I usually put the function explanation after the function declaration - unless the explanation refers to more than one function, in which case I'll put it before them. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070611/e3126eec/attachment.html From gulshanchawla_2000 at yahoo.com Tue Jun 12 02:35:39 2007 From: gulshanchawla_2000 at yahoo.com (gulshan chawla) Date: Tue, 12 Jun 2007 02:35:39 -0700 (PDT) Subject: [Live-devel] Unable to build with VC 8.0 !!! Message-ID: <413964.28417.qm@web35404.mail.mud.yahoo.com> Hi , I have downloaded the source code for latest version of LIVE555 Streaming Media but i am unable to build it using the instruction given for building it on windows using Visual Studio 8.0 .Please let me know if anybody has build it succssfullly using Visual Studio 8.0 or knows how to build it . THanks and Regards gc ____________________________________________________________________________________ Get the Yahoo! toolbar and be alerted to new email wherever you're surfing. http://new.toolbar.yahoo.com/toolbar/features/mail/index.php From bleary at harris.com Tue Jun 12 07:21:33 2007 From: bleary at harris.com (Leary, Brent) Date: Tue, 12 Jun 2007 10:21:33 -0400 Subject: [Live-devel] Multicast Q Message-ID: Ross, I am using the 'testOnDemandRTSPServer' to support unicast streaming of MPEG transport streams to a PC running VLC. This is working with no problems. When I started the 'Live555 Media Server' was unavailable. Now I am interested in adding multicast support to the RTSP Server I'm running. From reading your responses to prior posts to this message board, neither 'testOnDemandRTSPServer' nor 'Live555 Media Server' support multicast playback (I'm assuming this is still the case). In order to support mutlicast, 'testMPEG1or2VideoStreamer' would have to be integrated/started to allow the running RTSP Server to perform mutlicast playback. Is this correct or has multicast support been incorporated into an RTSP Server? Does the existing RTSP Server & streamer software support the RFC's 2326's multicast example (below)? If it does would it be able to handle having the client set the multicast IP destination & port in the setup command? Is there a client already available that supports having the client setup the multicast destination & port instead of having them determined by the server? Thanks, Brent 14.4 Live Media Presentation Using Multicast The media server M chooses the multicast address and port. Here, we assume that the web server only contains a pointer to the full description, while the media server M maintains the full description. ...... C->M: DESCRIBE rtsp://live.example.com/concert/audio RTSP/1.0 CSeq: 1 M->C: RTSP/1.0 200 OK CSeq: 1 Content-Type: application/sdp Content-Length: 44 v=0 o=- 2890844526 2890842807 IN IP4 192.16.24.202 s=RTSP Session m=audio 3456 RTP/AVP 0 a=control:rtsp://live.example.com/concert/audio c=IN IP4 224.2.0.1/16 C->M: SETUP rtsp://live.example.com/concert/audio RTSP/1.0 CSeq: 2 Transport: RTP/AVP;multicast M->C: RTSP/1.0 200 OK CSeq: 2 Transport: RTP/AVP;multicast;destination=224.2.0.1; port=3456-3457;ttl=16 Session: 0456804596 -Brent -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070612/6587fb51/attachment-0001.html From finlayson at live555.com Tue Jun 12 07:55:35 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 12 Jun 2007 07:55:35 -0700 Subject: [Live-devel] Multicast Q In-Reply-To: References: Message-ID: >I am using the 'testOnDemandRTSPServer' to support unicast streaming >of MPEG transport streams to a PC running VLC. This is working with >no problems. When I started the 'Live555 Media Server' was >unavailable. Now I am interested in adding multicast support to the >RTSP Server I'm running. From reading your responses to prior posts >to this message board, neither 'testOnDemandRTSPServer' nor 'Live555 >Media Server' support multicast playback (I'm assuming this is still >the case). In order to support mutlicast, >'testMPEG1or2VideoStreamer' would have to be integrated/started to >allow the running RTSP Server to perform mutlicast playback. Is >this correct or has multicast support been incorporated into an RTSP >Server? Yes, our RTSP server implementation supports multicast streams. This is demonstrated in the various "test*Streamer" demo applications. E.g., You can demonstrate this by uncommenting the line //#define IMPLEMENT_RTSP_SERVER 1 in "testProgs/testMPEG1or2VideoStreamer.cpp". >Does the existing RTSP Server & streamer software support >the RFC's 2326's multicast example (below)? Yes. > If it does would it be able to handle having the client set the >multicast IP destination & port in the setup command? No, because that is a silly idea. It's the role of the server, not clients, to choose whether - and with what address/port - a stream is multicast. See http://lists.live555.com/pipermail/live-devel/2007-February/006273.html >Is there a client already available that supports having the client >setup the multicast destination & port No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070612/987d0790/attachment.html From severin.schoepke at gmail.com Tue Jun 12 08:12:59 2007 From: severin.schoepke at gmail.com (Severin Schoepke) Date: Tue, 12 Jun 2007 17:12:59 +0200 Subject: [Live-devel] Combining Audio and Video Streams Message-ID: <466EB7FB.8050302@gmail.com> Hi list, I already posted once about problems regarding the synchronization of audio and video streams... To recapitulize: I have a multithreaded app, one thread reads live-generated audio and video frames, and another encodes them (using ffmpeg) and streams 'em using live555. The frames are transferred between the two threads using two queues. The first thread stores a video frame every 40ms and an audio frame every 24ms. When I stream out only the video stream, everything works fine. But when I add the audio stream, the video source gets polled way too few times and the video starts lagging behind. I built a small test app that approximately simulates this behaviour (see attachments). The results are the same: if I stream only video, everything works as it should, if I add the audio stream, everything starts to stutter and lag. Could someone please verify that my pipeline (CustomSource -> MPEG1or2AudioFramer -> MPEG1or2AudioRTPSink and CustomSource -> MPEG4ESVideoFramer -> MPEG4ESRTPSink) is set up correctly? And could someone point me into the right direction to synchronize the two streams? Any answers are highly appreciated! cheers, Severin -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: ffmpeg2live555.cpp Url: http://lists.live555.com/pipermail/live-devel/attachments/20070612/7d99ab19/attachment-0005.ksh -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: FFMPEGDummySource.cpp Url: http://lists.live555.com/pipermail/live-devel/attachments/20070612/7d99ab19/attachment-0006.ksh -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: FFMPEGDummySource.hh Url: http://lists.live555.com/pipermail/live-devel/attachments/20070612/7d99ab19/attachment-0007.ksh -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: FFMPEGEncoder.cpp Url: http://lists.live555.com/pipermail/live-devel/attachments/20070612/7d99ab19/attachment-0008.ksh -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: FFMPEGEncoder.hh Url: http://lists.live555.com/pipermail/live-devel/attachments/20070612/7d99ab19/attachment-0009.ksh From xcsmith at rockwellcollins.com Tue Jun 12 08:48:08 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Tue, 12 Jun 2007 10:48:08 -0500 Subject: [Live-devel] Multicast Q In-Reply-To: Message-ID: In order to support mutlicast, 'testMPEG1or2VideoStreamer' would have to be integrated/started to allow the running RTSP Server to perform mutlicast playback. Is this correct or has multicast support been incorporated into an RTSP Server? As you will see in the mailing list, I already tried add multicast to the On-Demand servers. I would really recommend that you switch to using unicast if you want to have On-Demand sessions. The LIVE555 can use unicast to "fake" On-Demand multicast sessions (send the same data to all clients via unicast), so your users should not be able to tell the difference. I tried to dig through the code and see if I could implement On-Demand multicast, but it would be a real pain to do without changing/adding a lot of source because PassiveServerMediaSubsession requires the Groupsocks setup before construction, among other things. Also, it really does not make that much sense to implement. Is there a client already available that supports having the client setup the multicast destination & port If you want to use the streamer applications, you will have to implement this yourself in the RTSPClient class and also make a small change to PassiveServerMediaSubsession, but it is not so bad to do. xo -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070612/c475135e/attachment.html From neo_star2007 at yahoo.es Tue Jun 12 13:29:34 2007 From: neo_star2007 at yahoo.es (Joan Manuel Zabala) Date: Tue, 12 Jun 2007 22:29:34 +0200 (CEST) Subject: [Live-devel] Help Urgent! testMPEG1or2VideoStreamer problems Message-ID: <20070612202934.90579.qmail@web28113.mail.ukl.yahoo.com> hello to all! somebody could say me why when I execute to testMPEGVideoStreamer it sends these messages and I do not know if it is sending data to the network. ./testMPEG1or2VideoStreamer Unable to determine our source address: This computer has an invalid IP address: 0x0 Unable to determine our source address: This computer has an invalid IP address: 0x0 Beginning streaming... Beginning to read from file... I do not have a network card but I form a direction IP for the machine of type unicast. now, it wanted that somebody could explain to me why it is direction multicast SSM? and what direction I must place in the direction of origin of the source? please! somebody helps me Joan Zabala - Venezuela --------------------------------- ?Descubre una nueva forma de obtener respuestas a tus preguntas! Entra en Yahoo! Respuestas. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070612/2864551a/attachment.html From jnitin at ssdi.sharp.co.in Wed Jun 13 05:27:01 2007 From: jnitin at ssdi.sharp.co.in (Nitin Jain) Date: Wed, 13 Jun 2007 17:57:01 +0530 Subject: [Live-devel] Speed Header Implementation in media Server References: <20070612202934.90579.qmail@web28113.mail.ukl.yahoo.com> Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB562760@KABEX01.sharpamericas.com> Hi all, We want to implement Speed Header to deliver data to the client at a particular speed as per RFC 2326 . Presently this feature is not supported in LIVE555 media server. Can any one give some hints on how to start working on this? Regards Nitin Jain -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3293 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070613/d4323568/attachment.bin From finlayson at live555.com Wed Jun 13 06:37:28 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 13 Jun 2007 06:37:28 -0700 Subject: [Live-devel] Speed Header Implementation in media Server In-Reply-To: <9219A756FBA09B4D8CE75D4D7987D6CB562760@KABEX01.sharpamericas.com> References: <20070612202934.90579.qmail@web28113.mail.ukl.yahoo.com> <9219A756FBA09B4D8CE75D4D7987D6CB562760@KABEX01.sharpamericas.com> Message-ID: >Hi all, > >We want to implement Speed Header to deliver data to the client at >a particular speed as per RFC 2326 . >Presently this feature is not supported in LIVE555 media server. Almost noone implements the "Speed:" header, and in fact, because of this, the IETF is considering removing it from future revisions of the RTSP standard. Do you understand that the "Speed:" header is *not* intended to be used for 'fast forward' play? Fast forward play is implemented using the "Scale:" header is used (which our server already supports, for some media types. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yanivz at hotmail.com Wed Jun 13 07:11:53 2007 From: yanivz at hotmail.com (Yaniv Zohar) Date: 13 Jun 2007 07:11:53 -0700 Subject: [Live-devel] Hello All Message-ID: I have a small problem The System Command Get Segmantation Fault when Date: Wed, 13 Jun 2007 17:11:50 +0300 Mime-Version: 1.0 Content-Type: text/plain; format=flowed Hello All I have a small problem The System Command Get Segmantation Fault when the RTSP runining . I try to go out from C code to system command: run RTSP as a thread. then i execute the system command any system command even null command. I get segmantation fault. does any one have an idea or a solution for this problem? Best Regards ZManTK _________________________________________________________________ Don't just search. Find. Check out the new MSN Search! http://search.msn.com/ From tom.deblauwe at vsk.be Wed Jun 13 08:19:29 2007 From: tom.deblauwe at vsk.be (Tom Deblauwe) Date: Wed, 13 Jun 2007 17:19:29 +0200 Subject: [Live-devel] One stream to multiple clients Message-ID: <46700B01.5040102@vsk.be> Hello, I have a question about streaming 1 stream to multiple clients. For example a live camera stream, coming from a driver which delivers the video in a callback function(in my case). So I get this one stream from the camera, and 'somehow' give it to the livemedia library. But suppose multiple clients want to see it at the same time, using a TCP connection or a non-multicast UDP 'connection'. So two clients ask e.g. "rtsp://ipaddress/showCam?1". So they should both see the same video. Do I need to worry about duplicating the stream to those 2 clients, or does the library take care of that? If not, how should I go about doing such a thing? Thanks a lot, Best regards Tom Deblauwe From gulshanchawla_2000 at yahoo.com Wed Jun 13 09:08:40 2007 From: gulshanchawla_2000 at yahoo.com (gulshan chawla) Date: Wed, 13 Jun 2007 09:08:40 -0700 (PDT) Subject: [Live-devel] testOnDemandRTSPServer fails with strange byte error !!! Message-ID: <141228.11746.qm@web35410.mail.mud.yahoo.com> Hi , I am trying to test testOnDemandRTSPServer with a test.mpg file to play it as an mpeg1or2AudioVideoTest but getting the following errors when i request a play request from VLC player MPEG1or2VideoStreamParser::parseSlice(): Saw unexpected code 0x1b5 MPEG1or2VideoStreamParser::parseSlice(): Saw unexpected code 0x1b5 MPEG1or2VideoStreamParser::parseSlice(): Saw unexpected code 0x1b5 MPEG1or2VideoStreamParser::parseSlice(): Saw unexpected code 0x1d9 MPEG1or2VideoStreamParser::parseSlice(): Saw unexpected code 0x1ef Any comments are more than welcome . Regards ____________________________________________________________________________________ Yahoo! oneSearch: Finally, mobile search that gives answers, not web links. http://mobile.yahoo.com/mobileweb/onesearch?refer=1ONXIC From finlayson at live555.com Wed Jun 13 11:48:18 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 13 Jun 2007 11:48:18 -0700 Subject: [Live-devel] One stream to multiple clients In-Reply-To: <46700B01.5040102@vsk.be> References: <46700B01.5040102@vsk.be> Message-ID: >Hello, > >I have a question about streaming 1 stream to multiple clients. For >example a live camera stream, coming from a driver which delivers the >video in a callback function(in my case). So I get this one stream from >the camera, and 'somehow' give it to the livemedia library. But suppose >multiple clients want to see it at the same time, using a TCP connection >or a non-multicast UDP 'connection'. So two clients ask e.g. >"rtsp://ipaddress/showCam?1". So they should both see the same video. Do >I need to worry about duplicating the stream to those 2 clients, or does >the library take care of that? Yes, our RTSP server implementation takes care of that, *provided that* you set the "reuseFirstSource" parameter to True when you create each "ServerMediaSubsession" object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Wed Jun 13 11:50:23 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 13 Jun 2007 11:50:23 -0700 Subject: [Live-devel] testOnDemandRTSPServer fails with strange byte error !!! In-Reply-To: <141228.11746.qm@web35410.mail.mud.yahoo.com> References: <141228.11746.qm@web35410.mail.mud.yahoo.com> Message-ID: >Hi , > I am trying to test testOnDemandRTSPServer with a >test.mpg file to play it as an mpeg1or2AudioVideoTest >but getting the following errors when i request a >play request from VLC player > >MPEG1or2VideoStreamParser::parseSlice(): Saw >unexpected code 0x1b5 >MPEG1or2VideoStreamParser::parseSlice(): Saw >unexpected code 0x1b5 >MPEG1or2VideoStreamParser::parseSlice(): Saw >unexpected code 0x1b5 >MPEG1or2VideoStreamParser::parseSlice(): Saw >unexpected code 0x1d9 >MPEG1or2VideoStreamParser::parseSlice(): Saw >unexpected code 0x1ef > > Any comments are more than welcome . Your "test.mpg" file is probably not a MPEG-1 or 2 Program Stream file. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From adigupt_000 at rediffmail.com Wed Jun 13 23:18:48 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 14 Jun 2007 06:18:48 -0000 Subject: [Live-devel] MPEG4 RTP transmission Message-ID: <20070614061848.21604.qmail@webmail99.rediffmail.com> hi, Can any one tell me how to transmit mpeg4 media via RTP .. i had made sufficient changes in testMPEG4VideoStreamer ( i.e. provide unicast address) but things are not working for me .. can any one tell me how to do it .. ? thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070613/91fe140f/attachment.html From finlayson at live555.com Wed Jun 13 23:29:25 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 13 Jun 2007 23:29:25 -0700 Subject: [Live-devel] MPEG4 RTP transmission In-Reply-To: <20070614061848.21604.qmail@webmail99.rediffmail.com> References: <20070614061848.21604.qmail@webmail99.rediffmail.com> Message-ID: >hi, > >Can any one tell me how to transmit mpeg4 media via RTP .. i had made >sufficient changes in testMPEG4VideoStreamer ( i.e. provide unicast >address) but things are not working for me .. Does it work OK for you if you *don't* make any changes to the code?? The supplied application code (for "testMPEG4VideoStreamer") is there for a reason - it works. Anyway, the best way to stream via unicast is to use "testOnDemandRTSPServer" (or "live555MediaServer"). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From adigupt_000 at rediffmail.com Thu Jun 14 02:39:21 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 14 Jun 2007 09:39:21 -0000 Subject: [Live-devel] MPEG4 RTP transmission Message-ID: <20070614093921.9163.qmail@webmail94.rediffmail.com> On Thu, 14 Jun 2007 Ross Finlayson wrote : > >hi, > > > >Can any one tell me how to transmit mpeg4 media via RTP .. i had made > >sufficient changes in testMPEG4VideoStreamer ( i.e. provide unicast > >address) but things are not working for me .. > >Does it work OK for you if you *don't* make any changes to the code?? >The supplied application code (for "testMPEG4VideoStreamer") is there >for a reason - it works. > >Anyway, the best way to stream via unicast is to use >"testOnDemandRTSPServer" (or "live555MediaServer"). >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ hi, yeah it works fine ..but "testOnDemandRTSPServer" and "testMPEG4VideoStreamer" (unmodified version) are for RTSP transmission .. but i want to transmit media through RTP/RTCP transmission , and that is not working fine .. thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/0075fbf6/attachment.html From finlayson at live555.com Thu Jun 14 02:51:14 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 14 Jun 2007 02:51:14 -0700 Subject: [Live-devel] MPEG4 RTP transmission In-Reply-To: <20070614093921.9163.qmail@webmail94.rediffmail.com> References: <20070614093921.9163.qmail@webmail94.rediffmail.com> Message-ID: >yeah it works fine ..but "testOnDemandRTSPServer" and >"testMPEG4VideoStreamer" (unmodified version) are for RTSP >transmission .. but i want to transmit media through RTP/RTCP >transmission , and that is not working fine .. RTSP is not a 'transmission' protocol; it is a control protocol. RTSP servers *do* transmit their media using RTP/RTCP. To stream MPEG-4 video via unicast, you will need to use a RTSP/RTP/RTCP server - e.g., "testOnDemandRTSPServer", and receive/play it using a RTSP/RTP/RTCP client - e.g., VLC. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From adigupt_000 at rediffmail.com Thu Jun 14 03:45:38 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 14 Jun 2007 10:45:38 -0000 Subject: [Live-devel] MPEG4 RTP transmission Message-ID: <20070614104538.13894.qmail@webmail104.rediffmail.com> ? On Thu, 14 Jun 2007 Ross Finlayson wrote : > >yeah it works fine ..but "testOnDemandRTSPServer" and > >"testMPEG4VideoStreamer" (unmodified version) are for RTSP > >transmission .. but i want to transmit media through RTP/RTCP > >transmission , and that is not working fine .. > >RTSP is not a 'transmission' protocol; it is a control protocol. >RTSP servers *do* transmit their media using RTP/RTCP. > >To stream MPEG-4 video via unicast, you will need to use a >RTSP/RTP/RTCP server - e.g., "testOnDemandRTSPServer", and >receive/play it using a RTSP/RTP/RTCP client - e.g., VLC. >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ hi , yeah this is fine .. but when i made changes(provide unicast address) exactly as it is in testMPEG1or2VideoStreamer then i am unable to play with vlc player .. but i am able to play MPEG1/2 media streamed through testMPEG1or2VideStreamer, also MPEG4 media streamed through testMPEG4VideoStreamer(unchanged) ie. transmitted through RTSP server .. thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/f4b521fb/attachment.html From jnitin at ssdi.sharp.co.in Thu Jun 14 04:59:40 2007 From: jnitin at ssdi.sharp.co.in (Nitin Jain) Date: Thu, 14 Jun 2007 17:29:40 +0530 Subject: [Live-devel] How to use setMediaSessionParameter() and getMediaSessionParameter() from LIVE555 media Server References: <20070614104538.13894.qmail@webmail104.rediffmail.com> Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB562761@KABEX01.sharpamericas.com> Hi All, I want to use GET_PARAMETER and SET_PARAMETER RTSP request method from both LIVE555 RTSP client and LIVE555 media server as per RFC 2326 . But by looking at the codebase i found that the corresponding API's in file RTSPClient.c 1) Boolean setMediaSessionParameter() 2) Boolean getMediaSessionParameter(); can be used for sending GET_PARAMETER and SET_PARAMETER request only from RTSP Client and not from MediaServer . Please tell me how I can use these two API/s from LIVE555 media server also as per RFC 2326. Regards Nitin -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3855 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070614/600be532/attachment.bin From finlayson at live555.com Thu Jun 14 05:38:19 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 14 Jun 2007 05:38:19 -0700 Subject: [Live-devel] How to use setMediaSessionParameter() and getMediaSessionParameter() from LIVE555 media Server In-Reply-To: <9219A756FBA09B4D8CE75D4D7987D6CB562761@KABEX01.sharpamericas.com> References: <20070614104538.13894.qmail@webmail104.rediffmail.com> <9219A756FBA09B4D8CE75D4D7987D6CB562761@KABEX01.sharpamericas.com> Message-ID: >But by looking at the codebase i found that the corresponding API's >in file RTSPClient.c > > >1) Boolean setMediaSessionParameter() >2) Boolean getMediaSessionParameter(); > > >can be used for sending GET_PARAMETER and SET_PARAMETER request >only from RTSP Client and not from MediaServer . > >Please tell me how I can use these two API/s from LIVE555 media server You can't, because those functions are defined in the "RTSPClient" class - i.e., a RTSP client, not a RTSP server. If you really want to implement these two requests in our RTSP *server* implementation, then you would need to modify "RTSPServer" class. (Note that there's already a stub there for implementing "GET_PARAMETER".) You're on your own here... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From tom.deblauwe at vsk.be Thu Jun 14 06:01:50 2007 From: tom.deblauwe at vsk.be (Tom Deblauwe) Date: Thu, 14 Jun 2007 15:01:50 +0200 Subject: [Live-devel] One stream to multiple clients In-Reply-To: References: <46700B01.5040102@vsk.be> Message-ID: <46713C3E.2080900@vsk.be> Ross Finlayson wrote: > Yes, our RTSP server implementation takes care of that, *provided > that* you set the "reuseFirstSource" parameter to True when you > create each "ServerMediaSubsession" object. > Nice! So, I was checking out the whole picture of the library, just trying to understand it. It's a pity there is not a good overview of the different classes and how they are used. So after looking at the sources, I will try to explain here, correct me if I'm wrong! To implement your own RTSP server, subclass RTSPServer, and reimplement lookupServerMediaSession. A media session represents a RTSP URL that a client can ask of the RTSP server, and represents one source of video and/or audio data. In lookupServerMediaSession, you check if the session already exists, and if not, create a standard one using the static function createNew of ServerMediaSession. Then you need to add a subsession to this session, and then you add the ServerMediaSession to your RTSP server object using rtspServer->addServerMediaSession. A subsession class is subclassed from OnDemandServerMediaSubsession and implements createNew to create a new subsession to add to your session. Also you reimplement createNewRTPSink so you can stream TO the network(from a file or whatever). This function must give back a RTPSink object. So you need to subclass MultiFramedRTPSink(see SimpleRTPSink for an example), and use it to create that RTPSink object. As far as I understand, the RTPSink object gives us the chance to set some RTP specific parameters like a marker bit or something, according to the contents of the individual RTP packet that will be sent out. SimpleRTPSink sets the marker bit on the last packet of a big frame for example. You also need to subclass createNewStreamSource of the OnDemandServerMediaSubsession class if you want to be able to receive data FROM the network. You need to give a FramedSource pointer in this function. So from now on, I'm not so sure anymore :) So we have a bunch of classes that contain the name 'Framer' in them, which are subclasses of FramedSource, and which write the incoming data again in a file in most cases I think. We have also MultiFramedRTPSource which is also a FramedSource. SimpleRTPSource is subclassed of this class. So we can return an object of this class in createNewStreamSource. Now I can't explain any further how the sink gets it's incoming data, and how the source gets the data, where is it? I don't understand. Can someone point me out here? Thanks a lot, Best regards, Tom Deblauwe So you need to subclass FramedFilter and implement the createNew function. A FramedFilter is also a FramedSource, and so you (must?) return an instance of your subclass in the createNewStreamSource function. How the From neo_star2007 at yahoo.es Thu Jun 14 06:56:12 2007 From: neo_star2007 at yahoo.es (Joan Manuel Zabala) Date: Thu, 14 Jun 2007 15:56:12 +0200 (CEST) Subject: [Live-devel] Help Urgent! testMPEG1or2VideoStreamer problems Message-ID: <20070614135612.9104.qmail@web28104.mail.ukl.yahoo.com> hello to all! somebody could say me why when I execute to testMPEGVideoStreamer it sends these messages and I do not know if it is sending data to the network. ./testMPEG1or2VideoStreamer Unable to determine our source address: This computer has an invalid IP address: 0x0 Unable to determine our source address: This computer has an invalid IP address: 0x0 Beginning streaming... Beginning to read from file... I do not have a network card but I form a direction IP for the machine of type unicast. now, it wanted that somebody could explain to me why it is direction multicast SSM? and what direction I must place in the direction of origin of the source? please! somebody helps me Joan Zabala - Venezuela --------------------------------- ?Descubre una nueva forma de obtener respuestas a tus preguntas! Entra en Yahoo! Respuestas. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/4538ec8d/attachment.html -------------- next part -------------- _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From neo_star2007 at yahoo.es Thu Jun 14 07:23:00 2007 From: neo_star2007 at yahoo.es (Joan Manuel Zabala) Date: Thu, 14 Jun 2007 16:23:00 +0200 (CEST) Subject: [Live-devel] Help! how a program RTSP client works with multicast Message-ID: <20070614142300.84750.qmail@web28101.mail.ukl.yahoo.com> Hi, if I have a program RTSP client to work with multicast, my machine must have a direction IP multicast or any other? Can I prove the transmission of video of the server in the same machine where I execute the client? Joan Zabala - Venezuela --------------------------------- ?Descubre una nueva forma de obtener respuestas a tus preguntas! Entra en Yahoo! Respuestas. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/ce0969dd/attachment.html From finlayson at live555.com Thu Jun 14 09:08:31 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 14 Jun 2007 09:08:31 -0700 Subject: [Live-devel] One stream to multiple clients In-Reply-To: <46713C3E.2080900@vsk.be> References: <46700B01.5040102@vsk.be> <46713C3E.2080900@vsk.be> Message-ID: >To implement your own RTSP server, subclass RTSPServer No, in most cases you don't need to write your own subclass; the existing "RTSPServer" class will work just fine. This is especially true if the set of accessible streams is fixed, and known in advance. (See, for example, the code for "testOnDemandRTSPServer".) However, to repeat the answer to the original question, if you want the contents of a stream to be duplicated to each concurrent client, then make sure that you set the "reuseFirstSource" parameter to True when you create each "ServerMediaSubsession" object. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From ymreddy at ssdi.sharp.co.in Thu Jun 14 09:43:52 2007 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Thu, 14 Jun 2007 22:13:52 +0530 Subject: [Live-devel] Frame rate SDP attribute References: <46700B01.5040102@vsk.be> <46713C3E.2080900@vsk.be> Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB8C599F@KABEX01.sharpamericas.com> Hello Everybody, We would like to include the frame rate attribute as part of SDP generation during RTSP streaming from server to client. We found that frame rate is retrieved from stream after sending SDP report. Any clues on how to include frame rate as part of SDP. Thanks and Regards Y. Mallikharjuna Reddy -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3545 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070614/2dc47fa8/attachment.bin From ymreddy at ssdi.sharp.co.in Thu Jun 14 09:51:21 2007 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Thu, 14 Jun 2007 22:21:21 +0530 Subject: [Live-devel] Frame rate SDP attribute References: <46700B01.5040102@vsk.be> <46713C3E.2080900@vsk.be> Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB8C59A0@KABEX01.sharpamericas.com> Hello Everybody, We would like to include the frame rate attribute as part of SDP generation during RTSP streaming from server to client. We found that frame rate is retrieved from stream after sending SDP report. Any clues on how to include frame rate as part of SDP. Thanks and Regards Y. Mallikharjuna Reddy -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3369 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070614/a4c3db69/attachment.bin From ymreddy at ssdi.sharp.co.in Thu Jun 14 05:32:24 2007 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Thu, 14 Jun 2007 18:02:24 +0530 Subject: [Live-devel] Frame rate SDP attribute Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB8C7BA4@KABEX01.sharpamericas.com> Hello Everybody, We would like to include the frame rate attribute as part of SDP generation during RTSP streaming from server to client. We found that frame rate is retrieved from stream after sending SDP report. Any clues on how to include frame rate as part of SDP. Thanks and Regards Y. Mallikharjuna Reddy -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/a0b6279b/attachment.html From finlayson at live555.com Thu Jun 14 11:35:06 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 14 Jun 2007 11:35:06 -0700 Subject: [Live-devel] Frame rate SDP attribute In-Reply-To: <9219A756FBA09B4D8CE75D4D7987D6CB8C7BA4@KABEX01.sharpamericas.com> References: <9219A756FBA09B4D8CE75D4D7987D6CB8C7BA4@KABEX01.sharpamericas.com> Message-ID: Please don't send the same message to the list multiple times! >We would like to include the frame rate attribute as part of SDP >generation during RTSP streaming from server to client. We found >that frame rate is retrieved from stream after sending SDP report. >Any clues on how to include frame rate as part of SDP. What type of video data is this? For most video codecs, the frame rate can be derived from information that's carried in-band (in codec-specific frame headers), rather than in SDP. There is no standard SDP attribute for "frame rate". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/2a2ae277/attachment-0001.html From ymreddy at ssdi.sharp.co.in Thu Jun 14 21:39:18 2007 From: ymreddy at ssdi.sharp.co.in (Mallikharjuna Reddy (NAVT)) Date: Fri, 15 Jun 2007 10:09:18 +0530 Subject: [Live-devel] Frame rate SDP attribute Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB8C7BA5@KABEX01.sharpamericas.com> Sorry for inconvenience. The video data is MPEG2 ES, PS and TS streams. Thanks and Regards Y. Mallikharjuna Reddy -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com]On Behalf Of Ross Finlayson Sent: Friday, June 15, 2007 12:05 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Frame rate SDP attribute Please don't send the same message to the list multiple times! We would like to include the frame rate attribute as part of SDP generation during RTSP streaming from server to client. We found that frame rate is retrieved from stream after sending SDP report. Any clues on how to include frame rate as part of SDP. What type of video data is this? For most video codecs, the frame rate can be derived from information that's carried in-band (in codec-specific frame headers), rather than in SDP. There is no standard SDP attribute for "frame rate". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/f24829ee/attachment.html From adigupt_000 at rediffmail.com Thu Jun 14 21:50:03 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 15 Jun 2007 04:50:03 -0000 Subject: [Live-devel] MPEG4 RTP transmission Message-ID: <20070615045003.9318.qmail@webmail90.rediffmail.com> ? hi , i want to transmit MPEG4 media through RTP/RTCP (don't want to use RTSP).. so for this .. i made changes(provide unicast address) exactly as it is in testMPEG1or2VideoStreamer but i am unable to play with vlc player .. but i am able to play MPEG1/2 media streamed through testMPEG1or2VideStreamer, also MPEG4 media streamed through testMPEG4VideoStreamer(unchanged) ie. transmitted through RTSP server .. can anyone tell me how to do it.. thanx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/4e8fe9e2/attachment.html From finlayson at live555.com Thu Jun 14 22:05:06 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 14 Jun 2007 22:05:06 -0700 Subject: [Live-devel] Frame rate SDP attribute In-Reply-To: <9219A756FBA09B4D8CE75D4D7987D6CB8C7BA5@KABEX01.sharpamericas.com> References: <9219A756FBA09B4D8CE75D4D7987D6CB8C7BA5@KABEX01.sharpamericas.com> Message-ID: >The video data is MPEG2 ES, PS and TS streams. Then you don't convey the frame rate in SDP, because it can be derived from the video data itself. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/d7f9e52c/attachment.html From finlayson at live555.com Thu Jun 14 23:12:28 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 14 Jun 2007 23:12:28 -0700 Subject: [Live-devel] MPEG4 RTP transmission In-Reply-To: <20070615045003.9318.qmail@webmail90.rediffmail.com> References: <20070615045003.9318.qmail@webmail90.rediffmail.com> Message-ID: >i want to transmit MPEG4 media through RTP/RTCP (don't want to use >RTSP).. Then you're out of luck. To stream MPEG-4 using our library, you need to use RTSP. We have both muticast ("testMPEG4VideoStreamer") and unicast ("testOnDemandRTSPServer") demo applications for this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From adigupt_000 at rediffmail.com Thu Jun 14 23:27:29 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 15 Jun 2007 06:27:29 -0000 Subject: [Live-devel] MPEG4 RTP transmission Message-ID: <20070615062729.30748.qmail@webmail106.rediffmail.com> ? On Fri, 15 Jun 2007 Ross Finlayson wrote : > >i want to transmit MPEG4 media through RTP/RTCP (don't want to use > >RTSP).. > >Then you're out of luck. To stream MPEG-4 using our library, you >need to use RTSP. We have both muticast ("testMPEG4VideoStreamer") >and unicast ("testOnDemandRTSPServer") demo applications for this. >-- > >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ hi, i think testMPEG1or2VideoStreamer is not using RTSP (correct me if i am wrong) so .. can't MPEG4 media be also transmitted in the same way through testMPEG4VideoStreamer ... thanx ... -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070614/376ebfbc/attachment.html From tom.deblauwe at vsk.be Thu Jun 14 23:35:02 2007 From: tom.deblauwe at vsk.be (Tom Deblauwe) Date: Fri, 15 Jun 2007 08:35:02 +0200 Subject: [Live-devel] One stream to multiple clients In-Reply-To: References: <46700B01.5040102@vsk.be> <46713C3E.2080900@vsk.be> Message-ID: <46723316.2020600@vsk.be> Ross Finlayson wrote: >> To implement your own RTSP server, subclass RTSPServer >> > > No, in most cases you don't need to write your own subclass; the > existing "RTSPServer" class will work just fine. This is especially > true if the set of accessible streams is fixed, and known in advance. > (See, for example, the code for "testOnDemandRTSPServer".) However, > to repeat the answer to the original question, if you want the > contents of a stream to be duplicated to each concurrent client, then > make sure that you set the "reuseFirstSource" parameter to True when > you create each "ServerMediaSubsession" object. > OK I've re-read the first question in the FAQ, and now I understand where we get the data from. Thanks! Regards, Tom, From finlayson at live555.com Fri Jun 15 01:26:13 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 15 Jun 2007 01:26:13 -0700 Subject: [Live-devel] MPEG4 RTP transmission In-Reply-To: <20070615062729.30748.qmail@webmail106.rediffmail.com> References: <20070615062729.30748.qmail@webmail106.rediffmail.com> Message-ID: >i think testMPEG1or2VideoStreamer is not using RTSP (correct me if i >am wrong) so .. can't MPEG4 media be also transmitted in the same way >through testMPEG4VideoStreamer ... No - MPEG-4 video is different. Because this is a frequently asked question, I have now added it to the FAQ: http://www.live555.com/liveMedia/faq.html#rtsp-needed -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070615/39b15fb2/attachment.html From adigupt_000 at rediffmail.com Fri Jun 15 01:49:31 2007 From: adigupt_000 at rediffmail.com (aditya gupta) Date: 15 Jun 2007 08:49:31 -0000 Subject: [Live-devel] MPEG4 RTP transmission Message-ID: <20070615084931.9932.qmail@webmail89.rediffmail.com> ? On Fri, 15 Jun 2007 Ross Finlayson wrote : >>i think testMPEG1or2VideoStreamer is not using RTSP (correct me if i >>am wrong) so .. can't MPEG4 media be also transmitted in the same way >>through testMPEG4VideoStreamer ... > >No - MPEG-4 video is different. > >Because this is a frequently asked question, I have now added it to the FAQ: >http://www.live555.com/liveMedia/faq.html#rtsp-needed >-- >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ thanx for all replies .. i will look out for some other way .. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070615/8de88742/attachment.html From finlayson at live555.com Fri Jun 15 02:13:37 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 15 Jun 2007 02:13:37 -0700 Subject: [Live-devel] MPEG4 RTP transmission In-Reply-To: <20070615084931.9932.qmail@webmail89.rediffmail.com> References: <20070615084931.9932.qmail@webmail89.rediffmail.com> Message-ID: >thanx for all replies .. i will look out for some other way .. Or you could just use RTSP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lucabe72 at email.it Fri Jun 15 02:15:20 2007 From: lucabe72 at email.it (Luca Abeni) Date: Fri, 15 Jun 2007 11:15:20 +0200 Subject: [Live-devel] MPEG4 RTP transmission In-Reply-To: References: <20070615062729.30748.qmail@webmail106.rediffmail.com> Message-ID: <467258A8.7060400@email.it> Hi Ross, Ross Finlayson wrote: >> i think testMPEG1or2VideoStreamer is not using RTSP (correct me if i >> am wrong) so .. can't MPEG4 media be also transmitted in the same way >> through testMPEG4VideoStreamer ... > > No - MPEG-4 video is different. > > Because this is a frequently asked question, I have now added it to the FAQ: > http://www.live555.com/liveMedia/faq.html#rtsp-needed Sorry for jumping in this thread, but this new FAQ makes me curious: is the out-of-band transmission of MPEG4 global headers mandatory? I mean: my understanding of reading RFC 3016 is that configuration information may be transmitted out-of-band, but we are not forced to do so... Did I misunderstand something? Thanks, Luca From julian.lamberty at mytum.de Fri Jun 15 03:25:08 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 15 Jun 2007 12:25:08 +0200 Subject: [Live-devel] vobStreamer questions Message-ID: <46726904.2000502@mytum.de> Hi! I would like to know how vobStreamer respectively MPEG1or2VideoRTP Source encapsulate the VideoStream from a DVD into the RTP Packets. Does it extract the Elementary Streams from the Program stream and pack the Elementary streams according to RFC2250? Why does Wireshark identify the RTP packets as MPEG-1? It should rather be MPEG-2, right? Thanks Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070615/a76fb752/attachment.bin From jnitin at ssdi.sharp.co.in Fri Jun 15 05:01:03 2007 From: jnitin at ssdi.sharp.co.in (Nitin Jain) Date: Fri, 15 Jun 2007 17:31:03 +0530 Subject: [Live-devel] Calling RTSPClient::incomingRequestHandler1() from RTSP Client to handle RTSP request from Server Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB5631FE@KABEX01.sharpamericas.com> Hi Everybody, I saw a reply by Ross in mailing list about using SET_PARAMETER from RTSP server as below:- ************************************************************************ >But anyway, if you *really* want to handle the "SET_PARAMETER" (or >any other) RTSP command coming from the server, then you could modify >the function "RTSPClient::incomingRequestHandler1()". (However, once >you've modified the code, you can, in general, expect no support.) >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ ************************************************************************* My question is 1) From where I can call the RTSPClient::incomingRequestHandler1() function so that client can handle the RTSP request from server .Presently when I tried to send the SET_PARAMETER request from server the RTSP client is considering it as a Response and not an RTSP Request. Please provide your valueable feedback. Regards Nitin From ravinder.singh at ti.com Fri Jun 15 06:28:44 2007 From: ravinder.singh at ti.com (singh, Ravinder) Date: Fri, 15 Jun 2007 18:58:44 +0530 Subject: [Live-devel] response to VOD server In-Reply-To: Message-ID: <7EAD1AEEA7621C45899FE99123E124A0E6657F@dbde01.ent.ti.com> Hi Ross I would like to Clarify following things: 1) Whether response is send to the server after every data packet received if yes from which file/function. 2) Whether openRTSP can handle TS+H264+MPEG1l2 combo in trick play mode 3)Can I control the VOD server data streaming rate as my buffer is getting overflowed because my server Is sending data at a faster rate than what is being consumed , inspite of maintaing a 10MB of Circular buffer my data is Getting overflowed, by any means can I control the rate at which server is Streaming the data. Regards Ravinder From finlayson at live555.com Fri Jun 15 07:49:20 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 15 Jun 2007 07:49:20 -0700 Subject: [Live-devel] MPEG4 RTP transmission In-Reply-To: <467258A8.7060400@email.it> References: <20070615062729.30748.qmail@webmail106.rediffmail.com> <467258A8.7060400@email.it> Message-ID: >Hi Ross, > >Ross Finlayson wrote: >>> i think testMPEG1or2VideoStreamer is not using RTSP (correct me if i >>> am wrong) so .. can't MPEG4 media be also transmitted in the same way >>> through testMPEG4VideoStreamer ... > > >> No - MPEG-4 video is different. >> >> Because this is a frequently asked question, I have now added it to the FAQ: > > http://www.live555.com/liveMedia/faq.html#rtsp-needed >Sorry for jumping in this thread, but this new FAQ makes me curious: is >the out-of-band transmission of MPEG4 global headers mandatory? > >I mean: my understanding of reading RFC 3016 is that configuration >information may be transmitted out-of-band, but we are not forced to do >so... Did I misunderstand something? No, that is correct. However, if the config information is not carried at all (not even in-band, as part of the MPEG-4 video RTP stream), then receivers may not be able to decode the stream properly. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Jun 15 07:52:32 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 15 Jun 2007 07:52:32 -0700 Subject: [Live-devel] vobStreamer questions In-Reply-To: <46726904.2000502@mytum.de> References: <46726904.2000502@mytum.de> Message-ID: >I would like to know how vobStreamer respectively MPEG1or2VideoRTP >Source encapsulate the VideoStream from a DVD into the RTP Packets. >Does it extract the Elementary Streams from the Program stream and >pack the Elementary streams according to RFC2250? Yes, except that the *audio* stream - being AC-3 rather than MPEG audio - is sent using a different RTP payload format (which, in fact, does not currently conform to the IETF standard RFC for AC-3 RTP streams; that code will need to be updated). > >Why does Wireshark identify the RTP packets as MPEG-1? I don't know; you'll have to ask the author of "Wireshark". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Jun 15 07:59:51 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 15 Jun 2007 07:59:51 -0700 Subject: [Live-devel] response to VOD server In-Reply-To: <7EAD1AEEA7621C45899FE99123E124A0E6657F@dbde01.ent.ti.com> References: <7EAD1AEEA7621C45899FE99123E124A0E6657F@dbde01.ent.ti.com> Message-ID: >1) Whether response is send to the server after every data packet >received Are you asking if a RTSP/RTP client sends back a 'response' to the server after every incoming RTP packet that it receives?? Most definitely not. The only thing that clients send back to the server are occasional RTCP Reception Report ("RR") packets, and RTSP commands (like "TEARDOWN", to end the stream). >2) Whether openRTSP can handle TS+H264+MPEG1l2 combo in trick play mode The "openRTSP" client application does not support 'trick mode' operations at all. >3)Can I control the VOD server data streaming rate as my buffer is >getting overflowed because my server Is sending data at a faster rate >than what is being consumed , inspite of maintaing a 10MB of Circular >buffer my data is >Getting overflowed, by any means can I control the rate at which server >is >Streaming the data. Are you talking about *our* RTSP server implementation? If so, then our server implementation streams data at its natural rate (on average). You seem very onfused about lots of things... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jhansonxi at gmail.com Sat Jun 16 20:52:25 2007 From: jhansonxi at gmail.com (Jeff Hanson) Date: Sat, 16 Jun 2007 23:52:25 -0400 Subject: [Live-devel] Problem with RTSP stream playback in VLC Message-ID: I'm using Ubuntu Feisty and installed VLC via Automatix. When I try to play a rtsp streaming video it refuses to play. The terminal output shows that it's using live555 and the error is "main input error: no suitable access module for `rtsp:...". I would just like to know if it's a limitation or an actual bug and if it's live555, VLC, or the package maintainer I should report it to. The video link file (Rob Zombie video) and terminal log is attached. -------------- next part -------------- vlc VLC media player 0.8.6 Janus Sending request: OPTIONS rtsp://st21g1.services.att-idns.net/v1/774/2334/14/f8/14f8724c1a3f56462e24b0633f1789fa.rm?ts=1139089010&ttl=300&cs=C5AA4D6E6F862925D8BFC6BAD1D478DD846ED431 RTSP/1.0 CSeq: 1 User-Agent: VLC media player (LIVE555 Streaming Media v2006.03.16) Received OPTIONS response: RTSP/1.0 200 OK CSeq: 1 Date: Sun, 17 Jun 2007 03:40:32 GMT Server: Helix Server Version 9.0.6.1262 (win32) (RealServer compatible) Public: OPTIONS, DESCRIBE, ANNOUNCE, PLAY, SETUP, GET_PARAMETER, SET_PARAMETER, TEARDOWN RealChallenge1: 18b2533a7d6733af7a183ba2d9afc5d9 StatsMask: 3 Via: 1.0 nycny113ins (NetCache NetApp/5.6.2D24) Sending request: DESCRIBE rtsp://st21g1.services.att-idns.net/v1/774/2334/14/f8/14f8724c1a3f56462e24b0633f1789fa.rm?ts=1139089010&ttl=300&cs=C5AA4D6E6F862925D8BFC6BAD1D478DD846ED431 RTSP/1.0 CSeq: 2 Accept: application/sdp User-Agent: VLC media player (LIVE555 Streaming Media v2006.03.16) Received DESCRIBE response: RTSP/1.0 200 OK CSeq: 2 Date: Sun, 17 Jun 2007 03:40:32 GMT Set-Cookie: cbid=hfgjihilcjcfflgheoorkugqmrjrptducfhjkipicgdkclplpnlorumtorrsltpuhkqgcmfi;path=/;expires=Thu,31-Dec-2037 23:59:59 GMT vsrc: http://sppoc.intr.icdsatt.net:80/viewsource/template.html?nuyhtgaaiwz6imDxE7pD02b7sqy7y626DAwe31g4o7sa63b7h4b43ecgsxy7ilyxqe5C0aE64jduwexurwc8mzlfngxya8sg0C55h7i76mb8s1Dcn53Esez8y3wBm5zl91u9082dn7wix9z71ut7b7C5l1A9mr72u9Blzzz9cEDyE7mgms580BqCz9000000000000000000000000000000000000000000000000000000craEg7000000000000000000000000iCh6stbczzb0000000000000000000000000000000000000wEyzb0000000000000000000000000000000000000000000000000ui0000000000000000000000xii5E6000000rog000A3B000tixa3rfzcxh0000000000000000000000000000000dbpcy5h582v9Dtxgqe000000000000000000000000tjcB2Byug000aia9r4qqxob8000000000000Di7000 Last-Modified: Wed, 23 Jul 2003 01:35:09 GMT Content-type: application/sdp Content-base: rtsp://st21g1.services.att-idns.net/v1/774/2334/14/f8/14f8724c1a3f56462e24b0633f1789fa.rm?ts=1139089010&ttl=300&cs=C5AA4D6E6F862925D8BFC6BAD1D478DD846ED431/ Via: 1.0 nycny113ins (NetCache NetApp/5.6.2D24) Content-Length: 5611 Need to read 5611 extra bytes Read 1788 extra bytes: v=0 o=- 1058924109 1058924109 IN IP4 199.106.211.175 s= i= c=IN IP4 199.106.211.144 t=0 0 a=SdpplinVersion:1610641492 a=Flags:integer;0 a=IsRealDataType:integer;1 a=StreamCount:integer;2 a=ASMRuleBook:string;"#($Bandwidth < 21800),Stream0Bandwidth = 8000, Stream1Bandwidth = 9399;#($Bandwidth >= 21800) && ($Bandwidth < 29000),Stream0Bandwidth = 8000, Stream1Bandwidth = 13800;#($Bandwidth >= 29000) && ($Bandwidth < 60799),Stream0Bandwidth = 8000, Stream1Bandwidth = 21000;#($Bandwidth >= 60799) && ($Bandwidth < 80999),Stream0Bandwidth = 16193, Stream1Bandwidth = 44606;#($Bandwidth >= 80999) && ($Bandwidth < 204799),Stream0Bandwidth = 16193, Stream1Bandwidth = 64806;#($Bandwidth >= 204799) && ($Bandwidth < 272999),Stream0Bandwidth = 44100, Stream1Bandwidth = 160699;#($Bandwidth >= 272999),Stream0Bandwidth = 44100, Stream1Bandwidth = 228899;" a=range:npt=0- m=audio 0 RTP/AVP 101 b=AS:44 a=control:streamid=0 a=range:npt=0-239.076000 a=length:npt=239.076000 a=rtpmap:101 x-pn-realaudio/1000 a=mimetype:string;"audio/x-pn-realaudio" a=ActualPreroll:integer;2304 a=AvgBitRate:integer;44100 a=AvgPacketSize:integer;640 a=EndOneRuleEndAll:integer;1 a=EndTime:integer;237679 a=MaxBitRate:integer;44100 a=MaxPacketSize:integer;640 a=MinimumSwitchOverlap:integer;200 a=Preroll:integer;4608 a=SeekGreaterOnSwitch:integer;0 a=StartTime:integer;0 a=OpaqueData:buffer;"TUxUSQAIAAMAAwAAAAAAAQABAAIAAgAEAAAAVi5yYf0ABQAALnJhNWYFYccABQAAAEYAAAAAASAAA6gAAADqYAAAAAAACAEgACAAAAAAH0AAAB9AAAAAEAABZ2VucmNvb2sBAAAAAAAACAEAAAEBAAAMAAAAVi5yYf0ABQAALnJhNWYFYccABQAAAEYAAgAAAjQAB13gAAHaZwAAAAAACAI0AC8AAAAAViIAAFYiAAAAEAABZ2VucmNvb2sBAAAAAAAACAEAAAECAAASAAAAVi5yYf0ABQAALnJhNWYFYccABQAAAEYABQAAAoAAFAAAAAUL/gAAAAAAEAKAAIAAAAAArEQAAKxEAAAAE Read 1460 extra bytes: AABZ2VucmNvb2sBAAAAAAAACAEAAAIEAAAvAAAAVi5yYf0ABQAALnJhNWYFYccABQAAAEYAAAAAASAAA6gAAADqYAAAAAAACAEgACAAAAAAH0AAAB9AAAAAEAABZ2VucmNvb2sBAAAAAAAACAEAAAEBAAAM" a=RMFF 1.0 Flags:buffer;"AAgAAgAAAAIAAAACAAAAAgAA" a=ASMRuleBook:string;"#($OldPNMPlayer),AverageBandwidth=8000,priority=5,PNMKeyframeRule=T;#($OldPNMPlayer),AverageBandwidth=0,priority=5,PNMNonKeyframeRule=T;#($Bandwidth < 16193),AverageBandwidth=8000,Priority=5;#($Bandwidth < 16193),AverageBandwidth=0,Priority=5,OnDepend=\"2\", OffDepend=\"2\";#($Bandwidth >= 16193) && ($Bandwidth < 44100),AverageBandwidth=16193,Priority=5;#($Bandwidth >= 16193) && ($Bandwidth < 44100),AverageBandwidth=0,Priority=5,OnDepend=\"4\", OffDepend=\"4\";#($Bandwidth >= 44100),AverageBandwidth=44100,Priority=5;#($Bandwidth >= 44100),AverageBandwidth=0,Priority=5,OnDepend=\"6\", OffDepend=\"6\";" a=intrinsicDurationType:string;"intrinsicDurationContinuous" a=StreamName:string;"audio/x-pn-multirate-realaudio logical stream" m=video 0 RTP/AVP 101 b=AS:229 a=control:streamid=1 a=range:npt=0-237.683000 a=length:npt=237.683000 a=rtpmap:101 x-pn-realvideo/1000 a=mimetype:string;"video/x-pn-realvideo" a=AvgBitRate:integer;228899 a=AvgPacketSize:integer;821 a=EndOneRuleEndAll:integer;1 a=EndTime:integer;237679 a=MaxBitRate:integer;228899 a=MaxPacketSize:integer;1007 a=MinimumSwitchOverlap:integer;0 a=Preroll:integer;9666 a=SeekGreaterOnSwitch:integer;1 a=StartTime:integer;0 a=OpaqueData:buffe Read 1460 extra bytes: r;"TUxUSQASAAcABwAIAAkAAgACAAEAAQAAAAAABAAEAAMAAwAGAAYABQAFAAoAAAAkAAAAJFZJRE9SVjMwAPAAtAAMAAAAAAAKAAABCZAwMCAgAiwgAAAAJAAAACRWSURPUlYzMADwALQADAAAAAAACgAAAQmQMDAgIAIsIAAAACQAAAAkVklET1JWMzAA8AC0AAwAAAAAAAoAAAEJkDAwICACLCAAAAAkAAAAJFZJRE9SVjMwAPAAtAAMAAAAAAAMAAABCZAwMCAgAiwgAAAAJAAAACRWSURPUlYzMADwALQADAAAAAAADAAAAQmQMDAgIAIsIAAAACQAAAAkVklET1JWMzAA8AC0AAwAAAAAAA8AAAEJkDAwICACLCAAAAAkAAAAJFZJRE9SVjMwAPAAtAAMAAAAAAAPAAABCZAwMCAgAiwgAAAAJAAAACRWSURPUlYzMADwALQADAAAAAAACgAAAQmQMDAgIAIsIAAAACQAAAAkVklET1JWMzAA8AC0AAwAAAAAAAoAAAEJkDAwICACLCAAAAAkAAAAJFZJRE9SVjMwAPAAtAAMAAAAAAAKAAABCZAwMCAgAiwg" a=RMFF 1.0 Flags:buffer;"ABIAAgAAAAAAAgACAAAAAgAAAAIAAAACAAAAAgAAAAIAAAACAAA=" a=ASMRuleBook:string;"#(($Bandwidth >= 21000) && ($OldPNMPlayer)),AverageBandwidth=21000,priority=9,PNMKeyframeRule=T;#(($Bandwidth >= 21000) && ($OldPNMPlayer)),AverageBandwidth=0,priority=5,PNMNonKeyframeRule=T;#(($Bandwidth < 21000) && ($OldPNMPlayer)),TimestampDelivery=T,DropByN=T,priority=9,PNMThinningRule=T;#($Bandwidth < 9399),TimestampDelivery=T,DropByN=T,priority=9;#($Bandwidth >= 9399) && ($Bandwidth < 13800),AverageBandwidth=9399,Priority=9;#($Bandwidth >= 9399) && ($Bandwidth < 13800),AverageBandwidth=0,Priority=5,OnDepend=\"4\";#($Bandwidth >= 13800) && ($Bandwidth < 21000),AverageBandwidth=13800,Priority=9;#($Bandwidth >= 13800) && ($Bandwidth < 21000),AverageBandwidth=0,Priority=5,OnDepend=\"6\";#($Bandwidth >= 21000) && ($Bandwidth < 44606),AverageBandwidt Read 903 extra bytes: h=21000,Priority=9;#($Bandwidth >= 21000) && ($Bandwidth < 44606),AverageBandwidth=0,Priority=5,OnDepend=\"8\";#($Bandwidth >= 44606) && ($Bandwidth < 64806),AverageBandwidth=44606,Priority=9;#($Bandwidth >= 44606) && ($Bandwidth < 64806),AverageBandwidth=0,Priority=5,OnDepend=\"10\";#($Bandwidth >= 64806) && ($Bandwidth < 160699),AverageBandwidth=64806,Priority=9;#($Bandwidth >= 64806) && ($Bandwidth < 160699),AverageBandwidth=0,Priority=5,OnDepend=\"12\";#($Bandwidth >= 160699) && ($Bandwidth < 228899),AverageBandwidth=160699,Priority=9;#($Bandwidth >= 160699) && ($Bandwidth < 228899),AverageBandwidth=0,Priority=5,OnDepend=\"14\";#($Bandwidth >= 228899),AverageBandwidth=228899,Priority=9;#($Bandwidth >= 228899),AverageBandwidth=0,Priority=5,OnDepend=\"16\";" a=intrinsicDurationType:string;"intrinsicDurationContinuous" a=StreamName:string;"video/x-pn-multirate-realvideo logical stream" [00000298] live555 demuxer: real codec detected, using real-RTSP instead [00000298] live555 demuxer error: Nothing to play for rtsp://st21g1.services.att-idns.net/v1/774/2334/14/f8/14f8724c1a3f56462e24b0633f1789fa.rm?ts=1139089010&ttl=300&cs=C5AA4D6E6F862925D8BFC6BAD1D478DD846ED431 [00000296] main input error: no suitable access module for `rtsp://st21g1.services.att-idns.net/v1/774/2334/14/f8/14f8724c1a3f56462e24b0633f1789fa.rm?ts=1139089010&ttl=300&cs=C5AA4D6E6F862925D8BFC6BAD1D478DD846ED431' [00000287] main playlist: nothing to play -------------- next part -------------- A non-text attachment was scrubbed... Name: 00_dragula.smi Type: application/smil Size: 572 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070616/f6077d75/attachment-0001.bin From finlayson at live555.com Sat Jun 16 21:50:41 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 16 Jun 2007 21:50:41 -0700 Subject: [Live-devel] Problem with RTSP stream playback in VLC In-Reply-To: References: Message-ID: >I'm using Ubuntu Feisty and installed VLC via Automatix. When I try >to play a rtsp streaming video it refuses to play. The terminal >output shows that it's using live555 and the error is "main input >error: no suitable access module for `rtsp:...". I would just like to >know if it's a limitation or an actual bug and if it's live555, VLC, >or the package maintainer I should report it to. The video link file >(Rob Zombie video) and terminal log is attached. The problem is that the stream that you are trying to play uses non-standard RealNetworks-specific audio/video codecs (and RTP payload formats) that VLC does not support. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jhansonxi at gmail.com Sat Jun 16 22:29:28 2007 From: jhansonxi at gmail.com (Jeff Hanson) Date: Sun, 17 Jun 2007 01:29:28 -0400 Subject: [Live-devel] Problem with RTSP stream playback in VLC In-Reply-To: References: Message-ID: On 6/17/07, Ross Finlayson wrote: > The problem is that the stream that you are trying to play uses > non-standard RealNetworks-specific audio/video codecs (and RTP > payload formats) that VLC does not support. Thanks! From frahm.c at googlemail.com Sun Jun 17 09:11:29 2007 From: frahm.c at googlemail.com (Christian Frahm) Date: Sun, 17 Jun 2007 18:11:29 +0200 Subject: [Live-devel] Fwd: Sink taking too long to call doGetNextFrame - bad fDurationInMicroseconds? In-Reply-To: <8039fa140706160403m49a83de8s8bd9b69f39615d6a@mail.gmail.com> References: <8039fa140706160403m49a83de8s8bd9b69f39615d6a@mail.gmail.com> Message-ID: <8039fa140706170911mc52baf8m53dcf882e9ff1f3e@mail.gmail.com> Hello to everyone, Im currently developing a source class which reads from a USB DVB-T device. The aim is to select 1 audio and 1 video stream and stream them (in PES format) separatelly to another host in our network. To demultiplex the MPEG-TS stream, I have modified mplex and built it into my code. For testing purposes, I filter out only 1 video stream, connected my class to a MPEG1or2VideoDiscreteFramer and used a MPEG1or2VideoRTPSink to stream. Now - MPEG PES can be very large (up to 65535 bytes). So the first problem is: the MPEG1or2VideoRTPSink has fMaxSize set o 60000 bytes - so sometimes I get erros because it says I should make MPEG1or2VideoRTPSink's buffer larger. Anyone can tell how I do that? However, the biggest problem is that (after adding some timestamps to my debugs) - *may times* MPEG1or2VideoDiscreteFramer takes up to half a second to call my goGetNextFrame( ). In the trace below, such a situation is shown. Every seems to be working fine then this happens. I can assure that the PES packets are all well-formed. Would there be any reason for the MPEG1or2VideoDiscreteFramer/MPEG1or2VideoRTPSink to take so long in calling my function again? System has more than enough resources. By the way, I always set fDurationInMicroseconds=0 and gettimeofday(&fPresentationTime, NULL); inside deliverFrame( ). Is this correct? Does this influence the rate at which sinks ask for more data from a source? Thank you very much in advance, Christian Frahm ************** TRACE ************* doGetNextFrame() - 2007-06-16 12:18:03.985 PES Size:15008 End of doGetNextFrame() - 2007-06-16 12:18:03.985 deliverFrame() - 2007-06-16 12:18:03.985 ******* Copying PES to OutBuffer ************* Deliver Frame(): we have 15008 bytes to deliver. (fMaxSize = 60800. End of deliverFrame() - 2007-06-16 12:18:03.985 doGetNextFrame() - 2007-06-16 12:18:03.985 PES Size:34952 End of doGetNextFrame() - 2007-06-16 12:18:03.986 deliverFrame() - 2007-06-16 12:18:03.986 ******* Copying PES to OutBuffer ************* Deliver Frame(): we have 34952 bytes to deliver. (fMaxSize = 60800. End of deliverFrame() - 2007-06-16 12:18:03.986 ============= almost 400 ms!!!!!!!!!!!!!!!! ================== doGetNextFrame() - 2007-06-16 12:18:04.343 PES Size:15080 End of doGetNextFrame() - 2007-06-16 12:18: 04.343 deliverFrame() - 2007-06-16 12:18:04.343 ******* Copying PES to OutBuffer ************* Deliver Frame(): we have 15080 bytes to deliver. (fMaxSize = 60800. End of deliverFrame() - 2007-06-16 12:18:04.344 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070617/3966b3c7/attachment.html From finlayson at live555.com Sun Jun 17 13:30:29 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 17 Jun 2007 13:30:29 -0700 Subject: [Live-devel] Fwd: Sink taking too long to call doGetNextFrame - bad fDurationInMicroseconds? In-Reply-To: <8039fa140706170911mc52baf8m53dcf882e9ff1f3e@mail.gmail.com> References: <8039fa140706160403m49a83de8s8bd9b69f39615d6a@mail.gmail.com> <8039fa140706170911mc52baf8m53dcf882e9ff1f3e@mail.gmail.com> Message-ID: >Now - MPEG PES can be very large (up to 65535 bytes). So the first >problem is: the MPEG1or2VideoRTPSink has fMaxSize set o 60000 bytes >- so sometimes I get erros because it says I should make >MPEG1or2VideoRTPSink's buffer larger. Anyone can tell how I do that? Yes, add OutPacketBuffer::maxSize = 70000; to your code, before you create any "RTPSink"s (or create a "RTSPServer"). >By the way, I always set fDurationInMicroseconds=0 and >gettimeofday(&fPresentationTime, NULL); inside deliverFrame( ). Is >this correct? Using "gettimeofday()" to set "fPresentationTime" is correct. However, you should not set "fDurationInMicroseconds" to zero (otherwise the "RTPSink" object will ask for more data immediately after sending it). Instead, you should set "fDurationInMicroseconds" to the actual frame duration,. > Does this influence the rate at which sinks ask for more data from a source? Yes, "fDurationInMicroseconds" determines the rate at which the "RTPSink" asks for more data from its source (see "MultiFramedRTPSink.cpp", line 307). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From david.rignac at free.fr Mon Jun 18 00:47:02 2007 From: david.rignac at free.fr (david.rignac at free.fr) Date: Mon, 18 Jun 2007 09:47:02 +0200 Subject: [Live-devel] (no subject) Message-ID: <1182152822.46763876e369f@imp.free.fr> First, all congratulations for Live555. But, I have no performance that I will hope. I develop an embedded application running under Linux 2.6.19 with a processor at 260MHz (with 128Mo DDR2 memory) and I would like use liveMediaServer as RTSP server and multiple openRTSP as RTSP clients. (I have modified playCommon.cpp to create the output file with a name "dd-mm-yy-video-MP2T-1"). The RTSP server and the RTSP clients are running on the same processor (on the board). The first step is to start the liveMediaServer and start only 1 openRTSP application ("./openRTSP rtsp://ip-address/file1.ts"). The result is good : a complete video output file and less than 2% of CPU used during the streaming and recording. The second step is to start 8 openRTSP applications. The result is bad : 8 incomplete video output files and 100% of CPU used. Is is normal ? What is the origin of problem : live555 or hardware configuration ? What is the mimimal configuration to stream 8 video files ? Thanks From david.rignac at free.fr Mon Jun 18 03:02:49 2007 From: david.rignac at free.fr (david.rignac at free.fr) Date: Mon, 18 Jun 2007 12:02:49 +0200 Subject: [Live-devel] Performance of the RTSP server/client Message-ID: <1182160969.46765849dcfe0@imp.free.fr> First, all congratulations for Live555. But, I have no performance that I will hope. I develop an embedded application running under Linux 2.6.19 with a processor at 260MHz (with 128Mo DDR2 memory) and I would like use liveMediaServer as RTSP server and multiple openRTSP as RTSP clients. (I have modified playCommon.cpp to create the output file with a name "dd-mm-yy-video-MP2T-1"). The RTSP server and the RTSP clients are running on the same processor (on the board). The first step is to start the liveMediaServer and start only 1 openRTSP application ("./openRTSP rtsp://ip-address/file1.ts"). The result is good : a complete video output file and less than 2% of CPU used during the streaming and recording. The second step is to start 8 openRTSP applications. The result is bad : 8 incomplete video output files and 100% of CPU used. Is is normal ? What is the origin of problem : live555 or hardware configuration ? What is the mimimal configuration to stream 8 video files ? Thanks From jnitin at ssdi.sharp.co.in Mon Jun 18 04:11:24 2007 From: jnitin at ssdi.sharp.co.in (Nitin Jain) Date: Mon, 18 Jun 2007 16:41:24 +0530 Subject: [Live-devel] Speed Header Implementation in media Server Message-ID: <9219A756FBA09B4D8CE75D4D7987D6CB5631FF@KABEX01.sharpamericas.com> >Almost noone implements the "Speed:" header, and in fact, because of >this, the IETF is considering removing it from future revisions of >the RTSP standard. >Do you understand that the "Speed:" header is *not* intended to be >used for 'fast forward' play? Fast forward play is implemented using >the "Scale:" header is used (which our server already supports, for >some media types. I understand that Fast forward feature is implemented using Scale Header in LIVE555 rtsp server however I want to implement server transmission rate adaptation using Speed Header. How can I do that in LIVE555 RTSP Server? Please provide your inputs. Regards Nitin From marete at edgenet.co.ke Mon Jun 18 05:09:17 2007 From: marete at edgenet.co.ke (Brian Gitonga Marete) Date: Mon, 18 Jun 2007 15:09:17 +0300 Subject: [Live-devel] (live555MediaServer) Software Client for RTSP Trick Play mode Message-ID: <1182168557.1791.6.camel@delta.local> Hello all, Could anyone point me to a (preferably free/freeware) graphical RTSP-capable client that can take advantage of live555MediaServer's trick play capabilities in their current state? (I believe that currently that this requires indexed .ts files and those must contain MPEG-2 video.) Are the latest (SVN) versions of VLC able to do this? Thanks. Brian G. Marete. From clem.taylor at gmail.com Mon Jun 18 12:17:47 2007 From: clem.taylor at gmail.com (Clem Taylor) Date: Mon, 18 Jun 2007 15:17:47 -0400 Subject: [Live-devel] calling a function each time a new user joins a multicast stream? Message-ID: Hi, I'm adding some code to insert an IVOP in a live MPEG4 stream each time a new user joins the stream. In the unicast case I just redefined startStream in my OnDemandServerMediaSubsession subclass and sent a message to the encoder to insert an IVOP. This works great. However, the OnDemandServerMediaSubsession class is not used in the multicast case. I'm not sure where is the right place to insert my code. Should I just subclass PassiveServerMediaSubsession and redefine startStream, or is there another preferred place to add some code when a new user joins a multicast stream? This may be a stupid question, so feel free to laugh... Thanks, Clem From finlayson at live555.com Mon Jun 18 14:25:33 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 18 Jun 2007 14:25:33 -0700 Subject: [Live-devel] (live555MediaServer) Software Client for RTSP Trick Play mode In-Reply-To: <1182168557.1791.6.camel@delta.local> References: <1182168557.1791.6.camel@delta.local> Message-ID: >Could anyone point me to a (preferably free/freeware) graphical >RTSP-capable client that can take advantage of live555MediaServer's >trick play capabilities in their current state? (I believe that >currently that this requires indexed .ts files and those must contain >MPEG-2 video.) Are the latest (SVN) versions of VLC able to do this? No - not yet. However, the VLC developers are currently working on supporting this in a future release. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Jun 18 14:27:39 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 18 Jun 2007 14:27:39 -0700 Subject: [Live-devel] calling a function each time a new user joins a multicast stream? In-Reply-To: References: Message-ID: >I'm adding some code to insert an IVOP in a live MPEG4 stream each >time a new user joins the stream. In the unicast case I just redefined >startStream in my OnDemandServerMediaSubsession subclass and sent a >message to the encoder to insert an IVOP. This works great. However, >the OnDemandServerMediaSubsession class is not used in the multicast >case. I'm not sure where is the right place to insert my code. Should >I just subclass PassiveServerMediaSubsession and redefine startStream Yes. "PassiveServerMediaSubsession" is used when streaming a multicast stream using RTSP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From frahm.c at googlemail.com Tue Jun 19 06:22:34 2007 From: frahm.c at googlemail.com (Christian Frahm) Date: Tue, 19 Jun 2007 15:22:34 +0200 Subject: [Live-devel] PES packet sizes Message-ID: <8039fa140706190622y74eb581cga3ba825119910b13@mail.gmail.com> Hi Everyone, I am having a few problems while parsing PES stripped from DVB-T MPEG-TS to the MPEGDemux Class. PES coming from DVB-T are indeed very peculiar. First of all, they can be very big. Some channels had PESs up to 150KB... so first question: is this correct? Can PES really be unbounded? The problem is - MPEG1or2Demux class reads the size of a PES packet from the header. So packets that are larger than 65536 will not be processed (they usually have length = 0). Right? Can I simply chop a PES (lets say before a GOP) and parse that to the demux? Will it get mixed up? Or am I missing something here... Secondly , since our aim is to stream the ES... how can I strip the ES from PES stream? Lastly - MPEG1or2Demux does not output any Audio PES. I notice that PTS and DTS are null. I am using FramedSource* audioPES = mpegDemux->newAudioStream(); as sink. Will this sink accept PES? I ask because I had the same problem with vdeo - which I solved after creating a FramedSource from as newRawPESStream. ******* Copying PES to OutBuffer ************* PES Size:5888 Type: Audio Deliver Frame(): we have 5888 bytes to deliver. (fMaxSize = 8688. Hacking PES length - Bytes in Buffer: 5888 PES starts with start of buffer. PES size was 16FA and is now 170 (5888)parsing pack header found packet start code 0x1c0 instead of pack header parsing PES packet saw packet_start_code_prefix 13, saw audio stream: 0x00 PES_packet_length: 5888 audio stream, packet presentation_time_stamp: 0x000000000 packet decoding_time_stamp: 0x000000000 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070619/2dfece31/attachment.html From julian.lamberty at mytum.de Tue Jun 19 07:13:54 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Tue, 19 Jun 2007 16:13:54 +0200 Subject: [Live-devel] PES packet sizes In-Reply-To: <8039fa140706190622y74eb581cga3ba825119910b13@mail.gmail.com> References: <8039fa140706190622y74eb581cga3ba825119910b13@mail.gmail.com> Message-ID: <4677E4A2.1010401@mytum.de> Hi Christian ;) > I am having a few problems while parsing PES stripped from DVB-T > MPEG-TS to the MPEGDemux Class. > > PES coming from DVB-T are indeed very peculiar. First of all, they can > be very big. Some channels had PESs up to 150KB... so first question: > is this correct? Can PES really be unbounded? Yes, PES packets can be arbitrarily long in TSs. > Secondly , since our aim is to stream the ES... how can I strip the ES > from PES stream? Just strip of the PES headers, after the header a PES packet just contains concatenated Bytes from an ES. See you tomorrow! Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070619/2e35e683/attachment.bin From finlayson at live555.com Tue Jun 19 07:32:42 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 19 Jun 2007 07:32:42 -0700 Subject: [Live-devel] PES packet sizes In-Reply-To: <8039fa140706190622y74eb581cga3ba825119910b13@mail.gmail.com> References: <8039fa140706190622y74eb581cga3ba825119910b13@mail.gmail.com> Message-ID: >I am having a few problems while parsing PES stripped from DVB-T >MPEG-TS to the MPEGDemux Class. You can't do this. The "MPEG1or2Demux" class is used for parsing MPEG Program Stream data. >Secondly , since our aim is to stream the ES... how can I strip the >ES from PES stream? You will need to write your own filter class that parses PES packet data (only), to produce ES data. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rjbrennn at gmail.com Tue Jun 19 14:10:26 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Tue, 19 Jun 2007 17:10:26 -0400 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to Message-ID: Hello, I am trying to take a buffer of MPEG2-TS data which is constantly written to, and hook it up to live555 for output via RTP. I am currently basing my work on the testMEPG2TransportStreamer code, but obviously this was designed to stream a file. My current idea is not really panning out, due to the fact that I am apparently confused... ByteStreamFileSource is used in the test program, but it seems that I will need to use a different type of source. Is there an existing MediaSource that will work, or will I need to create my own? Also, MPEG2TransportStreamFramer is used as the videosource to videoSink->startPlaying . This would seem to still be valid for what I am doing, correct? Lastly, doEventLoop seems to continuously call SingleStep(), and from what I can make of this function, it seems that in some way this is where the data to be sent out is packaged and sent. Assuming that I get acknowledgement each time my buffer is ready to be sent out, can I simply call SingleStep() each time I have data to send? I hope I was clear enough... Thanks in advance! -- Russell Brennan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070619/38257e91/attachment.html From finlayson at live555.com Tue Jun 19 14:47:33 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 19 Jun 2007 14:47:33 -0700 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to In-Reply-To: References: Message-ID: >I am trying to take a buffer of MPEG2-TS data which is constantly >written to, and hook it up to live555 for output via RTP. The best way to do this is to make your buffer an OS pipe, and then just have your LIVE555-based application read from this. For example, if you write your MPEG-2 TS data to standard output (in one process), and then pipe this to your LIVE-555-based application (in another process), that reads from "stdin" (see below). If you do this, then you won't have to modify any existing code (except the name of the input file in "testMPEG2TransportStreamer"). > I am currently basing my work on the testMEPG2TransportStreamer >code, but obviously this was designed to stream a file. Yes - however, the special file name "stdin" can be used to read from standard input. (I'm assuming that you're running a Unix system (including Linux). If instead you're running Windows, then I don't know what you'd do (although I think Windows has 'named pipes').) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From xcsmith at rockwellcollins.com Tue Jun 19 15:32:09 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Tue, 19 Jun 2007 17:32:09 -0500 Subject: [Live-devel] Trick Mode FPS question Message-ID: I notice that my trick mode streams seem to have the same average data rate regardless of the specified scale factor. Does the trick mode implementation send an I-frame at a rate of 30 fps, but repeat I-frames to fill in the gaps since there are only 2 I-frames / second in my stream? So for example, with 2x playback, 4 different I-frames are sent per second, but 30 frames are sent in total every second? Thanks! Xochitl -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070619/4f60b00b/attachment.html From pushkal.mishra at gmail.com Tue Jun 19 16:17:00 2007 From: pushkal.mishra at gmail.com (Pushkal Mishra) Date: Tue, 19 Jun 2007 19:17:00 -0400 Subject: [Live-devel] Changing media directory for vod server In-Reply-To: References: <3cc3561f0704160355i680fc9f4x4a306d12bd0ae94d@mail.gmail.com> <3cc3561f0704160801x59d7db11n809ca771ec3b060a@mail.gmail.com> Message-ID: <467863ec.0e0c360a.28f0.022f@mx.google.com> Hi, Currently, our live555 vod server is playing TS assets out of current directory. Instead can we pull vod assets from any particular directory? If so, can you please point us to the file where we can make the change? I plan to use a getenv( ) call to get our vod library path, but not sure where to make the code change yet. Thank you very much. Regards, Pushkal From finlayson at live555.com Tue Jun 19 16:46:44 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 19 Jun 2007 16:46:44 -0700 Subject: [Live-devel] Changing media directory for vod server In-Reply-To: <467863ec.0e0c360a.28f0.022f@mx.google.com> References: <3cc3561f0704160355i680fc9f4x4a306d12bd0ae94d@mail.gmail.com> <3cc3561f0704160801x59d7db11n809ca771ec3b060a@mail.gmail.com> <467863ec.0e0c360a.28f0.022f@mx.google.com> Message-ID: >Hi, > >Currently, our live555 vod server is playing TS assets out of current >directory. Instead can we pull vod assets from any particular directory? If >so, can you please point us to the file where we can make the change? I plan >to use a getenv( ) call to get our vod library path, but not sure where to >make the code change yet. You could do this in the implementation of "DynamicRTSPServer::lookupServerMediaSession()": 1/ Prepend the new directory name to "streamName", before caling "fopen()", and 2/ Prepend the new directory name to "streamName", before caling "createNewSMS()". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From pushkal.mishra at gmail.com Tue Jun 19 17:24:25 2007 From: pushkal.mishra at gmail.com (Pushkal Mishra) Date: Tue, 19 Jun 2007 20:24:25 -0400 Subject: [Live-devel] Changing media directory for vod server In-Reply-To: References: <3cc3561f0704160355i680fc9f4x4a306d12bd0ae94d@mail.gmail.com> <3cc3561f0704160801x59d7db11n809ca771ec3b060a@mail.gmail.com> <467863ec.0e0c360a.28f0.022f@mx.google.com> Message-ID: <467873b9.093a360a.6d3e.204a@mx.google.com> Hey Ross, thanks for the quick help. I made the changes. I can see the server trying the right file, but the client doesn't see any video stream. I will dig some more and see what am I missing.. FILE* fid = fopen(strcat(str, streamName), "rb"); ..... sms = createNewSMS(envir(), strcat(str, streamName), fid); -----Original Message----- From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, June 19, 2007 7:47 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Changing media directory for vod server >Hi, > >Currently, our live555 vod server is playing TS assets out of current >directory. Instead can we pull vod assets from any particular directory? If >so, can you please point us to the file where we can make the change? I plan >to use a getenv( ) call to get our vod library path, but not sure where to >make the code change yet. You could do this in the implementation of "DynamicRTSPServer::lookupServerMediaSession()": 1/ Prepend the new directory name to "streamName", before caling "fopen()", and 2/ Prepend the new directory name to "streamName", before caling "createNewSMS()". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Jun 19 18:59:03 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 19 Jun 2007 18:59:03 -0700 Subject: [Live-devel] Trick Mode FPS question In-Reply-To: References: Message-ID: >I notice that my trick mode streams seem to have the same average >data rate regardless of the specified scale factor. Does the trick >mode implementation send an I-frame at a rate of 30 fps, but repeat >I-frames to fill in the gaps since there are only 2 I-frames / >second in my stream? > >So for example, with 2x playback, 4 different I-frames are sent per >second, but 30 frames are sent in total every second? Yes. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070619/ed5e9f06/attachment.html From austinf at cetoncorp.com Wed Jun 20 00:31:51 2007 From: austinf at cetoncorp.com (Austin Foxley) Date: Wed, 20 Jun 2007 00:31:51 -0700 Subject: [Live-devel] rfc2326 Message-ID: <4678D7E7.6070707@cetoncorp.com> A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 251 bytes Desc: OpenPGP digital signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070620/c70c6e0a/attachment.bin From finlayson at live555.com Wed Jun 20 00:37:50 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 20 Jun 2007 00:37:50 -0700 Subject: [Live-devel] rfc2326 In-Reply-To: <4678D7E7.6070707@cetoncorp.com> References: <4678D7E7.6070707@cetoncorp.com> Message-ID: Do you have a specific question about the "LIVE555 Streaming Media" software?? -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From austinf at cetoncorp.com Wed Jun 20 01:08:33 2007 From: austinf at cetoncorp.com (Austin Foxley) Date: Wed, 20 Jun 2007 01:08:33 -0700 Subject: [Live-devel] rfc2326 In-Reply-To: References: <4678D7E7.6070707@cetoncorp.com> Message-ID: <4678E081.4020007@cetoncorp.com> Ross Finlayson wrote: > Do you have a specific question about the "LIVE555 Streaming Media" software?? not really...i was trying to use it, but found out it's concept of session is directly out of spec with the rfc2326 specification in regards to depending on the state of a particular tcp connection. I'm working on changing it to suit my needs...i will post a diff for anyone interested, when i'm done. -- Austin Foxley -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 251 bytes Desc: OpenPGP digital signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070620/190362f1/attachment.bin From rjbrennn at gmail.com Wed Jun 20 06:54:44 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Wed, 20 Jun 2007 09:54:44 -0400 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to In-Reply-To: References: Message-ID: Yes, I'm on RedHat 5... Piping it sounds liek a great idea to me, I'm going to try to put something together and I'll post some code snippets for future reference when I get things running smoothly. Thanks, Russell On 6/19/07, Ross Finlayson wrote: > > >I am trying to take a buffer of MPEG2-TS data which is constantly > >written to, and hook it up to live555 for output via RTP. > > The best way to do this is to make your buffer an OS pipe, and then > just have your LIVE555-based application read from this. For > example, if you write your MPEG-2 TS data to standard output (in one > process), and then pipe this to your LIVE-555-based application (in > another process), that reads from "stdin" (see below). > > If you do this, then you won't have to modify any existing code > (except the name of the input file in "testMPEG2TransportStreamer"). > > > I am currently basing my work on the testMEPG2TransportStreamer > >code, but obviously this was designed to stream a file. > > Yes - however, the special file name "stdin" can be used to read from > standard input. > > (I'm assuming that you're running a Unix system (including Linux). > If instead you're running Windows, then I don't know what you'd do > (although I think Windows has 'named pipes').) > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070620/b9303725/attachment.html From lroels at hotmail.com Wed Jun 20 07:36:00 2007 From: lroels at hotmail.com (Luc Roels) Date: Wed, 20 Jun 2007 14:36:00 +0000 Subject: [Live-devel] Adding a codec Message-ID: Hi, Suppose a video server has been created that streams a non compliant H.264 stream ( a so called modified H.264 that's absolutely not compatible with the H.264 bitstream, no NAL's etc... ) I have to create a client application that can connect to such a server using RTSP, receive the modified H.264 video stream, decode it and render it on the screen. How do I go about this? Since the MediaSubsession::initiate() function will not recognize the codec, I suppose I will have to change this function to create a 'ModifiedH264VideoRTPSource' and then create a ModifiedH264VideoRTPSource class derived from MultiFramedRTPSource. Am I correct? Is this the only thing I that needs to be done? Bonus question. If the doNormalMBitRule is active, what happens if let's say an RTP packet belonging to the frame gets lost? Do you get a partial frame or non at all? How can you detect that a frame or part of the frame is lost? best regards, Luc Roels _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070620/14d7b5c9/attachment.html From finlayson at live555.com Wed Jun 20 11:05:34 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 20 Jun 2007 11:05:34 -0700 Subject: [Live-devel] Adding a codec In-Reply-To: References: Message-ID: >Suppose a video server has been created that streams a non compliant >H.264 stream ( a so called modified H.264 that's absolutely not >compatible with the H.264 bitstream, no NAL's etc... ) > >I have to create a client application that can connect to such >a server using RTSP, receive the modified H.264 video stream, decode >it and render it on the screen. To do this properly, you should also define a new RTP payload format (for your new bitstream format), and present it (as an Internet Draft) to the IETF for standardization. > How do I go about this? Since the MediaSubsession::initiate() >function will not recognize the codec, I suppose I will have to >change this function to create a 'ModifiedH264VideoRTPSource' and >then create a ModifiedH264VideoRTPSource class derived from >MultiFramedRTPSource. Am I correct? Is this the only thing I that >needs to be done? No, you will also need to use a different RTP media type - i.e., *not* "video/H264" - so that receivers will know that they're not getting the standard H.264 RTP payload format. > >Bonus question. If the doNormalMBitRule is active, what happens if >let's say an RTP packet belonging to the frame gets lost? Do you get >a partial frame or non at all? None at all. > How can you detect that a frame or part of the frame is lost? At a higher level - e.g., by inspecting the presentation times of the frames that you *do* receive. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rjbrennn at gmail.com Wed Jun 20 13:10:33 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Wed, 20 Jun 2007 16:10:33 -0400 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to In-Reply-To: References: Message-ID: Well, I have been experimenting with piping and I can get file descriptors for pipes using the pipe() call, but it seems that I need a FILE sctruct rather than a file descriptor. Any ideas how to convert one to the other? Russell On 6/20/07, Russell Brennan wrote: > > Yes, I'm on RedHat 5... Piping it sounds liek a great idea to me, I'm > going to try to put something together and I'll post some code snippets for > future reference when I get things running smoothly. Thanks, > > Russell > > > On 6/19/07, Ross Finlayson wrote: > > > > >I am trying to take a buffer of MPEG2-TS data which is constantly > > >written to, and hook it up to live555 for output via RTP. > > > > The best way to do this is to make your buffer an OS pipe, and then > > just have your LIVE555-based application read from this. For > > example, if you write your MPEG-2 TS data to standard output (in one > > process), and then pipe this to your LIVE-555-based application (in > > another process), that reads from "stdin" (see below). > > > > If you do this, then you won't have to modify any existing code > > (except the name of the input file in "testMPEG2TransportStreamer"). > > > > > I am currently basing my work on the testMEPG2TransportStreamer > > >code, but obviously this was designed to stream a file. > > > > Yes - however, the special file name "stdin" can be used to read from > > standard input. > > > > (I'm assuming that you're running a Unix system (including Linux). > > If instead you're running Windows, then I don't know what you'd do > > (although I think Windows has 'named pipes').) > > -- > > > > Ross Finlayson > > Live Networks, Inc. > > http://www.live555.com/ > > _______________________________________________ > > live-devel mailing list > > live-devel at lists.live555.com > > http://lists.live555.com/mailman/listinfo/live-devel > > > > > > -- > Russell Brennan > RJBrennn at gmail.com > (708) 699-7314 -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070620/66c5a8e0/attachment-0001.html From matthew.trim at eds.com Thu Jun 21 01:20:44 2007 From: matthew.trim at eds.com (Trim, Matthew L) Date: Thu, 21 Jun 2007 18:20:44 +1000 Subject: [Live-devel] wis-streamer with external sound card? Message-ID: <5963F1C68F53754894E6168A6D6B6AD401BA50E2@aubwm231.apac.corp.eds.com> Hi, I just discovered wis-streamer and am wondering if there is an option to use an external Sound Card? I want to feed a digital AES signal in and obviously the hardware capture devices only have RCA analogue inputs. Cheers, Matt From gerrit at erg.abdn.ac.uk Thu Jun 21 04:38:02 2007 From: gerrit at erg.abdn.ac.uk (Gerrit Renker) Date: Thu, 21 Jun 2007 12:38:02 +0100 Subject: [Live-devel] The IPv6 question again Message-ID: <200706211238.02299@strip-the-willow> Browsing through the archives showed that the question has been raised in the past several times: a chance for RTSP over TCP/IPv6 in live555 ? I wonder if someone is already working on it or if patches exist / are acceptable. We need RTSPv6 support for a research project and may be able to put in some work, but unfortunately not funding. Gerrit From finlayson at live555.com Thu Jun 21 07:43:12 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 21 Jun 2007 07:43:12 -0700 Subject: [Live-devel] wis-streamer with external sound card? In-Reply-To: <5963F1C68F53754894E6168A6D6B6AD401BA50E2@aubwm231.apac.corp.eds.com> References: <5963F1C68F53754894E6168A6D6B6AD401BA50E2@aubwm231.apac.corp.eds.com> Message-ID: >I just discovered wis-streamer and am wondering if there is an option to >use an external Sound Card? By default, the code uses the PCM audio input associated with the Linux WIS GO7007 driver. However, you could change this in the file "WISINput.cpp". -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From MHosseini at newheights.com Thu Jun 21 11:44:57 2007 From: MHosseini at newheights.com (Mojtaba Hosseini) Date: Thu, 21 Jun 2007 12:44:57 -0600 Subject: [Live-devel] H264 RTP Streaming: A Tutorial Message-ID: Hello, For the past week or so I've been working on getting H264 RTP streaming working with live555Media libraries. From what I can see in the mailing list archives, many others have done the same and will probably want to do so in the future. So once I got the *basic* thing working, I wrote up some documentation in the form of a 'tutorial' and included some diagrams that helped me understand what was happening. I have also included code samples that may help others by way of providing a basic starting point. The tutorial documentation and sample code are here: http://www.white.ca/patrick/tutorial.tar.gz For those that are working in the same area, getting your feedback would be much appreciated. For those with better familiarirty with the live555Media than me, correcting the mistakes in the documentation + giving suggestions for how things are done better would really help. Thank you and I look forward to your comments. Mojtaba Hosseini -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070621/7e4519ef/attachment.html From dweber at robotics.net Thu Jun 21 13:16:56 2007 From: dweber at robotics.net (Dan Weber) Date: Thu, 21 Jun 2007 16:16:56 -0400 Subject: [Live-devel] Status of multiple payload types in liveMedia Message-ID: <20070621201656.GA12772@Barney.robotics.net> Hey Ross, I asked you some time ago when you were implementing multiple payload type support into liveMedia. What's the current status? Dan From rjbrennn at gmail.com Thu Jun 21 14:53:02 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Thu, 21 Jun 2007 17:53:02 -0400 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to In-Reply-To: References: Message-ID: Ok, after some hours of tinkering this (non-working) bit of code is what I have come up with: ********************* int pipeFile[2]; // Pipe file descriptors FILE* fdFile; .... if (pipe(pipeFile) == -1) // create the pipe { perror ("pipe"); exit(1); } if ((fdFile = fdopen(pipeFile[0], "r")) == NULL) //open the read end of our pipe as a FILE { perror ("fdopen"); exit(1); } ... while (1) { // Grab to our unix pipe m_grabx(hin_, &cfInBuf_[0], ngot_); // this fills the buffer cfInBuf_ if (ngot_ > 0) { // Send out the packet to Unix pipe write(pipeFile[1], &cfInBuf_[0], ngot_*sizeof(real_4)); // write to our pipe } } ... and then, ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(*env, fdFile, deleteFidOnClose, inputDataChunkSize); and so on. However, I am getting the error BasicTaskScheduler::SingleStep(): select() fails: Bad file descriptor Does anything look awry offhand? I would have like to have just passed createNew my pipe file descriptor, but I had to convert to the FILE struct to fit the createNew prototype. Thanks! Russell On 6/20/07, Russell Brennan wrote: > > Well, I have been experimenting with piping and I can get file descriptors > for pipes using the pipe() call, but it seems that I need a FILE sctruct > rather than a file descriptor. Any ideas how to convert one to the other? > > Russell > > > On 6/20/07, Russell Brennan wrote: > > > > Yes, I'm on RedHat 5... Piping it sounds liek a great idea to me, I'm > > going to try to put something together and I'll post some code snippets for > > future reference when I get things running smoothly. Thanks, > > > > Russell > > > > > > On 6/19/07, Ross Finlayson wrote: > > > > > > >I am trying to take a buffer of MPEG2-TS data which is constantly > > > >written to, and hook it up to live555 for output via RTP. > > > > > > The best way to do this is to make your buffer an OS pipe, and then > > > just have your LIVE555-based application read from this. For > > > example, if you write your MPEG-2 TS data to standard output (in one > > > process), and then pipe this to your LIVE-555-based application (in > > > another process), that reads from "stdin" (see below). > > > > > > If you do this, then you won't have to modify any existing code > > > (except the name of the input file in "testMPEG2TransportStreamer"). > > > > > > > I am currently basing my work on the testMEPG2TransportStreamer > > > >code, but obviously this was designed to stream a file. > > > > > > Yes - however, the special file name "stdin" can be used to read from > > > standard input. > > > > > > (I'm assuming that you're running a Unix system (including Linux). > > > If instead you're running Windows, then I don't know what you'd do > > > (although I think Windows has 'named pipes').) > > > -- > > > > > > Ross Finlayson > > > Live Networks, Inc. > > > http://www.live555.com/ > > > _______________________________________________ > > > live-devel mailing list > > > live-devel at lists.live555.com > > > http://lists.live555.com/mailman/listinfo/live-devel > > > > > > > > > > > -- > > Russell Brennan > > RJBrennn at gmail.com > > (708) 699-7314 > > > > > -- > Russell Brennan > RJBrennn at gmail.com > (708) 699-7314 > -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070621/38e273b8/attachment.html From finlayson at live555.com Thu Jun 21 15:02:02 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 21 Jun 2007 15:02:02 -0700 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to In-Reply-To: References: Message-ID: >Ok, after some hours of tinkering this (non-working) bit of code I think you're misunderstanding what I have in mind. You should not have to change *any* existing LIVE555 code (except to change "test.ts" to "stdin" in "testMPEG2TransportStreamer.cpp"). Instead, just run your MPEG-2 TS grabbing code in a separate process (that writes to stdout), and just pipe it to "testMPEG2TransportStreamer" using the command line - i.e., yourGrabberApplication | testMPEG2TransportStreamer Alternatively, if your Transport Stream source is available as a named file (e.g., device), then just run testMPEG2TransportStreamer < yourTransportStreamDeviceFileName (It baffles me that so many Unix programmers these days seem to know so little about pipes, filters, and stdio.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lroels at hotmail.com Fri Jun 22 02:33:21 2007 From: lroels at hotmail.com (Luc Roels) Date: Fri, 22 Jun 2007 09:33:21 +0000 Subject: [Live-devel] (no subject) Message-ID: In the FAQ, I read that to implement live streaming you have to created your own FramedSource subclass to encapsulate your input source. In essence in the deliverframe you have to copy your frame data to the fTO. What I don't understand is where the frame memory ( pointed to by the fTO pointer ) is allocated and where the fMaxSize is set? Can you shed some light on this? _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070622/484698fc/attachment-0001.html From tl11305 at salle.url.edu Fri Jun 22 02:53:15 2007 From: tl11305 at salle.url.edu (Ramon Martin de Pozuelo Genis) Date: Fri, 22 Jun 2007 11:53:15 +0200 (CEST) Subject: [Live-devel] Extension Header Message-ID: <1385.172.16.11.129.1182505995.squirrel@webmail.salle.url.edu> Hi!! What would I do to add a extension RTP header?? I saw there are some functions in MultiFramedRTPSink that seems to do this (setSpecialHeaderBytes / setSpecialHeaderBytes) ? Is it OK? In that case, exactly where and when have I to add a call to this functions for a correct implementation? Thanks in advance, Ramon From rjbrennn at gmail.com Fri Jun 22 06:02:55 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Fri, 22 Jun 2007 09:02:55 -0400 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to In-Reply-To: References: Message-ID: I would love to be able to do this, but unfortunately this is just a small part of a larger system, and I need to be able to call it with one command line argument, so I can't use a shell pipe to do this (although I do think that you method will be an excellent way to verify that my system as a whole will work properly once all of the pipe code is in place). Sorry if I seem to know so little about unix pipes, I'm not an expert programmer by any means and I have never had to use pipes in the way that I am attempting to use them. Russell On 6/21/07, Ross Finlayson wrote: > > >Ok, after some hours of tinkering this (non-working) bit of code > > I think you're misunderstanding what I have in mind. You should not > have to change *any* existing LIVE555 code (except to change > "test.ts" to "stdin" in "testMPEG2TransportStreamer.cpp"). Instead, > just run your MPEG-2 TS grabbing code in a separate process (that > writes to stdout), and just pipe it to "testMPEG2TransportStreamer" > using the command line - i.e., > > yourGrabberApplication | testMPEG2TransportStreamer > > Alternatively, if your Transport Stream source is available as a > named file (e.g., device), then just run > testMPEG2TransportStreamer < yourTransportStreamDeviceFileName > > (It baffles me that so many Unix programmers these days seem to know > so little about pipes, filters, and stdio.) > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070622/558642eb/attachment.html From MHosseini at newheights.com Fri Jun 22 06:07:35 2007 From: MHosseini at newheights.com (Mojtaba Hosseini) Date: Fri, 22 Jun 2007 07:07:35 -0600 Subject: [Live-devel] (no subject) References: Message-ID: On the sender side, the output buffer is encapsulated wihtin the OutPacketBuffer class (see MediaSink.cpp). In its constructor, a buffer is allocated (called fBuf). It is *that* buffer that gets passed to the input source. Ex: MultiFramedRTPsink class passes the pointer to the fBuf buffer (by calling the curPtr() method of OutPacketBuffer class) to the source (through the getNextFrame method of the source). This pointer is called 'fTo' wihtin the context of the source. The same goes for the 'fMaxSize': it is the size of the OutPacketBuffer passed from the sink to the source in the 'getNextFrame'. IMPORTANT NOTE: the maximum size of the output buffer is hardcoded to be 60KBytes. You need to change this value if your output is larger than this (see OutPacketBuffer:maxSize in MediaSink.cpp ). Would it make sense to allow the user to change this maximum value dynamically? In my application I had to increase the hardcoded 60KBytes and recompile the live555Media library. The same goes for the receiver side buffer. Mojtaba Hosseini -----Original Message----- From: live-devel-bounces at ns.live555.com on behalf of Luc Roels Sent: Fri 6/22/2007 3:33 AM To: Live555 group Subject: [Live-devel] (no subject) -----Original Message----- From: live-devel-bounces at ns.live555.com on behalf of Luc Roels Sent: Fri 6/22/2007 3:33 AM To: Live555 group Subject: [Live-devel] (no subject) In the FAQ, I read that to implement live streaming you have to created your own FramedSource subclass to encapsulate your input source. In essence in the deliverframe you have to copy your frame data to the fTO. What I don't understand is where the frame memory ( pointed to by the fTO pointer ) is allocated and where the fMaxSize is set? Can you shed some light on this? _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3410 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070622/c8e8e6ed/attachment.bin From Steve.Drew at gdcanada.com Fri Jun 22 06:58:39 2007 From: Steve.Drew at gdcanada.com (Drew, Steve) Date: Fri, 22 Jun 2007 09:58:39 -0400 Subject: [Live-devel] Creating Transport Stream file from MPEG-2 Elementary video file Message-ID: Hello, I am having problems trying to create a Transport Stream file from MPEG-2 Elementary Video files. I have successfully converted from program stream using testMPEG1or2ProgramToTransportStream, and have successfully streamed using testMPEG1or2VideoStreamer, but I would like to send using transport stream instead. I have made some slight modifications to testMPEG1or2ProgramToTransportStream as noted below. This program results in a choppy and condensed transport stream file where the video seems to lack any sense of timing. I was hoping someone could verify the following program. I am using Windows XP with MinGW compiler. #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" char const* inputFileName = "in.mpg"; char const* outputFileName = "out.ts"; void afterPlaying(void* clientData); // forward UsageEnvironment* env; int main(int argc, char** argv) { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Open the input file as a 'byte-stream file source': FramedSource* inputSource = ByteStreamFileSource::createNew(*env, inputFileName); if (inputSource == NULL) { *env << "Unable to open file \"" << inputFileName << "\" as a byte-stream file source\n"; exit(1); } MPEG2TransportStreamFromESSource* tsFrames = MPEG2TransportStreamFromESSource::createNew(*env); // Add the elementary video file stream source tsFrames->addNewVideoSource(inputSource, 2); // Open the output file as a 'file sink': MediaSink* outputSink = FileSink::createNew(*env, outputFileName); if (outputSink == NULL) { *env << "Unable to open file \"" << outputFileName << "\" as a file sink\n"; exit(1); } // Finally, start playing: *env << "Beginning to read...\n"; outputSink->startPlaying(*tsFrames, afterPlaying, NULL); env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } void afterPlaying(void* /*clientData*/) { *env << "Done reading.\n"; *env << "Wrote output file: \"" << outputFileName << "\"\n"; exit(0); } ___________________ Thanks, Steve The information contained in this e-mail message is PRIVATE. It may contain confidential information and may be legally privileged. It is intended for the exclusive use of the addressee(s). If you are not the intended recipient, you are hereby notified that any dissemination, distribution or reproduction of this communication is strictly prohibited. If the intended recipient(s) cannot be reached or if a transmission problem has occurred, please notify the sender immediately by return e-mail and destroy all copies of this message. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070622/aa97eeaa/attachment-0001.html From finlayson at live555.com Fri Jun 22 07:37:36 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 22 Jun 2007 07:37:36 -0700 Subject: [Live-devel] (no subject) In-Reply-To: References: Message-ID: >In the FAQ, I read that to implement live streaming you have to >created your own FramedSource subclass to encapsulate your input >source. In essence in the deliverframe you have to copy your frame >data to the fTO. > >What I don't understand is where the frame memory ( pointed to by >the fTO pointer ) is allocated and where the fMaxSize is set? Those variables are set for you by the downstream object (when it called "getNextFrame()"). I.e., when you implement your "doGetNextFrame()" virtual function, you should assume that those variables have already been set for you. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From MHosseini at newheights.com Fri Jun 22 07:40:38 2007 From: MHosseini at newheights.com (Mojtaba Hosseini) Date: Fri, 22 Jun 2007 08:40:38 -0600 Subject: [Live-devel] Maximum Size of Buffers Message-ID: I'll just correct my latest post: In order to increase sender's maximum buffer size, one only needs to add this line OutPacketBuffer::maxSize = [the maximum size your program requires] before any data sink is created. I found that I had to change MAX_PACKET_SIZE variable in MultiFramedRTPSource.cpp for the *receiver* to handle size of greater than 10KBytes. Am I wrong again? Is there a way for my calling program to change this dynamically? Mojtaba -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070622/5c4e3405/attachment.html From finlayson at live555.com Fri Jun 22 07:54:26 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 22 Jun 2007 07:54:26 -0700 Subject: [Live-devel] Extension Header In-Reply-To: <1385.172.16.11.129.1182505995.squirrel@webmail.salle.url.edu> References: <1385.172.16.11.129.1182505995.squirrel@webmail.salle.url.edu> Message-ID: >What would I do to add a extension RTP header?? I saw there are some functions >in MultiFramedRTPSink that seems to do this (setSpecialHeaderBytes / >setSpecialHeaderBytes) ? Is it OK? In that case, exactly where and when have I >to add a call to this functions for a correct implementation? Ramon, The code currently does not support RTP extension headers (except to ignore them if it sees them in incomig packets). You would need to modify "MultiFramedRTPSink" (for sending) and "MultiFramedRTPSource" (for receiving) to support them. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Jun 22 08:04:14 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 22 Jun 2007 08:04:14 -0700 Subject: [Live-devel] Creating Transport Stream file from MPEG-2 Elementary video file In-Reply-To: References: Message-ID: The problem with your code is that you are feeding a "ByteStreamFileSource" (an unstructured byte stream) directly into a "MPEG2TransportStreamFromESSource". Instead, you should do ByteStreamFileSource -> MPEG1or2VideoStreamFramer -> MPEG2TransportStreamFromESSource i.e. insert a "MPEG1or2VideoStreamFramer" inbetween. This will parse the MPEG Elementary Stream to generate appropriate timestamps that will be used downstream when creating the Transport Stream. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From frahm.c at googlemail.com Fri Jun 22 08:36:51 2007 From: frahm.c at googlemail.com (Christian Frahm) Date: Fri, 22 Jun 2007 17:36:51 +0200 Subject: [Live-devel] MPEG1or2Demux Message-ID: <8039fa140706220836n3a25308r20deaa3dd50d5749@mail.gmail.com> I am currently working in a project to stream DVB-T signals. To do that, I extract the PES packets from the MPEG-TS stream and send it to the Demux class. Upon investigating why the Demux sometimes run out of buffer space (fMaxSize very small), I have come to the conclusion that *the Demux class only empties its buffer once the buffer is full*. Is there any way of ordering the Demux to forward Frames (here PES packets) to their respective sinks as soon as they arrive? This would also reduce delay and sporadic bursts in network traffic. Thanks, Christian Frahm -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070622/a00dee3f/attachment.html From Christopher.Koehnen at iem.fh-friedberg.de Fri Jun 22 10:36:48 2007 From: Christopher.Koehnen at iem.fh-friedberg.de (=?ISO-8859-15?Q?Christopher_K=F6hnen?=) Date: Fri, 22 Jun 2007 19:36:48 +0200 Subject: [Live-devel] VS8 linker problem Message-ID: <467C08B0.4070909@iem.fh-friedberg.de> Hi, my name is Christopher and I am trying to integrate RTP functionality from your liveMedia implementation into a DirectShow filter using Microsoft Visual Studio 2005 and latest Windows SDK. To give it a first try I created an empty DS filter and copied the code from the MPEG2TransportStreamer there, without calling it. Now, when I compile my filter, it gives me several linker errors, most of them related to the ::createNew methods of the Factory classes, e.g. RTCPInstance::createNew(... Just for testing purposes I tried to make the constructor of this class also public and used it directly, instead of calling createNew(..., and this worked fine! No linker error like this any more. This shows me, that the lib is linked correctly, as I can call other methods from there. So, what goes wrong? I know, that's not the way it should work, but can anyone give me a hint how to solve this, please? See below the error message from the linker. Regards Christopher error LNK2019: unresolved external symbol ""public: static class RTCPInstance * __stdcall RTCPInstance::createNew(class UsageEnvironment &,class Groupsock *,unsigned int,unsigned char const *,class RTPSink *,class RTPSource const *,unsigned int)" (?createNew at RTCPInstance@@SGPAV1 at AAVUsageEnvironment@@PAVGroupsock@@IPBEPAVRTPSink@@PBVRTPSource@@I at Z)" in function ""int __stdcall RTPmain(int,char * *)" (?RTPmain@@YGHHPAPAD at Z)". -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070622/39688246/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Christopher.Koehnen.vcf Type: text/x-vcard Size: 451 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070622/39688246/attachment.vcf From julian.lamberty at mytum.de Sat Jun 23 12:27:03 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Sat, 23 Jun 2007 21:27:03 +0200 Subject: [Live-devel] FramedFilter Performance & Questions Message-ID: <467D7407.1090607@mytum.de> Hi! I've implemented a FramedFilter subclass that transcoders video from MPEG-2 to MPEG-4 using the libavcodec library. Therefore I use a live-"chain" that looks like: MPEG1or2VideoRTPSource -> Transcoder (my class) -> MPEG4VideoStreamDiscreteFramer -> MPEG4ESVideoRTPSink The Transcoder's structure is: //Begin Code doGetNextFrame() { gettimeofday(begin_frame, NULL); fInputSource->getNextFrame(..., afterGettingFrame, ...); } afterGettingFrame(void* clientData, unsigned numBytesRead, ...) { Transcoder* transcoder = (Transcoder*)clientData; transcoder->afterGettingFrame1(numBytesRead, ...); } afterGettingFrame1(unsigned numBytesRead, ...) { size = numBytesRead; while(size > 0) { gettimeofday(&begin_decoding_slice, NULL); dec_bytes = decode(); size -= dec_bytes; gettimeofday(&end_decoding_slice, NULL); decoding_frame_time += decoding_slice_time; if(got_frame) { decoding_frame_time = 0; gettimeofday(&begin_encoding_frame, NULL); enc_bytes = encode(); gettimeofday(&end_encoding_frame, NULL); memcpy(fTo, outbuf, ...); frame_ready = true; } } if(!frame_ready) { fInputSource->getNextFrame(..., afterGettingFrame, ...); } else { frame_ready = false; afterGetting(this); gettimeofday(&end_frame, NULL); } } //End Code From the timevals inserted I calculate decoding, encoding and total frame processing time. While decoding takes between 3ms and 10ms per frame and encoding takes between 8ms and 15ms per frame, the total processing time is sometimes greater than 50ms and exceeds the sum of decoding and encoding time by more than 15ms or even more. This seems to be quite much just for fetching new data from the source... where could this come from? With more than 40ms average processing time the Transcoder loses the ability to do his work in real-time. Another question: Can I use a MPEG1or2VideoStreamFramer after the MPEG1or2VideoRTPSource, so that I can pass complete frames into the decoder instead of chunks? Thanks Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070623/10a3d675/attachment.bin From finlayson at live555.com Sat Jun 23 20:28:12 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 23 Jun 2007 20:28:12 -0700 Subject: [Live-devel] FramedFilter Performance & Questions In-Reply-To: <467D7407.1090607@mytum.de> References: <467D7407.1090607@mytum.de> Message-ID: >I've implemented a FramedFilter subclass that transcoders video from >MPEG-2 to MPEG-4 using the libavcodec library. > >Therefore I use a live-"chain" that looks like: > >MPEG1or2VideoRTPSource -> Transcoder (my class) -> >MPEG4VideoStreamDiscreteFramer -> MPEG4ESVideoRTPSink If your "Transcoder" filter delivers discrete MPEG-4 frames, and sets "fPresentationTime" and "fDurationInMicroseconds" properly, then you don't need to insert a "MPEG4VideoStreamDiscreteFramer". >From the timevals inserted I calculate decoding, encoding and total >frame processing time. >While decoding takes between 3ms and 10ms per frame and encoding >takes between 8ms and 15ms per frame, the total processing time is >sometimes greater than 50ms and exceeds the sum of decoding and >encoding time by more than 15ms or even more. > >This seems to be quite much just for fetching new data from the >source... where could this come from? I don't know; you'll need to instrument this for yourself to figure it out. Make sure, though, that you are setting "fDurationInMicroseconds" correctly at each point in the chain, because that field determines how long "MPEG4ESVideoRTPSink" can wait after sending each packet before requesting new data. >Another question: Can I use a MPEG1or2VideoStreamFramer after the >MPEG1or2VideoRTPSource, so that I can pass complete frames into the >decoder instead of chunks? No. ("MPEG1or2VideoStreamFramer" takes an unstructured byte stream as input.) You wlll need write your own filter - inserted after the "MPEG1or2VideoRTPSource" object - to assemble incoming slices into a complete frame, prior to delivering it to your decoder. (Better yet, fix your decoder so that it can handle individual slices - that way, you will handle data loss much better.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From julian.lamberty at mytum.de Sun Jun 24 01:01:57 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Sun, 24 Jun 2007 10:01:57 +0200 Subject: [Live-devel] FramedFilter Performance & Questions In-Reply-To: References: <467D7407.1090607@mytum.de> Message-ID: <467E24F5.5090903@mytum.de> > > > If your "Transcoder" filter delivers discrete MPEG-4 frames, and sets > "fPresentationTime" and "fDurationInMicroseconds" properly, then you > don't need to insert a "MPEG4VideoStreamDiscreteFramer". > It does, but if I leave out the framer, the sink doesn't even call doGetNextFrame() of my class... > I don't know; you'll need to instrument this for yourself to figure > it out. Make sure, though, that you are setting > "fDurationInMicroseconds" correctly at each point in the chain, > because that field determines how long "MPEG4ESVideoRTPSink" can wait > after sending each packet before requesting new data. > I tried both ways: Set to 0 or to 40000 (according to 25Hz framerate), doesn't change anything (maybe because I use the framer?) > > No. ("MPEG1or2VideoStreamFramer" takes an unstructured byte stream as input.) > Better yet, fix your decoder so that it can handle individual slices - that way, > you will handle data loss much better.) OK, it actually does handle slices. -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070624/575b2795/attachment.bin From julian.lamberty at mytum.de Mon Jun 25 07:17:57 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Mon, 25 Jun 2007 16:17:57 +0200 Subject: [Live-devel] FramedFilter Performance & Questions Message-ID: <467FCE95.5060503@mytum.de> OK, I've investigated the problem further by adding timestamp at the beginning and the end of doGetNextFrame(), afterGettingFrame() and afterGettingFrame1(). As you can see from the code snippet in this thread, everytime my transcoder needs more data to complete the frame it calls fInputSource->getNextFrame(..., afterGettingFrame, ...). afterGettingFrame1() actually encapsulates the transcoding funstions. No I've added the timestamps (all in ms) and got the following output (similar for every frame): Transcoder::doGetNextFrame() start: 1182779980073.611084 Transcoder::doGetNextFrame() end: 1182779980073.627930 Transcoder::afterGetttingFrame() start: 1182779980073.635986 Transcoder::afterGetttingFrame1() start: 1182779980073.643066 Transcoder::afterGettingFrame1() end: 1182779980073.673096 Transcoder::afterGettingFrame() end: 1182779980073.679932 ...(timestamps are very close here)... Transcoder::afterGettingFrame1() end: 1182779980073.902100 *Transcoder::afterGettingFrame() end: 1182779980073.907959* *Transcoder::afterGetttingFrame() start: 1182779980096.835938* Transcoder::afterGetttingFrame1() start: 1182779980096.875000 Decoded frame 2 [B, 9140 Bytes] in 13.95 ms [mpeg4 @ 0xb7e14144]warning, too many b frames in a row Encoded frame 2 [P, 16834 Bytes] in 12.15 ms, PSNR = 48.86 dB Total frame processing time 49.47 ms Transcoder::afterGettingFrame1() end: 1182779980123.104980 Transcoder::afterGettingFrame() end: 1182779980123.113037 Transcoder::doGetNextFrame() start: 1182779980127.437012 As you can see at the marked position, there is a gap of about 23ms just before the frame is completely decodable, timestamps before and after this position seem to be OK to me. Why is this gap there and where does it come from? Why is the last call to afterGettingFrame delayed and the other calls are not? Am I doing something wrong or does the parsing of the final slice (by MPEG1or2VideoRTPSource) take so much time? I really don't get where these 23ms come from, thats actually as much as I need for decoding and encoding... :( Thanks for your help Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070625/8b1f880f/attachment.bin From rjbrennn at gmail.com Mon Jun 25 07:35:14 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Mon, 25 Jun 2007 10:35:14 -0400 Subject: [Live-devel] calling SingleStep() Message-ID: I'm trying to bypass doEventLoop() and call SingleStep() in my own while loop, but after going through a tangle of code trying to figure out how to call it, I didn't see how I could call it, so I created the following: void BasicTaskScheduler0::doSingleStep(char* watchVariable) { if (watchVariable == NULL && *watchVariable == 0) SingleStep(); } Is there a better way to do this? If not, this might something worth adding to make the library more flexible. -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070625/477e5f21/attachment.html From nantonop at orbitech.gr Mon Jun 25 10:59:20 2007 From: nantonop at orbitech.gr (Nikos Antonopoulos) Date: Mon, 25 Jun 2007 20:59:20 +0300 Subject: [Live-devel] MPEG2TransportStreamIndexer & h264... Message-ID: <46800278.6000604@orbitech.gr> hi everyone, I've been trying to use trick play by creating an index file (using MPEG2TransportStreamIndexer) for an MPEG-TS (h264/mp3) file. I've has no luck with it - been getting an empty index file. I understand that the indexer is designed to work with mpeg1-2 files. Question is whether I should spend some time into making this handle h264 files as well. Would that be too difficult? I've taken my slim chances with a quick and very naive hack in MPEG2IFrameIndexFromTransportStream::analyzePMT() ... if (stream_type == 1 || stream_type == 2) { fVideo_PID = elementary_PID; return; } ... and added || stream_type == 27 (you can see the desperation surely...) and got the point of a filled index file but naturally the index is messed up i think and trick play does not work properly... If anyone lend an expert opinion on this i'd be very grateful... Thanks Nik From rjbrennn at gmail.com Mon Jun 25 12:11:03 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Mon, 25 Jun 2007 15:11:03 -0400 Subject: [Live-devel] calling SingleStep() In-Reply-To: References: Message-ID: As an update, I decided for simplicity to just make this function public instead of protected. On 6/25/07, Russell Brennan wrote: > > I'm trying to bypass doEventLoop() and call SingleStep() in my own while > loop, but after going through a tangle of code trying to figure out how to > call it, I didn't see how I could call it, so I created the following: > > void BasicTaskScheduler0::doSingleStep(char* watchVariable) { > if (watchVariable == NULL && *watchVariable == 0) SingleStep(); > } > > Is there a better way to do this? If not, this might something worth > adding to make the library more flexible. > > -- > Russell Brennan > RJBrennn at gmail.com > (708) 699-7314 > -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070625/d0fd7c73/attachment-0001.html From rjbrennn at gmail.com Mon Jun 25 12:28:04 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Mon, 25 Jun 2007 15:28:04 -0400 Subject: [Live-devel] Buffered MPEG2-TS to RTP how-to In-Reply-To: References: Message-ID: I have figured out how to do this, so here is the basis of my code, and some hanging unanswered questions... the code is mostly testMPEG2TransportStreamer.cpp. #include #include #include #include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" // To stream using "source-specific multicast" (SSM), uncomment the following: //#define USE_SSM 1 #ifdef USE_SSM Boolean const isSSM = True; #else Boolean const isSSM = False; #endif // To set up an internal RTSP server, uncomment the following: //#define IMPLEMENT_RTSP_SERVER 1 // (Note that this RTSP server works for multicast only) #define TRANSPORT_PACKET_SIZE 188 #define TRANSPORT_PACKETS_PER_NETWORK_PACKET 7 class pipeToRTP { public: // Constructor and destructor pipeToRTP(); // Primitive execution method void execute(); protected: //----------------------------------------------------------------------- // Variable Declarations //----------------------------------------------------------------------- char rtpFileName_[L_tmpnam]; string destAddressStr_; unsigned short rtpPortNum_; //----------------------------------------------------------------------- // Method Declarations //----------------------------------------------------------------------- void preamble(); void setupRTP(); void compute(); void play(); void postamble(); // Buffers char cBuffer_[MAX_GRAB]; // Ouput of FEC is a type 1000 SB // RTP objects UsageEnvironment* env; FramedSource* videoSource; RTPSink* videoSink; BasicTaskScheduler* scheduler; ByteStreamFileSource* Source; ofstream outPipe_; }; /** * pipeToRTP Constructor */ pipeToRTP::pipeToRTP() { tmpnam(rtpFileName_); // make a random filename } /** * Class Execution */ void pipeToRTP::execute() { preamble(); m_sync(); setupRTP(); //compute(); // Called via setupRTP() postamble(); } /** * Preamble * Initializes file headers, variables, and buffers */ void pipeToRTP::preamble() {} /** * Main loop * This is where the data packets are pushed to a file stream */ void pipeToRTP::compute() { while (m_do(lper_, hin_.xfer_len)) { // Grab to our unix pipe m_grabx(hin_, cBuffer_, ngot_); if (ngot_ > 0) { // Send out the packet to Unix pipe outPipe_.write(cBuffer_, ngot_); // write a block of data scheduler->SingleStep(0); // Made this guy public } } } /** * Postamble * Performs post-processing tasks such as closing file headers and freeing * memory. */ void pipeToRTP::postamble() { #ifdef DEBUG *env << "...done reading from file\n"; #endif Medium::close(videoSource); // Note that this also closes the input file that this source read from. outPipe_.close(); exit(1); } /** * Main Routine * Instantiates an instance of the class and calls the execution method */ void mainroutine() { pipeToRTP p; try { p.execute(); } catch (...) { m_error("Primitive execution failed (pipeToRTP)"); } } /** * setupRTP * Initializes RTP environment. */ void pipeToRTP::setupRTP() { // Begin by setting up our usage environment: scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); ... #ifdef DEBUG *env << "Beginning streaming...\n"; #endif outPipe_.open(rtpFileName_); // Open the file if (!outPipe_) { cerr << "Could not open fifo for output" << endl; exit(1); } play(); compute(); } void afterPlaying(void* /*clientData*/) { // Just for the compiler } void pipeToRTP::play() { unsigned const inputDataChunkSize = TRANSPORT_PACKETS_PER_NETWORK_PACKET*TRANSPORT_PACKET_SIZE; // Open the input as a 'byte-stream file source': Source = ByteStreamFileSource::createNew(*env, rtpFileName_, inputDataChunkSize); if (Source == NULL) { *env << "Unable to open file \"" << rtpFileName_ << "\" as a byte-stream file source\n"; exit(1); } // Create a 'framer' for the input source (to give us proper inter-packet gaps): videoSource = MPEG2TransportStreamFramer::createNew(*env, Source); // Finally, start playing: #ifdef DEBUG *env << "Beginning to read from file...\n"; #endif videoSink->startPlaying(*videoSource, afterPlaying, videoSink); } Right, so the only way I could manage to get this working was to use an ofstream object to write my data to, using a random filename which was passed to ByteStreamFileSource::createNew. Also, teh above functions play() and compute() HAD to be withing setupRTP() or else I would get a bad file descriptor error from select()... this probably has something to do with scope, but I will not investigate this further. Performace seems to be good, this uses minimal cpu cycles on my machine. If anyone has a better alternative to ofstream, I would be happy to hear it! Russell On 6/19/07, Russell Brennan wrote: > > Hello, > > I am trying to take a buffer of MPEG2-TS data which is constantly written > to, and hook it up to live555 for output via RTP. I am currently basing my > work on the testMEPG2TransportStreamer code, but obviously this was designed > to stream a file. > My current idea is not really panning out, due to the fact that I am > apparently confused... ByteStreamFileSource is used in the test program, > but it seems that I will need to use a different type of source. Is there > an existing MediaSource that will work, or will I need to create my own? > > Also, MPEG2TransportStreamFramer is used as the videosource to > videoSink->startPlaying . This would seem to still be valid for what I am > doing, correct? > > Lastly, doEventLoop seems to continuously call SingleStep(), and from what > I can make of this function, it seems that in some way this is where the > data to be sent out is packaged and sent. Assuming that I get > acknowledgement each time my buffer is ready to be sent out, can I simply > call SingleStep() each time I have data to send? > > I hope I was clear enough... Thanks in advance! > > -- > Russell Brennan > > -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070625/e6ea7dbf/attachment.html From finlayson at live555.com Mon Jun 25 19:26:06 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 25 Jun 2007 19:26:06 -0700 Subject: [Live-devel] MPEG2TransportStreamIndexer & h264... In-Reply-To: <46800278.6000604@orbitech.gr> References: <46800278.6000604@orbitech.gr> Message-ID: >I've been trying to use trick play by creating an index file (using >MPEG2TransportStreamIndexer) for an MPEG-TS (h264/mp3) file. >I've has no luck with it - been getting an empty index file. > >I understand that the indexer is designed to work with mpeg1-2 files. >Question is whether I should spend some time into making this handle >h264 files as well. Would that be too difficult? It's on our "to do" list. What makes it non-trivial is that the index file generation code works by analyzing each of the video frames in the Video Elementary Stream embedded within the Transport Stream - in order to figure out which frames are 'key frames' (i.e., I-frames), and where they are. I.e., the indexing code would need to know the structure of MPEG-4 video frames (just as it currently knows about MPEG-2 video frames). >I've taken my slim chances with a quick and very naive hack in >MPEG2IFrameIndexFromTransportStream::analyzePMT() >... > if (stream_type == 1 || stream_type == 2) { > fVideo_PID = elementary_PID; > return; > } >... > >and added >|| stream_type == 27 (you can see the desperation surely...) > >and got the point of a filled index file but naturally the index is >messed up i think and trick play does not work properly... See above. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From neo_star2007 at yahoo.es Tue Jun 26 08:14:39 2007 From: neo_star2007 at yahoo.es (Joan Manuel Zabala) Date: Tue, 26 Jun 2007 17:14:39 +0200 (CEST) Subject: [Live-devel] Help with the testMPEG1or2VideoStreamer Message-ID: <953643.95074.qm@web28105.mail.ukl.yahoo.com> hello! I wanted that somebody could help me. saying to me, if the testing program to testMPEG1or2VideoStreamer when arriving at the point where startPlaying() is called to the function initiates the transmission of the video through the network. or on the contrary I must make a function so that it initiates the shipment of packages to the network and how I could do it. on the other hand, if I have a receiving program of video by RTP when unloading the data from the network and startPlaying() is called to the function, after that begins the reproduction of the video or I must use a decoder to part? I ask this, because I am making a project for reception of live video using RTP/RTCP/RTSP, and must soon give the results of this development, more or less for half-full of Julio, and I am on the brink of madness, please if somebody could help me would thank for it. because it is my project of graduation and it is very important for my. of before hand, thanks. Joan Zabala - Venezuela --------------------------------- LLama Gratis a cualquier PC del Mundo. Llamadas a fijos y m?viles desde 1 c?ntimo por minuto. http://es.voice.yahoo.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070626/9c4b0651/attachment.html From rjbrennn at gmail.com Tue Jun 26 09:12:22 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Tue, 26 Jun 2007 12:12:22 -0400 Subject: [Live-devel] Help with the testMPEG1or2VideoStreamer In-Reply-To: <953643.95074.qm@web28105.mail.ukl.yahoo.com> References: <953643.95074.qm@web28105.mail.ukl.yahoo.com> Message-ID: If I understand your question correctly, you are simply asking if packets are sent when startPlaying() is called. I believe the answer is no, doEventLoop() calls a function called SingleStep() in a while loop. Each time SingleStep() is executed, a packet is sent. That's as deep as I understand it. Russell On 6/26/07, Joan Manuel Zabala wrote: > > hello! I wanted that somebody could help me. saying to me, if the testing > program to testMPEG1or2VideoStreamer when arriving at the point where > startPlaying() is called to the function initiates the transmission of the > video through the network. or on the contrary I must make a function so that > it initiates the shipment of packages to the network and how I could do it. > on the other hand, if I have a receiving program of video by RTP when > unloading the data from the network and startPlaying() is called to the > function, after that begins the reproduction of the video or I must use a > decoder to part? > I ask this, because I am making a project for reception of live video > using RTP/RTCP/RTSP, and must soon give the results of this development, > more or less for half-full of Julio, and I am on the brink of madness, > please if somebody could help me would thank for it. because it is my > project of graduation and it is very important for my. > of before hand, thanks. > *Joan Zabala - Venezuela* > > > ------------------------------ > > LLama Gratis a cualquier PC del Mundo. > Llamadas a fijos y m?viles desde 1 c?ntimo por minuto. > http://es.voice.yahoo.com > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070626/19221986/attachment-0001.html From xcsmith at rockwellcollins.com Tue Jun 26 13:05:58 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Tue, 26 Jun 2007 15:05:58 -0500 Subject: [Live-devel] Question on ClientTrickPlayState.fScale Message-ID: I am working with mediaServer and the trick mode code now, and I have the following problem: When I setup a session and send the very first PLAY request with a scale of 2.0, I get a regular 1x stream. I ran the debugger for awhile and noticed that in ClientTrickPlayState, fScale is initialized to 2.0. In the function MPEG2TransportFileServerMediaSubsession::startStream(), client->areChangingState() appears to return false. I think this is because I want to play with scale 2 but fScale was already = 2.0. Therefore my trick mode filters are never created. When I send my first PLAY, everything works OK if my scale is 4. Do you think I am doing something wrong? Why is fScale initialized to 2.0? Thanks very much! ?????????????????????????????????????????? Xochitl Smith GS Software Engineer; Computer-E ph: 319.263.0191 xcsmith at rockwellcollins.com ?????????????????????????????????????????? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070626/7290ab62/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 2784 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070626/7290ab62/attachment.jpe From finlayson at live555.com Tue Jun 26 16:56:49 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 26 Jun 2007 16:56:49 -0700 Subject: [Live-devel] Question on ClientTrickPlayState.fScale In-Reply-To: References: Message-ID: >Do you think I am doing something wrong? Why is fScale initialized to 2.0? Oops, you've found a bug. I had used that initialization when debugging the code, but I had forgotten to change it back to 1.0 afterwards. (I never noticed it at the time, because I used the Amino STB to test the code, and it uses a scale of 6.0 (by default).) So please change 2.0 to 1.0 in "MPEG2TransportFileServerMediaSubsession.cpp", line 258. (This fix will get included in the next release of the software.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070626/19e22a46/attachment.html From julian.lamberty at mytum.de Wed Jun 27 03:12:26 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Wed, 27 Jun 2007 12:12:26 +0200 Subject: [Live-devel] FramedFilter Performance & Questions In-Reply-To: <467FCE95.5060503@mytum.de> References: <467FCE95.5060503@mytum.de> Message-ID: <4682380A.2010003@mytum.de> Am I right assuming that my afterGettingFrame() function is called by the TaskScheduler? Sometimes it happens that afterGettingFrame() is called before the first call has completed. Could anyone explain, why sometimes I have a delay of >10ms between two calls to afterGettingFrame()? -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070627/97d543d3/attachment.bin From finlayson at live555.com Wed Jun 27 03:29:00 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 27 Jun 2007 03:29:00 -0700 Subject: [Live-devel] FramedFilter Performance & Questions In-Reply-To: <4682380A.2010003@mytum.de> References: <467FCE95.5060503@mytum.de> <4682380A.2010003@mytum.de> Message-ID: >Am I right assuming that my afterGettingFrame() function is called >by the TaskScheduler? No. Assuming that your "afterGettingFrame()" function was passed as a parameter to "getNextFrame()", then it will be called by "FramedSource::afterGetting()" (which your "doGetNextFrame()" implementation should have called once it completed delivery of incoming data). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From julian.lamberty at mytum.de Wed Jun 27 04:01:54 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Wed, 27 Jun 2007 13:01:54 +0200 Subject: [Live-devel] FramedFilter Performance & Questions In-Reply-To: References: <467FCE95.5060503@mytum.de> <4682380A.2010003@mytum.de> Message-ID: <468243A2.8070206@mytum.de> > No. Assuming that your "afterGettingFrame()" function was passed as > a parameter to "getNextFrame()", then it will be called by > "FramedSource::afterGetting()" (which your "doGetNextFrame()" > implementation should have called once it completed delivery of > incoming data). > So is my code correct? If the frame is not completely decodable, I call getNextFrame() again (on MPEG1or2VideoRTPSource). As I said, sometimes (irregular) there are ~10 to 20ms delay between consecutive calls of afterGettingFrame(). Sometimes the afterGettingFrame() function is called twice even if the first one has not come to an end. How can that be? Could you please have a look at the code? Thank you very much! void Transcoder::doGetNextFrame() { fInputSource->getNextFrame(inbuf, INBUF_SIZE, afterGettingFrame, this, handleClosure, this); } void Transcoder::afterGettingFrame(void* clientData, unsigned numBytesRead, unsigned numTruncatedBytes, struct timeval presentationTime, unsigned durationInMicroseconds) { Transcoder* transcoder = (Transcoder*)clientData; transcoder->afterGettingFrame1(numBytesRead, numTruncatedBytes, presentationTime, durationInMicroseconds); } void Transcoder::afterGettingFrame1(unsigned numBytesRead, unsigned numTruncatedBytes, struct timeval presentationTime, unsigned durationInMicroseconds) { //Decoding dec_bytes = avcodec_decode_video(dec_codec_ctx, dec_frame, &got_frame, inbuf, size); //decoding inbuf (data chunks) to dec_frame if(got_frame) //decoded one complete frame { //Encoding gettimeofday(&begin_encoding_frame, NULL); enc_bytes = avcodec_encode_video(enc_codec_ctx, outbuf, fMaxSize, dec_frame); //encoding dec_frame to outbuf if(enc_bytes > fMaxSize) //more bytes encoded than framer can handle { fFrameSize = fMaxSize; fNumTruncatedBytes = enc_bytes - fMaxSize; } else { fFrameSize = enc_bytes; fNumTruncatedBytes = 0; } //Delivering memcpy(fTo, outbuf, fFrameSize); //copy outbuf to fTo for framer fPresentationTime = begin_encoding_frame; fDurationInMicroseconds = durationInMicroseconds; //just pass through durationInMicroseconds frame_ready = true; //frame ready for delivery } if(!frame_ready) //get more data { fInputSource->getNextFrame(inbuf, INBUF_SIZE, afterGettingFrame, this, handleClosure, this); //*problems* } else //deliver data and return { frame_ready = false; afterGetting(this); } } After that I have the MPEG4VideoStreamDiscreteFramer (without that, Transcoder::doGetNextFrame() isn't even called... why is that btw?) and an MPEG4ESVideoRTPSink. The streaming actually works, but I temporarily exceed 40ms to transcode one frame. -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070627/4db65092/attachment-0001.bin From julian.lamberty at mytum.de Wed Jun 27 06:31:44 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Wed, 27 Jun 2007 15:31:44 +0200 Subject: [Live-devel] FramedFilter Performance & Questions In-Reply-To: <468243A2.8070206@mytum.de> References: <467FCE95.5060503@mytum.de> <4682380A.2010003@mytum.de> <468243A2.8070206@mytum.de> Message-ID: <468266C0.8050400@mytum.de> Another interesting effect is that the average total processing time for one frame (measured from the start of doGetNextFrame() until after afterGetting(this)) is always close to 40ms, no matter how long my transcoder needs to process one frame (assuming that this time stays below ~35ms, which it does). Could that be related to "durationIn Microseconds"?? -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070627/00be4c20/attachment.bin From finlayson at live555.com Wed Jun 27 06:54:23 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 27 Jun 2007 06:54:23 -0700 Subject: [Live-devel] FramedFilter Performance & Questions In-Reply-To: <468243A2.8070206@mytum.de> References: <467FCE95.5060503@mytum.de> <4682380A.2010003@mytum.de> <468243A2.8070206@mytum.de> Message-ID: >>No. Assuming that your "afterGettingFrame()" function was passed >>as a parameter to "getNextFrame()", then it will be called by >>"FramedSource::afterGetting()" (which your "doGetNextFrame()" >>implementation should have called once it completed delivery of >>incoming data). >> >So is my code correct? If the frame is not completely decodable, I >call getNextFrame() again (on MPEG1or2VideoRTPSource). As I said, >sometimes (irregular) there are ~10 to 20ms delay between >consecutive calls of afterGettingFrame(). Sometimes the >afterGettingFrame() function is called twice even if the first one >has not come to an end. How can that be? Could you please have a >look at the code? In general I don't have time to examine people's custom code in detail. (But Remember, You Have Complete Source Code.) However, in your case, you need to look at the code for whatever object your "Transcoder" object is reading *from* - i.e., the 'upstream' data source for your "Transcoder". It's this 'upstream' object that is causing your Transcoder's "afterGettingFrame" function to be called (when your 'upstream' object calls "FramedSource::afterGetting()"). >After that I have the MPEG4VideoStreamDiscreteFramer (without that, >Transcoder::doGetNextFrame() isn't even called... why is that btw? This is quite clear if you look at the code for "MPEG4VideoRTPSink". Its upstream object *must* be a "MPEG4VideoStreamFramer" (or a subclass). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kartikraop at gmail.com Wed Jun 27 23:11:30 2007 From: kartikraop at gmail.com (Kartik Rao) Date: Thu, 28 Jun 2007 11:41:30 +0530 Subject: [Live-devel] Live stream frequently getting stuck Message-ID: Hi. I am attempting to perform MPEG4 live streaming (on Windows) using live555 and DirectShow (SDK 9). I did the following: 1. Using DirectShow, I grab camera feed, encode it and dump the encoded data into a named pipe. 2. I modified the testMPEG4VideoStreamer program, so that it can be spawned as a seperate thread. Also, I created a "ByteStreamPipeSource" and "FramedPipeSource" class - similar to ByteStreamFileSource and FramedFileSource - which read the encoded data from a named pipe. Using VLC as a client, I observed the following: Initially, streaming works fine (although with a small delay). Then it got stuck and played no further. I set the "rtsp-caching", disabled "drop late frames" and "skip frames" parameters. Now when it gets stuck, it recovers a few seconds later, plays for a few seconds before getting stuck again. When I dump the encoded feed into a file and stream the file, at a later point in time, it plays smoothly. Currently, I am not modifying the presentation times. The difference between presentation times for successive packets when streaming live feed and through video though are different. Should I attempt to modify presentation time? Kindly advice on how I can avoid the problem of the live stream frequently getting stuck. Thanks, Kartik -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070627/786086ec/attachment.html From armandopoulos at yahoo.fr Thu Jun 28 05:16:38 2007 From: armandopoulos at yahoo.fr (Armando Ko) Date: Thu, 28 Jun 2007 14:16:38 +0200 (CEST) Subject: [Live-devel] Using the live555 to implement RTCP Message-ID: <589634.7386.qm@web25912.mail.ukl.yahoo.com> Dear all, my idea is to make 2 separates applikation one for RTP streamingserver (rtp.exe) for JPEG streaming and the second application ist for RTCP (rtcp.exe). For do this i would write the SSRC from rtp.exe in a file and read it from this file with rtcp.exe. the connection point of both application is ssrc. is that a good idea ? Is possible to implement the rtcp.exe in a standalone application using the live555 ? if yes how can i do that ? there is some example for do this ? I had looked in the FAQ and testprograms but i can?t find a sample just for RTCP. Thanks for giving me a hint about my idea. Armando --------------------------------- D?couvrez une nouvelle fa?on d'obtenir des r?ponses ? toutes vos questions ! Profitez des connaissances, des opinions et des exp?riences des internautes sur Yahoo! Questions/R?ponses. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070628/2b2e1053/attachment.html From lroels at hotmail.com Thu Jun 28 06:14:05 2007 From: lroels at hotmail.com (Luc Roels) Date: Thu, 28 Jun 2007 13:14:05 +0000 Subject: [Live-devel] Lost packets Message-ID: Hi Ross, I've been able to create a simple streaming server for my 'modified H.264' video encoder card and created a simple viewing client in just a couple of days using the livemedia library, and it might even have been faster if there was some good documentation available :-). Even so, Livemedia is great, to do this from scratch would have taken me several weeks. One more question though regarding packet loss. In a previous post you told me that I can detect packet loss by inspecting the presentation times at a higher level. I don't see how this can work properly? Suppose we are streaming live MPEG4 video using RTP over the internet. If a P frame isn't delivered because one or more of it's composing packets are lost, the client should stop decoding until it receives a new and complete I frame. I don't see how the client can detect the packet loss by simply looking at the presentation time. If the streaming server delivers a variable framerate then there is no way to know that a frame is lost by looking at it's presentation time or am I wrong? The only way to detect the frame loss would be if the higher level had access to the frame's beginning and ending RTP sequence numbers or am I mistaken? What would be the simplest way to detect this? best regards, Luc Roels _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070628/2f8e6666/attachment.html From rjbrennn at gmail.com Thu Jun 28 06:31:55 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Thu, 28 Jun 2007 09:31:55 -0400 Subject: [Live-devel] Live stream frequently getting stuck In-Reply-To: References: Message-ID: It sounds like you are implementing your pipes in an inefficient manner... for instance, maybe you are writing the data to disk and then reading it when really you could be writing it to RAM and reading it instead. Check out my prior post on streaming from buffers. Russell On 6/28/07, Kartik Rao wrote: > > > Hi. I am attempting to perform MPEG4 live streaming (on Windows) using > live555 and DirectShow (SDK 9). I did the following: > > 1. Using DirectShow, I grab camera feed, encode it and dump the encoded > data into a named pipe. > 2. I modified the testMPEG4VideoStreamer program, so that it can be > spawned as a seperate thread. Also, I created a "ByteStreamPipeSource" and > "FramedPipeSource" class - similar to ByteStreamFileSource and > FramedFileSource - which read the encoded data from a named pipe. > > Using VLC as a client, I observed the following: Initially, streaming > works fine (although with a small delay). Then it got stuck and played no > further. I set the "rtsp-caching", disabled "drop late frames" and "skip > frames" parameters. Now when it gets stuck, it recovers a few seconds later, > plays for a few seconds before getting stuck again. > > When I dump the encoded feed into a file and stream the file, at a later > point in time, it plays smoothly. Currently, I am not modifying the > presentation times. The difference between presentation times for successive > packets when streaming live feed and through video though are different. > Should I attempt to modify presentation time? Kindly advice on how I can > avoid the problem of the live stream frequently getting stuck. > > Thanks, > Kartik > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Russell Brennan RJBrennn at gmail.com (708) 699-7314 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070628/c0c8ad67/attachment-0001.html From MHosseini at newheights.com Thu Jun 28 07:00:37 2007 From: MHosseini at newheights.com (Mojtaba Hosseini) Date: Thu, 28 Jun 2007 08:00:37 -0600 Subject: [Live-devel] Lost packets References: Message-ID: Hello, I agree. I was also able to create an RTP streaming application for my H264 encoder within a week. More documentation would surely be appreciated by new users. Towards this, I documented my work and put it here: (it has sample code + UML diagrams) http://www.white.ca/patrick/tutorial.tar.gz Do you think you would be able to do the same, Luc? I'm also interested in your question about packet loss. I have not yet had time to look at that part of RTP but I will have to, very soon. Are we to assume that presentation times will be regular (like every 33 ms) and if there is a gap between them on the receiving end, we know a frame was lost? I may be wrong but that doesn't seem like an elegant solution... Mojtaba Hosseini -----Original Message----- From: live-devel-bounces at ns.live555.com on behalf of Luc Roels Sent: Thu 6/28/2007 7:14 AM To: live-devel at ns.live555.com Subject: [Live-devel] Lost packets -----Original Message----- From: live-devel-bounces at ns.live555.com on behalf of Luc Roels Sent: Thu 6/28/2007 7:14 AM To: live-devel at ns.live555.com Subject: [Live-devel] Lost packets Hi Ross, I've been able to create a simple streaming server for my 'modified H.264' video encoder card and created a simple viewing client in just a couple of days using the livemedia library, and it might even have been faster if there was some good documentation available :-). Even so, Livemedia is great, to do this from scratch would have taken me several weeks. One more question though regarding packet loss. In a previous post you told me that I can detect packet loss by inspecting the presentation times at a higher level. I don't see how this can work properly? Suppose we are streaming live MPEG4 video using RTP over the internet. If a P frame isn't delivered because one or more of it's composing packets are lost, the client should stop decoding until it receives a new and complete I frame. I don't see how the client can detect the packet loss by simply looking at the presentation time. If the streaming server delivers a variable framerate then there is no way to know that a frame is lost by looking at it's presentation time or am I wrong? The only way to detect the frame loss would be if the higher level had access to the frame's beginning and ending RTP sequence numbers or am I mistaken? What would be the simplest way to detect this? best regards, Luc Roels _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3854 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070628/b56582f3/attachment.bin From finlayson at live555.com Thu Jun 28 14:09:15 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 28 Jun 2007 14:09:15 -0700 Subject: [Live-devel] Using the live555 to implement RTCP In-Reply-To: <589634.7386.qm@web25912.mail.ukl.yahoo.com> References: <589634.7386.qm@web25912.mail.ukl.yahoo.com> Message-ID: >my idea is to make 2 separates applikation one for RTP >streamingserver (rtp.exe) for JPEG streaming and the second >application ist for RTCP (rtcp.exe). This is a really silly idea. RTP and RTCP are not independent protocols - for example, RTCP reports include statistics about RTP packet loss. It doesn't make any sense to implement RTP and RTCP as separate applications. Instead, your RTP server should include RTCP, which our existing code does automatically (using the "RTCPInstance" class). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From lroels at hotmail.com Fri Jun 29 00:34:45 2007 From: lroels at hotmail.com (Luc Roels) Date: Fri, 29 Jun 2007 07:34:45 +0000 Subject: [Live-devel] Lost packets Message-ID: Hi Mojtaba, Writing a tutorial takes some time, which for the moment I don't really have, sorry. But in short, what I did to create my streaming server was to: 1) Create a custom DeviceSource class derived from FramedSource ( see FAQ, based on liveMedia/DeviceSource.cpp and liveMedia/DeviceSource.hh ). Basically, in the doGetNextFrame() and the deliverFrame() function there is code similar to the code in your x264VideoStreamFramer.cpp file ( the else{} part of your doGetNextFrame() function is in deliverFrame() ). I also created a simple FIFO class to store encoded video frames based on an STL queue. 2) Create a custom RTPSink class derived from VideoRTPSink. This is essentially a modified copy of liveMedia/MPEG4ESVideoRTPSink.cpp needed for handling my type of frames. 3) Create a custom MediaSubsesion class, overridding 3 functions createNewStreamSource(), createNewRTPSink() and getAuxSDPLine(). Function 1 and 2 create instances of your custom devicesource and your custom rtpsink. The getAuxSDPLine() starts the rtpsink and waits till my systemheader is created, so that it can be put in the config=... part of the reply to the DESCRIBE. 4) Create a custom RTSPserver class based on mediaServer/DynamicRTSPserver.hh and mediaServer/DynamicRTSPserver.cpp changing the createNewSMS() function to handle my media type That's it! I also set the max packet len (OutPacketBuffer::maxSize) in MediaSink.cpp to 120 KB. The whole thing is started up by ( see mediaServer/live555MediaServer.cpp ) - Creating a basictaskscheduler - Creating a basicuserenvironment - Create the RTSP server - Entering the doeventloop By the way, thanks for your tutorial, it helped... About the packet loss, for my application it will be necessary to create some container format which will hold the video and audio frames. To simply fix the packet loss detection problem I could of course put a frame number in the container format, this would avoid any changes to liveMedia, but it would be nice if some function or mechanism would be added to report to the higher level that a frame was lost. Hello, I agree. I was also able to create an RTP streaming application for my H264 encoder within a week. More documentation would surelybe appreciated by new users. Towards this, I documented my work and put it here: (it has sample code + UML diagrams)http://www.white.ca/patrick/tutorial.tar.gzDo you think you would be able to do the same, Luc?I'm also interested in your question about packet loss. I have not yet had time to look at that part of RTP but I will have to, very soon. Are we to assume that presentation times will be regular (like every 33 ms) and if there is a gap between them on the receiving end, we know a frame was lost? I may be wrong but that doesn't seem like an elegant solution...Mojtaba Hosseini _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070629/93c16ec6/attachment.html From lroels at hotmail.com Fri Jun 29 00:53:15 2007 From: lroels at hotmail.com (Luc Roels) Date: Fri, 29 Jun 2007 07:53:15 +0000 Subject: [Live-devel] Server shutdown Message-ID: Hi Ross, One more question, what is the proper way to shutdown the LIVE555 media server. The demo source just enters an endless loop, and if the program is forced to exit you get lots of memory leaks. Could you enlight me with the proper shutdown procedure? _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070629/4a00db38/attachment.html From finlayson at live555.com Fri Jun 29 01:32:52 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 29 Jun 2007 01:32:52 -0700 Subject: [Live-devel] Server shutdown In-Reply-To: References: Message-ID: >One more question, what is the proper way to shutdown the LIVE555 >media server. Just -C (or the "kill" command). > The demo source just enters an endless loop The "live555MediaServer" code (unlike the code for the application in the "testProgs" directory) is not 'demo source'. It's the source for a complete application (product). >, and if the program is forced to exit you get lots of memory leaks. Not at all. Once any application ends (no matter how it ends), all of its memory is reclaimed by the operating system. (Similarly, all of its network sockets get closed.) > Could you enlight me with the proper >shutdown procedure? Just kill it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From MHosseini at newheights.com Fri Jun 29 06:01:34 2007 From: MHosseini at newheights.com (Mojtaba Hosseini) Date: Fri, 29 Jun 2007 07:01:34 -0600 Subject: [Live-devel] Server shutdown References: Message-ID: Hi Luc, The RTSP server is probably one level higher from the RTP streaming server I've created but this is how I end the RTP session: (I think I have this code in the VideoLiveMediaRtpSocket.cpp of my tutorial) //variables dynamically allocated, initialized and used when streaming: char mEventLoopController = 0; RTPSink *mVideoSenderSink; //my H264 RTP Sink MediaSource *mVideoSenderSource; //my H264 Framed Source Port *mRtpPort; //port for RTP Port *mRtcpPort; //port for RTCP Groupsock *mRtpGroupsock; //socket for RTP Groupsock *mRtcpGroupsock; //socket for RTCP RTCPInstance* mRtcpInstance; UsageEnvironment *mEnv; TaskScheduler *mScheduler; //line below runs in its own thread of execution void ThreadExecute() { mEnv->taskScheduler().doEventLoop(&mEventLoopController); } //Stop function called by user or a signal (like Ctrl-C) to end session void Stop() { //set this variable to non-zero to allow doEventLoop to return //you may have to introduce a dummy delayed task to ensure doEventLoop return (see FAQ) mEventLoopController = 0xFF; //also stop the thread by calling pthread_join() on the thread mentioned above mVideoSenderSink->stopPlaying(); Medium::close(mVideoSenderSource); Medium::close(mVideoSenderSink); Medium::close(mRtcpInstance); //I had problem with this call on server side, client side can call this with no problem though delete mScheduler; mEnv->reclaim(); delete mRtpPort; delete mRtcpPort; delete mRtpGroupsock; delete mRtcpGroupsock; } -----Original Message----- From: live-devel-bounces at ns.live555.com on behalf of Luc Roels Sent: Fri 6/29/2007 1:53 AM To: live-devel at ns.live555.com Subject: [Live-devel] Server shutdown Hi Ross, One more question, what is the proper way to shutdown the LIVE555 media server. The demo source just enters an endless loop, and if the program is forced to exit you get lots of memory leaks. Could you enlight me with the proper shutdown procedure? _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 3669 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070629/4f1b20be/attachment-0001.bin From MHosseini at newheights.com Fri Jun 29 06:02:58 2007 From: MHosseini at newheights.com (Mojtaba Hosseini) Date: Fri, 29 Jun 2007 07:02:58 -0600 Subject: [Live-devel] Lost packets References: Message-ID: Thanks Luc, I just wanted to know how other people are implementing the same thing. It looks like we have done similar work which means I was not too far off the track. Thanks for your explanation. In the very near future I will be looking at the packet loss issue through liveMedia. I'll report if I find a way of supporting it through liveMedia so we wouldn't have to do it at the layer above (RTP already keeps sequence numbers so it should be able to detect packet loss and as you said provide a mechanism for calling application to deal with it). Mojtaba -----Original Message----- From: live-devel-bounces at ns.live555.com on behalf of Luc Roels Sent: Fri 6/29/2007 1:34 AM To: live-devel at ns.live555.com Subject: [Live-devel] Lost packets Hi Mojtaba, Writing a tutorial takes some time, which for the moment I don't really have, sorry. But in short, what I did to create my streaming server was to: 1) Create a custom DeviceSource class derived from FramedSource ( see FAQ, based on liveMedia/DeviceSource.cpp and liveMedia/DeviceSource.hh ). Basically, in the doGetNextFrame() and the deliverFrame() function there is code similar to the code in your x264VideoStreamFramer.cpp file ( the else{} part of your doGetNextFrame() function is in deliverFrame() ). I also created a simple FIFO class to store encoded video frames based on an STL queue. 2) Create a custom RTPSink class derived from VideoRTPSink. This is essentially a modified copy of liveMedia/MPEG4ESVideoRTPSink.cpp needed for handling my type of frames. 3) Create a custom MediaSubsesion class, overridding 3 functions createNewStreamSource(), createNewRTPSink() and getAuxSDPLine(). Function 1 and 2 create instances of your custom devicesource and your custom rtpsink. The getAuxSDPLine() starts the rtpsink and waits till my systemheader is created, so that it can be put in the config=... part of the reply to the DESCRIBE. 4) Create a custom RTSPserver class based on mediaServer/DynamicRTSPserver.hh and mediaServer/DynamicRTSPserver.cpp changing the createNewSMS() function to handle my media type That's it! I also set the max packet len (OutPacketBuffer::maxSize) in MediaSink.cpp to 120 KB. The whole thing is started up by ( see mediaServer/live555MediaServer.cpp ) - Creating a basictaskscheduler - Creating a basicuserenvironment - Create the RTSP server - Entering the doeventloop By the way, thanks for your tutorial, it helped... About the packet loss, for my application it will be necessary to create some container format which will hold the video and audio frames. To simply fix the packet loss detection problem I could of course put a frame number in the container format, this would avoid any changes to liveMedia, but it would be nice if some function or mechanism would be added to report to the higher level that a frame was lost. Hello, I agree. I was also able to create an RTP streaming application for my H264 encoder within a week. More documentation would surelybe appreciated by new users. Towards this, I documented my work and put it here: (it has sample code + UML diagrams)http://www.white.ca/patrick/tutorial.tar.gzDo you think you would be able to do the same, Luc?I'm also interested in your question about packet loss. I have not yet had time to look at that part of RTP but I will have to, very soon. Are we to assume that presentation times will be regular (like every 33 ms) and if there is a gap between them on the receiving end, we know a frame was lost? I may be wrong but that doesn't seem like an elegant solution...Mojtaba Hosseini _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 4583 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070629/bfe9bfa2/attachment.bin From julian.lamberty at mytum.de Fri Jun 29 07:22:54 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 29 Jun 2007 16:22:54 +0200 Subject: [Live-devel] presentationTime and B-Frames Message-ID: <468515BE.6090105@mytum.de> Hi! I'm transcoding MPEG-2 (with B-Frames) into MPEG-4 (without B-Frames, just I and P). Currently I'm just transcoding video, but later the transcoded streams should by able to be synchronised with the source's audio stream: --------> Audio -------------------------------- | | ------------ ------ | Source | | Sink | ------------ ------ | ------------ | --------> Video ---------> | Transcoder | ------ ------------ The problem is setting the timestamps correct. Right now I generate them when a frame enters the encoder. But then I lose the information of the incoming frame and won't be able to sync to audio later on, right? When I just pass the times through by setting fPresentationTime = presentationTime, the transcoded streams "flickers" as if the frames are presented in the wrong order. Is there a possibility to set the presentation times in a way without losing the ability to sync to the corresponding audio stream? Thank you Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070629/2a911e88/attachment.bin From weiyutao36 at 163.com Fri Jun 29 07:24:19 2007 From: weiyutao36 at 163.com (weiyutao36) Date: Fri, 29 Jun 2007 22:24:19 +0800 (CST) Subject: [Live-devel] Use live555 library as a streaming server-- Message-ID: <13340346.888151183127059444.JavaMail.coremail@bj163app25.163.com> Hi Ross, I want to use live555 streaming Media library as a streaming server and I have serveral questions: (1) Can the live555 library be used in a practical application, for example, a VoD system, as a streaming server? (2) How many concurrent users can it support, or does it have a limit on the number of concurrent users? (3) Can the streaming server run stably for normal use? (4) I know that currently the server can stream mpeg-4 elementary stream and want to know further: a) do you have the plan to make the library support streaming a mpeg-4 file including audio and video; b) is it possible for myself to modify the code to make it have this functionality? If it is possible, what should I do? Thank you VERY much. (so many questions) -------------------Yutao Wei -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070629/7cebaf57/attachment.html From julian.lamberty at mytum.de Fri Jun 29 07:24:55 2007 From: julian.lamberty at mytum.de (Julian Lamberty) Date: Fri, 29 Jun 2007 16:24:55 +0200 Subject: [Live-devel] RTCP question Message-ID: <46851637.6010405@mytum.de> I've set up an RTCPInstance for my Sink exactly as shown in the testprogs, but there are no RTCP packets sent. Also the vobStreamer just sends one single RTCP packet. Why is that? Shouldn't there be more RTCP traffic? Julian -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5198 bytes Desc: S/MIME Cryptographic Signature Url : http://lists.live555.com/pipermail/live-devel/attachments/20070629/b8016906/attachment-0001.bin From lroels at hotmail.com Fri Jun 29 07:32:27 2007 From: lroels at hotmail.com (Luc Roels) Date: Fri, 29 Jun 2007 14:32:27 +0000 Subject: [Live-devel] Server shutdown Message-ID: Ok, it's a fact that the memory will be reclaimed by the operating system on program exit, but it's not nice to see all those memory leaks reported by boundschecker on program exit ( i'am running under windows by the way ). Suppose the server ran inside a program that added a watch variable to the doEventLoop to stop it and later start it again! Then you would need to clean-up right? So the question was not which button to hit to exit the program :-), but what the proper way is to clean-up and in what order? P.S. Sorry for calling the media server a demo program :-) >One more question, what is the proper way to shutdown the LIVE555 >media server. Just -C (or the "kill" command). > The demo source just enters an endless loop The "live555MediaServer" code (unlike the code for the application in the "testProgs" directory) is not 'demo source'. It's the source for a complete application (product). >, and if the program is forced to exit you get lots of memory leaks. Not at all. Once any application ends (no matter how it ends), all of its memory is reclaimed by the operating system. (Similarly, all of its network sockets get closed.) > Could you enlight me with the proper >shutdown procedure? Just kill it. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _________________________________________________________________ Ontdek Windows Live Hotmail, het ultieme online mailprogramma! http://get.live.com/mail/overview -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070629/dcf43f49/attachment.html From finlayson at live555.com Sat Jun 30 00:34:21 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 30 Jun 2007 00:34:21 -0700 Subject: [Live-devel] presentationTime and B-Frames In-Reply-To: <468515BE.6090105@mytum.de> References: <468515BE.6090105@mytum.de> Message-ID: >Content-Type: multipart/signed; protocol="application/x-pkcs7-signature"; > micalg=sha1; boundary="------------ms050109060700050902090703" > >Hi! > >I'm transcoding MPEG-2 (with B-Frames) into MPEG-4 (without >B-Frames, just I and P). Currently I'm just transcoding video, but >later the transcoded streams should by able to be synchronised with >the source's audio stream: > > > --------> Audio -------------------------------- > | | >------------ ------ >| Source | | Sink | >------------ ------ > | ------------ | > --------> Video ---------> | Transcoder | ------ > ------------ > >The problem is setting the timestamps correct. Right now I generate >them when a frame enters the encoder. But then I lose the >information of the incoming frame and won't be able to sync to audio >later on, right? > >When I just pass the times through by setting fPresentationTime = >presentationTime, the transcoded streams "flickers" as if the frames >are presented in the wrong order. > >Is there a possibility to set the presentation times in a way >without losing the ability to sync to the corresponding audio stream? Note that In your original MPEG-2 stream, with B-frames, the frames will be in decoding order, which is different from the display order, and thus different from the order of frames in the resulting MPEG-4 stream (because that doesn't have B frames). In particular, the presentation times in the original MPEG-2 stream, with B-frames, will *not* be monotonically increasing. You will need to reorder the presentation times accordingly when you convert the stream to MPEG-4 (without B-frames). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From alex.cameron at digitaltx.tv Sat Jun 30 13:51:52 2007 From: alex.cameron at digitaltx.tv (Alexander Cameron) Date: Sat, 30 Jun 2007 21:51:52 +0100 Subject: [Live-devel] RSTP Server Mods - opening files & transcoding Message-ID: <003501c7bb58$7d134700$0300a8c0@AlexLaptop> Hi - my first post on the live developer mailing list, and i think my questions should be fairly simple to answer (i've been looking through the archives and FAQ but they haven't been covered there it seems). We've been using the MediaServer for a few weeks now and its been brilliant. There are a couple of quirks i'd like to iron out in it though and was wondering if someone could point me in the right direction so i can alter the source code and re-compile it in the way i want it to function. I know these things are rarely as simple as that and despite not having a whole lot of C++ knowledge i'm wiling to give it a go (or get a friend to tweak it for me). a) How can i stream file names with spaces in them? (e.g. rtsp:///i am a file.ts instead of rtsp:///file.ts) b) Is it possible stream file outside the working directory or create virtual paths? c) Ideally i'd like to be able to drag and drop files into a directory and transcode them on the fly to any format i set, in the same way it can be done with VLC manually. I know this has been covered before but could someone give me an overview how VLC's engine or something like FFMpeg could be integrated with the live RTSP server to achieve this so streams are served and transcoded on-demand? Thanks in advance - apologies for any avoidable ignorance. Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070630/5325ed22/attachment.html