From tbatra18 at gmail.com Tue Jan 1 22:37:03 2013 From: tbatra18 at gmail.com (Tarun Batra) Date: Wed, 2 Jan 2013 12:07:03 +0530 Subject: [Live-devel] Is there any provision in Live 555 to get the rtsp stream from RTSP server and save only the NAL unit in a file or an array? Message-ID: Hello Sir, Is there any provision in Live 555 to get the rtsp stream from RTSP server and save only the NAL unit in a file or an array? Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From rafael.gdm at vaelsys.com Wed Jan 2 05:06:27 2013 From: rafael.gdm at vaelsys.com (Rafael Gil) Date: Wed, 2 Jan 2013 14:06:27 +0100 Subject: [Live-devel] RTSP/1.0 454 Session Not Found In-Reply-To: <50E16768.4030205@sound4.biz> References: <50E16768.4030205@sound4.biz> Message-ID: Hi Eric I'll check this out. Thank you. Regards 2012/12/31 Eric HEURTEL > Hi, > > At some point my RTSP source answers with error code "RTSP/1.0 454 Session > Not Found" to RTSP commands but seems not being handled by RTSP client. > Why? Is this an invalid error code? > > Yes. It is usually returned by the server when the client attempts to > access a stream using an invalid RTSP 'session id' string. That shouldn't > be happening (with our RTSP client or proxy code), so I'm curious as to why > you're getting this error. (See below.) > > There you go. I had to "play" a little bit with it before the error occurs > so the log is pretty big (also I added some printfs myself) but the whole > sequence is there and you get the error at the end. > I'm curious of how would I have to reset/close the connection so it could > be reopened by the client to stablish a new session. If the error occurs, > at least I would like the system to recover even if it is with a new > session. Is this possible? > > > I had a problem which may be relative to this one and may help here : > RTSP client adds a '/' at the end of baseURL and cannot be used for > reconnection else a 404 Session Not Found occurs. > By "Reconnection" I mean calling sendDescribeCommand() after a > deconnection, for an automatic reconnection, without destroying the RTSP > client object. > As a workaround I had to save the original URL in an additional field for > this to work. > > > I'm talking of a relatively old Live version (LIVE555 2012.09.12) and > have not checked if this has been changed since. > > > Cheers > > > E. HEURTEL > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Rafael Gil Vaelsys -------------- next part -------------- An HTML attachment was scrubbed... URL: From rafael.gdm at vaelsys.com Wed Jan 2 05:04:54 2013 From: rafael.gdm at vaelsys.com (Rafael Gil) Date: Wed, 2 Jan 2013 14:04:54 +0100 Subject: [Live-devel] RTSP/1.0 454 Session Not Found In-Reply-To: <5B43BD47-D46C-4366-B0E4-B321DC83DB42@live555.com> References: <22EFD09E-5E20-44BB-8C20-C57D9A9F81F2@live555.com> <5B43BD47-D46C-4366-B0E4-B321DC83DB42@live555.com> Message-ID: Hi Ross Thanks for the reply. You are right: I had done some other modificactions to the code. But here is the same problem with the plain proxyServer executable (see attached log). As you can see the problem persists. Indeed VLC sends a tear down. The problem is that when I try to reconnect to the proxy I can not because the backend connection can not recover from this error. Any ideas? Thanks and happy new year! Rafael 2012/12/28 Ross Finlayson > Well, what's happening here is that - for some strange reason - the RTSP > connection between your front-end client and the proxy is timing out (due > to inactivity). I say "for some strange reason" because - prior to this - > the proxy was receiving periodic RTCP "RR" packets from the packets OK, but > (again, for some strange reason) did not recognize these RTCP packets as > indicating client 'liveness'. > > You are going to have to figure out yourself why this is happening, > because (1) I don't understand why it's happening, and (2) I'm not > convinced that the problem is not somehow caused by your own modifications > to the supplied code. (You have already modified the code to add your own > debugging printfs, but I suspect you have done more than this.) In any > case, you're on your own here. > > The resulting 454 errors - in response to the front-end client's > "TEARDOWN" request - are therefore understandable, but you shouldn't be > caring at all about this. The front-end client (VLC) has chosen to close > the stream - which turns out to have already been closed by the proxy. Big > deal. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- Rafael Gil Vaelsys -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: session-not-found.log Type: application/octet-stream Size: 111393 bytes Desc: not available URL: From eric at sound4.biz Wed Jan 2 09:48:52 2013 From: eric at sound4.biz (Eric HEURTEL) Date: Wed, 02 Jan 2013 18:48:52 +0100 Subject: [Live-devel] SDP attribute extension In-Reply-To: References: Message-ID: <50E47304.2010803@sound4.biz> >> I have to deal with application-specific SDP attributes (a=my_attribute:value). I have my own subclass of MediaSubsession. >> I was wondering if parsing of subsession description lines (c=, b=, a=rtpmap... ) could be done in a virtual new method (ie MediaSubsession::parseSDPline()) that could be overloaded, instead of being hardcoded in MediaSession::initializeWithSDP(). >> What do you think ? > This is a possibility, but I consider changes like this - whose sole purpose is to support non-standard protocol extensions - to be low priority. > > What 'application-specific SDP attributes' do you have in mind? We are targetting professional live audio for studio. For precise sync and low delay we have to carry information giving the relation between RTP timestamp and PTP timestamp (NTP timestamp carried in RTCP is not precise enough). I'll will not give details here, but we retrieve this constant delta through a new SDP "a=timesync:value" media-level attribute. It is a legal extension to the standard, not yet published. You can also think to other extensions, like "a=orient:portrait" and the ones given in Section 6 of RFC456, or "a=cliprect:..." (Quicktime) or "a=crypto" (for RTP/SAVP RFC4568), or de-facto standard extensions, like "a=x-dimensions" that you've chosen to implement... Ideally, it would be cumfortable to be able to extend "Media Level" and "Session Level" attributes through virtual methods. Best wishes E. HEURTEL -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 2 10:00:58 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Jan 2013 07:00:58 +1300 Subject: [Live-devel] RTSP/1.0 454 Session Not Found In-Reply-To: References: <22EFD09E-5E20-44BB-8C20-C57D9A9F81F2@live555.com> <5B43BD47-D46C-4366-B0E4-B321DC83DB42@live555.com> Message-ID: > Thanks for the reply. You are right: I had done some other modificactions to the code. But here is the same problem with the plain proxyServer executable (see attached log). It's not quite the "plain proxyServer", because - once again - you have turned on unnecessary debugging "printf"s, producing a long and confusing log. It would have been far clearer if you had started by doing absolutely *no* modifications to the code - not even defining DEBUG - and just running "live555ProxyServer" with the -V flag. But anyway, what's happening here is that: 1/ Your back-end server (camera) is not properly recognizing RTCP "RR" packets - coming from the proxy - as indicating that its client (i.e., the proxy) is still alive. Consequently, your back-end server (camera) is timing out and closing the (back-end) session that comes from the proxy, even though the proxy is still alive. 2/ Our proxy server is noticing that the back-end stream has closed, but is not handling this as cleanly as it should. That's why subsequent RTSP commands from the proxy to the back-end server are generating "454 Session Not Found" responses. I can probably update the proxy server implementation to deal with both of these situations better. However, because the fundamental problem here is your broken (non-compliant) back-end server (i.e., camera), this is not a super high priority for me. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 2 10:37:03 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Jan 2013 07:37:03 +1300 Subject: [Live-devel] SDP attribute extension In-Reply-To: <50E47304.2010803@sound4.biz> References: <50E47304.2010803@sound4.biz> Message-ID: <6832B76D-D784-4854-B3EF-69B5B9CCE618@live555.com> >> What 'application-specific SDP attributes' do you have in mind? > We are targetting professional live audio for studio. For precise sync and low delay we have to carry information giving the relation between RTP timestamp and PTP timestamp (NTP timestamp carried in RTCP is not precise enough). The following work - currently underway in the IETF - might be relevant here. (If so, you should adopt it and/or contribute to this effort.) http://tools.ietf.org/html/draft-ietf-avtcore-clksrc-01 (If/when this work becomes an IETF standard, I will update the LIVE555 code to support this new "ts-refclk" SDP attribute, if requested.) > You can also think to other extensions, like "a=orient:portrait" and the ones given in Section 6 of RFC456, or "a=cliprect:..." (Quicktime) or "a=crypto" (for RTP/SAVP RFC4568), or de-facto standard extensions, like "a=x-dimensions" that you've chosen to implement... Yes, however - although I have (reluctantly) implemented some non-standard attributes that are already in widespread use, I am disinclined to support *new* non-standard SDP attributes; only those SDP attributes that get defined as parts of new IETF standards. Sorry. > Ideally, it would be cumfortable to be able to extend "Media Level" and "Session Level" attributes through virtual methods. That would be for your RTSP client. What about your RTSP server? It would also need to generate these new SDP attribute(s). (Is your RTSP server not also implemented using our code? If not, why not? :-) But anyway, as noted above, modifying our code solely to make it easier for people to use non-standard SDP attributes is not a priority for me. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 2 16:52:18 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Jan 2013 13:52:18 +1300 Subject: [Live-devel] QuickTime does not work with proxyServer program In-Reply-To: <345787DF-6DAA-439D-B3E1-BAD7C44127D9@live555.com> References: <56D328FC-D8D1-4E82-93EB-4FD7CBDAB575@live555.com> <345787DF-6DAA-439D-B3E1-BAD7C44127D9@live555.com> Message-ID: <4C612C3C-46FE-4E73-AF02-B6B91D5D0ADF@live555.com> >> The other solution, setting reclamationTestSeconds to 0, does not seems to work. In fact, it ends very bad as the program terminates unexpectedly (see attached log). Any ideas on this one? > > This might be an independent bug. I'm currently investigating this. This did, indeed out to be a separate bug in our RTSP server code. I have just installed a new version of our software - 2013.01.03 - that should fix this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ From hh3.kim at samsung.com Wed Jan 2 17:57:22 2013 From: hh3.kim at samsung.com (=?euc-kr?B?sejH9sij?=) Date: Thu, 03 Jan 2013 01:57:22 +0000 (GMT) Subject: [Live-devel] LIVE555 Crash since version 20120913. Message-ID: <0MG100J2W1FOV080@mailout1.samsung.com> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: live555_RtpInterface.htm Type: application/octet-stream Size: 398841 bytes Desc: not available URL: From finlayson at live555.com Wed Jan 2 18:29:28 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Jan 2013 15:29:28 +1300 Subject: [Live-devel] LIVE555 Crash since version 20120913. In-Reply-To: <0MG100J2W1FOV080@mailout1.samsung.com> References: <0MG100J2W1FOV080@mailout1.samsung.com> Message-ID: <75E5C0BB-BA5A-4A39-BDB4-DDE5D9129C1B@live555.com> > _Tables* _Tables::getOurTables(UsageEnvironment& env, Boolean createIfNotPresent) { > --> here!! --> if (env.liveMediaPriv == NULL && createIfNotPresent) { What specifically is wrong here? I.e., why is a crash occurring at that line? > When I insert a code as follows, the problem was solved. > Please overview again changes of live555. > > void SocketDescriptor::tcpReadHandler1(int mask) { > ... > removeSocketDescription(fEnv, fOurSocketNum); // <-- When this code is added, the problem was solved... > delete this; > I still don't understand what the problem is, and why making an extra call to "removeSocketDescriptor()" before calling "delete this;" (which will cause "removeSocketDescriptor()" to be called from the "SocketDescriptor" destructor anyway) would change anything?? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tonygriffen at gmail.com Wed Jan 2 13:26:38 2013 From: tonygriffen at gmail.com (Tony Griffen) Date: Wed, 2 Jan 2013 13:26:38 -0800 Subject: [Live-devel] RTSP/1.0 404 File Not Found, Or In Incorrect Format Message-ID: Hi, I have read many thread on the dreaded message "RTSP/1.0 404 File Not Found, Or In Incorrect Format" when trying to stream from Live555MediaServer, yet I was not able to find any solution. Below I attach output from openRTSP client, strace from live555MediaServer, directory structure, and live555MediaServer version output. I even copied the binary to the same directory as the file I am trying to stream and am invoking it with ./live555MediaServer from that directory. Nothing helps. I have spent so many hours on this, I have no idea what is going on. Any suggestions will be appreciated. Thanks, t. *** openRTSP report: Sending request: DESCRIBE rtsp://10.0.100.51/welcome.wav RTSP/1.0 CSeq: 3 User-Agent: openRTSP (LIVE555 Streaming Media v2012.02.04) Accept: application/sdp Received 101 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 404 File Not Found, Or In Incorrect Format CSeq: 3 *** strace on the server: select(5, [3 4], [], [], {0, 9972}) = 1 (in [3], left {0, 8117}) accept(3, {sa_family=AF_INET, sin_port=htons(58106), sin_addr=inet_addr("10.0.100.201")}, [16]) = 5 fcntl64(5, F_GETFL) = 0x2 (flags O_RDWR) fcntl64(5, F_SETFL, O_RDWR|O_NONBLOCK) = 0 getsockopt(5, SOL_SOCKET, SO_SNDBUF, [102400], [4]) = 0 getsockopt(5, SOL_SOCKET, SO_SNDBUF, [102400], [4]) = 0 gettimeofday({1357160101, 791935}, NULL) = 0 gettimeofday({1357160101, 792016}, NULL) = 0 gettimeofday({1357160101, 792037}, NULL) = 0 select(6, [3 4 5], [], [], {0, 7737}) = 1 (in [5], left {0, 7735}) recvfrom(5, "OPTIONS rtsp://10.0.100.51/welco"..., 10000, 0, {sa_family=0xfe08 /* AF_??? */, sa_data="\212\0\fp/\10\334\307/\10\360r/\10"}, [0]) = 120 gettimeofday({1357160101, 792173}, NULL) = 0 time(NULL) = 1357160101 send(5, "RTSP/1.0 200 OK\r\nCSeq: 2\r\nDate: "..., 152, 0) = 152 gettimeofday({1357160101, 792286}, NULL) = 0 gettimeofday({1357160101, 792306}, NULL) = 0 select(6, [3 4 5], [], [], {0, 7468}) = 1 (in [5], left {0, 7282}) recvfrom(5, "DESCRIBE rtsp://10.0.100.51/welc"..., 10000, 0, {sa_family=0xfe08 /* AF_??? */, sa_data="\212\0\fp/\10\334\307/\10\360r/\10"}, [0]) = 146 gettimeofday({1357160101, 792617}, NULL) = 0 open("welcome.wav", O_RDONLY|O_LARGEFILE) = 6 close(6) = 0 open("welcome.wav", O_RDONLY|O_LARGEFILE) = 6 fstat64(6, {st_mode=S_IFREG|0644, st_size=582038, ...}) = 0 mmap2(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0xb77ad000 read(6, "RIFF\216\341\10\0WAVEfmt \20\0\0\0\1\0\2\0D\254\0\0\20\261\2\0"..., 4096) = 4096 close(6) = 0 munmap(0xb77ad000, 4096) = 0 time(NULL) = 1357160101 send(5, "RTSP/1.0 404 File Not Found, Or "..., 101, 0) = 101 gettimeofday({1357160101, 793091}, NULL) = 0 gettimeofday({1357160101, 793110}, NULL) = 0 select(6, [3 4 5], [], [], {0, 6664}) = 1 (in [5], left {0, 6433}) recvfrom(5, "", 10000, 0, {sa_family=0xfe08 /* AF_??? */, sa_data="\212\0\fp/\10\334\307/\10\360r/\10"}, [0]) = 0 gettimeofday({1357160101, 793459}, NULL) = 0 close(5) = 0 *** Directory structure: [/var/live555] # ls -la drwxr-xr-x. 2 root root 4096 Jan 2 12:32 . drwxr-xr-x. 27 root root 4096 Jan 2 10:45 .. -rwxr-xr-x. 1 root root 20413 Jan 2 11:47 live555MediaServer -rw-r--r--. 1 ftp ftp 582038 Jan 2 11:07 welcome.wav *** live555MediaServer version output: LIVE555 Media Server version 0.74 (LIVE555 Streaming Media library version 2012.02.04) From darko at toplifikacija.net Thu Jan 3 00:34:51 2013 From: darko at toplifikacija.net (Darko Mijatovic) Date: Thu, 3 Jan 2013 09:34:51 +0100 Subject: [Live-devel] RTP unicast Message-ID: <002601cde98d$32d4f490$c864a8c0@darko> Hi, I have some device that is capable only to push RTP by multicast or unicast. Is there a way to use live555ProxyServer or anyting else to serve content to multiple clients. Main goal is to use one stream from device via inetrnet to some server and from that point to serve clients (not all in LAN) Thanks, Darko -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 3 02:01:03 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Jan 2013 23:01:03 +1300 Subject: [Live-devel] RTP unicast In-Reply-To: <002601cde98d$32d4f490$c864a8c0@darko> References: <002601cde98d$32d4f490$c864a8c0@darko> Message-ID: <4F7C632A-81CC-4E00-B076-790A4310AFD5@live555.com> > I have some device that is capable only to push RTP by multicast or unicast. Is there a way to use live555ProxyServer or anyting else to serve content to multiple clients. You can't use a proxy RTSP server, because that requires the input have its own RTSP server. However, with a little programming, you can use our regular RTSP server implementation. I suggest you look at the code for our "testOnDemandRTSPServer" demo application, noting in particular how we implement the "mpeg2TransportStreamFromUDPSourceTest" stream. You would probably need to write your own "ServerMediaSubsession" subclass, depending on what codec(s) your input source uses. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 3 02:14:10 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Jan 2013 23:14:10 +1300 Subject: [Live-devel] RTSP/1.0 404 File Not Found, Or In Incorrect Format In-Reply-To: References: Message-ID: > I have read many thread on the dreaded message "RTSP/1.0 404 File Not > Found, Or In Incorrect Format" when trying to stream from > Live555MediaServer, yet I was not able to find any solution. There's nothing 'dreaded' about that error message; it means exactly what it says. The server is either (1) unable to find the specified file, or (2) it did find the file, but discovered that it's in a format that it can't handle. Because "fopen()" apparently succeeds on the file name, we know that (2) must be true. Therefore, we need to find out why the server doesn't like the file "welcome.wav". See http://www.live555.com/liveMedia/faq.html#my-file-doesnt-work Also, because you've read the FAQ (as everyone is asked to do before they post to this mailing list), you know that we support only the latest version of the "LIVE555 Streaming Media" software. You are using an old version (2012.02.04). You should upgrade to the latest version ASAP. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tonygriffen at gmail.com Thu Jan 3 09:47:53 2013 From: tonygriffen at gmail.com (Tony Griffen) Date: Thu, 3 Jan 2013 09:47:53 -0800 Subject: [Live-devel] RTSP/1.0 404 File Not Found, Or In Incorrect Format In-Reply-To: References: Message-ID: Thank you for your suggestions. I've downloaded the latest source code and rebuilt it. It is now: LIVE555 Streaming Media library version 2013.01.03 welcome.wav still doesn't stream, and since file shows: # file welcome.wav welcome.wav: RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, stereo 44100 Hz and since it plays everywhere I tried it, it would not occur to me that live555MediaServer might not be able to read it (apparently generateSDPDescription() in ServerMediaSession.cpp fails and then I get this: setRTSPResponse("404 File Not Found, Or In Incorrect Format")). So, following your hint, I created a streaming video mpg file and tried that. Worked like a charm, and the output from openRTSP is exactly perfect for what I want to show. Still perplexed about the WAV file, but it is no longer important to my task. Thanks, t. On Thu, Jan 3, 2013 at 2:14 AM, Ross Finlayson wrote: > I have read many thread on the dreaded message "RTSP/1.0 404 File Not > Found, Or In Incorrect Format" when trying to stream from > Live555MediaServer, yet I was not able to find any solution. > > > There's nothing 'dreaded' about that error message; it means exactly what it > says. The server is either (1) unable to find the specified file, or (2) it > did find the file, but discovered that it's in a format that it can't > handle. Because "fopen()" apparently succeeds on the file name, we know > that (2) must be true. > > Therefore, we need to find out why the server doesn't like the file > "welcome.wav". See > http://www.live555.com/liveMedia/faq.html#my-file-doesnt-work > > Also, because you've read the FAQ (as everyone is asked to do before they > post to this mailing list), you know that we support only the latest version > of the "LIVE555 Streaming Media" software. You are using an old version > (2012.02.04). You should upgrade to the latest version ASAP. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Thu Jan 3 13:05:01 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Jan 2013 10:05:01 +1300 Subject: [Live-devel] RTSP/1.0 404 File Not Found, Or In Incorrect Format In-Reply-To: References: Message-ID: > welcome.wav still doesn't stream, and since file shows: > > # file welcome.wav > welcome.wav: RIFF (little-endian) data, WAVE audio, Microsoft > PCM That's presumably the problem. Our WAV file parsing code can't handle WAV files that contain 'Microsoft PCM' (whatever that is), because I don't think there is a standard RTP payload format defined for how to stream this. Our WAV file parsing code recognizes regular PCM, PCMU (i.e., u-law), PCMA (i.e., a-law), and IMA ADPCM WAV files only. (Fortunately these are by far the most commonly-seen types of WAV file.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From temp2010 at forren.org Thu Jan 3 13:27:43 2013 From: temp2010 at forren.org (temp2010 at forren.org) Date: Thu, 3 Jan 2013 16:27:43 -0500 Subject: [Live-devel] How to release frame; How to improve horrible video quality Message-ID: Please consider these two questions. Thanks very much. BACKGROUND: I adapted testH264VideoStreamer.cpp and DeviceSouce.cpp/hpp so that I could send a live video feed out over the network. I currently test using VLC. I generate h.264 encoded frames using a Microsoft Media Foundation (MF) back end to my existing MF streaming software. There is a call to env->taskScheduler().doEventLoop() in my adapted testH264VideoStreamer.cpp, which is in a separate thread. Thus, my primary thread handles the MF streaming, and this secondary thread makes sure doEventLoop() happens. QUESTION ONE: My h.264 encoding repeatedly creates new [output] buffers that I pass to the live555 side, using an adapted signalNewFrameData() function that goes through the scheduler trigger event. This is fine. But how can I figure out when the buffer is no longer in use? I need to return it to the pool, de-allocate it, or otherwise free it up. Otherwise, I'll leak memory usage indefinitely. QUESTION TWO: Right now, the video I see using VLC is truly horrible. I'm no h.264 or mpeg-4 expert, but it appears as if half my detail frames are missing. It's like watching satellite TV during a rain storm. This could be related to the above, but I don't think so due to a prior single-thread solution that looked the same but guaranteed appropriate buffer allocation/de-allocation. So why might my video look so bad, or how can I go about figuring out why it's so? Being sufficiently ignorant, I can only guess at how you may answer to assist me. One thing I guess about is a method to trace the h.264 encoded output to see if it's all getting though, I frames P frames we call frame for Ice cream, or whatever. Thanks very much. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 3 13:54:33 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Jan 2013 10:54:33 +1300 Subject: [Live-devel] How to release frame; How to improve horrible video quality In-Reply-To: References: Message-ID: > BACKGROUND: I adapted testH264VideoStreamer.cpp and DeviceSouce.cpp/hpp so that I could send a live video feed out over the network. I currently test using VLC. I generate h.264 encoded frames using a Microsoft Media Foundation (MF) back end to my existing MF streaming software. There is a call to env->taskScheduler().doEventLoop() in my adapted testH264VideoStreamer.cpp, which is in a separate thread. Thus, my primary thread handles the MF streaming, and this secondary thread makes sure doEventLoop() happens. I'm not quite sure what you mean by those last two sentences, but I hope you've read . Only one thread can call LIVE555 library code, including "doEventLoop()". The one exception to this is that a separate thread can call "triggerEvent()", as it appears you are doing. That's OK. > > QUESTION ONE: My h.264 encoding repeatedly creates new [output] buffers that I pass to the live555 side, using an adapted signalNewFrameData() function that goes through the scheduler trigger event. This is fine. But how can I figure out when the buffer is no longer in use? Once the buffer's data has been copied to the downstream object's buffer - e.g., after the call to "memmove()" on line 137 in the "DeviceSource.cpp" example code - then the buffer is no longer needed. > QUESTION TWO: Right now, the video I see using VLC is truly horrible. I'm no h.264 or mpeg-4 expert, but it appears as if half my detail frames are missing. It's like watching satellite TV during a rain storm. This could be related to the above, but I don't think so due to a prior single-thread solution that looked the same but guaranteed appropriate buffer allocation/de-allocation. So why might my video look so bad, or how can I go about figuring out why it's so? Being sufficiently ignorant, I can only guess at how you may answer to assist me. One thing I guess about is a method to trace the h.264 encoded output to see if it's all getting though, I frames P frames we call frame for Ice cream, or whatever. Unfortunately we can't help you specifically with VLC, as it's not our software. Instead, I suggest that you start by running our own RTSP client applications - "testRTSPClient", and then "openRTSP". In particular, "testRTSPClient" will tell you the sizes of each of the H.264 NAL units that it receives. You can then compare these to those that your encoder is generating. I suspect that your problem may be that some of your NAL units (for I-frames) may be too large for the streamer's output buffer (for RTP streaming). (However, if that were the case, you'd be seeing console warning messages about the need to increase "OutPacketBuffer::maxSize". You didn't mention seeing any warning messages like that...) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Thu Jan 3 16:30:49 2013 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Thu, 3 Jan 2013 16:30:49 -0800 Subject: [Live-devel] RTSPServer crash Message-ID: <005701cdea12$bfb65180$3f22f480$@com> Hi Ross, I have a seen a crash in RTSPServer that I can reproduce using QuickTime, and I have attached a patch that fixes it. This has been tested using version 2013-01-03 of the LIVE555 libraries with zero modifications. Here are steps to reproduce: 1. Set QuickTime to use HTTP as the transport. 2. Launch testOnDemandRTSPServer.exe 3. Use QuickTime to connect to one of the streams served by the testOnDemandRTSPServer. 4. Pause the stream in QuickTime. 5. Kill and restart the testOnDemandRTSPServer. 6. Play the stream again in QuickTime. It seems that after following this sequence, QuickTime will send a POST as the first command to the RTSP server. If this happens, the server crashes due to fOurServer.fClientConnectionsForHTTPTunneling being NULL in handleHTTPCmd_TunnelingPOST. The attached patch simply creates the fOurServer.fClientConnectionsForHTTPTunneling the same way handleHTTPCmd_TunnelingGET does. With the patch applied, the server returns a 405 error to QuickTime and the whole conversation can be retried. If this is not the correct way to fix this issue please let me know, as this is easy for me to reproduce and I can test a different fix if necessary. Thanks, Chris Richardson WTI -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: RTSPServer.cpp.patch Type: application/octet-stream Size: 827 bytes Desc: not available URL: From finlayson at live555.com Thu Jan 3 21:09:14 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 4 Jan 2013 18:09:14 +1300 Subject: [Live-devel] RTSPServer crash In-Reply-To: <005701cdea12$bfb65180$3f22f480$@com> References: <005701cdea12$bfb65180$3f22f480$@com> Message-ID: <221705AA-803F-4069-8441-2BF02E3804FE@live555.com> Chris, Thanks for the report. I've just installed a new version - 2013.01.04 - that includes this fix. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tbatra18 at gmail.com Thu Jan 3 07:28:08 2013 From: tbatra18 at gmail.com (Tarun Batra) Date: Thu, 3 Jan 2013 20:58:08 +0530 Subject: [Live-devel] buffers in Open RTSP client Message-ID: Hello Sir, In Open RTSP client where are the buffers that get the data being received from the RTSP server i was trying to find them but was not able to find them.... Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From Reynel.Castelino at fike.com Fri Jan 4 03:01:30 2013 From: Reynel.Castelino at fike.com (Castelino, Reynel) Date: Fri, 4 Jan 2013 11:01:30 +0000 Subject: [Live-devel] re-streaming rtsp In-Reply-To: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E126@BLSUSAEXCH1.fike.corp> References: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E126@BLSUSAEXCH1.fike.corp> Message-ID: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E14F@BLSUSAEXCH1.fike.corp> Hi Ross, May be you can shed light on this. We receive RTSP streams from cameras ( e.g. rtsp://10.0.0.27:554/codec-h264/channel0.sdp) that we wish to re-stream over a http tunnel (and use a nonstandard http port e.g. 9100). Is this possible using Live555 Media Server. Tried using the proxy server but it doesn?t fit the bill. We have also looked at VLC( libvlc). On some forums you do mention that RTSP over HTTP is supported but I think that?s on client side. Thanks Reynel Castelino Engineer, FIKE -------------- next part -------------- An HTML attachment was scrubbed... URL: From temp2010 at forren.org Fri Jan 4 04:49:55 2013 From: temp2010 at forren.org (temp2010 at forren.org) Date: Fri, 4 Jan 2013 07:49:55 -0500 Subject: [Live-devel] How to release frame; How to improve horrible video quality In-Reply-To: References: Message-ID: Thanks very much for your help, Ross. BACKGROUND: (CLARIFICATION ONLY) Indeed I only have one thread calling doEventLoop(). That's all I meant by my penultimate background sentence. The last background sentence points out a separate thread for MF, that's independent of Live555, and is in fact the thread taking advantage of the one exception within Live555: calling triggerEvent(). FOLLOWUP QUESTION: Having incorporated your question one answer, my output is better and now only "very poor" rather than "horrible". My uncompressed stream is 640 x 480 x 30fps. It passes through NV12 format, which when considered as YUV I think has about 16 bits of info per pixel (8 bits for Y, 8 bits for UV, like YUV422). So an uncompressed 640 x 480 frame should have nearly 5Mbits of info. However, the compressed frames I'm sending to Live555 are generally 160kbits each (after an initial 384kbit frame). I realize you're not responsible for MF, but perhaps you can give your opinion, please. This suggests to me that my MF H.264 encoder is compressing roughly 5Mbits/160kbits = 31 times. Perhaps my remaining "very poor" quality is because this is too much compression. I reconfigured the encoder for 10 times the average bit rate, but this typical 160kbits/frame output size didn't really change. DO YOU AGREE that I ought to be able to increase encoder output quality by increasing average bit rate, and that the 160kbits per compressed frame should rise toward 5Mbits? Thanks again. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 4 11:06:38 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 5 Jan 2013 08:06:38 +1300 Subject: [Live-devel] re-streaming rtsp In-Reply-To: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E14F@BLSUSAEXCH1.fike.corp> References: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E126@BLSUSAEXCH1.fike.corp> <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E14F@BLSUSAEXCH1.fike.corp> Message-ID: <4CD9849E-03C9-4F38-87CB-6BF190E4E5E5@live555.com> > May be you can shed light on this. We receive RTSP streams from cameras ( e.g. rtsp://10.0.0.27:554/codec-h264/channel0.sdp) that we wish to re-stream over a http tunnel (and use a nonstandard http port e.g. 9100). Is this possible using Live555 Media Server. Tried using the proxy server but it doesn?t fit the bill. Why not? The "LIVE555 Proxy Server" is intended just for situations like this. > On some forums There's only one 'forum' for supporting the "LIVE555 Streaming Media" code - this mailing list. > you do mention that RTSP over HTTP is supported but I think that?s on client side. No, we implement RTSP-over-HTTP tunneling both for RTSP clients and RTSP servers (including our RTSP proxy server). The "LIVE555 Proxy Server" application (in the "proxyServer" directory) sets up a HTTP server (for optional RTSP-over-HTTP tunneling) by attempting to use three port numbers that are typically used for HTTP: 80, 8000, and 8080. (Note the code in "live555ProxyServer.cpp".) I suggest that you continue to use one of these three ports, if possible (because they are most likely to pass through firewalls). However you could, of course, modify the code to use some other port (like 9100) instead. (I note, however, that port 9100 has been registered - in the Unix /etc/services file - for use by what appears to be some HP printing service.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From renatomauro at libero.it Fri Jan 4 05:47:06 2013 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Fri, 04 Jan 2013 14:47:06 +0100 Subject: [Live-devel] How to release frame; How to improve horrible video quality In-Reply-To: References: Message-ID: <50E6DD5A.9020008@libero.it> NV12 is similar to I420 (or YUV420, if you prefer), so it is 12 bit per pixel, 8 for luminance and 4 for CrCb (or U and V if you prefer). See http://www.fourcc.org/yuv.php#NV12. Obviously, "4 bits for CrCb" means that each byte is used for 4 pixels (a 2x2 quad), and so NV12 is a planar only format. So: 640 x 480 x 1,5 = 450 kBi = 3.51 Mbi uncompressed => 3.51 Mbi / 160 kbi = 22,5 times. I suggest to render the stream for the luminance plane only (like old TV sets!!): if you get a good output, the problem is the way the renderer works on the U and V plane. Regards, Renato Il 04/01/2013 13.49, temp2010 at forren.org ha scritto: > Thanks very much for your help, Ross. > > BACKGROUND: (CLARIFICATION ONLY) Indeed I only have one thread calling > doEventLoop(). That's all I meant by my penultimate background > sentence. The last background sentence points out a separate thread > for MF, that's independent of Live555, and is in fact the thread > taking advantage of the one exception within Live555: calling > triggerEvent(). > > FOLLOWUP QUESTION: Having incorporated your question one answer, my > output is better and now only "very poor" rather than "horrible". My > uncompressed stream is 640 x 480 x 30fps. It passes through NV12 > format, which when considered as YUV I think has about 16 bits of info > per pixel (8 bits for Y, 8 bits for UV, like YUV422). So an > uncompressed 640 x 480 frame should have nearly 5Mbits of info. > However, the compressed frames I'm sending to Live555 are generally > 160kbits each (after an initial 384kbit frame). I realize you're not > responsible for MF, but perhaps you can give your opinion, > please. This suggests to me that my MF H.264 encoder is compressing > roughly 5Mbits/160kbits = 31 times. Perhaps my remaining "very poor" > quality is because this is too much compression. I reconfigured the > encoder for 10 times the average bit rate, but this typical > 160kbits/frame output size didn't really change. DO YOU AGREE that I > ought to be able to increase encoder output quality by increasing > average bit rate, and that the 160kbits per compressed frame should > rise toward 5Mbits? > > Thanks again. > > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Fri Jan 4 11:35:39 2013 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Fri, 4 Jan 2013 11:35:39 -0800 Subject: [Live-devel] RTSPServer Access Control Change Request Message-ID: <009e01cdeab2$aed18f60$0c74ae20$@com> Hi Ross, First, thanks for incorporating my previous fix to RTSPServer so quickly, and sorry to send another change request so soon. I would like to propose an additional virtual function to be added to RTSPServer: specialClientUserAccessCheck. This function would allow an RTSPServer subclass to easily perform user-specific access checks on individual streams, after performing normal digest authentication. Note that this function would only be used to further restrict access to a given stream, not to gain any additional access that was not already present. The attached patch shows the requested changes, which I hope you will accept. If you do not like the idea or would like me to restructure it please let me know. Thanks, Chris Richardson WTI -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: RTSPServerAccessCheck.patch Type: application/octet-stream Size: 2637 bytes Desc: not available URL: From CERLANDS at arinc.com Fri Jan 4 11:57:38 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Fri, 4 Jan 2013 19:57:38 +0000 Subject: [Live-devel] Windows make-files broken Message-ID: When downloading and building the latest version (2012-01-04) on Windows I ran into problems. I get a fatal error for all projects. Example for liveMedia: liveMedia.mak(54) : fatal error U1036: syntax error : too many names to left of '=' Stop. I noticed Makefile.tail has changed recently and it seems like the Windows compiler (VS2008 & 2010 at least) doesn't like these lines (only PREFIX for testProgs & mediaServer): PREFIX ?= /usr/local LIBDIR ?= $(PREFIX)/lib When commenting out those lines it compiles just fine. This is of course just a hack to make it compile. Not sure how to properly include the recently added makefile install section on Windows. /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From temp2010 at forren.org Fri Jan 4 12:08:29 2013 From: temp2010 at forren.org (temp2010 at forren.org) Date: Fri, 4 Jan 2013 15:08:29 -0500 Subject: [Live-devel] How to release frame; How to improve horrible video quality In-Reply-To: <50E6DD5A.9020008@libero.it> References: <50E6DD5A.9020008@libero.it> Message-ID: Renato, How would one go about rendering the stream for only the luminance plane? Do you mean going into my MF H.264 encoder? You must. Otherwise I don't know how to analyze the compressed output to omit the color. And for the MF H.264 encoder, I don't think there's any way to tell it to do only Y. Is this what you were suggesting? If so, then how would one get around the problem I just indicated? If not, what did you really mean. On Fri, Jan 4, 2013 at 8:47 AM, Renato MAURO (Libero) wrote: > NV12 is similar to I420 (or YUV420, if you prefer), so it is 12 bit per > pixel, 8 for luminance and 4 for CrCb (or U and V if you prefer). See > http://www.fourcc.org/yuv.php#NV12. > Obviously, "4 bits for CrCb" means that each byte is used for 4 pixels (a > 2x2 quad), and so NV12 is a planar only format. > > So: 640 x 480 x 1,5 = 450 kBi = 3.51 Mbi uncompressed => 3.51 Mbi / 160 > kbi = 22,5 times. > > I suggest to render the stream for the luminance plane only (like old TV > sets!!): if you get a good output, the problem is the way the renderer > works on the U and V plane. > > > Regards, > > Renato > > > Il 04/01/2013 13.49, temp2010 at forren.org ha scritto: > > Thanks very much for your help, Ross. > > BACKGROUND: (CLARIFICATION ONLY) Indeed I only have one thread calling > doEventLoop(). That's all I meant by my penultimate background sentence. > The last background sentence points out a separate thread for MF, that's > independent of Live555, and is in fact the thread taking advantage of the > one exception within Live555: calling triggerEvent(). > > FOLLOWUP QUESTION: Having incorporated your question one answer, my > output is better and now only "very poor" rather than "horrible". My > uncompressed stream is 640 x 480 x 30fps. It passes through NV12 format, > which when considered as YUV I think has about 16 bits of info per pixel (8 > bits for Y, 8 bits for UV, like YUV422). So an uncompressed 640 x 480 > frame should have nearly 5Mbits of info. However, the compressed frames > I'm sending to Live555 are generally 160kbits each (after an initial > 384kbit frame). I realize you're not responsible for MF, but perhaps you > can give your opinion, please. This suggests to me that my MF H.264 encoder > is compressing roughly 5Mbits/160kbits = 31 times. Perhaps my remaining > "very poor" quality is because this is too much compression. I > reconfigured the encoder for 10 times the average bit rate, but this > typical 160kbits/frame output size didn't really change. DO YOU AGREE that > I ought to be able to increase encoder output quality by increasing average > bit rate, and that the 160kbits per compressed frame should rise toward > 5Mbits? > > Thanks again. > > > > > > _______________________________________________ > live-devel mailing listlive-devel at lists.live555.comhttp://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 4 12:11:02 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 5 Jan 2013 09:11:02 +1300 Subject: [Live-devel] RTSPServer Access Control Change Request In-Reply-To: <009e01cdeab2$aed18f60$0c74ae20$@com> References: <009e01cdeab2$aed18f60$0c74ae20$@com> Message-ID: I'm confused. There's already a "specialClientAccessCheck()" virtual function (defined in "liveMedia/include/RTSPServer.hh") that I added in April 2007 in order to allow special-purpose access control - beyond the usual digest authentication mechanism. Is that not sufficient for your purpose? I don't plan on adding yet another virtual function like this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 4 13:00:12 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 5 Jan 2013 10:00:12 +1300 Subject: [Live-devel] Windows make-files broken In-Reply-To: References: Message-ID: > I noticed Makefile.tail has changed recently and it seems like the Windows compiler (VS2008 & 2010 at least) doesn't like these lines (only PREFIX for testProgs & mediaServer): > PREFIX ?= /usr/local > LIBDIR ?= $(PREFIX)/lib Arggh! Those lines were suggested by the Debian guy, who (I'm guessing) wanted to make it possible for PREFIX and/or LIBDIR to be defined as something else on the compile line. I should have guessed that they would likely break in at least some versions of Windows. What I'll do - in the next release - is change those lines to PREFIX = /usr/local LIBDIR = /usr/local/lib but move them to the "Makefile.head" file (in each library directory), so that if anyone wants to redefine those constants, they can do so in their "config." file. (Of course, those constants are actually used only for the "install" Makefile target, which is not intended for use in Windows anyway.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Fri Jan 4 13:26:15 2013 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Fri, 4 Jan 2013 13:26:15 -0800 Subject: [Live-devel] RTSPServer Access Control Change Request In-Reply-To: References: <009e01cdeab2$aed18f60$0c74ae20$@com> Message-ID: <00c501cdeac2$21d7fad0$6587f070$@com> Hi Ross, The specialClientAccessCheck function can be used to test the IP address/port against the stream suffix, but not the user name. If a subclass needs to check the user name, it will have to parse the request itself, which usually leads to error prone code duplication. Then, in the event that the subclass decides to allow the request, the RTSPServer must parse the request once again and then potentially deny the user access anyways. It just doesn't seem that useful to me, with the exception of providing IP address/port blacklisting. I briefly considered changing the signature of specialClientAccessCheck to include an optional parameter for the user name, but that would not only change the semantics of the operation (it would have to be called after parsing the request), it would also obviously break existing subclasses silently. I will try to think of another way to perform user-specific access control without disturbing the code too much. Also, if anybody else has ideas, I am all ears and happy to implement whatever is decided on. Otherwise I will just carry this patch. Thanks, Chris Richardson WTI From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, January 04, 2013 12:11 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTSPServer Access Control Change Request I'm confused. There's already a "specialClientAccessCheck()" virtual function (defined in "liveMedia/include/RTSPServer.hh") that I added in April 2007 in order to allow special-purpose access control - beyond the usual digest authentication mechanism. Is that not sufficient for your purpose? I don't plan on adding yet another virtual function like this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 4 15:42:51 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 5 Jan 2013 12:42:51 +1300 Subject: [Live-devel] RTSPServer Access Control Change Request In-Reply-To: <00c501cdeac2$21d7fad0$6587f070$@com> References: <009e01cdeab2$aed18f60$0c74ae20$@com> <00c501cdeac2$21d7fad0$6587f070$@com> Message-ID: <54E33551-3B18-4793-BA1B-6CF3E1B731AB@live555.com> Grumble... I don't particularly like this, but I didn't particularly like the "specialClientAccessCheck()" hack either. But a pair of related hacks isn't much worse than one, so I'll go ahead and add your patch to the next release of the software. (At some point the whole API for custom server authentication probably needs to be rethought...) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bdrung at debian.org Sat Jan 5 01:53:18 2013 From: bdrung at debian.org (Benjamin Drung) Date: Sat, 05 Jan 2013 10:53:18 +0100 Subject: [Live-devel] Windows make-files broken In-Reply-To: References: Message-ID: <1357379598.3069.2.camel@deep-thought> Am Samstag, den 05.01.2013, 10:00 +1300 schrieb Ross Finlayson: > > I noticed Makefile.tail has changed recently and it seems like the > > Windows compiler (VS2008 & 2010 at least) doesn't like these lines > > (only PREFIX for testProgs & mediaServer): > > PREFIX ?= /usr/local > > LIBDIR ?= $(PREFIX)/lib > > > Arggh! Those lines were suggested by the Debian guy, who (I'm > guessing) wanted to make it possible for PREFIX and/or LIBDIR to be > defined as something else on the compile line. I should have guessed > that they would likely break in at least some versions of Windows. > > > What I'll do - in the next release - is change those lines to > > > PREFIX = /usr/local > LIBDIR = /usr/local/lib Changing ?= to = has no drawback for me. Overwriting these variables by calling "make PREFIX=/usr" still works (I have tested it). Can you please leave LIBDIR = $(PREFIX)/lib instead of hard-coding the libdir? If a user sets PREFIX to /usr, the default libdir should be /usr/lib instead of /usr/local/lib. -- Benjamin Drung Debian & Ubuntu Developer From finlayson at live555.com Sat Jan 5 02:30:22 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 5 Jan 2013 23:30:22 +1300 Subject: [Live-devel] Windows make-files broken In-Reply-To: <1357379598.3069.2.camel@deep-thought> References: <1357379598.3069.2.camel@deep-thought> Message-ID: <2D5AF6F6-0298-474E-969B-BEE470C0C96E@live555.com> >> PREFIX = /usr/local >> LIBDIR = /usr/local/lib > > Changing ?= to = has no drawback for me. Overwriting these variables by > calling "make PREFIX=/usr" still works (I have tested it). > > Can you please leave LIBDIR = $(PREFIX)/lib instead of hard-coding the > libdir? OK, I'll make that change in the next release. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bdrung at debian.org Sat Jan 5 03:23:27 2013 From: bdrung at debian.org (Benjamin Drung) Date: Sat, 05 Jan 2013 12:23:27 +0100 Subject: [Live-devel] Missing LDFLAGS Message-ID: <1357385007.3069.6.camel@deep-thought> Hi, The build system should honor LDFLAGS for the shared library (like it does for the static library). linker-flags.patch is attached. There are some trailing spaces in the Makefiles. Some variables in config.* have trailing spaces. The Makefiles should not rely on having a space at the end of a variable. strip-trailing-spaces.patch is attached. -- Benjamin Drung Debian & Ubuntu Developer -------------- next part -------------- A non-text attachment was scrubbed... Name: linker-flags.patch Type: text/x-patch Size: 616 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: strip-trailing-spaces.patch Type: text/x-patch Size: 23945 bytes Desc: not available URL: From renatomauro at libero.it Sat Jan 5 04:56:00 2013 From: renatomauro at libero.it (Renato MAURO (Libero)) Date: Sat, 05 Jan 2013 13:56:00 +0100 Subject: [Live-devel] How to release frame; How to improve horrible video quality In-Reply-To: References: <50E6DD5A.9020008@libero.it> Message-ID: <50E822E0.50302@libero.it> >Do you mean going into my MF H.264 encoder? You must. Oops, no, not really! I rewrite my suggestion (the previous one is good when working at the receiver side): *grab* the stream for the luminance plane only (like old TV sets!!): if you get a good output, the problem is the way the *encoder* works on the U and V plane (or what you are guessing it likes to be fed in). So, I mean to memset to 0x80 all the U and V bytes of the grabber output (see http://www.vidcheck.com/whitepapers.asp to know why 0x80 and not 0x00, which would give the default green color). Obviously this test works only if the decoder input must be a planar format; in case of a packed format (YUV422), it will fail and you have to manage what the grabber outputs and what the encoder accepts. Here down a verbose description. Regards, Renato A) I'll try to describe some basics about a video stream chain, although it's very easy to find information on the Internet. Usually the chain is made of: 1) a grabber, which gives complete frames in some FOURCC format (RGB, YUV422, YUV420, ...); 2) an encoder, which, in a run, accepts one of the FOURCC formats and deflates the frames (in this case MF encoder); 3) a streamer, which sends data to a receiver (in this case Live555); 4) a stream receiver, which receives data (in this case Live555 embedded in VLC); 5) a decoder, which inflates data (in this case FFMPEG embedded in VLC) and gives frames in the same FOURCC format as points 2; 6) a renderer, which outputs the images on a video card (in this case an output class embedded in VLC (e.g. on Windows a DirectX surface for the same FOURCC format as point 5)); Sometimes, but it's a bad performing design (heavy CPU load) the chain also has any one or both of the following points: 1-bis) a FOURCC format converter to adapt the grabber output to the encoder input; 5-bis) a FOURCC format converter to adapt the decoder output to the renderer input. B) Now, if you are right when you say that your grabber outputs NV12 format frames (point 1), please check if your encoder accepts this format or if it waits for YUV422 (point 2). If the format does not match, a solution is implementing the point 1-bis; but it's better to change the grabber output or the encoder input configuration (or the encoder at all). C) Another check is to verify that the decoder output (point 5) matches the encoder input (point 2), regardless the FOURCC format of the data you feed in the encoder. Since you use VLC, please read its documentation to know how to save each decoded frame. D) Another check is to verify that the receiver output (point 4) matches the streamer input (point 4), using operRTSP at the receiver side (not VLC). Il 04/01/2013 21.08, temp2010 at forren.org ha scritto: > Renato, > > How would one go about rendering the stream for only the luminance > plane? Do you mean going into my MF H.264 encoder? You must. > Otherwise I don't know how to analyze the compressed output to omit > the color. And for the MF H.264 encoder, I don't think there's any > way to tell it to do only Y. Is this what you were suggesting? If > so, then how would one get around the problem I just indicated? If > not, what did you really mean. > > > On Fri, Jan 4, 2013 at 8:47 AM, Renato MAURO (Libero) > > wrote: > > NV12 is similar to I420 (or YUV420, if you prefer), so it is 12 > bit per pixel, 8 for luminance and 4 for CrCb (or U and V if you > prefer). See http://www.fourcc.org/yuv.php#NV12. > Obviously, "4 bits for CrCb" means that each byte is used for 4 > pixels (a 2x2 quad), and so NV12 is a planar only format. > > So: 640 x 480 x 1,5 = 450 kBi = 3.51 Mbi uncompressed => 3.51 Mbi > / 160 kbi = 22,5 times. > > I suggest to render the stream for the luminance plane only (like > old TV sets!!): if you get a good output, the problem is the way > the renderer works on the U and V plane. > > > Regards, > > Renato > > > Il 04/01/2013 13.49, temp2010 at forren.org > ha scritto: >> Thanks very much for your help, Ross. >> >> BACKGROUND: (CLARIFICATION ONLY) Indeed I only have one thread >> calling doEventLoop(). That's all I meant by my penultimate >> background sentence. The last background sentence points out a >> separate thread for MF, that's independent of Live555, and is in >> fact the thread taking advantage of the one exception within >> Live555: calling triggerEvent(). >> >> FOLLOWUP QUESTION: Having incorporated your question one answer, >> my output is better and now only "very poor" rather than >> "horrible". My uncompressed stream is 640 x 480 x 30fps. It >> passes through NV12 format, which when considered as YUV I think >> has about 16 bits of info per pixel (8 bits for Y, 8 bits for UV, >> like YUV422). So an uncompressed 640 x 480 frame should have >> nearly 5Mbits of info. However, the compressed frames I'm >> sending to Live555 are generally 160kbits each (after an initial >> 384kbit frame). I realize you're not responsible for MF, but >> perhaps you can give your opinion, please. This suggests to me >> that my MF H.264 encoder is compressing roughly 5Mbits/160kbits = >> 31 times. Perhaps my remaining "very poor" quality is because >> this is too much compression. I reconfigured the encoder for 10 >> times the average bit rate, but this typical 160kbits/frame >> output size didn't really change. DO YOU AGREE that I ought to >> be able to increase encoder output quality by increasing average >> bit rate, and that the 160kbits per compressed frame should rise >> toward 5Mbits? >> >> Thanks again. >> >> >> >> >> >> _______________________________________________ >> live-devel mailing list >> live-devel at lists.live555.com >> http://lists.live555.com/mailman/listinfo/live-devel > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 5 06:11:47 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 6 Jan 2013 03:11:47 +1300 Subject: [Live-devel] Missing LDFLAGS In-Reply-To: <1357385007.3069.6.camel@deep-thought> References: <1357385007.3069.6.camel@deep-thought> Message-ID: <3F174F51-FA4D-475E-8F86-B51C4E9D3448@live555.com> I'll make the first suggested change (adding LDFLAGS), but not the second. The "$(LIBRARY_LINK)$@" has to remain as is, because - for some configurations (e.g., "sunos") - there can't be a space there. Also, I'm not going to "fix what's not broken". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bdrung at debian.org Sat Jan 5 15:43:26 2013 From: bdrung at debian.org (Benjamin Drung) Date: Sun, 06 Jan 2013 00:43:26 +0100 Subject: [Live-devel] Missing LDFLAGS In-Reply-To: <3F174F51-FA4D-475E-8F86-B51C4E9D3448@live555.com> References: <1357385007.3069.6.camel@deep-thought> <3F174F51-FA4D-475E-8F86-B51C4E9D3448@live555.com> Message-ID: <1357429406.22672.1.camel@deep-thought> Am Sonntag, den 06.01.2013, 03:11 +1300 schrieb Ross Finlayson: > I'll make the first suggested change (adding LDFLAGS), but not the > second. The "$(LIBRARY_LINK)$@" has to remain as is, because - for > some configurations (e.g., "sunos") - there can't be a space there. > Also, I'm not going to "fix what's not broken". Thanks. I have attached a smaller, less intrusive patch for the trailing space removal. These removals have no bad side effect. -- Benjamin Drung Debian & Ubuntu Developer -------------- next part -------------- A non-text attachment was scrubbed... Name: strip-trailing-spaces-v2.patch Type: text/x-patch Size: 5677 bytes Desc: not available URL: From Reynel.Castelino at fike.com Mon Jan 7 14:26:55 2013 From: Reynel.Castelino at fike.com (Castelino, Reynel) Date: Mon, 7 Jan 2013 22:26:55 +0000 Subject: [Live-devel] re-streaming rtsp In-Reply-To: <4CD9849E-03C9-4F38-87CB-6BF190E4E5E5@live555.com> References: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E126@BLSUSAEXCH1.fike.corp> <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E14F@BLSUSAEXCH1.fike.corp> <4CD9849E-03C9-4F38-87CB-6BF190E4E5E5@live555.com> Message-ID: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E1F6@BLSUSAEXCH1.fike.corp> Hi Ross, I compile and run the Proxy Server using command line on windows ( win 7 64). After few minutes the app crashes with a "BasicTaskScheduler::SingleStep():select() fails: No error". Any clue how to prevent this. Thanks Reynel From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, January 04, 2013 2:07 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] re-streaming rtsp May be you can shed light on this. We receive RTSP streams from cameras ( e.g. rtsp://10.0.0.27:554/codec-h264/channel0.sdp) that we wish to re-stream over a http tunnel (and use a nonstandard http port e.g. 9100). Is this possible using Live555 Media Server. Tried using the proxy server but it doesn't fit the bill. Why not? The "LIVE555 Proxy Server" is intended just for situations like this. On some forums There's only one 'forum' for supporting the "LIVE555 Streaming Media" code - this mailing list. you do mention that RTSP over HTTP is supported but I think that's on client side. No, we implement RTSP-over-HTTP tunneling both for RTSP clients and RTSP servers (including our RTSP proxy server). The "LIVE555 Proxy Server" application (in the "proxyServer" directory) sets up a HTTP server (for optional RTSP-over-HTTP tunneling) by attempting to use three port numbers that are typically used for HTTP: 80, 8000, and 8080. (Note the code in "live555ProxyServer.cpp".) I suggest that you continue to use one of these three ports, if possible (because they are most likely to pass through firewalls). However you could, of course, modify the code to use some other port (like 9100) instead. (I note, however, that port 9100 has been registered - in the Unix /etc/services file - for use by what appears to be some HP printing service.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 7 14:42:04 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Jan 2013 11:42:04 +1300 Subject: [Live-devel] re-streaming rtsp In-Reply-To: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E1F6@BLSUSAEXCH1.fike.corp> References: <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E126@BLSUSAEXCH1.fike.corp> <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E14F@BLSUSAEXCH1.fike.corp> <4CD9849E-03C9-4F38-87CB-6BF190E4E5E5@live555.com> <2C55BDC7C60FF64D894FC9E84A0B6F2EC4E1F6@BLSUSAEXCH1.fike.corp> Message-ID: <12C33335-3DA8-4320-9213-2BCFB0CA4232@live555.com> > I compile and run the Proxy Server using command line on windows ( win 7 64). After few minutes the app crashes with a ?BasicTaskScheduler::SingleStep():select() fails: No error?. Any clue how to prevent this. You could try using an operating system other than Windows (e.g., Linux or Freebsd). (Generally speaking, Windows is ill-suited for running server software.) But other than that, unfortunately I don't know what's wrong - in part because you haven't provided much information. So please download and build the latest version of the software (which prints more diagnostic output when this error occurs). Also, run "live555ProxyServer" application with the -V flag (to print additional diagnostic output - as is recommended on our web site). Please post the complete diagnostic output to this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Mon Jan 7 14:48:29 2013 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Mon, 7 Jan 2013 14:48:29 -0800 Subject: [Live-devel] RTSPServer Access Control Change Request In-Reply-To: <54E33551-3B18-4793-BA1B-6CF3E1B731AB@live555.com> References: <009e01cdeab2$aed18f60$0c74ae20$@com> <00c501cdeac2$21d7fad0$6587f070$@com> <54E33551-3B18-4793-BA1B-6CF3E1B731AB@live555.com> Message-ID: <015001cded29$1e0b1fe0$5a215fa0$@com> Hi Ross, Thank you for accepting the patch and for cleaning it up as well. I appreciate not having to apply it each time I update. Chris Richardson WTI From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, January 04, 2013 3:43 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTSPServer Access Control Change Request Grumble... I don't particularly like this, but I didn't particularly like the "specialClientAccessCheck()" hack either. But a pair of related hacks isn't much worse than one, so I'll go ahead and add your patch to the next release of the software. (At some point the whole API for custom server authentication probably needs to be rethought...) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashfaque at iwavesystems.com Mon Jan 7 23:39:35 2013 From: ashfaque at iwavesystems.com (Ashfaque) Date: Tue, 8 Jan 2013 13:09:35 +0530 Subject: [Live-devel] Live555 test on WEC7 Message-ID: Hi Ross, I have ported the Live555 RTSP streaming library to Windows Embedded Compact 7 (WEC7) Platform builder and testing with the H264 test application (testH264VideoStreamer.cpp) provided in the package. I have installed the SDK for Freescale i.MX6 SABRE SDP WEC7 platform. The compilation and linking is success full. But when i run the application the IP Address in the RTSP URL provided by the Live555 is different from the board IP Address. Streaming does not work when tried to connect with VLC Player. I have tested the similar application on linux machine where both the IP Addresses are same and streaming works fine. I am able to ping the board with the its IP Address and not with the IP Address provided by the Live555. Why Live555 is reading the incorrect IP Address for the Streaming URL and how to configure it correctly. Any multicast routing need to be enabled? I am testing the streaming with VLC Player running on Windows XP PC connected to the same network. Please let me know the cause. Regards, Ashfaque -------------- next part -------------- An HTML attachment was scrubbed... URL: From kingaceck at 163.com Mon Jan 7 21:32:17 2013 From: kingaceck at 163.com (kingaceck) Date: Tue, 8 Jan 2013 13:32:17 +0800 Subject: [Live-devel] two network adapters issue Message-ID: <201301081332126252488@163.com> Hi There are two network adapters in my server.When I run the live555 on this server, how will the live555 decide which IP to use ? Can I assign a fixed IP to the live555.Thank you very munch. 2013-01-08 kingaceck -------------- next part -------------- An HTML attachment was scrubbed... URL: From kritisinghal23 at gmail.com Mon Jan 7 23:19:18 2013 From: kritisinghal23 at gmail.com (kriti singhal) Date: Tue, 8 Jan 2013 12:49:18 +0530 Subject: [Live-devel] testMPEG2TransportStreamer Query Message-ID: Hello Sir, I am new to live media libraries,let me first tell what am i doing:- I am using "*testMPEG2TransportStreamer*" of your to stream an ts file(without implementing internal RTSP server in * testMPEG2TransportStreamer* ) and using *"testOnDemandRTSPServer"* to receive the stream. I just want to know when i saw the wireshark the packets are sent using UDP from *testMPEG2TransportStreamer *can these packets be sent using TCP from * testMPEG2TransportStreamer* without implementing internal RTSP server in *testMPEG2TransportStreamer *. If yes then what would i have to do and if no then also plz tell that how to send packets over tcp from *testMPEG2TransportStreamer *as i want the streaming over TCP. Thanks . -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 8 19:12:38 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Jan 2013 19:12:38 -0800 Subject: [Live-devel] Live555 test on WEC7 In-Reply-To: References: Message-ID: > Why Live555 is reading the incorrect IP Address It's probably not an 'incorrect' address, but just one that's different from the one that you wanted. > for the Streaming URL and how to configure it correctly. > Any multicast routing need to be enabled? Yes. The easiest way to ensure that one particular interface is chosen is to make that interface the one for which multicast routing is enabled - i.e., the interface that has a route for 224.0.0.0/4 Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sam.lu at nokia.com Wed Jan 9 01:22:51 2013 From: sam.lu at nokia.com (sam.lu at nokia.com) Date: Wed, 9 Jan 2013 09:22:51 +0000 Subject: [Live-devel] Can live555 support MPEG Dash? Message-ID: <9D1EC863A877FF49A472A1404913CFAA4EA84E@008-AM1MPN1-073.mgdnok.nokia.com> Hello, there: Can I use live555 to setup a MPEG Dash server? If yes, please provide some information about how to do this. Otherwise, would you please also provide me some information which tool I can use to setup a MPEG Dash server? Many thanks! BR Sam Lu. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 9 01:52:02 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Jan 2013 01:52:02 -0800 Subject: [Live-devel] Can live555 support MPEG Dash? In-Reply-To: <9D1EC863A877FF49A472A1404913CFAA4EA84E@008-AM1MPN1-073.mgdnok.nokia.com> References: <9D1EC863A877FF49A472A1404913CFAA4EA84E@008-AM1MPN1-073.mgdnok.nokia.com> Message-ID: <3C2CF617-5ACE-4D34-8BF3-DFD5ECAC05D1@live555.com> > Can I use live555 to setup a MPEG Dash server? No, we support "HTTP Live Streaming" (see ), but not MPEG-DASH. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Wed Jan 9 14:41:47 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Wed, 9 Jan 2013 22:41:47 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN Message-ID: I experience an issue with TEARDOWN that I can't resolve. I've attached an example (testRTSPClientCycleStreams.cpp) that for me results in an Unhandled exception after a while; it usually happens after 10 minutes to an hour. The example is the testRTSPClient example with some modifications to cycle between two streams every 10s. Summary of the changes: - Added 10s delayed task in continueAfterPLAY() that calls streamTimerHandler(). - streamTimerHandler() shuts down the running client and starts a new one. Here are two hardcoded URL's that it switches between. Will most likely work as well using one stream; just easier to spot different streams in the output with two. - Broken up the last part of shutdownStream() and put it in cleanUpStream(). The sendTeardownCommand() function now uses the continueAfterTEARDOWN response handler. Using the response handler appears to be what is causing the exception. Never seen the exception when passing NULL. - Added cleanUpStream() that just performs the last cleanup. If you have the time to compile and test, it should be enough to change the URL's on line 391. The exception appears to occur deep inside the event loop in standard libraries and I haven't been able to make anything out of it. Would be great if someone can confirm it happens for them or have any input on how to fix it. Also, of course, if you see any issues with the code, please let me know. I run on a Windows client and have tested with Cisco VSM6 & VSM7 servers. /Claes -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: testRTSPClientCycleStreams.cpp URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From finlayson at live555.com Wed Jan 9 19:43:38 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Jan 2013 19:43:38 -0800 Subject: [Live-devel] two network adapters issue In-Reply-To: <201301081332126252488@163.com> References: <201301081332126252488@163.com> Message-ID: <3D868F27-D740-4A17-8AE6-969A8A5FBD13@live555.com> > There are two network adapters in my server.When I run the live555 on this server, how will the live555 decide which IP to use ? See http://lists.live555.com/pipermail/live-devel/2012-June/015361.html Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 9 19:56:59 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 9 Jan 2013 19:56:59 -0800 Subject: [Live-devel] testMPEG2TransportStreamer Query In-Reply-To: References: Message-ID: <8BFB2B4A-5EE0-4BAA-9BF6-F96C2FFE0557@live555.com> > I just want to know when i saw the wireshark the packets are sent using UDP from testMPEG2TransportStreamer can these packets be sent using TCP from testMPEG2TransportStreamer No. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tbatra18 at gmail.com Wed Jan 9 05:53:58 2013 From: tbatra18 at gmail.com (Tarun Batra) Date: Wed, 9 Jan 2013 19:23:58 +0530 Subject: [Live-devel] Proxy Server Query Message-ID: Hello Sir, I made an streaming application in which i am streaming an transport stream file through your program testMPEG2TransportStreamer.cpp with RTSP server implemented in it and then i am receiving the stream using Proxy server. When i saw the documentation of Proxy server you have mentioned that to use "-t" option to request that each 'back-end' RTSP server stream RTP and RTCP data packets over its TCP connection, instead of using UDP packets." but after giving URL like "-t rtsp://192.168.15.192/streamname" in proxy server i saw that in wireshark the "testMPEG2TransportStreamer " is sending UDP packets. Then i made a change in "testMPEG2TransportStreamer" that i replaces this line of code by sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp));//original by this line of code sms->addSubsession(MPEG2TransportUDPServerMediaSubsession::createNew(*env,destinationAddressStr,rtpPortNum,inputStreamIsRawUDP)); Still i am not able to see the TCP packets streaming from "testMPEG2TransportStreamer". Whats the issue sir can you explain ? What should be done to remove the above issue? Thanks Tarun -------------- next part -------------- An HTML attachment was scrubbed... URL: From kritisinghal23 at gmail.com Wed Jan 9 22:02:17 2013 From: kritisinghal23 at gmail.com (kriti singhal) Date: Thu, 10 Jan 2013 11:32:17 +0530 Subject: [Live-devel] testMPEG2TransportStreamer Query In-Reply-To: <8BFB2B4A-5EE0-4BAA-9BF6-F96C2FFE0557@live555.com> References: <8BFB2B4A-5EE0-4BAA-9BF6-F96C2FFE0557@live555.com> Message-ID: Sir, But here comes an issue i made my rtsp client when i ask from the client to testondemandrtsp server to get the stream using rtspoverhttptunnelling port than the testondemandrtsp streams tcp packet,why is it so? On Thu, Jan 10, 2013 at 9:26 AM, Ross Finlayson wrote: > I just want to know when i saw the wireshark the packets are sent using > UDP from *testMPEG2TransportStreamer *can these packets be sent using TCP > from *testMPEG2TransportStreamer* > > > No. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yogesh_marathe at ti.com Thu Jan 10 00:47:31 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Thu, 10 Jan 2013 08:47:31 +0000 Subject: [Live-devel] Live555 performance numbers Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> Hi, I want to use live555 for receiving H.264 encoded video streams over IP network through RTP. Is there any information about performance numbers of live555 in terms of CPU cycles or CPU load % when it comes to receiving streams. Also, Is live555 claimed to be optimized for receiving video streams over network? Or there is scope for futher optimization? Thanks in advance. Regards, Yogesh. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 10 01:00:42 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Jan 2013 01:00:42 -0800 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> Message-ID: <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com> > I want to use live555 for receiving H.264 encoded video streams over IP network through RTP. Is there any information about performance numbers of live555 in terms of CPU cycles or CPU load % when it comes to receiving streams. It's difficult, if not impossible, to answer questions like this, because the performance of the system depends so much on the particular hardware that you're using. Generally speaking, though, the performance overhead of LIVE555's RTSP/RTP/RTCP reception/processing code is usually insignificant compared to the cost of receiving/decoding/rendering the incoming video. E.g., I suggest that you run the VLC media player application, that uses the "LIVE555 Streaming Media" code as its RTSP/RTP/RTCP client, and which can receive and play H.264 video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yogesh_marathe at ti.com Thu Jan 10 03:29:55 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Thu, 10 Jan 2013 11:29:55 +0000 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com> Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> Hello Ross, Thanks for the prompt response. Assuming I'm not bothered about decoding and rendering part of incoming video (as I can offload that), I want to know if any profiling has been done on just LIVE555's receiving part on any processor ever? I had observed in one of the experiments with OpenRTSP, if I periodically open multiple threads (using rtspRead()) to receive video streams from different sources, CPU load was going high. The application was just receiving and dropping packets. I want know measures that can be taken in live555 that would enable my system to open maximum possible threads/processes ensuring minimal overhead of receiving and waiting if any. Regards, Yogesh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 10, 2013 2:31 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers I want to use live555 for receiving H.264 encoded video streams over IP network through RTP. Is there any information about performance numbers of live555 in terms of CPU cycles or CPU load % when it comes to receiving streams. It's difficult, if not impossible, to answer questions like this, because the performance of the system depends so much on the particular hardware that you're using. Generally speaking, though, the performance overhead of LIVE555's RTSP/RTP/RTCP reception/processing code is usually insignificant compared to the cost of receiving/decoding/rendering the incoming video. E.g., I suggest that you run the VLC media player application, that uses the "LIVE555 Streaming Media" code as its RTSP/RTP/RTCP client, and which can receive and play H.264 video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Thu Jan 10 06:12:18 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Thu, 10 Jan 2013 14:12:18 +0000 Subject: [Live-devel] StreamReplicator with FileSink problem Message-ID: <20130110132339.M35692@livingdata.pt> Hello Ross, We have been running for some time a RTSPServer that acquires video from a live source (with our own DeviceSource) from which we create 2 or 3 replicas: one for multicast streaming and another to feed a FileSink for file storage. The third replica is created on demand for unicast streaming. Recently we detected a problem on some servers when the file partition on which the files are being stored becomes full. When that happens, all pipelines (FileSink, multicast and unicast) stop running. The FileSink halt was expected and we are currently implementing a strategy to deal with disk full issues. But, at least, the multicast streaming should continue and we were somewhat surprised with these symptoms, so we tried to understand what was going on. This is what we concluded: when the partition becomes full, an error must be detected in FileSink::afterGettingFrame(...) when fflush(fOutFid) is called. After that, no next frame is requested from our upstream source, even though there are several replicas in the replicator. We think this is because in MediaSink::onSourceClosure(...) fSource is set to NULL and, when MediaSink::stopPlaying() is called, fSource->stopGettingFrames() is NOT CALLED since it was previously set to NULL. And, since fSource->stopGettingFrames() is not being called, neither is StreamReplica::doStopGettingFrames(). Now, we believe that our file sink replica was the master replica in StreamReplicator and, since it's not being deactivated, no other replica requests any more frames. The following patch to MediaSink solved our current problem: --- liveMedia/MediaSink.cpp (revision 16076) +++ liveMedia/MediaSink.cpp (working copy) @@ -92,7 +92,7 @@ void MediaSink::onSourceClosure(void* clientData) { MediaSink* sink = (MediaSink*)clientData; - sink->fSource = NULL; // indicates that we can be played again + // sink->fSource = NULL; // indicates that we can be played again if (sink->fAfterFunc != NULL) { (*(sink->fAfterFunc))(sink->fAfterClientData); } We're just not sure that this is a correct solution or if we should override MediaSink::stopPlaying() to set the MediaSink::fSource before 'if (fSource != NULL) fSource->stopGettingFrames()' is called, or something else entirely. I've attached a small modification to testProgs/testReplicator.cpp that creates 2 replicas from a ByteStreamFileSource that reads the video "bipbop-gear1-all.ts" (available from the LIVE555 site). We run this app with the following script: #-------- #!/bin/sh ./testReplicator & DISK_AVAIL=`df -h . | sed '1d' | awk '{print $4}' | cut -d'%' -f1` echo "type to continue and allocate ${DISK_AVAIL} on current partition" echo "ATTENTION: This will allocate all free space" read cont fallocate -l ${DISK_AVAIL} dummy.1 #-------- It allows us to check that, after fallocate completely fills up the available space, both the file and the udp sink replicas stop working. Any help will be appreciated. Thank You, Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt -------------- next part -------------- A non-text attachment was scrubbed... Name: testReplicator.cpp Type: text/x-c++src Size: 4663 bytes Desc: not available URL: From jshanab at smartwire.com Thu Jan 10 05:21:08 2013 From: jshanab at smartwire.com (Jeff Shanab) Date: Thu, 10 Jan 2013 13:21:08 +0000 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com> I can only give a few data points. A lot depends on the bandwidth per stream. I receive and save to disk from security cameras. These are, on avg, set for 10fps d1(704x480) On one desktop PC I7-950, 12G ram and Solid-state drive I was testing my software thruput. A rather high end machine to eliminate it as a variable. I received 400 streams and wrote to disk, The limit was the single gigibit network connection. I routinely record 20-50 streams on a mini PC. bookend PC with notebook drive. Same software. Architecture is multi-threaded running up to 60 streams per thread/environment. Live555 uses an event loop and the 60 came from the Windows limitation on WaitForMultipleObjects since removed. ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Marathe, Yogesh [yogesh_marathe at ti.com] Sent: Thursday, January 10, 2013 5:29 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Hello Ross, Thanks for the prompt response. Assuming I'm not bothered about decoding and rendering part of incoming video (as I can offload that), I want to know if any profiling has been done on just LIVE555's receiving part on any processor ever? I had observed in one of the experiments with OpenRTSP, if I periodically open multiple threads (using rtspRead()) to receive video streams from different sources, CPU load was going high. The application was just receiving and dropping packets. I want know measures that can be taken in live555 that would enable my system to open maximum possible threads/processes ensuring minimal overhead of receiving and waiting if any. Regards, Yogesh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 10, 2013 2:31 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers I want to use live555 for receiving H.264 encoded video streams over IP network through RTP. Is there any information about performance numbers of live555 in terms of CPU cycles or CPU load % when it comes to receiving streams. It's difficult, if not impossible, to answer questions like this, because the performance of the system depends so much on the particular hardware that you're using. Generally speaking, though, the performance overhead of LIVE555's RTSP/RTP/RTCP reception/processing code is usually insignificant compared to the cost of receiving/decoding/rendering the incoming video. E.g., I suggest that you run the VLC media player application, that uses the "LIVE555 Streaming Media" code as its RTSP/RTP/RTCP client, and which can receive and play H.264 video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 10 11:00:43 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Jan 2013 11:00:43 -0800 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: <20130110132339.M35692@livingdata.pt> References: <20130110132339.M35692@livingdata.pt> Message-ID: > - sink->fSource = NULL; // indicates that we can be played again > + // sink->fSource = NULL; // indicates that we can be played again [...] > We're just not sure that this is a correct solution No, it's not. Instead, try making the following change to "FileSink::afterGettingFrame()" (lines 130-132 of "liveMedia/FileSink.cpp"): Change the order of the calls to onSourceClosure(this); and stopPlaying(); so that "stopPlaying()" is called first, before "onSourceClosure(this)". Let us know if this works for you. (If so, I'll make the change in the next release of the software.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 10 16:42:31 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Jan 2013 16:42:31 -0800 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: References: Message-ID: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> I ran your modified application for several hours (on FreeBSD, under GDB), and also on Linux using "valgrind", but unfortunately was unable to find any problem. I suspect that whatever bug is causing this is something that (for some reason) is causing an exception only in your environment. I'm still interested in learning the cause, of course (especially if it is - as it appears to be - a bug in our code), but it looks like you're probably going to have to track this down yourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Thu Jan 10 19:42:14 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Fri, 11 Jan 2013 03:42:14 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> Message-ID: I ran your modified application for several hours (on FreeBSD, under GDB), and also on Linux using "valgrind", but unfortunately was unable to find any problem. I suspect that whatever bug is causing this is something that (for some reason) is causing an exception only in your environment. I'm still interested in learning the cause, of course (especially if it is - as it appears to be - a bug in our code), but it looks like you're probably going to have to track this down yourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ Thanks a lot for taking the time to test it. I'm not really sure how to proceed, but I guess I can try to locate a Linux client to compile and run the test on. It can of course be the Cisco server that does something that somehow triggers the issue to occur in the client. Cisco VSM7 should however be a complete rewrite from VSM6 and since the behavior in this case is exactly the same it would surprise me a bit, but any "oddities" can of course be carried over to a new version, rewrite or not. I've however never seen anything strange in the logs indicating this. As already mentioned, the issue always happens after calling sendTeardownCommand() (with a response handler). When looking at the callstack after the unhandled exception I often see slightly different locations, where the last reference point is to some system DLL, e.g. ntdll or msvcr90d.dll. The last RTSP client function call in the callstack is often BasicTaskScheduler::SingleStep(). Would it be of any help to keep track of and pass on the callstacks (or any other info)? As expected, the debug output (from the previously attached test program) always ends with the line "Opening connection to 192.168.1.103, port 554..." after a TEARDOWN request has been sent. I've attached an output example; not sure if it might be of any help. It can be seen that the TEARDOWN response often comes after the new stream has been started, but I assume that is ok. /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From tbatra18 at gmail.com Thu Jan 10 05:52:17 2013 From: tbatra18 at gmail.com (Tarun Batra) Date: Thu, 10 Jan 2013 19:22:17 +0530 Subject: [Live-devel] How to check the received frame size? Message-ID: Hello sir, Can you please guide me on how to check the received frame size(packet size) that we receive in test on demand rtsp server and proxy server please? Thanks you -------------- next part -------------- An HTML attachment was scrubbed... URL: From kritisinghal23 at gmail.com Thu Jan 10 06:37:59 2013 From: kritisinghal23 at gmail.com (kriti singhal) Date: Thu, 10 Jan 2013 20:07:59 +0530 Subject: [Live-devel] rtsp over http tunnelling option conformation Message-ID: Hello Ross sir i saw an rtsp over http tunnelling option in your programs i just want to know that how can we conform that the stream we asre receiving is rtsp over http one of my way is that we know rtsp packet size is 1328 and if it is ovet tcp thaen it should be 1332(4 bytes of tcp header). Is there any way to know in your new proxy server program that its 1332 or is there any other way? And is there any way to print verbose output from library on console, i saw that u have written in your libraries that "# Comment out the following line to produce Makefiles that generate debuggable code: NODEBUG=1" But if i did that i get an _ITERATOR_DEBUG_LEVEL error, Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From yogesh_marathe at ti.com Thu Jan 10 23:03:18 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Fri, 11 Jan 2013 07:03:18 +0000 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com> Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com> Jeff, Thank you for your input. Is this the max performance on your system? Or CPU is still not fully loaded with 400 D1 at 10fps receive and write. It will be of a great help if you could comment on CPU performance when you are only receiving and not writing. I will see if I can extrapolate these stats for my system. Unfortunately, I'm devoid of such processing power and memory you mentioned. Regards, Yogesh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Thursday, January 10, 2013 6:51 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers I can only give a few data points. A lot depends on the bandwidth per stream. I receive and save to disk from security cameras. These are, on avg, set for 10fps d1(704x480) On one desktop PC I7-950, 12G ram and Solid-state drive I was testing my software thruput. A rather high end machine to eliminate it as a variable. I received 400 streams and wrote to disk, The limit was the single gigibit network connection. I routinely record 20-50 streams on a mini PC. bookend PC with notebook drive. Same software. Architecture is multi-threaded running up to 60 streams per thread/environment. Live555 uses an event loop and the 60 came from the Windows limitation on WaitForMultipleObjects since removed. ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Marathe, Yogesh [yogesh_marathe at ti.com] Sent: Thursday, January 10, 2013 5:29 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Hello Ross, Thanks for the prompt response. Assuming I'm not bothered about decoding and rendering part of incoming video (as I can offload that), I want to know if any profiling has been done on just LIVE555's receiving part on any processor ever? I had observed in one of the experiments with OpenRTSP, if I periodically open multiple threads (using rtspRead()) to receive video streams from different sources, CPU load was going high. The application was just receiving and dropping packets. I want know measures that can be taken in live555 that would enable my system to open maximum possible threads/processes ensuring minimal overhead of receiving and waiting if any. Regards, Yogesh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 10, 2013 2:31 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers I want to use live555 for receiving H.264 encoded video streams over IP network through RTP. Is there any information about performance numbers of live555 in terms of CPU cycles or CPU load % when it comes to receiving streams. It's difficult, if not impossible, to answer questions like this, because the performance of the system depends so much on the particular hardware that you're using. Generally speaking, though, the performance overhead of LIVE555's RTSP/RTP/RTCP reception/processing code is usually insignificant compared to the cost of receiving/decoding/rendering the incoming video. E.g., I suggest that you run the VLC media player application, that uses the "LIVE555 Streaming Media" code as its RTSP/RTP/RTCP client, and which can receive and play H.264 video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 10 23:38:42 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Jan 2013 23:38:42 -0800 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com> <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com> Message-ID: <226DE422-12B0-447B-B060-F56EBE1E7F86@live555.com> Rather than speculating endlessly about how the "LIVE555 Streaming Media" software might perform on your system, why not just download it and try it yourself? It will take you less than 5 minutes to download and build the software. Because you are interested in RTSP client functionality, I suggest that you run the "testRTSPClient" demo application, which receives stream data (into a memory buffer), but does not decode or render it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yogesh_marathe at ti.com Fri Jan 11 00:19:42 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Fri, 11 Jan 2013 08:19:42 +0000 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <226DE422-12B0-447B-B060-F56EBE1E7F86@live555.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com> <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com> <226DE422-12B0-447B-B060-F56EBE1E7F86@live555.com> Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9ED798@DBDE01.ent.ti.com> I have done this before as I have mentioned in my previous mail in same thread. May be that was quite long ago. Let tryout with latest Live555. Regards, Yogesh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, January 11, 2013 1:09 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Rather than speculating endlessly about how the "LIVE555 Streaming Media" software might perform on your system, why not just download it and try it yourself? It will take you less than 5 minutes to download and build the software. Because you are interested in RTSP client functionality, I suggest that you run the "testRTSPClient" demo application, which receives stream data (into a memory buffer), but does not decode or render it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Fri Jan 11 07:16:26 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Fri, 11 Jan 2013 15:16:26 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> Message-ID: Included the attachment I was mentioning below... It shows the last minute or so of the output from the modified testRTSPClient. /Claes Thanks a lot for taking the time to test it. I'm not really sure how to proceed, but I guess I can try to locate a Linux client to compile and run the test on. It can of course be the Cisco server that does something that somehow triggers the issue to occur in the client. Cisco VSM7 should however be a complete rewrite from VSM6 and since the behavior in this case is exactly the same it would surprise me a bit, but any "oddities" can of course be carried over to a new version, rewrite or not. I've however never seen anything strange in the logs indicating this. As already mentioned, the issue always happens after calling sendTeardownCommand() (with a response handler). When looking at the callstack after the unhandled exception I often see slightly different locations, where the last reference point is to some system DLL, e.g. ntdll or msvcr90d.dll. The last RTSP client function call in the callstack is often BasicTaskScheduler::SingleStep(). Would it be of any help to keep track of and pass on the callstacks (or any other info)? As expected, the debug output (from the previously attached test program) always ends with the line "Opening connection to 192.168.1.103, port 554..." after a TEARDOWN request has been sent. I've attached an output example; not sure if it might be of any help. It can be seen that the TEARDOWN response often comes after the new stream has been started, but I assume that is ok. /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: TestRTSP-UnhandledExceptionOutput.txt URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From jshanab at smartwire.com Fri Jan 11 05:28:49 2013 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 11 Jan 2013 13:28:49 +0000 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC225245B72@IL-BOL-EXCH01.smartwire.com> Oh, Sorry, Less than 25% CPU with the 400 streams. Leading me to believe I can handle a lot more on a rack server with multiple interfaces and a better upstream. Indeed we now have over 500 cameras on a co-located dl380. I would like to know what is the best way in such a situation to know when you get behind. Other than watching the video, I can check my index which gives me a frame count. Is there a hook to tell me at the time live555 is forced to drop a frame in RTSP? or does it reconnect at that point. ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Marathe, Yogesh [yogesh_marathe at ti.com] Sent: Friday, January 11, 2013 1:03 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Jeff, Thank you for your input. Is this the max performance on your system? Or CPU is still not fully loaded with 400 D1 at 10fps receive and write. It will be of a great help if you could comment on CPU performance when you are only receiving and not writing. I will see if I can extrapolate these stats for my system. Unfortunately, I?m devoid of such processing power and memory you mentioned. Regards, Yogesh. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Jan 11 08:47:10 2013 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 11 Jan 2013 16:47:10 +0000 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <615FD77639372542BF647F5EBAA2DBC225245B72@IL-BOL-EXCH01.smartwire.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com>, <615FD77639372542BF647F5EBAA2DBC225245B72@IL-BOL-EXCH01.smartwire.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC225245CC2@IL-BOL-EXCH01.smartwire.com> Actually. As I re-read your question I see you asked about not writing, I do not have an answer for that as that is what my software does. Also, It is a 32 bit build so the process is only given the maximum 2G of address space, it just means other processes on that box are not impacting it's memory footprint. ( a good trick for isolating system load from app on windows, run 32bit process on 64bit os) ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Jeff Shanab [jshanab at smartwire.com] Sent: Friday, January 11, 2013 7:28 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Oh, Sorry, Less than 25% CPU with the 400 streams. Leading me to believe I can handle a lot more on a rack server with multiple interfaces and a better upstream. Indeed we now have over 500 cameras on a co-located dl380. I would like to know what is the best way in such a situation to know when you get behind. Other than watching the video, I can check my index which gives me a frame count. Is there a hook to tell me at the time live555 is forced to drop a frame in RTSP? or does it reconnect at that point. ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Marathe, Yogesh [yogesh_marathe at ti.com] Sent: Friday, January 11, 2013 1:03 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Jeff, Thank you for your input. Is this the max performance on your system? Or CPU is still not fully loaded with 400 D1 at 10fps receive and write. It will be of a great help if you could comment on CPU performance when you are only receiving and not writing. I will see if I can extrapolate these stats for my system. Unfortunately, I?m devoid of such processing power and memory you mentioned. Regards, Yogesh. -------------- next part -------------- An HTML attachment was scrubbed... URL: From zanglan at yahoo.com Fri Jan 11 06:30:15 2013 From: zanglan at yahoo.com (Lan Zang) Date: Fri, 11 Jan 2013 22:30:15 +0800 (CST) Subject: [Live-devel] Can live555 read RTP data from ffmpeg? Message-ID: <1357914615.45736.YahooMailNeo@web15803.mail.cnb.yahoo.com> Hi,? I am sending MPEG TS data over RTP by ffmpeg, like "ffmpeg -i file.ts -c copy -f mpegts rtp://localhost:1234". I want live555 can get these RTP data. I modified the testOnDemandRTSPServer.cpp to use unicast address for the mpeg2TransportStreamFromUDPSourceTest item. I then run testOnDemandRTSPServer. But it seems that my video player(VLC player) got nothing from testOnDemandRTSPServer while playing URL like "rtsp://192.168.133.195:8554/mpeg2TransportStreamFromUDPSourceTest".? Is there anything more need to be done to make this thing work? Shall I change SimpleRTPSource in?MPEG2TransportUDPServerMediaSubsession::createNewStreamSource() to other RTPSource? Regards, Lan Zang(Sander) -------------- next part -------------- An HTML attachment was scrubbed... URL: From xingskycn at 163.com Fri Jan 11 07:52:10 2013 From: xingskycn at 163.com (Leon) Date: Fri, 11 Jan 2013 23:52:10 +0800 Subject: [Live-devel] Live555 can not running in iphone5(armv7s) Message-ID: Dear, Live555 can not running in iphone5, Please see the below: Thanks! schedule(3.094665->1357919186.566343) [0x20873780]saw incoming RTCP packet (from address 192.168.0.99, port 5557) 80c80006 2a2c646c bc17cca7 840420f7 ad02cccc 0000127d 0065b9e7 81ca0004 2a2c646c 0106286e 6f6e6529 00000000 SR RR validated RTCP subpacket (type 2): 0, 200, 0, 0x2a2c646c UNSUPPORTED TYPE(0xca) validated RTCP subpacket (type 2): 1, 202, 12, 0x2a2c646c validated entire RTCP packet sending REPORT sending RTCP packet 81c90007 19d97f0b 2a2c646c 93000429 00014fad 0000524a cca78404 00021962 81ca0006 19d97f0b 010f4c65 6f6e7465 6b692d69 50686f6e 65000000 schedule(5.211426->1357919191.781338) [0x20873780]saw incoming RTCP packet (from address 192.168.0.99, port 5557) 80c80006 2a2c646c bc17ccad 80cbc05d ad0b05c0 00001d8e 00a29980 81ca0004 2a2c646c 0106286e 6f6e6529 00000000 SR RR validated RTCP subpacket (type 2): 0, 200, 0, 0x2a2c646c UNSUPPORTED TYPE(0xca) validated RTCP subpacket (type 2): 1, 202, 12, 0x2a2c646c validated entire RTCP packet schedule(0.230628->1357919192.012120) sending REPORT sending RTCP packet 81c90007 19d97f0b 2a2c646c d7000e3c 00015b9f 000025c1 ccad80cb 0001b9c6 81ca0006 19d97f0b 010f4c65 6f6e7465 6b692d69 50686f6e 65000000 reap: checking SSRC 0x2a2c646c: 4 (threshold 0) schedule(2.373337->1357919194.388745) schedule(3.627554->1357919198.017503) sending REPORT sending RTCP packet 81c90007 19d97f0b 2a2c646c db001382 000161c3 00005e60 ccad80cb 0007bb47 81ca0006 19d97f0b 010f4c65 6f6e7465 6b692d69 50686f6e 65000000 schedule(4.116388->1357919202.136998) [0x20873780]saw incoming RTCP packet (from address 192.168.0.99, port 5557) 80c80006 2a2c646c bc17ccb3 784ed6fe ad133778 00002866 00de5263 81ca0004 2a2c646c 0106286e 6f6e6529 00000000 SR RR validated RTCP subpacket (type 2): 0, 200, 0, 0x2a2c646c UNSUPPORTED TYPE(0xca) validated RTCP subpacket (type 2): 1, 202, 12, 0x2a2c646c validated entire RTCP packet schedule(0.033425->1357919202.170639) schedule(1.982397->1357919204.153927) sending REPORT sending RTCP packet 81c90007 19d97f0b 2a2c646c f600204a 00016f10 000079c6 ccb3784e 0003b70a 81ca0006 19d97f0b 010f4c65 6f6e7465 6b692d69 50686f6e 65000000 schedule(4.181883->1357919208.339101) 2013-01-11 23:46:44.584 NewKenIPCam[563:3d13] -[RTSPPlayViewController subsession:didReceiveFrame:presentationTime:durationInMicroseconds:] [Line 456] image size -> {640, 480}, data length -> 64388 [0x20873780]saw incoming RTCP packet (from address 192.168.0.99, port 5557) 80c80006 2a2c646c bc17ccb9 85e84f09 ad1b8785 00003387 011ba86f 81ca0004 2a2c646c 0106286e 6f6e6529 00000000 SR RR validated RTCP subpacket (type 2): 0, 200, 0, 0x2a2c646c UNSUPPORTED TYPE(0xca) validated RTCP subpacket (type 2): 1, 202, 12, 0x2a2c646c validated entire RTCP packet -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 11 22:08:59 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Jan 2013 22:08:59 -0800 Subject: [Live-devel] Can live555 read RTP data from ffmpeg? In-Reply-To: <1357914615.45736.YahooMailNeo@web15803.mail.cnb.yahoo.com> References: <1357914615.45736.YahooMailNeo@web15803.mail.cnb.yahoo.com> Message-ID: > I am sending MPEG TS data over RTP by ffmpeg, like "ffmpeg -i file.ts -c copy -f mpegts rtp://localhost:1234". I want live555 can get these RTP data. If your only source of MPEG TS data is from files, then you don't need to use "ffmpeg" at all. Instead, you can stream your "file.ts" directly from our server. (If you are using "testOnDemandRTSPServer", then you rename "file.ts" as "test.ts". If you are using "live555MediaServer", then you don't need to rename your file, as long as its name ends with ".ts".) > I modified the testOnDemandRTSPServer.cpp to use unicast address for the mpeg2TransportStreamFromUDPSourceTest item. By this I presume that you changed the definition of "inputAddressStr" to: char const* inputAddressStr = NULL; > I then run testOnDemandRTSPServer. But it seems that my video player(VLC player) got nothing from testOnDemandRTSPServer while playing URL like "rtsp://192.168.133.195:8554/mpeg2TransportStreamFromUDPSourceTest". I suggest first using "testRTSPClient" as your RTSP client application, instead of VLC. > Is there anything more need to be done to make this thing work? No, I don't think so, assuming that your "ffmpeg" command generates RTP-encapsulated MPEG Transport Stream packets (with the correct RTP payload format code: 33). However, I don't know enough about "ffmpeg" to say for sure whether it's doing what you intend. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 11 22:11:16 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 11 Jan 2013 22:11:16 -0800 Subject: [Live-devel] Live555 can not running in iphone5(armv7s) In-Reply-To: References: Message-ID: <02D2647C-D5C0-470A-84DA-CFFE0F755BF7@live555.com> I don't understand why you think that the LIVE555 code is "not running". All you are showing us is the diagnostic output from our RTCP implementation, which shows that it is both sending and receiving RTCP packets OK. (Note that the "UNSUPPORTED TYPE" message is *not* an error.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zanglan at yahoo.com Sat Jan 12 06:00:52 2013 From: zanglan at yahoo.com (Lan Zang) Date: Sat, 12 Jan 2013 22:00:52 +0800 (CST) Subject: [Live-devel] Can live555 read RTP data from ffmpeg? In-Reply-To: References: <1357914615.45736.YahooMailNeo@web15803.mail.cnb.yahoo.com> Message-ID: <1357999252.34252.YahooMailNeo@web15804.mail.cnb.yahoo.com> Ross, Thanks for quick reply. I am trying to simplify my question, so, there might be some errors in my description.? I am not actually using a plain media file. I get video from camera and encode them by "ffmpeg". The weird thing about ffmpeg is that although the ffmpeg command options say that the output is RTP. Through the captured udp packets, it is not. The output are mpeg ts data over UDP directly, and one group fo UDP data have been fragmented into several UDP packets. So, in this case, shall I use?BasicUDPSource instead of?SimpleRTPSource as source? Can?BasicUDPSource handle fragmented UDP data? Regards, Lan Zang(Sander) ________________________________ From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Saturday, January 12, 2013 2:08 PM Subject: Re: [Live-devel] Can live555 read RTP data from ffmpeg? I am sending MPEG TS data over RTP by ffmpeg, like "ffmpeg -i file.ts -c copy -f mpegts rtp://localhost:1234". I want live555 can get these RTP data. If your only source of MPEG TS data is from files, then you don't need to use "ffmpeg" at all. ?Instead, you can stream your "file.ts" directly from our server. ?(If you are using "testOnDemandRTSPServer", then you rename "file.ts" as "test.ts". ?If you are using "live555MediaServer", then you don't need to rename your file, as long as its name ends with ".ts".) I modified the testOnDemandRTSPServer.cpp to use unicast address for the mpeg2TransportStreamFromUDPSourceTest item. By this I presume that you changed the definition of "inputAddressStr" to: char const* inputAddressStr = NULL; I then run testOnDemandRTSPServer. But it seems that my video player(VLC player) got nothing from testOnDemandRTSPServer while playing URL like "rtsp://192.168.133.195:8554/mpeg2TransportStreamFromUDPSourceTest".? I suggest first using "testRTSPClient" as your RTSP client application, instead of VLC. Is there anything more need to be done to make this thing work? No, I don't think so, assuming that your "ffmpeg" command generates RTP-encapsulated MPEG Transport Stream packets (with the correct RTP payload format code: 33). ?However, I don't know enough about "ffmpeg" to say for sure whether it's doing what you intend. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 12 06:51:17 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 12 Jan 2013 06:51:17 -0800 Subject: [Live-devel] Can live555 read RTP data from ffmpeg? In-Reply-To: <1357999252.34252.YahooMailNeo@web15804.mail.cnb.yahoo.com> References: <1357914615.45736.YahooMailNeo@web15803.mail.cnb.yahoo.com> <1357999252.34252.YahooMailNeo@web15804.mail.cnb.yahoo.com> Message-ID: > I am not actually using a plain media file. I get video from camera and encode them by "ffmpeg". The weird thing about ffmpeg is that although the ffmpeg command options say that the output is RTP. Through the captured udp packets, it is not. The output are mpeg ts data over UDP directly, and one group fo UDP data have been fragmented into several UDP packets. So, in this case, shall I use BasicUDPSource instead of SimpleRTPSource as source? Yes, but the way to do this is to change the "testOnDemandRTSPServer" code to set the constant "inputStreamIsRawUDP" to True. (See "testProgs/testOnDemandRTSPServer.cpp", line 338.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From markuss at sonicfoundry.com Sun Jan 13 10:22:02 2013 From: markuss at sonicfoundry.com (Markus Schumann) Date: Sun, 13 Jan 2013 18:22:02 +0000 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback Message-ID: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> All, I am getting a negative value for presentationTime.tv_sec in MySink::afterGettingFrame. I absolute value increases (when ignoring the sign) In the live555's custom sink sample code the timestamps (of type timeval) are getting casually casted to an unsigned. Is it save to ignore the sign and do the cast? What's the story for the timestamps begin negative and increasing in absolute value? // // used to output the 'microseconds' part of the presentation time // char uSecsStr[6+1]; sprintf(uSecsStr, "%06u", (unsigned) presentationTime.tv_usec); envir() << ".\tPresentation time: " << (unsigned)presentationTime.tv_sec << "." << uSecsStr; FYI: this is a great piece of software and has been working so far flawlessly - very nice work done! Thanks Markus From finlayson at live555.com Sun Jan 13 11:15:57 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 13 Jan 2013 11:15:57 -0800 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback In-Reply-To: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> References: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> Message-ID: <979D0602-3001-495C-9FC1-008445613BA2@live555.com> You haven't said specifically how you are using our software. Are you using it as a RTSP/RTP client? If so, then what is *transmitting* the RTP (& RTCP) packets? (Does the transmitter also use our software?) The "tv_sec" value in a presentation time (a "struct timeval") is supposed to be treated as an integer, not as an unsigned. It *can* be negative, but usually won't be, because they will usually be (but are not required to be, unless the transmitter uses our server software) aligned with 'wall clock' time. In any case, they should not be "negative numbers, increasing in absolute value". If you are a RTP/RTCP receiver, and are seeing values like this, then your transmitter's RTP/RTCP implementation is probably broken. Or are your presentation times coming from somewhere else (other than a "RTPSource")? If so, please explain. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From markuss at sonicfoundry.com Sun Jan 13 11:49:07 2013 From: markuss at sonicfoundry.com (Markus Schumann) Date: Sun, 13 Jan 2013 19:49:07 +0000 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback In-Reply-To: <979D0602-3001-495C-9FC1-008445613BA2@live555.com> References: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> <979D0602-3001-495C-9FC1-008445613BA2@live555.com> Message-ID: <1ED2F9A76678E0428E90FB2B6F93672D0108A708@postal.sonicfoundry.net> Thanks for the response! I am using live555 as RTSP/RTP client with a custom H.264 and audio sink connecting to a commercial of the shelve IP camera. My earlier statement was wrong the timestamps are increasing. The absolute value of the negative timestamp is going down (see below). Life is good - sorry for the confusion. I just got puzzled by seeing the output of the following lines of code that I copied and pasted: sprintf(uSecsStr, "%06u", (unsigned)presentationTime.tv_usec); envir() << ".\tPresentation time: " << (unsigned)presentationTime.tv_sec << "." << uSecsStr; Thanks again. Markus. DEBUG output (with unsigned cast removed): Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 476 bytes. Presentation time: -850904334.959992 NPT: 29677 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 511 bytes. Presentation time: -850904333.026992 NPT: 29677 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850904333.074820 NPT: 127507 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 450 bytes. Presentation time: -850904333.093992 NPT: 29677.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 461 bytes. Presentation time: -850904333.160992 NPT: 29677.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850904333.204102 NPT: 127507 From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Sunday, January 13, 2013 1:16 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] negative presentation time in "afterGettingFrame" sink callback You haven't said specifically how you are using our software. ?Are you using it as a RTSP/RTP client? ?If so, then what is *transmitting* the RTP (& RTCP) packets? ?(Does the transmitter also use our software?) The "tv_sec" value in a presentation time (a "struct timeval") is supposed to be treated as an integer, not as an unsigned. ?It *can* be negative, but usually won't be, because they will usually be (but are not required to be, unless the transmitter uses our server software) aligned with 'wall clock' time. ?In any case, they should not be "negative numbers, increasing in absolute value". ?If you are a RTP/RTCP receiver, and are seeing values like this, then your transmitter's RTP/RTCP implementation is probably broken. Or are your presentation times coming from somewhere else (other than a "RTPSource")? ?If so, please explain. Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sun Jan 13 17:01:27 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 13 Jan 2013 17:01:27 -0800 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback In-Reply-To: <1ED2F9A76678E0428E90FB2B6F93672D0108A708@postal.sonicfoundry.net> References: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> <979D0602-3001-495C-9FC1-008445613BA2@live555.com> <1ED2F9A76678E0428E90FB2B6F93672D0108A708@postal.sonicfoundry.net> Message-ID: > I just got puzzled by seeing the output of the following lines of code that I copied and pasted: > > sprintf(uSecsStr, "%06u", (unsigned)presentationTime.tv_usec); > envir() << ".\tPresentation time: " << (unsigned)presentationTime.tv_sec << "." << uSecsStr; Yes, the "(unsigned)" cast in the second line was incorrect. It will be changed to "(int)" in the next release of the software. > DEBUG output (with unsigned cast removed): > > Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 476 bytes. Presentation time: -850904334.959992 NPT: 29677 > Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 511 bytes. Presentation time: -850904333.026992 NPT: 29677 > Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850904333.074820 NPT: 127507 > Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 450 bytes. Presentation time: -850904333.093992 NPT: 29677.1 > Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 461 bytes. Presentation time: -850904333.160992 NPT: 29677.2 > Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850904333.204102 NPT: 127507 > Life is good Perhaps, although I'm a bit puzzled by the "NPT" values - in particular, why they aren't in sync. Could you send another message, this time with the *complete* RTSP/RTP debugging output (ending with the first few 'negative' presentation times, as you've done above)? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kingaceck at 163.com Sun Jan 13 17:34:49 2013 From: kingaceck at 163.com (=?utf-8?B?a2luZ2FjZWNr?=) Date: Mon, 14 Jan 2013 09:34:49 +0800 Subject: [Live-devel] =?utf-8?q?the_time_that_had_played_is_error?= Message-ID: <201301140934392507179@163.com> Hi I have extended live555 media server to support mp4 file.when playing using vlc,I click the pause button and then click play button.The time that had played at vlc panel becomes 00:00. I played the mpg file to test this issue and the time that had played at vlc panel is ok. Why the issue only occured in playing my mp4 file(h264 aac 1080p)?Is there any steps that missed in my *ServerMediaSubsession for mp4 file or any other question: /* * ffmpeg_H264_server_media_subsession.cpp * */ #include "BasicUsageEnvironment.hh" #include "liveMedia.hh" #include "../ffmpeg_demux.h" #include "../ffmpeg_server_demux.h" #include "../ffmpeg_demuxed_elementary_stream.h" #include "ffmpeg_h264_server_media_subsession.h" FfmpegH264ServerMediaSubsession *FfmpegH264ServerMediaSubsession::CreateNew( FfmpegServerDemux& demux, u_int8_t stream_id, Boolean reuse_source) { return new FfmpegH264ServerMediaSubsession(demux, stream_id, reuse_source); } FfmpegH264ServerMediaSubsession::~FfmpegH264ServerMediaSubsession() { // TODO Auto-generated destructor stub //delete (FfmpegServerDemux*)(&ffmpeg_demux_); Medium::close((FfmpegServerDemux*)(&ffmpeg_demux_)); } FfmpegH264ServerMediaSubsession::FfmpegH264ServerMediaSubsession( FfmpegServerDemux& demux, u_int8_t stream_id, Boolean reuse_source): H264VideoFileServerMediaSubsession(demux.envir(), NULL, reuse_source), ffmpeg_demux_(demux), stream_id_(stream_id) { // TODO Auto-generated constructor stub } FramedSource *FfmpegH264ServerMediaSubsession::createNewStreamSource( unsigned clientSessionId, unsigned& estBitrate) { estBitrate = 500; //kbps??estimate FramedSource* es = ffmpeg_demux_.NewElementaryStream(clientSessionId, stream_id_); if (es == NULL) return NULL; //return H264VideoStreamDiscreteFramer::createNew(envir(), es); return H264VideoStreamFramer::createNew(envir(), es); } void FfmpegH264ServerMediaSubsession ::seekStreamSource(FramedSource* inputSource, double& seekNPT, double /*streamDuration*/, u_int64_t& /*numBytes*/) { H264VideoStreamFramer* framer = (H264VideoStreamFramer*)inputSource; framer->flushInput(); // "inputSource" is a filter; its input source is the original elem stream source: FfmpegDemuxedElementaryStream* elemStreamSource = (FfmpegDemuxedElementaryStream*)(((FramedFilter*)inputSource)->inputSource()); // Next, get the original source demux: FfmpegDemux& sourceDemux = elemStreamSource->sourceDemux(); // and flush its input buffers: sourceDemux.FlushInput(); sourceDemux.seekStreamSource(seekNPT); } float FfmpegH264ServerMediaSubsession::duration() const { return ffmpeg_demux_.fileDuration(); } 2013-01-14 kingaceck -------------- next part -------------- An HTML attachment was scrubbed... URL: From kingaceck at 163.com Sun Jan 13 19:47:49 2013 From: kingaceck at 163.com (=?utf-8?B?a2luZ2FjZWNr?=) Date: Mon, 14 Jan 2013 11:47:49 +0800 Subject: [Live-devel] =?utf-8?q?about_elapsed_time_of_mpg_file?= Message-ID: <201301141147491098401@163.com> Hi I say the elapsed time of mpg file is ok in the previous mail.But now I find that if click pause button and then click play button during the first 5 seconds of playing mpg video, the elapsed time ialso becomes 00:00.But if after playing 20 seconds of mpg video ,to pause and then play the video,the elapsed time is ok.why? Below is the response of play method,I can't find the *Range* infomation.Is it possible to send the Range to the client?if possible, how can I abtain the elapsed time in live555 media server? Thank you very much. RTSP/1.0 200 OK CSeq: 10 Date: Mon, Jan 14 2013 03:24:47 GMT Session: 9892A748 RTP-Info: url=rtsp://129.1.5.156/test.mpg/track1;seq=63754;rtptime=2880301312,url=rtsp://129.1.5.156/test.mpg/track2;seq=12018;rtptime=1598034933 2013-01-14 kingaceck -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 13 21:02:50 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 13 Jan 2013 21:02:50 -0800 Subject: [Live-devel] about elapsed time of mpg file In-Reply-To: <201301141147491098401@163.com> References: <201301141147491098401@163.com> Message-ID: <1AD0FCB9-A058-4CC4-8BFC-F96BFC99E189@live555.com> > Below is the response of play method,I can't find the *Range* infomation.Is it possible to send the Range to the client? Yes. If you reimplement the virtual function float duration() const in your "ServerMediaSubsession" subclass to return the stream's duration, in seconds, then this value will be returned in the "Range:" header in the RTSP response. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From markuss at sonicfoundry.com Mon Jan 14 14:57:13 2013 From: markuss at sonicfoundry.com (Markus Schumann) Date: Mon, 14 Jan 2013 22:57:13 +0000 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback In-Reply-To: References: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> <979D0602-3001-495C-9FC1-008445613BA2@live555.com> <1ED2F9A76678E0428E90FB2B6F93672D0108A708@postal.sonicfoundry.net> Message-ID: <1ED2F9A76678E0428E90FB2B6F93672D0108A88C@postal.sonicfoundry.net> Ross, Here you are: Opening connection to 10.0.70.22, port 554... ...remote connection opened Sending request: DESCRIBE rtsp://10.0.70.22/video1+audio1 RTSP/1.0 CSeq: 2 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.10.11) Accept: application/sdp Received 759 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 2 x-Accept-Dynamic-Rate: 1 Content-Type: application/sdp Content-Base: rtsp://10.0.70.22:554/video1+audio1 Content-Length: 602 v=0 o=- 1438766090 1357921184 IN IP4 10.0.70.22 s= RTSP server i=audio video live media server a=type:broadcast c=IN IP4 0.0.0.0 t=0 0 m=video 0 RTP/AVP 96 i=Video channel in H264 VBR format a=mpeg4-esid:201 a=control:trackID=0 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=428032;sprop-parameter-sets=Z0KAMtoCgPSVIAAADwAAAwHCDAgAXjgA1AXvfC8IhGo=,aM48gA== m=audio 0 RTP/AVP 0 i=Audio channel in standard mu-Low (G711) format a=control:trackID=1 m=application 0 RTP/AVP 107 i=ONVIF metadata a=control:events a=sendonly a=rtpmap:107 vnd.onvif.metadata/90000 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Got a SDP description: v=0 o=- 1438766090 1357921184 IN IP4 10.0.70.22 s= RTSP server i=audio video live media server a=type:broadcast c=IN IP4 0.0.0.0 t=0 0 m=video 0 RTP/AVP 96 i=Video channel in H264 VBR format a=mpeg4-esid:201 a=control:trackID=0 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=428032;sprop-parameter-sets=Z0KAMtoCgPSVIAAADwAAAwHCDAgAXjgA1AXvfC8IhGo=,aM48gA== m=audio 0 RTP/AVP 0 i=Audio channel in standard mu-Low (G711) format a=control:trackID=1 m=application 0 RTP/AVP 107 i=ONVIF metadata a=control:events a=sendonly a=rtpmap:107 vnd.onvif.metadata/90000 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Initiated the "video/H264" subsession (client ports 56932-56933) Sending request: SETUP rtsp://10.0.70.22:554/video1+audio1/trackID=0 RTSP/1.0 CSeq: 3 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.10.11) Transport: RTP/AVP;unicast;client_port=56932-56933 Received 138 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 3 Session: 872219556 Transport: RTP/AVP;unicast;server_port=64702-64703;client_port=56932-56933;ssrc=1a5c1a37 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Set up the "video/H264" subsession (client ports 56932-56933) [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Created a data sink for the "video/H264" subsession [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Initiated the "audio/PCMU" subsession (client ports 56934-56935) Sending request: SETUP rtsp://10.0.70.22:554/video1+audio1/trackID=1 RTSP/1.0 CSeq: 4 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.10.11) Transport: RTP/AVP;unicast;client_port=56934-56935 Session: 872219556 Received 138 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Session: 872219556 Transport: RTP/AVP;unicast;server_port=64802-64803;client_port=56934-56935;ssrc=f74a6c57 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Set up the "audio/PCMU" subsession (client ports 56934-56935) [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Created a data sink for the "audio/PCMU" subsession [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Initiated the "application/VND.ONVIF.METADATA" subsession (client ports 56936-56937) Sending request: SETUP rtsp://10.0.70.22:554/video1+audio1/events RTSP/1.0 CSeq: 5 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.10.11) Transport: RTP/AVP;unicast;client_port=56936-56937 Session: 872219556 Received 138 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 5 Session: 872219556 Transport: RTP/AVP;unicast;server_port=64902-64903;client_port=56936-56937;ssrc=10ebbe51 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Set up the "application/VND.ONVIF.METADATA" subsession (client ports 56936-56937) [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Failed to create a data sink for the "application/VND.ONVIF.METADATA" subsession: liveMedia9 Sending request: PLAY rtsp://10.0.70.22:554/video1+audio1 RTSP/1.0 CSeq: 6 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.10.11) Session: 872219556 Range: npt=0.000- Received 252 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 6 Session: 872219556 Range: npt=now- RTP-Info: url=rtsp://10.0.70.22:554/video1+audio1/trackID=0;seq=35711,url=rtsp://10.0.70.22:554/video1+audio1/trackID=1;seq=62268,url=rtsp://10.0.70.22:554/video1+audio1/events;seq=4796 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Started playing session... Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 32 bytes. Presentation time: 1358204056.791836! NPT: 32409.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 4 bytes. Presentation time: 1358204056.791836! NPT: 32409.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 23815 bytes. Presentation time: 1358204056.791836! NPT: 32409.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: 1358204056.835742! NPT: 130239 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2331 bytes. Presentation time: 1358204056.858836! NPT: 32409.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 658 bytes. Presentation time: 1358204056.925836! NPT: 32409.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: 1358204056.963742! NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 614 bytes. Presentation time: 1358204056.991836! NPT: 32409.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 631 bytes. Presentation time: 1358204057.058836! NPT: 32409.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806167.638055 NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 632 bytes. Presentation time: -850806167.672293 NPT: 32409.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 407 bytes. Presentation time: -850806167.739293 NPT: 32410 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806167.766055 NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 392 bytes. Presentation time: -850806167.805293 NPT: 32410.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 390 bytes. Presentation time: -850806167.872293 NPT: 32410.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806167.894055 NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 239 bytes. Presentation time: -850806167.939293 NPT: 32410.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 264 bytes. Presentation time: -850806166.005293 NPT: 32410.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.022055 NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 235 bytes. Presentation time: -850806166.071635 NPT: 32410.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 245 bytes. Presentation time: -850806166.138635 NPT: 32410.4 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.150445 NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 211 bytes. Presentation time: -850806166.205635 NPT: 32410.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 139 bytes. Presentation time: -850806166.271635 NPT: 32410.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.278445 NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 107 bytes. Presentation time: -850806166.338635 NPT: 32410.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 248 bytes. Presentation time: -850806166.405635 NPT: 32410.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.406445 NPT: 130240 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 256 bytes. Presentation time: -850806166.472635 NPT: 32410.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.534445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 331 bytes. Presentation time: -850806166.536635 NPT: 32410.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 542 bytes. Presentation time: -850806166.603635 NPT: 32410.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.662445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 885 bytes. Presentation time: -850806166.672635 NPT: 32410.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2186 bytes. Presentation time: -850806166.739635 NPT: 32411 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.790445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1263 bytes. Presentation time: -850806166.805635 NPT: 32411.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1987 bytes. Presentation time: -850806166.872635 NPT: 32411.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806166.918445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1137 bytes. Presentation time: -850806166.939635 NPT: 32411.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1730 bytes. Presentation time: -850806165.005635 NPT: 32411.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.046445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1651 bytes. Presentation time: -850806165.072635 NPT: 32411.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1634 bytes. Presentation time: -850806165.139635 NPT: 32411.4 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.174445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1573 bytes. Presentation time: -850806165.206635 NPT: 32411.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1633 bytes. Presentation time: -850806165.272635 NPT: 32411.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.302445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 32 bytes. Presentation time: -850806165.339635 NPT: 32411.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 4 bytes. Presentation time: -850806165.339635 NPT: 32411.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 23827 bytes. Presentation time: -850806165.339635 NPT: 32411.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2500 bytes. Presentation time: -850806165.406635 NPT: 32411.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.430445 NPT: 130241 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 742 bytes. Presentation time: -850806165.473635 NPT: 32411.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 652 bytes. Presentation time: -850806165.539635 NPT: 32411.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.558445 NPT: 130242 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 638 bytes. Presentation time: -850806165.606635 NPT: 32411.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 429 bytes. Presentation time: -850806165.673635 NPT: 32411.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.686445 NPT: 130242 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 401 bytes. Presentation time: -850806165.740635 NPT: 32412 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 288 bytes. Presentation time: -850806165.806635 NPT: 32412.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.814445 NPT: 130242 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 275 bytes. Presentation time: -850806165.873635 NPT: 32412.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 273 bytes. Presentation time: -850806165.940635 NPT: 32412.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806165.942445 NPT: 130242 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 262 bytes. Presentation time: -850806164.006635 NPT: 32412.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 255 bytes. Presentation time: -850806164.073635 NPT: 32412.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.070445 NPT: 130242 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 149 bytes. Presentation time: -850806164.140635 NPT: 32412.4 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.198445 NPT: 130242 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 122 bytes. Presentation time: -850806164.206635 NPT: 32412.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 129 bytes. Presentation time: -850806164.272635 NPT: 32412.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.326445 NPT: 130242 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 132 bytes. Presentation time: -850806164.340635 NPT: 32412.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 153 bytes. Presentation time: -850806164.407635 NPT: 32412.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.454445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 218 bytes. Presentation time: -850806164.474635 NPT: 32412.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 638 bytes. Presentation time: -850806164.540635 NPT: 32412.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.582445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 625 bytes. Presentation time: -850806164.607635 NPT: 32412.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 571 bytes. Presentation time: -850806164.674635 NPT: 32412.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.710445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2521 bytes. Presentation time: -850806164.740635 NPT: 32413 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1208 bytes. Presentation time: -850806164.807635 NPT: 32413.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.838445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1234 bytes. Presentation time: -850806164.874635 NPT: 32413.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1853 bytes. Presentation time: -850806164.941635 NPT: 32413.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806164.966445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1740 bytes. Presentation time: -850806163.006635 NPT: 32413.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1115 bytes. Presentation time: -850806163.073635 NPT: 32413.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.094445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1741 bytes. Presentation time: -850806163.141635 NPT: 32413.4 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1615 bytes. Presentation time: -850806163.208635 NPT: 32413.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.222445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1662 bytes. Presentation time: -850806163.274635 NPT: 32413.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 32 bytes. Presentation time: -850806163.341635 NPT: 32413.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 4 bytes. Presentation time: -850806163.341635 NPT: 32413.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 23808 bytes. Presentation time: -850806163.341635 NPT: 32413.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.350445 NPT: 130243 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2621 bytes. Presentation time: -850806163.408635 NPT: 32413.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 637 bytes. Presentation time: -850806163.475635 NPT: 32413.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.478445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 636 bytes. Presentation time: -850806163.541635 NPT: 32413.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 626 bytes. Presentation time: -850806163.608635 NPT: 32413.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.606445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 587 bytes. Presentation time: -850806163.675635 NPT: 32413.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.734445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 402 bytes. Presentation time: -850806163.737635 NPT: 32414 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 342 bytes. Presentation time: -850806163.804635 NPT: 32414 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.862445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 389 bytes. Presentation time: -850806163.875635 NPT: 32414.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 242 bytes. Presentation time: -850806163.942635 NPT: 32414.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806163.990445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 257 bytes. Presentation time: -850806162.008635 NPT: 32414.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 233 bytes. Presentation time: -850806162.075635 NPT: 32414.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806162.118445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 262 bytes. Presentation time: -850806162.142635 NPT: 32414.4 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 168 bytes. Presentation time: -850806162.209635 NPT: 32414.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806162.246445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 162 bytes. Presentation time: -850806162.275635 NPT: 32414.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 149 bytes. Presentation time: -850806162.342635 NPT: 32414.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806162.374445 NPT: 130244 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 134 bytes. Presentation time: -850806162.409635 NPT: 32414.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 217 bytes. Presentation time: -850806162.476635 NPT: 32414.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806162.502445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 430 bytes. Presentation time: -850806162.541635 NPT: 32414.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 618 bytes. Presentation time: -850806162.608635 NPT: 32414.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806162.630445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 620 bytes. Presentation time: -850806162.676635 NPT: 32414.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2441 bytes. Presentation time: -850806162.742635 NPT: 32415 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806162.758445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1209 bytes. Presentation time: -850806162.809635 NPT: 32415.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1958 bytes. Presentation time: -850806162.876635 NPT: 32415.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806162.886445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1089 bytes. Presentation time: -850806162.943635 NPT: 32415.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1784 bytes. Presentation time: -850806161.009635 NPT: 32415.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.014445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1705 bytes. Presentation time: -850806161.076635 NPT: 32415.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1118 bytes. Presentation time: -850806161.143635 NPT: 32415.4 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.142445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1575 bytes. Presentation time: -850806161.210635 NPT: 32415.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.270445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 1611 bytes. Presentation time: -850806161.276635 NPT: 32415.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 32 bytes. Presentation time: -850806161.343852 NPT: 32415.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 4 bytes. Presentation time: -850806161.343852 NPT: 32415.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 23823 bytes. Presentation time: -850806161.343852 NPT: 32415.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.398445 NPT: 130245 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2642 bytes. Presentation time: -850806161.410852 NPT: 32415.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 695 bytes. Presentation time: -850806161.477852 NPT: 32415.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.526445 NPT: 130246 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 623 bytes. Presentation time: -850806161.543852 NPT: 32415.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 580 bytes. Presentation time: -850806161.610852 NPT: 32415.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.654445 NPT: 130246 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 582 bytes. Presentation time: -850806161.677852 NPT: 32415.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 335 bytes. Presentation time: -850806161.743852 NPT: 32416 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.782445 NPT: 130246 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 412 bytes. Presentation time: -850806161.810852 NPT: 32416.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 345 bytes. Presentation time: -850806161.877852 NPT: 32416.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850806161.910445 NPT: 130246 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 230 bytes. Presentation time: -850806161.945852 NPT: 32416.2 From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Sunday, January 13, 2013 7:01 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] negative presentation time in "afterGettingFrame" sink callback I just got puzzled by seeing the output of the following lines of code that I copied and pasted: sprintf(uSecsStr, "%06u", (unsigned)presentationTime.tv_usec); envir() << ".\tPresentation time: " << (unsigned)presentationTime.tv_sec << "." << uSecsStr; Yes, the "(unsigned)" cast in the second line was incorrect. It will be changed to "(int)" in the next release of the software. DEBUG output (with unsigned cast removed): Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 476 bytes. Presentation time: -850904334.959992 NPT: 29677 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 511 bytes. Presentation time: -850904333.026992 NPT: 29677 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850904333.074820 NPT: 127507 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 450 bytes. Presentation time: -850904333.093992 NPT: 29677.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 461 bytes. Presentation time: -850904333.160992 NPT: 29677.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850904333.204102 NPT: 127507 Life is good Perhaps, although I'm a bit puzzled by the "NPT" values - in particular, why they aren't in sync. Could you send another message, this time with the *complete* RTSP/RTP debugging output (ending with the first few 'negative' presentation times, as you've done above)? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bdrung at debian.org Mon Jan 14 16:02:28 2013 From: bdrung at debian.org (Benjamin Drung) Date: Tue, 15 Jan 2013 01:02:28 +0100 Subject: [Live-devel] pkg-config file Message-ID: <1358208148.28126.4.camel@deep-thought> Hi, attached a patch that adds a pkg-config file in case a shared library is built and installed. Note that the version will be needed for the pkg-config file. Would it make sense to have pkg-config files for each library individually in addition to the combining one? -- Benjamin Drung Debian & Ubuntu Developer -------------- next part -------------- A non-text attachment was scrubbed... Name: pkgconfig.patch Type: text/x-patch Size: 1239 bytes Desc: not available URL: From finlayson at live555.com Mon Jan 14 19:18:26 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 14 Jan 2013 19:18:26 -0800 Subject: [Live-devel] pkg-config file In-Reply-To: <1358208148.28126.4.camel@deep-thought> References: <1358208148.28126.4.camel@deep-thought> Message-ID: No, you can do this yourself for your Debian distribution, if you wish. Our 'distribution' is the existing source code tar file. From my point of view, it's fine as it is. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yogesh_marathe at ti.com Mon Jan 14 23:48:58 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Tue, 15 Jan 2013 07:48:58 +0000 Subject: [Live-devel] Live555 performance numbers In-Reply-To: <615FD77639372542BF647F5EBAA2DBC225245CC2@IL-BOL-EXCH01.smartwire.com> References: <6EFDC7CD849764409289BF330AF9704A3E9EC167@DBDE01.ent.ti.com> <67E79E60-627A-4FA3-9E5F-59503EFE2567@live555.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC299@DBDE01.ent.ti.com> <615FD77639372542BF647F5EBAA2DBC2252454FB@IL-BOL-EXCH01.smartwire.com>, <6EFDC7CD849764409289BF330AF9704A3E9EC6C3@DBDE01.ent.ti.com>, <615FD77639372542BF647F5EBAA2DBC225245B72@IL-BOL-EXCH01.smartwire.com> <615FD77639372542BF647F5EBAA2DBC225245CC2@IL-BOL-EXCH01.smartwire.com> Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9EF2D1@DBDE01.ent.ti.com> Thanks Jeff. Regards, Yogesh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Jeff Shanab Sent: Friday, January 11, 2013 10:17 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Actually. As I re-read your question I see you asked about not writing, I do not have an answer for that as that is what my software does. Also, It is a 32 bit build so the process is only given the maximum 2G of address space, it just means other processes on that box are not impacting it's memory footprint. ( a good trick for isolating system load from app on windows, run 32bit process on 64bit os) ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Jeff Shanab [jshanab at smartwire.com] Sent: Friday, January 11, 2013 7:28 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Oh, Sorry, Less than 25% CPU with the 400 streams. Leading me to believe I can handle a lot more on a rack server with multiple interfaces and a better upstream. Indeed we now have over 500 cameras on a co-located dl380. I would like to know what is the best way in such a situation to know when you get behind. Other than watching the video, I can check my index which gives me a frame count. Is there a hook to tell me at the time live555 is forced to drop a frame in RTSP? or does it reconnect at that point. ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Marathe, Yogesh [yogesh_marathe at ti.com] Sent: Friday, January 11, 2013 1:03 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 performance numbers Jeff, Thank you for your input. Is this the max performance on your system? Or CPU is still not fully loaded with 400 D1 at 10fps receive and write. It will be of a great help if you could comment on CPU performance when you are only receiving and not writing. I will see if I can extrapolate these stats for my system. Unfortunately, I'm devoid of such processing power and memory you mentioned. Regards, Yogesh. -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Tue Jan 15 04:19:56 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 15 Jan 2013 13:19:56 +0100 Subject: [Live-devel] RTP header extension Message-ID: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> Hi Ross, Looking in the mailing list I found the subject was discuss in http://lists.live555.com/pipermail/live-devel/2009-April/010424.html It seems that actual code just drop RTP header extension, isn't it ? Do you plan to include a callback to give the opportunity to manage the RTP header without rewriting MultiFramedRTPSource ? Thansk & Regards, Michel. [@@THALES GROUP RESTRICTED@@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 15 05:38:30 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Jan 2013 05:38:30 -0800 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback In-Reply-To: <1ED2F9A76678E0428E90FB2B6F93672D0108A88C@postal.sonicfoundry.net> References: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> <979D0602-3001-495C-9FC1-008445613BA2@live555.com> <1ED2F9A76678E0428E90FB2B6F93672D0108A708@postal.sonicfoundry.net> <1ED2F9A76678E0428E90FB2B6F93672D0108A88C@postal.sonicfoundry.net> Message-ID: <1AA92EA0-5D36-4D3C-8C08-6C9CC8D19049@live555.com> > Received a complete PLAY response: > RTSP/1.0 200 OK > CSeq: 6 > Session: 872219556 > Range: npt=now- > RTP-Info: url=rtsp://10.0.70.22:554/video1+audio1/trackID=0;seq=35711,url=rtsp://10.0.70.22:554/video1+audio1/trackID=1;seq=62268,url=rtsp://10.0.70.22:554/video1+audio1/events;seq=4796 FYI, it turned out that this "RTP-Info:" header - although legal - wasn't useful to us, because it didn't specify *both* the "seq" and "rtptime" parameters. Our RTSP client code was getting confused by this, which explains why the "NPT" values were turning out strange. I've just installed a new version of the code that ignores "RTP-Info:" headers - like this - that don't specify *both* the "seq" and "rtptime" parameters. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 15 05:39:27 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Jan 2013 05:39:27 -0800 Subject: [Live-devel] RTP header extension In-Reply-To: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> Message-ID: <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> > Do you plan to include a callback to give the opportunity to manage the RTP header without rewriting MultiFramedRTPSource ? Yes, at some point... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From markuss at sonicfoundry.com Tue Jan 15 08:16:00 2013 From: markuss at sonicfoundry.com (Markus Schumann) Date: Tue, 15 Jan 2013 16:16:00 +0000 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback In-Reply-To: <1AA92EA0-5D36-4D3C-8C08-6C9CC8D19049@live555.com> References: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> <979D0602-3001-495C-9FC1-008445613BA2@live555.com> <1ED2F9A76678E0428E90FB2B6F93672D0108A708@postal.sonicfoundry.net> <1ED2F9A76678E0428E90FB2B6F93672D0108A88C@postal.sonicfoundry.net> <1AA92EA0-5D36-4D3C-8C08-6C9CC8D19049@live555.com> Message-ID: <1ED2F9A76678E0428E90FB2B6F93672D0108AB91@postal.sonicfoundry.net> Ross, I downloaded your latest and ran it. The output is below. I don't see much difference between the versions and I couldn't pin point your change diffing the sources. Am I missing something? Thanks Markus. Opening connection to 10.0.70.22, port 554... ...remote connection opened Sending request: DESCRIBE rtsp://10.0.70.22/video1+audio1 RTSP/1.0 CSeq: 2 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.11.16) Accept: application/sdp Received 759 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 2 x-Accept-Dynamic-Rate: 1 Content-Type: application/sdp Content-Base: rtsp://10.0.70.22:554/video1+audio1 Content-Length: 602 v=0 o=- 1438766090 1357921184 IN IP4 10.0.70.22 s= RTSP server i=audio video live media server a=type:broadcast c=IN IP4 0.0.0.0 t=0 0 m=video 0 RTP/AVP 96 i=Video channel in H264 VBR format a=mpeg4-esid:201 a=control:trackID=0 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=428032;sprop-parameter-sets=Z0KAMtoCgPSVIAAADwAAAwHCDAgAXjgA1AXvfC8IhGo=,aM48gA== m=audio 0 RTP/AVP 0 i=Audio channel in standard mu-Low (G711) format a=control:trackID=1 m=application 0 RTP/AVP 107 i=ONVIF metadata a=control:events a=sendonly a=rtpmap:107 vnd.onvif.metadata/90000 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Got a SDP description: v=0 o=- 1438766090 1357921184 IN IP4 10.0.70.22 s= RTSP server i=audio video live media server a=type:broadcast c=IN IP4 0.0.0.0 t=0 0 m=video 0 RTP/AVP 96 i=Video channel in H264 VBR format a=mpeg4-esid:201 a=control:trackID=0 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=428032;sprop-parameter-sets=Z0KAMtoCgPSVIAAADwAAAwHCDAgAXjgA1AXvfC8IhGo=,aM48gA== m=audio 0 RTP/AVP 0 i=Audio channel in standard mu-Low (G711) format a=control:trackID=1 m=application 0 RTP/AVP 107 i=ONVIF metadata a=control:events a=sendonly a=rtpmap:107 vnd.onvif.metadata/90000 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Initiated the "video/H264" subsession (client ports 50154-50155) Sending request: SETUP rtsp://10.0.70.22:554/video1+audio1/trackID=0 RTSP/1.0 CSeq: 3 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.11.16) Transport: RTP/AVP;unicast;client_port=50154-50155 Received 138 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 3 Session: 301667356 Transport: RTP/AVP;unicast;server_port=64702-64703;client_port=50154-50155;ssrc=fec56a4b [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Set up the "video/H264" subsession (client ports 50154-50155) [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Created a data sink for the "video/H264" subsession [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Initiated the "audio/PCMU" subsession (client ports 50156-50157) Sending request: SETUP rtsp://10.0.70.22:554/video1+audio1/trackID=1 RTSP/1.0 CSeq: 4 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.11.16) Transport: RTP/AVP;unicast;client_port=50156-50157 Session: 301667356 Received 138 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Session: 301667356 Transport: RTP/AVP;unicast;server_port=64802-64803;client_port=50156-50157;ssrc=74562d4d [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Set up the "audio/PCMU" subsession (client ports 50156-50157) [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Created a data sink for the "audio/PCMU" subsession [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Initiated the "application/VND.ONVIF.METADATA" subsession (client ports 50158-50159) Sending request: SETUP rtsp://10.0.70.22:554/video1+audio1/events RTSP/1.0 CSeq: 5 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.11.16) Transport: RTP/AVP;unicast;client_port=50158-50159 Session: 301667356 Received 138 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 5 Session: 301667356 Transport: RTP/AVP;unicast;server_port=64902-64903;client_port=50158-50159;ssrc=b1dca64c [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Set up the "application/VND.ONVIF.METADATA" subsession (client ports 50158-50159) [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Failed to create a data sink for the "application/VND.ONVIF.METADATA" subsession: liveMedia9 Sending request: PLAY rtsp://10.0.70.22:554/video1+audio1 RTSP/1.0 CSeq: 6 User-Agent: C:\sofodev\Mediasite\main\Core\Capture\Debug\Test_SfLive555Client.exe (LIVE555 Streaming Media v2012.11.16) Session: 301667356 Range: npt=0.000- Received 252 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 6 Session: 301667356 Range: npt=now- RTP-Info: url=rtsp://10.0.70.22:554/video1+audio1/trackID=0;seq=35993,url=rtsp://10.0.70.22:554/video1+audio1/trackID=1;seq=62348,url=rtsp://10.0.70.22:554/video1+audio1/events;seq=4808 [URL:"rtsp://10.0.70.22:554/video1+audio1"]: Started playing session... Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 32 bytes. Presentation time: 1358266133.440908! NPT: 46765.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 4 bytes. Presentation time: 1358266133.440908! NPT: 46765.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 23811 bytes. Presentation time: 1358266133.440908! NPT: 46765.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 2070 bytes. Presentation time: 1358266133.507908! NPT: 46765.2 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: 1358266133.539475! NPT: 144595 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 988 bytes. Presentation time: 1358266133.575908! NPT: 46765.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 628 bytes. Presentation time: 1358266133.642908! NPT: 46765.3 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: 1358266133.667475! NPT: 144595 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 655 bytes. Presentation time: 1358266133.707908! NPT: 46765.4 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 650 bytes. Presentation time: 1358266133.774908! NPT: 46765.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: 1358266133.795475! NPT: 144595 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 403 bytes. Presentation time: 1358266133.842908! NPT: 46765.5 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 418 bytes. Presentation time: -850744096.878427 NPT: 46765.6 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850744096.891680 NPT: 144595 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 396 bytes. Presentation time: -850744096.944427 NPT: 46765.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 281 bytes. Presentation time: -850744095.011427 NPT: 46765.7 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850744095.019680 NPT: 144596 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 260 bytes. Presentation time: -850744095.078427 NPT: 46765.8 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 238 bytes. Presentation time: -850744095.144427 NPT: 46765.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850744095.147680 NPT: 144596 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 299 bytes. Presentation time: -850744095.210427 NPT: 46765.9 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850744095.275680 NPT: 144596 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 253 bytes. Presentation time: -850744095.277578 NPT: 46766 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 141 bytes. Presentation time: -850744095.344578 NPT: 46766.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; audio/PCMU: Received 1024 bytes. Presentation time: -850744095.403680 NPT: 144596 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 133 bytes. Presentation time: -850744095.411578 NPT: 46766.1 Stream "rtsp://10.0.70.22:554/video1+audio1"; video/H264: Received 134 bytes. Presentation time: -850744095.478578 NPT: 46766.2 From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, January 15, 2013 7:39 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] negative presentation time in "afterGettingFrame" sink callback Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 6 Session: 872219556 Range: npt=now- RTP-Info: url=rtsp://10.0.70.22:554/video1+audio1/trackID=0;seq=35711,url=rtsp://10.0.70.22:554/video1+audio1/trackID=1;seq=62268,url=rtsp://10.0.70.22:554/video1+audio1/events;seq=4796 FYI, it turned out that this "RTP-Info:" header - although legal - wasn't useful to us, because it didn't specify *both* the "seq" and "rtptime" parameters. Our RTSP client code was getting confused by this, which explains why the "NPT" values were turning out strange. I've just installed a new version of the code that ignores "RTP-Info:" headers - like this - that don't specify *both* the "seq" and "rtptime" parameters. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 15 10:55:48 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Jan 2013 10:55:48 -0800 Subject: [Live-devel] negative presentation time in "afterGettingFrame" sink callback In-Reply-To: <1ED2F9A76678E0428E90FB2B6F93672D0108AB91@postal.sonicfoundry.net> References: <1ED2F9A76678E0428E90FB2B6F93672D0108A6D7@postal.sonicfoundry.net> <979D0602-3001-495C-9FC1-008445613BA2@live555.com> <1ED2F9A76678E0428E90FB2B6F93672D0108A708@postal.sonicfoundry.net> <1ED2F9A76678E0428E90FB2B6F93672D0108A88C@postal.sonicfoundry.net> <1AA92EA0-5D36-4D3C-8C08-6C9CC8D19049@live555.com> <1ED2F9A76678E0428E90FB2B6F93672D0108AB91@postal.sonicfoundry.net> Message-ID: <826A8282-6335-4F63-813C-912BB2BA9240@live555.com> > I downloaded your latest and ran it. The output is below. > I don?t see much difference between the versions and I couldn?t pin point your change diffing the sources. > > Am I missing something? Yes, you didn't upgrade your client application. It's still running version 2012.11.16 of our software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Wed Jan 16 02:13:56 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Wed, 16 Jan 2013 11:13:56 +0100 Subject: [Live-devel] RTP header extension In-Reply-To: <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> Message-ID: <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp> Hi Ross, I guess this could be interesting to carry information inside the stream independandly of codec used. Actually we are using RTSP GET_PARAMETER to poll such extra informations. This is quite awkward ! The patch post in the mailing list http://lists.live555.com/pipermail/live-devel/2009-April/010424.html seems good, isn't it ? Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : mardi 15 janvier 2013 14:39 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] RTP header extension Do you plan to include a callback to give the opportunity to manage the RTP header without rewriting MultiFramedRTPSource ? Yes, at some point... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 16 07:04:33 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 16 Jan 2013 07:04:33 -0800 Subject: [Live-devel] RTP header extension In-Reply-To: <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp> Message-ID: <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> > I guess this could be interesting to carry information inside the stream independandly of codec used. That might be "interesting", but not necessarily appropriate. It depends on what sort of 'information' this is. The use of a RTP header extension is appropriate ***only if*** the information is directly related to the RTP packets (not just the stream as a whole). For example, one can imagine some RTP packets carrying an extra timestamp (e.g., a 'decoding timestamp'), in addition to the usual RTP timestamp (from which a 'presentation timestamp' is derived). If the 'information' is static, and unchanging, then it could be put in the stream's SDP description (e.g., the 'info' or 'description' SDP lines). There are (optional) parameters to "ServerMediaSession::createNew()" to provide this information, and also - at the receiving end - member functions of "MediaSession" to get this information: sessionName(); sessionDescription(); Another way to get information that's static (or doesn't change much) is to use the RTSP "GET_PARAMETER" command, as you've done. For information that is time-based - i.e., changes over time - but is not directly related to an existing media stream (i.e., the audio or video stream) - then the information could itself be its own RTP media stream - e.g., using the "text" media type. Note, for example, that we support time-varying T.140 text streams over RTP, using the class "T140TextRTPSink". (That's used for transmitting text over RTP; for receiving such streams, we just use "SimpleRTPSource".) We use such streams to transmit the 'subtitle' tracks from Matroska files (and VLC, when used as a RTP receiver, will also display these as subtitles). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kritisinghal23 at gmail.com Wed Jan 16 04:01:37 2013 From: kritisinghal23 at gmail.com (kriti singhal) Date: Wed, 16 Jan 2013 17:31:37 +0530 Subject: [Live-devel] Question regarding streaming application Message-ID: Hello sir, i made an streamer using your libraries which streams the live video from camera int initLm555Settings(void) { scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); destinationAddressStr #ifdef USE_SSM = "232.255.42.42"; #else = StreamingIp; #endif const unsigned short rtpPortNum = 18888; const unsigned short rtcpPortNum = rtpPortNum+1; const unsigned char ttl = 7; struct in_addr destinationAddress; destinationAddress.s_addr = our_inet_addr(destinationAddressStr); const Port rtpPort(rtpPortNum); const Port rtcpPort(rtcpPortNum); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl); #ifdef USE_SSM rtpGroupsock.multicastSendOnly(); rtcpGroupsock.multicastSendOnly(); #endif g_ExitEventLoop = 0; ideoSink = SimpleRTPSink::createNew(*env, &rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); setSendBufferTo(*env, rtpGroupsock.socketNum(), 1024 * 1024); // Create (and start) a 'RTCP instance' for this RTP sink: const unsigned estimatedSessionBandwidth = 5000; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* rtcp = RTCPInstance::createNew(*env, &rtcpGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL /* we're a server */, isSSM); UserAuthenticationDatabase* authDB = NULL; portNumBits rtspServerPortNum = 554; unsigned reclamationTestSeconds=65U; rtspServer = RTSPServer::createNew(*env,rtspServerPortNum, authDB, reclamationTestSeconds); if (rtspServer == NULL) { *env << "Failed to create RTSP server: " <getResultMsg()<<"\n"; rtspServerPortNum = 8554; rtspServer = RTSPServer::createNew(*env,rtspServerPortNum); if (rtspServer == NULL) { return 0; } else { *env << "Created RTSP server.."<<"\n."; } } else { *env << "Created RTSP server.."<<"\n."; Boolean const inputStreamIsRawUDP = False; char const* descriptionString={"Session streamed by \"testOnDemandRT\""}; sms= ServerMediaSession::createNew(*env, streamName, streamName,descriptionString); sms->addSubsession(MPEG2TransportUDPServerMediaSubsession::createNew(*env,destinationAddressStr,rtpPortNum1,inputStreamIsRawUDP)); rtspServer->addServerMediaSession(sms); char* url = rtspServer->rtspURL(sms); *env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; if (rtspServer->setUpTunnelingOverHTTP(sport) || rtspServer->setUpTunnelingOverHTTP(sport) || rtspServer->setUpTunnelingOverHTTP(sport)) { cout<<"\n\n\n(We use port "<httpServerPortNum()<<" for optional RTSP-over-HTTP tunneling.)\n"; } else { cout<<"\n\n\n(RTSP-over-HTTP tunneling is not available.)"; } play(); env->taskScheduler().doEventLoop(&g_ExitEventLoop); if(rtspServer) Medium::close(rtspServer); if(rtcp) Medium::close(rtcp); if(videoSink) Medium::close(videoSink); if(fileSource) Medium::close(fileSource); rtpGroupsock.removeAllDestinations(); rtcpGroupsock.removeAllDestinations(); env->reclaim(); delete scheduler; return 0; // only to prevent compiler warning } void afterPlaying(void* /*clientData*/) { *env << "...done reading from file\n"; videoSink->stopPlaying(); // Note that this also closes the input file that this source read from. Medium::close(videoSource); // Start playing once again: play(); } //================================================================ // play(): Play the input source. //================================================================= void play() { // Open the input file as a 'byte-stream file source': fi_params.nFICardFrameSize = TRANSPORT_PACKETS_PER_NETWORK_PACKET * TRANSPORT_PACKET_SIZE; fi_params.pfnGetRTPPayload = GetRTPPayload; fi_params.socketNum = videoSink->groupsockBeingUsed().socketNum(); DeviceParameters temp; fileSource = DeviceSourceFICard::createNew(*env, fi_params, temp); if (fileSource == NULL) { *env << "Unable to open Foresight card as a byte-stream file source\n"; exit(1); } FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = MPEG1or2VideoStreamDiscreteFramer::createNew(*env, videoES);//original // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); // env->taskScheduler().scheduleDelayedTask(uSecsToDelay, (TaskFunc*)periodicbrMeasurement1, videoSink); } void StartRTPProcess(void) { g_hRtpComThread = CreateThread((LPSECURITY_ATTRIBUTES) NULL, 0, (LPTHREAD_START_ROUTINE)initLm555Settings, 0, 0, &g_dwRtpComThreadID); if(g_hRtpComThread) SetThreadPriority(g_hRtpComThread, THREAD_PRIORITY_LOWEST/*THREAD_PRIORITY_NORMAL*/); } int StopRTProcess(void) { try{ if( videoSource ) videoSource->stopGettingFrames(); *env <<"in StopRTProcess\n"; Sleep(500); Medium::close(rtspServer); g_ExitEventLoop = 1; g_ExitEventLoop = 0; g_hRtpComThread = 0; g_dwRtpComThreadID = 0; return 0; } The streaming done by above streamer is catched by proxy server to which i give the url given by streamer at the line "char* url". this streaming is then seen by the client by using the proxy server URL,when the client say to stop stream the streamer calls its method StopRTProcess(void) but i got stuck in the line " Medium::close(rtspServer);",can you please tell why? I know i have modified your code but still need some of your help Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From tbatra18 at gmail.com Wed Jan 16 05:42:32 2013 From: tbatra18 at gmail.com (Tarun Batra) Date: Wed, 16 Jan 2013 19:12:32 +0530 Subject: [Live-devel] How to send TEAR DOWN request from Proxy Server? Message-ID: Hello All. Can you please guide me How to send TEAR DOWN request from Proxy Server? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Wed Jan 16 11:53:22 2013 From: tayeb.dotnet at gmail.com (Tayeb Meftah) Date: Wed, 16 Jan 2013 20:53:22 +0100 Subject: [Live-devel] Live555 Proxy server feature request Message-ID: <3FA5E3391B754EDA95A51D6CA05BCA08@worksc08f920f1> Hello, Please, i have the following: a RTSP Back end server (Vod) that contin about 50Video's 2. Live555 Proxy server in front of it but i'm obligated to put akll the 50 movie's into the url list, is there a way to put Live555 Proxy Server to proxy a single ip for any rtsp query? Thank -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix at embedded-sol.com Thu Jan 17 06:28:20 2013 From: felix at embedded-sol.com (Felix Radensky) Date: Thu, 17 Jan 2013 16:28:20 +0200 Subject: [Live-devel] Problem streaming live audio and video Message-ID: <50F80A84.1060903@embedded-sol.com> Hi, I'm trying to stream live H.264 video and live G.711 audio, using testMPEG1or2AudioVideoStreamer.cpp as an example. Both video and audio come from hardware encoders. As soon as encoded buffer is available, live555 code is notified via event trigger associated with media source. I'm using VLC to play the stream. The problem is that video freezes almost immediately. If only video is streamed, there's no problem. I suspect that audio can starve video since audio frame size produced by encoder is 160 bytes and some find of aggregation is required. Does this sound right ? Thanks a lot. Felix. From jagomez at indra.es Thu Jan 17 02:14:24 2013 From: jagomez at indra.es (=?iso-8859-1?Q?G=F3mez_Tovar=2C_Jos=E9_Andr=E9s?=) Date: Thu, 17 Jan 2013 11:14:24 +0100 Subject: [Live-devel] vlc 2.0.5 dont try to reconnect In-Reply-To: <20130110132339.M35692@livingdata.pt> References: <20130110132339.M35692@livingdata.pt> Message-ID: <9601D59775A8DC4F814B839F5FA8AA0001EFE974@MADARRMAILBOX03.indra.es> Hi, Why when I connect to a rtsp server and loose the signal during 10 to 20 second the vlc don't try to reconnect? Steps: 1- Connecto to rtsp server 2- Receive video OK 3- Server lost the link 4- If the server recovery fast the vlc continue receiving data but if the link is lost during 20 second for example the vlc stop the connection Is this correct? Why? I propose that the rtsp manager tries to connect to rtsp server each X millisecond up to the rtsp manager receive the stop Anyway thank you Thx More info: http://forum.videolan.org/viewtopic.php?f=14&t=107653 PS: I suppose it is an live555 behaviour Este correo electr?nico y, en su caso, cualquier fichero anexo al mismo, contiene informaci?n de car?cter confidencial exclusivamente dirigida a su destinatario o destinatarios. Si no es vd. el destinatario indicado, queda notificado que la lectura, utilizaci?n, divulgaci?n y/o copia sin autorizaci?n est? prohibida en virtud de la legislaci?n vigente. En el caso de haber recibido este correo electr?nico por error, se ruega notificar inmediatamente esta circunstancia mediante reenv?o a la direcci?n electr?nica del remitente. Evite imprimir este mensaje si no es estrictamente necesario. This email and any file attached to it (when applicable) contain(s) confidential information that is exclusively addressed to its recipient(s). If you are not the indicated recipient, you are informed that reading, using, disseminating and/or copying it without authorisation is forbidden in accordance with the legislation in effect. If you have received this email by mistake, please immediately notify the sender of the situation by resending it to their email address. Avoid printing this message if it is not absolutely necessary. From conchi.ap at vaelsys.com Thu Jan 17 06:03:02 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Thu, 17 Jan 2013 15:03:02 +0100 Subject: [Live-devel] FramedSource getNextFrame() problem Message-ID: <50F80496.8040109@vaelsys.com> Hello, I'm experiencing the same problem exposed in the following thread: http://lists.live555.com/pipermail/live-devel/2005-August/002993.html using the live555ProxyServer code included in the "proxyServer" folder (2013.01.15 version), and connecting to a mpeg4 stream throught RTSP-over-HTTP You can see the detailed log below: Requesting RTSP-over-HTTP tunneling (on port 80) Sending request: GET /mpeg4/media.amp/ HTTP/1.0 CSeq: 1 Authorization: Basic cm9vdDp0aXM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) x-sessioncookie: 7a5fa728ba8348bbf409313 Accept: application/x-rtsp-tunnelled Pragma: no-cache Cache-Control: no-cache Received 325 new bytes of response data. Received a complete GET response: HTTP/1.0 403 Forbidden Date: Thu, 17 Jan 2013 13:30:11 GMT Accept-Ranges: bytes Connection: close Content-Type: text/html; charset=ISO-8859-1 (plus 176 additional bytes) FramedSource[0x82850d8]::getNextFrame(): attempting to read more than once at the same time! Aborted When it happens, the getNextFrame method calls to the UsageEnvironment "InternalError", and it makes an abort(). I would be grateful if anyone could help me find the way to close the connection with the backend without exiting the entire program Thank you in advance Conchi Abasolo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 17 06:50:56 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Jan 2013 06:50:56 -0800 Subject: [Live-devel] Problem streaming live audio and video In-Reply-To: <50F80A84.1060903@embedded-sol.com> References: <50F80A84.1060903@embedded-sol.com> Message-ID: > I'm trying to stream live H.264 video and live G.711 audio, using > testMPEG1or2AudioVideoStreamer.cpp as an example. I'm not sure how much of an 'example' the "testMPEG1or2AudioVideoStreamer" code can be, given that it streams from a file source, and uses completely different codecs (and thus uses completely different "RTPSink" subclasses). But anyway... > Both video > and audio come from hardware encoders. As soon as encoded buffer > is available, live555 code is notified via event trigger associated > with media source. I'm using VLC to play the stream. > > The problem is that video freezes almost immediately. > If only video is streamed, there's no problem. > > I suspect that audio can starve video since audio frame size produced > by encoder is 160 bytes and some find of aggregation is required. > Does this sound right ? That's a possibility. However a more likely cause of problems like this is that you are setting "fPresentationTime" incorrectly for one or both of your audio and video substreams - so they end up being non-synchronized at the receiver (VLC) end. For each frame delivered by your input source, "fPresentationTime" must be set properly, and must be aligned with 'wall clock' time - i.e., the time that you'd get by calling "gettimeofday()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 17 06:51:56 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Jan 2013 06:51:56 -0800 Subject: [Live-devel] vlc 2.0.5 dont try to reconnect In-Reply-To: <9601D59775A8DC4F814B839F5FA8AA0001EFE974@MADARRMAILBOX03.indra.es> References: <20130110132339.M35692@livingdata.pt> <9601D59775A8DC4F814B839F5FA8AA0001EFE974@MADARRMAILBOX03.indra.es> Message-ID: > Why when I connect to a rtsp server and loose the signal during 10 to 20 second the vlc don't try to reconnect? Because VLC is not programmed that way. However, VLC is not our software, so your question is off-topic for this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 17 07:15:02 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Jan 2013 07:15:02 -0800 Subject: [Live-devel] FramedSource getNextFrame() problem In-Reply-To: <50F80496.8040109@vaelsys.com> References: <50F80496.8040109@vaelsys.com> Message-ID: > You can see the detailed log below: > Requesting RTSP-over-HTTP tunneling (on port 80) > Sending request: GET /mpeg4/media.amp/ HTTP/1.0 > CSeq: 1 > Authorization: Basic cm9vdDp0aXM= > User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) > x-sessioncookie: 7a5fa728ba8348bbf409313 > Accept: application/x-rtsp-tunnelled > Pragma: no-cache > Cache-Control: no-cache > Received 325 new bytes of response data. > Received a complete GET response: > HTTP/1.0 403 Forbidden > Date: Thu, 17 Jan 2013 13:30:11 GMT > Accept-Ranges: bytes > Connection: close > Content-Type: text/html; charset=ISO-8859-1 > (plus 176 additional bytes) > FramedSource[0x82850d8]::getNextFrame(): attempting to read more than once at the same time! > Aborted Unfortunately I haven't been able to reproduce this error. However, it is easy to avoid: Just don't try to access that stream using RTSP-over-HTTP! The "HTTP/1.0 403 Forbidden" message - returned by the back-end server - means that that server does not support RTSP-over-HTTP tunneling (at least, not on port 80). So don't try it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix at embedded-sol.com Thu Jan 17 07:25:12 2013 From: felix at embedded-sol.com (Felix Radensky) Date: Thu, 17 Jan 2013 17:25:12 +0200 Subject: [Live-devel] Problem streaming live audio and video In-Reply-To: References: <50F80A84.1060903@embedded-sol.com> Message-ID: <50F817D8.4000603@embedded-sol.com> An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 17 08:13:25 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Jan 2013 08:13:25 -0800 Subject: [Live-devel] Problem streaming live audio and video In-Reply-To: <50F817D8.4000603@embedded-sol.com> References: <50F80A84.1060903@embedded-sol.com> <50F817D8.4000603@embedded-sol.com> Message-ID: <4085115F-BE30-4736-BFC3-973D2EC3031A@live555.com> >> That's a possibility. However a more likely cause of problems like this is that you are setting "fPresentationTime" incorrectly for one or both of your audio and video substreams - so they end up being non-synchronized at the receiver (VLC) end. >> >> For each frame delivered by your input source, "fPresentationTime" must be set properly, and must be aligned with 'wall clock' time - i.e., the time that you'd get by calling "gettimeofday()". > > I do set fPresentationTime using gettimeofday() in my implementation of DeviceSource. It's the same class for both video and audio. That's good. Nonetheless, the symptoms you describe are consistent with bad presentation times, so - just to be sure - I suggest that you try running "testRTSPClient" as your client application, and check that the frame presentation times are sane. Also, because you report that you can stream video-only, but have a problem only when you try to stream both audio and video, I suggest that you try streaming audio-only. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From conchi.ap at vaelsys.com Thu Jan 17 08:59:25 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Thu, 17 Jan 2013 17:59:25 +0100 Subject: [Live-devel] ProxyServerMediaSubsession Segmentation fault Message-ID: <50F82DED.2090809@vaelsys.com> Hello I'm connecting with a mpeg4 stream over TCP (because i can use UDP), using the example live555ProxyServer (using the -t option) The connection is succesfully and i can see the video transmission, but when the backend sends a BYE, the live55ProxyServer crashed after trying to close the ProxyServerMediaSubsessions I would like to know what's triggering this issue and the best solution to solve it Just as a suggestion.... Is it possible that the program was trying to use a deleted object or something like?. Here is the complete log: ./live555ProxyServer -V -t rtsp://usr:pswd at 172.31.17.67:554/mpeg4/media.amp LIVE555 Proxy Server (LIVE555 Streaming Media library version 2013.01.15) Opening connection to 172.31.17.67, port 554... RTSP stream, proxying the stream "rtsp://usr:pswd at 172.31.17.67:554/mpeg4/media.amp" Play this stream using the URL: rtsp://172.25.6.230:8555/proxyStream (We use port 8000 for optional RTSP-over-HTTP tunneling.) ...remote connection opened Sending request: DESCRIBE rtsp://usr:pswd at 172.31.17.67:554/mpeg4/media.amp RTSP/1.0 CSeq: 2 User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) Accept: application/sdp Received 73 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 401 Unauthorized CSeq: 2 WWW-Authenticate: Basic realm="/" Resending... Sending request: DESCRIBE rtsp://usr:pswd at 172.31.17.67:554/mpeg4/media.amp RTSP/1.0 CSeq: 3 Authorization: Basic cm9vdDp0aXM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) Accept: application/sdp Received 857 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 3 Content-Base: rtsp://172.31.17.67:554/mpeg4/media.amp/ Content-Type: application/sdp Content-Length: 721 v=0 o=- 1358442587034508 1358442587034519 IN IP4 172.31.17.67 s=Media Presentation e=NONE c=IN IP4 0.0.0.0 b=AS:300 t=0 0 a=control:* a=range:npt=now- a=mpeg4-iod: "data:application/mpeg4-iod;base64,AoDUAE8BAf8DAQOAbwABQFBkYXRhOmFwcGxpY2F0aW9uL21wZWc0LW9kLWF1O2Jhc2U2NCxBUjBCR3dVZkF4Y0F5U1FBWlFRTklCRUVrK0FBQkpQZ0FBU1Q0QVlCQkE9PQQNAQUABAAAAAAAAAAAAAYJAQAAAAAAAAAAAzoAAkA2ZGF0YTphcHBsaWNhdGlvbi9tcGVnNC1iaWZzLWF1O2Jhc2U2NCx3QkFTWVFTSVVFVUZQd0E9BBICDQAAAgAAAAAAAAAABQMAAEAGCQEAAAAAAAAAAA==" a=isma-compliance:1,1.0,1 m=video 0 RTP/AVP 96 b=AS:300 a=framerate:10.0 a=control:trackID=1 a=rtpmap:96 MP4V-ES/90000 a=fmtp:96 profile-level-id=3; config=000001B003000001B5891300000100000001200086C40FA28782168A21 a=mpeg4-esid:201 ProxyServerMediaSession["rtsp://172.31.17.67:554/mpeg4/media.amp/"] added new "ProxyServerMediaSubsession" for RTP/video/MP4V-ES track ProxyServerMediaSubsession["MP4V-ES"]::createNewStreamSource(session id 0) Initiated: ProxyServerMediaSubsession["MP4V-ES"] ProxyServerMediaSubsession["MP4V-ES"]::createNewRTPSink() ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() ProxyServerMediaSubsession["MP4V-ES"]::createNewStreamSource(session id 564885010) Sending request: SETUP rtsp://172.31.17.67:554/mpeg4/media.amp/trackID=1 RTSP/1.0 CSeq: 4 Authorization: Basic cm9vdDp0aXM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) Transport: RTP/AVP/TCP;unicast;interleaved=0-1 ProxyServerMediaSubsession["MP4V-ES"]::createNewRTPSink() Received 124 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Session: 0157492160;timeout=60 Transport: RTP/AVP/TCP;unicast;interleaved=130-131;mode="PLAY" ProxyRTSPClient["rtsp://172.31.17.67:554/mpeg4/media.amp/"]::continueAfterSETUP(): head codec: MP4V-ES; numSubsessions 1 queue: MP4V-ES Sending request: PLAY rtsp://172.31.17.67:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 5 Authorization: Basic cm9vdDp0aXM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) Session: 0157492160 Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Session: 0157492160 Range: npt=now- RTP-Info: url=trackID=1;seq=29566;rtptime=2487697867 Sending request: OPTIONS rtsp://172.31.17.67:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 6 Authorization: Basic cm9vdDp0aXM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) ProxyServerMediaSubsession["MP4V-ES"]: received RTCP "BYE" ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() Sending request: PAUSE rtsp://172.31.17.67:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 7 Authorization: Basic cm9vdDp0aXM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.15) Session: 0157492160 ProxyRTSPClient["rtsp://172.31.17.67:554/mpeg4/media.amp/"]: lost connection to server ('errno': 22). Resetting... ProxyServerMediaSubsession["Hz? -ES"]::closeStreamSource() Opening connection to 172.31.17.67, port 554... Segmentation fault Many thanks Conchi Abasolo -------------- next part -------------- An HTML attachment was scrubbed... URL: From markuss at sonicfoundry.com Thu Jan 17 13:04:20 2013 From: markuss at sonicfoundry.com (Markus Schumann) Date: Thu, 17 Jan 2013 21:04:20 +0000 Subject: [Live-devel] stream descriptor only reachable by HTTP Message-ID: <1ED2F9A76678E0428E90FB2B6F93672D0108AE4F@postal.sonicfoundry.net> All, I have an RTP source where it's stream descriptor is only available via HTTP - any advice on how to go about it? Thanks Markus. URL: http://10.0.70.25/stream.sdp Browser output: v=0 o=- 8161451 8161451 IN IP4 10.0.70.25 s=ESP H264 STREAM e=NONE t=0 0 m=video 8800 RTP/AVP 96 c=IN IP4 10.0.71.24 a=rtpmap:96 H264/90000 a=fmtp:96 media=video; clock-rate=90000; encoding-name=H264; sprop-parameter-sets=Z0IAH6aAUAW5AA==,aM44gAAAAAA= -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 17 16:37:10 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 17 Jan 2013 16:37:10 -0800 Subject: [Live-devel] stream descriptor only reachable by HTTP In-Reply-To: <1ED2F9A76678E0428E90FB2B6F93672D0108AE4F@postal.sonicfoundry.net> References: <1ED2F9A76678E0428E90FB2B6F93672D0108AE4F@postal.sonicfoundry.net> Message-ID: > I have an RTP source where it?s stream descriptor is only available via HTTP ? any advice on how to go about it? [...] > Browser output: > > v=0 > o=- 8161451 8161451 IN IP4 10.0.70.25 > s=ESP H264 STREAM > e=NONE > t=0 0 > m=video 8800 RTP/AVP 96 > c=IN IP4 10.0.71.24 > a=rtpmap:96 H264/90000 > a=fmtp:96 media=video; clock-rate=90000; encoding-name=H264; sprop-parameter-sets=Z0IAH6aAUAW5AA==,aM44gAAAAAA= Grumble.... I wish people (in this case, the developers of that server) would stop thinking that they can avoid implementing RTSP. The RTSP protocol was designed and standardized for a good reason! Generally speaking, trying to specify a stream using a SDP description only - without using the RTSP protocol - works only for multicast streams, not unicast streams like this. But what *might* work here is the following: 1/ Call "MediaSession::createNew()" with this SDP description (string) as parameter, to create a new "MediaSession" object. 2/ Call "MediaSession::initiate()" on this "MediaSession" object. This will create a "RTPSource" (and corresponding "RTCPInstance") objects, to receive the stream. 3/ Then, iterate through the "MediaSubsession" objects (there should be just one) of the "MediaSession" object, and call yourMediaSinkObject->startPlaying(*(subsession->readSource()), ); One potential issue here is that - using this approach - RTCP "RR" packets from you (the receiver) will not get sent back to the server (because the receiver - because it didn't use RTSP - has no way of knowing the server's IP address). This means that if the server uses "RR" packets coming from the receiver to indicate 'liveness', then it will eventually time out the connection. If you have any control over the server, it would be better if you told it to stream via multicast instead of unicast, or - even better - to implement RTSP! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 18 02:30:43 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Jan 2013 02:30:43 -0800 Subject: [Live-devel] Live555 Proxy server feature request In-Reply-To: <3FA5E3391B754EDA95A51D6CA05BCA08@worksc08f920f1> References: <3FA5E3391B754EDA95A51D6CA05BCA08@worksc08f920f1> Message-ID: > Please, i have the following: > a RTSP Back end server (Vod) that contin about 50Video's > 2. Live555 Proxy server in front of it > but i'm obligated to put akll the 50 movie's into the url list, is there a way to put Live555 Proxy Server to proxy a single ip for any rtsp query? No, not with the current implementation. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.sidorov at iss.ru Fri Jan 18 03:51:44 2013 From: alexander.sidorov at iss.ru (Alexander Sidorov) Date: Fri, 18 Jan 2013 15:51:44 +0400 Subject: [Live-devel] JPEG quanrizer tables origin Message-ID: Hi, everybody! Could you tell me the origin of 'luma' and 'chroma' quantizer tables specified at JPEGVideoRTPSource.cpp? They are differ from tables specified at RFC2435. But some IP cameras working correctly only with this tables. So where are they from? Alexander -- Using Opera's revolutionary email client: http://www.opera.com/mail/ From finlayson at live555.com Fri Jan 18 08:56:38 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Jan 2013 08:56:38 -0800 Subject: [Live-devel] JPEG quanrizer tables origin In-Reply-To: References: Message-ID: <3652BEBB-63BE-4795-8CD2-CC498A0FE805@live555.com> > Could you tell me the origin of 'luma' and 'chroma' quantizer tables specified at JPEGVideoRTPSource.cpp? They were contributed to the code about 10 years ago. > They are differ from tables specified at RFC2435. The values in the tables are actually the same, but in a different ('zigzag') order. I'm not an expert on JPEG, but apparently the tables in RFC 2435 should be copied to the JPEG headers in 'zigzag' order, but the sample code in RFC 2435 does not do this. I.e., the sample code in RFC 2435 is apparently in error. Our code corrects this by reordering the tables, so that they can be copied to the JPEG headers using a simple "memcpy()". Other people's implementations do the same thing. See http://massapi.com/source/fmj/src/net/sf/fmj/media/codec/video/jpeg/RFC2035.java.html (a Java implementation) ftp://202.38.97.197/open/multimedia/VLC/ffmpeg/libavformat/rtpdec_jpeg.c (a C implementation for FFMPEG) http://social.msdn.microsoft.com/Forums/nl-NL/mediafoundationdevelopment/thread/ece2b8f5-6487-4c23-8244-cf3bde315f5f (a C# implementation) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Fri Jan 18 11:33:34 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Fri, 18 Jan 2013 19:33:34 +0000 Subject: [Live-devel] Windows build, some projects failing Message-ID: fyi: While downloading and compiling the latest (live.2013.01.15.tar) code I noticed testProgs, mediaServer and proxyServer still fails on Windows, i.e. Makefile.tail contains ?= which nmake doesn't like: PREFIX ?= /usr/local /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From jshanab at smartwire.com Fri Jan 18 08:48:41 2013 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 18 Jan 2013 16:48:41 +0000 Subject: [Live-devel] RTP header extension In-Reply-To: <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp>, <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC225249896@IL-BOL-EXCH01.smartwire.com> The containers using RTSP already define stream types for meta data beyond Video and Audio. Often used for analytics data for security camera video for example. Is this what we are talking about? ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Ross Finlayson [finlayson at live555.com] Sent: Wednesday, January 16, 2013 9:04 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] RTP header extension I guess this could be interesting to carry information inside the stream independandly of codec used. That might be "interesting", but not necessarily appropriate. It depends on what sort of 'information' this is. The use of a RTP header extension is appropriate ***only if*** the information is directly related to the RTP packets (not just the stream as a whole). For example, one can imagine some RTP packets carrying an extra timestamp (e.g., a 'decoding timestamp'), in addition to the usual RTP timestamp (from which a 'presentation timestamp' is derived). If the 'information' is static, and unchanging, then it could be put in the stream's SDP description (e.g., the 'info' or 'description' SDP lines). There are (optional) parameters to "ServerMediaSession::createNew()" to provide this information, and also - at the receiving end - member functions of "MediaSession" to get this information: sessionName(); sessionDescription(); Another way to get information that's static (or doesn't change much) is to use the RTSP "GET_PARAMETER" command, as you've done. For information that is time-based - i.e., changes over time - but is not directly related to an existing media stream (i.e., the audio or video stream) - then the information could itself be its own RTP media stream - e.g., using the "text" media type. Note, for example, that we support time-varying T.140 text streams over RTP, using the class "T140TextRTPSink". (That's used for transmitting text over RTP; for receiving such streams, we just use "SimpleRTPSource".) We use such streams to transmit the 'subtitle' tracks from Matroska files (and VLC, when used as a RTP receiver, will also display these as subtitles). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 18 12:27:00 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Jan 2013 12:27:00 -0800 Subject: [Live-devel] Windows build, some projects failing In-Reply-To: References: Message-ID: <41205B01-5B54-48B9-AB75-CB4D998696EF@live555.com> On Jan 18, 2013, at 11:33 AM, "Erlandsson, Claes P (CERLANDS)" wrote: > fyi: While downloading and compiling the latest (live.2013.01.15.tar) code I noticed testProgs, mediaServer and proxyServer still fails on Windows, i.e. Makefile.tail contains ?= which nmake doesn't like: > PREFIX ?= /usr/local Yes, I forgot to change this to PREFIX = /usr/local in those directories. (I had done this in the library directories only.) I'll fix this in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 19 03:13:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Jan 2013 03:13:32 -0800 Subject: [Live-devel] ProxyServerMediaSubsession Segmentation fault In-Reply-To: <50F82DED.2090809@vaelsys.com> References: <50F82DED.2090809@vaelsys.com> Message-ID: <36DDD53D-A573-4945-AAFC-C10EC51E534A@live555.com> Thanks for the report. I've just installed a new version (2013.01.19) of the "LIVE555 Streaming Media" code that - I think - may fix this bug. Please try it, and if you find any more problems, please let us know. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 19 03:16:33 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 19 Jan 2013 03:16:33 -0800 Subject: [Live-devel] How to send TEAR DOWN request from Proxy Server? In-Reply-To: References: Message-ID: > Can you please guide me How to send TEAR DOWN request from Proxy Server? Normally you won't need to do this, because if no 'front-end' clients are playing the stream, the proxy server will "PAUSE" the back-end stream, so it will consume no significant resources, even though the connection to the back-end server remains open. However (with the most recent version of the code), the proxy server will send a "TEARDOWN" request if you remove the "ProxyServerMediaSession" object from the server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From josan.chiao at fuho.com.tw Mon Jan 21 00:31:11 2013 From: josan.chiao at fuho.com.tw (=?big5?B?LrNuxemzoV9JUENBTb3SX7PsrMA=?=) Date: Mon, 21 Jan 2013 08:31:11 +0000 Subject: [Live-devel] Some strange runtime error come from our TI IPCam~ In-Reply-To: <66BA2E7AFB5D2E4099C04AA75EE078582351926F@ex2k10.fuho.com.tw> References: <66BA2E7AFB5D2E4099C04AA75EE078582351926F@ex2k10.fuho.com.tw> Message-ID: <66BA2E7AFB5D2E4099C04AA75EE0785823519286@ex2k10.fuho.com.tw> Dear Sir: We are IP Camera manufacturer and use TI DM36x solution, and it use live555 as rtsp server! We try to update to latest live555 source code! We can build the execution file (wis-streamer) successfully. But once we run the wis-streamer on our platform, it got runtime error as follows: Can you kindly check it out for us to find the reason!? I appreciate your help! ==================================== accept()ed connection from 192.168.1.90 Liveness indication from client at 192.168.1.90 Liveness indication from client at 192.168.1.90 RTSPClientSession[0x1bfb58]::handleRequestBytes() read 142 new bytes:DESCRIBE rtsp://192.168.1.100:554/video0.sdp RTSP/1.0 CSeq: 1 Accept: application/sdp User-Agent: CmRtspClient 19110 Bandwidth: 384000 parseRTSPRequestString() succeeded, returning cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "video0.sdp", CSeq "1", Content-Length 0, with 0 bytes following the message. terminate called after throwing an instance of 'int' terminate called recursively ==================================== Have a good day! Regards, Josan -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 21 00:55:18 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Jan 2013 00:55:18 -0800 Subject: [Live-devel] Some strange runtime error come from our TI IPCam~ In-Reply-To: <66BA2E7AFB5D2E4099C04AA75EE0785823519286@ex2k10.fuho.com.tw> References: <66BA2E7AFB5D2E4099C04AA75EE078582351926F@ex2k10.fuho.com.tw> <66BA2E7AFB5D2E4099C04AA75EE0785823519286@ex2k10.fuho.com.tw> Message-ID: <23780C3E-F7B8-4B09-8E28-70293C545030@live555.com> > We are IP Camera manufacturer and use TI DM36x solution, and it use live555 as rtsp server! > We try to update to latest live555 source code! We can build the execution file (wis-streamer) successfully. > But once we run the wis-streamer on our platform, it got runtime error as follows: [...] > terminate called after throwing an instance of 'int' > terminate called recursively [...] > Can you kindly check it out for us to find the reason!? Unfortunately I have no idea what might be causing that error. You're going to have to track this down yourself. Sorry. (However, you might find it useful to run a simple RTSP server application - such as "testOnDemandRTSPServer" - on your IP camera first, before trying to run "wis-streamer".) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon Jan 21 04:08:33 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 21 Jan 2013 13:08:33 +0100 Subject: [Live-devel] RTP header extension In-Reply-To: <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp> <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> Message-ID: <7773_1358770148_50FD2FE4_7773_3282_1_1BE8971B6CFF3A4F97AF4011882AA2550156097EF359@THSONEA01CMS01P.one.grp> Hi Ross, Thanks for your analysis of the appropriate channel to use depending on kind of data. I would like to use RTP header extension in order to send if frame is a synchronisation point, and timestamps (recording time). Do you think it's possible to give a callback to the RTSPClient ? For the server side, it seems needed to make some stuff in MultiFramedRTPSink to allow to set the X bit, and to fill the extended buffer, isn't it ? Thanks & Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : mercredi 16 janvier 2013 16:05 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] RTP header extension I guess this could be interesting to carry information inside the stream independandly of codec used. That might be "interesting", but not necessarily appropriate. It depends on what sort of 'information' this is. The use of a RTP header extension is appropriate ***only if*** the information is directly related to the RTP packets (not just the stream as a whole). For example, one can imagine some RTP packets carrying an extra timestamp (e.g., a 'decoding timestamp'), in addition to the usual RTP timestamp (from which a 'presentation timestamp' is derived). If the 'information' is static, and unchanging, then it could be put in the stream's SDP description (e.g., the 'info' or 'description' SDP lines). There are (optional) parameters to "ServerMediaSession::createNew()" to provide this information, and also - at the receiving end - member functions of "MediaSession" to get this information: sessionName(); sessionDescription(); Another way to get information that's static (or doesn't change much) is to use the RTSP "GET_PARAMETER" command, as you've done. For information that is time-based - i.e., changes over time - but is not directly related to an existing media stream (i.e., the audio or video stream) - then the information could itself be its own RTP media stream - e.g., using the "text" media type. Note, for example, that we support time-varying T.140 text streams over RTP, using the class "T140TextRTPSink". (That's used for transmitting text over RTP; for receiving such streams, we just use "SimpleRTPSource".) We use such streams to transmit the 'subtitle' tracks from Matroska files (and VLC, when used as a RTP receiver, will also display these as subtitles). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 21 06:27:49 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Jan 2013 06:27:49 -0800 Subject: [Live-devel] RTP header extension In-Reply-To: <7773_1358770148_50FD2FE4_7773_3282_1_1BE8971B6CFF3A4F97AF4011882AA2550156097EF359@THSONEA01CMS01P.one.grp> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp> <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> <7773_1358770148_50FD2FE4_7773_3282_1_1BE8971B6CFF3A4F97AF4011882AA2550156097EF359@THSONEA01CMS01P.one.grp> Message-ID: <95AA7B33-3D4D-4920-9348-430C2E45E8B9@live555.com> > I would like to use RTP header extension in order to send if frame is a synchronisation point, and timestamps (recording time). > > Do you think it's possible to give a callback to the RTSPClient ? No, because RTP header extensions have nothing to do with RTSP. Support - for both transmitters and receivers - for RTP header extensions will likely happen someday, but this is not currently a high-priority feature for free support. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rafael.gdm at vaelsys.com Mon Jan 21 10:21:50 2013 From: rafael.gdm at vaelsys.com (Rafael Gil) Date: Mon, 21 Jan 2013 19:21:50 +0100 Subject: [Live-devel] ffmpeg not working anymore with last version Message-ID: Hi I am aware this is not an ffmpeg mail list but it is not longer working with last live555 release. I had no problems with previous releases so I thought it was worth mentioning. Here is the log from the proxy server when I tried to execute 'ffprobe "rtsp://192.168.0.27:8554/proxyStream"': $ ./live555ProxyServer -V "rtsp:// root:root at 192.168.0.74/axis-media/media.amp" LIVE555 Proxy Server (LIVE555 Streaming Media library version 2013.01.19) Opening connection to 192.168.0.74, port 554... RTSP stream, proxying the stream "rtsp:// root:root at 192.168.0.74/axis-media/media.amp" Play this stream using the URL: rtsp://192.168.0.27:8554/proxyStream (We use port 8000 for optional RTSP-over-HTTP tunneling.) ...remote connection opened Sending request: DESCRIBE rtsp://root:root at 192.168.0.74/axis-media/media.ampRTSP/1.0 CSeq: 2 User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Accept: application/sdp Received 247 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 401 Unauthorized CSeq: 2 WWW-Authenticate: Digest realm="AXIS_00408CA14A12", nonce="000fbe78Y61172682ff34c4ccd049593d107545d6851fb", stale=FALSE WWW-Authenticate: Basic realm="AXIS_00408CA14A12" Date: Mon, 21 Jan 2013 18:45:58 GMT Resending... Sending request: DESCRIBE rtsp://root:root at 192.168.0.74/axis-media/media.ampRTSP/1.0 CSeq: 3 Authorization: Digest username="root", realm="AXIS_00408CA14A12", nonce="000fbe78Y61172682ff34c4ccd049593d107545d6851fb", uri="rtsp:// root:root at 192.168.0.74/axis-media/media.amp", response="4b4967e78a67ce86f01bf5611b971793" User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Accept: application/sdp Received 654 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 3 Content-Type: application/sdp Content-Base: rtsp://192.168.0.74/axis-media/media.amp/ Date: Mon, 21 Jan 2013 18:45:58 GMT Content-Length: 480 v=0 o=- 1358793958649265 1358793958649265 IN IP4 192.168.0.74 s=Media Presentation e=NONE b=AS:50064 t=0 0 a=control:* a=range:npt=0.000000- m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:50000 a=framerate:6.0 a=transform:1,0,0;0,1,0;0,0,1 a=control:trackID=1 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1; profile-level-id=420029; sprop-parameter-sets=Z0IAKeKQGQJvy4C3AQEBpB4kRUA=,aM48gA== m=audio 0 RTP/AVP 0 c=IN IP4 0.0.0.0 b=AS:64 a=control:trackID=2 ProxyServerMediaSession["rtsp://192.168.0.74/axis-media/media.amp/"] added new "ProxyServerMediaSubsession" for RTP/video/H264 track ProxyServerMediaSession["rtsp://192.168.0.74/axis-media/media.amp/"] added new "ProxyServerMediaSubsession" for RTP/audio/PCMU track accept()ed connection from 192.168.0.27 RTSPClientConnection[0x8837348]::handleRequestBytes() read 66 new bytes:OPTIONS rtsp://192.168.0.27:8554/proxyStream RTSP/1.0 CSeq: 1 parseRTSPRequestString() failed; checking now for HTTP commands (for RTSP-over-HTTP tunneling)... parseHTTPRequestString() failed! sending response: RTSP/1.0 400 Bad Request Date: Mon, Jan 21 2013 18:18:06 GMT Allow: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER RTSPClientConnection[0x8837348]::handleRequestBytes() read -1 new bytes (of 10000); terminating connection! Regards -- Rafael Gil Vaelsys -------------- next part -------------- An HTML attachment was scrubbed... URL: From markuss at sonicfoundry.com Mon Jan 21 10:25:06 2013 From: markuss at sonicfoundry.com (Markus Schumann) Date: Mon, 21 Jan 2013 18:25:06 +0000 Subject: [Live-devel] stream descriptor only reachable by HTTP In-Reply-To: References: <1ED2F9A76678E0428E90FB2B6F93672D0108AE4F@postal.sonicfoundry.net> Message-ID: <1ED2F9A76678E0428E90FB2B6F93672D0108AFA6@postal.sonicfoundry.net> Ross, Thanks for your input. I reported the absence of RTSP for the unicast case to the server maker. Luckily I can configure the device to multicast - here is the SDP http://10.0.70.25/stream.sdp outputs: v=0 o=- 8500488 8500488 IN IP4 10.0.70.25 s=ESP H264 STREAM e=NONE t=0 0 m=video 8800 RTP/AVP 96 c=IN IP4 225.0.0.0/1 a=rtpmap:96 H264/90000 a=fmtp:96 media=video; clock-rate=90000; encoding-name=H264; sprop-parameter-sets=Z0IAH6aAUAW5AA==,aM44gAAAAAA= Do I have to use a lightweight HTTP client to get the SDP and then side load your RTPClient with the string? Thanks Markus. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 17, 2013 6:37 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] stream descriptor only reachable by HTTP I have an RTP source where it's stream descriptor is only available via HTTP - any advice on how to go about it? [...] Browser output: v=0 o=- 8161451 8161451 IN IP4 10.0.70.25 s=ESP H264 STREAM e=NONE t=0 0 m=video 8800 RTP/AVP 96 c=IN IP4 10.0.71.24 a=rtpmap:96 H264/90000 a=fmtp:96 media=video; clock-rate=90000; encoding-name=H264; sprop-parameter-sets=Z0IAH6aAUAW5AA==,aM44gAAAAAA= Grumble.... I wish people (in this case, the developers of that server) would stop thinking that they can avoid implementing RTSP. The RTSP protocol was designed and standardized for a good reason! Generally speaking, trying to specify a stream using a SDP description only - without using the RTSP protocol - works only for multicast streams, not unicast streams like this. But what *might* work here is the following: 1/ Call "MediaSession::createNew()" with this SDP description (string) as parameter, to create a new "MediaSession" object. 2/ Call "MediaSession::initiate()" on this "MediaSession" object. This will create a "RTPSource" (and corresponding "RTCPInstance") objects, to receive the stream. 3/ Then, iterate through the "MediaSubsession" objects (there should be just one) of the "MediaSession" object, and call yourMediaSinkObject->startPlaying(*(subsession->readSource()), ); One potential issue here is that - using this approach - RTCP "RR" packets from you (the receiver) will not get sent back to the server (because the receiver - because it didn't use RTSP - has no way of knowing the server's IP address). This means that if the server uses "RR" packets coming from the receiver to indicate 'liveness', then it will eventually time out the connection. If you have any control over the server, it would be better if you told it to stream via multicast instead of unicast, or - even better - to implement RTSP! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 21 12:22:24 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Jan 2013 12:22:24 -0800 Subject: [Live-devel] stream descriptor only reachable by HTTP In-Reply-To: <1ED2F9A76678E0428E90FB2B6F93672D0108AFA6@postal.sonicfoundry.net> References: <1ED2F9A76678E0428E90FB2B6F93672D0108AE4F@postal.sonicfoundry.net> <1ED2F9A76678E0428E90FB2B6F93672D0108AFA6@postal.sonicfoundry.net> Message-ID: > Do I have to use a lightweight HTTP client to get the SDP and then side load your RTPClient with the string? However you get the SDP description is up to you. (However, since you already have it - because you included it in your last email - you shouldn't need to get it again :-) Once you have the SDP description, you can just follow steps 1/ to 3/ in my previous email. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 21 14:17:19 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Jan 2013 14:17:19 -0800 Subject: [Live-devel] ffmpeg not working anymore with last version In-Reply-To: References: Message-ID: Thanks for the note. I've just installed a new version (2013.01.21) of the code that should fix this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Mon Jan 21 03:36:56 2013 From: tayeb.dotnet at gmail.com (Tayeb Meftah) Date: Mon, 21 Jan 2013 12:36:56 +0100 Subject: [Live-devel] Vob streaming using Live555 Media Server Message-ID: <84C9BCCFFD90415388D234B1EFA4705A@worksc08f920f1> Hello, it's pocible to use Live555 Media server to stream Vob files? VobStreamer do only Multicast but i though i can use Live555 Media server to have my DVD's as a vod. thank Tayeb Meftah Voice of the blind T Broadcast Freedom http://www.vobradio.org Phone:447559762242 -------------- next part -------------- An HTML attachment was scrubbed... URL: From liluotao526 at qq.com Mon Jan 21 17:38:04 2013 From: liluotao526 at qq.com (=?utf-8?B?5LqM5rOJ5pig5pyI?=) Date: Tue, 22 Jan 2013 09:38:04 +0800 Subject: [Live-devel] Variational fps problem Message-ID: An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 21 18:06:42 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Jan 2013 18:06:42 -0800 Subject: [Live-devel] Variational fps problem In-Reply-To: References: Message-ID: > I use a Android phone to record a mp4 file including audio with aac and video with h264.And upload the file in real time while recording via rtsp protocol. I use openRTSP to receive the data stream,but the fps is variational,I don't how to set the fps. It is the server - i.e., the software that *transmits* the data - that chooses the video frame rate. This is not something that can be controlled by a RTSP/RTP client (like "openRTSP", "testRTSPClient", or "VLC"), which just receives the data. So, your question should be sent to whoever developed your *server* software; not to us. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liluotao526 at qq.com Mon Jan 21 19:02:22 2013 From: liluotao526 at qq.com (=?utf-8?B?5LqM5rOJ5pig5pyI?=) Date: Tue, 22 Jan 2013 11:02:22 +0800 Subject: [Live-devel] Variational fps problem Message-ID: An HTML attachment was scrubbed... URL: From josan.chiao at fuho.com.tw Mon Jan 21 18:40:25 2013 From: josan.chiao at fuho.com.tw (Josan) Date: Tue, 22 Jan 2013 02:40:25 +0000 (UTC) Subject: [Live-devel] =?utf-8?q?error_=28ARM=29_=3A_terminate_called_after?= =?utf-8?q?_throwing_an=09instance_of_=27int=27?= References: Message-ID: Sambhav writes: > > > Hi,? > > I cross compiled the live555 media server for ARM linux using the montavista toolchain. The compilation was successful. When I run the test programs the following error occurs and the program gets aborted. > > > terminate called after throwing an instance of 'int' ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?? > terminate called recursively ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?? > Aborted > > > Any solutions for the error ? > > Regards, > Sambhav > > > _______________________________________________ > live-devel mailing list > live-devel at ... > http://lists.live555.com/mailman/listinfo/live-devel > Hello Sambhav: We got the same situation like yours! Did you get the solution yet? Could you tell me how to get rid of this error? Thanks in advance! Regards, Josan From finlayson at live555.com Mon Jan 21 19:22:49 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Jan 2013 19:22:49 -0800 Subject: [Live-devel] Variational fps problem In-Reply-To: References: Message-ID: <4FEE6D7C-47DB-496B-85E4-93D71E78190E@live555.com> > But i wonder whether live555 can stream variational fps file. I'm not sure I understand your question, but because you say that you have a "mp4" file - the answer is probably NO, because our servers currently cannot transmit mp4 files. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 21 21:26:54 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Jan 2013 21:26:54 -0800 Subject: [Live-devel] error (ARM) : terminate called after throwing an instance of 'int' In-Reply-To: References: Message-ID: > We got the same situation like yours! > Did you get the solution yet? Could you tell me how to get rid of this error? Remember, You Have Complete Source Code. As long as you have a console, you should be able to find out (by inserting appropriate "printf()"s in the code) exactly where/why it's crashing. Also, as I noted to the previous questioner, you'll probably find it easier to debug the problem using "testOnDemandRTSPServer", rather than (the more complicated) "wis-streamer". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From josan.chiao at fuho.com.tw Mon Jan 21 22:24:47 2013 From: josan.chiao at fuho.com.tw (Josan) Date: Tue, 22 Jan 2013 06:24:47 +0000 (UTC) Subject: [Live-devel] =?utf-8?q?error_=28ARM=29_=3A_terminate_called_after?= =?utf-8?q?_throwing=09an=09instance_of_=27int=27?= References: Message-ID: Ross Finlayson writes: > > > > ? ?We got the same situation like yours!Did you get the solution yet? Could you tell me how to get rid of this error? > > > Remember, You Have Complete Source Code. > As long as you have a console, you should be able to find out (by inserting appropriate "printf()"s in the code) exactly where/why it's crashing. > > Also, as I noted to the previous questioner, you'll probably find it easier to debug the problem using "testOnDemandRTSPServer", rather than (the more complicated) "wis-streamer". > > Ross FinlaysonLive Networks, Inc.http://www.live555.com/ > > > > > _______________________________________________ > live-devel mailing list > live-devel at ... > http://lists.live555.com/mailman/listinfo/live-devel > Thanks for your reply, Ross! I have tried to print out some message to debug! But it crashed in handleCmd_DESCRIBE() function of RTSPServer.cpp! once it call this sdpDescription = session->generateSDPDescription(); then crashed! We use montavisa(mv_pro_5.0) toolchain as our cross-compiler. I just wonder if something wrong in our Makefie?? Because we can build live555 and run it successfully in early version. But I try to update live555 to latest source code, just cannot run it! Is there something I need to pay attention? Please give me some hint or any suggestion! I appreciate your help! --------This is our Makefile------- MVTOOL_DIR=/opt/dm36x/mv_pro_5.0/montavista/pro/devkit/arm/v5t_le MVTOOL_PREFIX=$(MVTOOL_DIR)/bin/arm_v5t_le- CROSS_COMPILE= $(MVTOOL_PREFIX) COMPILE_OPTS = $(INCLUDES) -I. -O3 -DSOCKLEN_T=socklen_t - DNO_STRSTREAM=1 -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 C = c C_COMPILER = $(CROSS_COMPILE)gcc C_FLAGS = $(COMPILE_OPTS) CPP = cpp CPLUSPLUS_COMPILER = $(CROSS_COMPILE)c++ CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 OBJ = o LINK = $(CROSS_COMPILE)c++ -o LINK_OPTS = -L. -s CONSOLE_LINK_OPTS = $(LINK_OPTS) LIBRARY_LINK = $(CROSS_COMPILE)ld -o LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = LIBS_FOR_GUI_APPLICATION = EXE = ##### End of variables to change Regards, Josan From conchi.ap at vaelsys.com Tue Jan 22 00:51:47 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Tue, 22 Jan 2013 09:51:47 +0100 Subject: [Live-devel] ProxyServerMediaSubsession Segmentation fault In-Reply-To: <50FD3A63.2080402@vaelsys.com> References: <50FD3A63.2080402@vaelsys.com> Message-ID: <50FE5323.6040301@vaelsys.com> Hello Ross Many thanks!!! I've been testing the new release and it fixes the bug that I reported last week. When a BYE is received from the backend, the proxy doesn't crash. But, once the connection with the frontend client is closed, if you try to connect again to the stream source, a new error appears. You can see below a complete log of a test connecting against a Mpeg4 stream source (through TCP): ./live555ProxyServer -V -t rtsp://root:qazwsx123 at 192.168.0.82:554/mpeg4/media.amp LIVE555 Proxy Server (LIVE555 Streaming Media library version 2013.01.19) Opening connection to 192.168.0.82, port 554... RTSP stream, proxying the stream "rtsp://root:qazwsx123 at 192.168.0.82:554/mpeg4/media.amp" Play this stream using the URL: rtsp://192.168.0.18/proxyStream (We use port 80 for optional RTSP-over-HTTP tunneling.) ...remote connection opened Sending request: DESCRIBE rtsp://root:qazwsx123 at 192.168.0.82:554/mpeg4/media.amp RTSP/1.0 CSeq: 2 User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Accept: application/sdp Received 89 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 401 Unauthorized CSeq: 2 WWW-Authenticate: Basic realm="AXIS_00408C7CC50E" Resending... Sending request: DESCRIBE rtsp://root:qazwsx123 at 192.168.0.82:554/mpeg4/media.amp RTSP/1.0 CSeq: 3 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Accept: application/sdp Received 856 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 3 Content-Base: rtsp://192.168.0.82:554/mpeg4/media.amp/ Content-Type: application/sdp Content-Length: 720 v=0 o=- 1358774696384221 1358774696384230 IN IP4 192.168.0.82 s=Media Presentation e=NONE c=IN IP4 0.0.0.0 b=AS:8000 t=0 0 a=control:* a=range:npt=now- a=mpeg4-iod: "data:application/mpeg4-iod;base64,AoFSAE8BAf/1AQNuAAFAUGRhdGE6YXBwbGljYXRpb24vbXBlZzQtb2QtYXU7YmFzZTY0LEFSMEJHd1VmQXhjQXlTUUFaUVFOSUJFRWsrQUFlaElBQUhvU0FBWUJCQT09BA0BBQAEAAAAAAAAAAAABgkBAAAAAAAAAAADOgACQDZkYXRhOmFwcGxpY2F0aW9uL21wZWc0LWJpZnMtYXU7YmFzZTY0LHdCQVNZUVNJVUVVRlB3QT0EEgINAAACAAAAAAAAAAAFAwAAQAYJAQAAAAAAAAAA" a=isma-compliance:1,1.0,1 m=video 0 RTP/AVP 96 b=AS:8000 a=framerate:5.0 a=control:trackID=1 a=rtpmap:96 MP4V-ES/90000 a=fmtp:96 profile-level-id=245; config=000001B0F5000001B5891300000100000001200086C40FA285020F0A21 a=mpeg4-esid:201 ProxyServerMediaSession["rtsp://192.168.0.82:554/mpeg4/media.amp/"] added new "ProxyServerMediaSubsession" for RTP/video/MP4V-ES track ProxyServerMediaSubsession["MP4V-ES"]::createNewStreamSource(session id 0) Initiated: ProxyServerMediaSubsession["MP4V-ES"] ProxyServerMediaSubsession["MP4V-ES"]::createNewRTPSink() ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() ProxyServerMediaSubsession["MP4V-ES"]::createNewStreamSource(session id 3856461167) Sending request: SETUP rtsp://192.168.0.82:554/mpeg4/media.amp/trackID=1 RTSP/1.0 CSeq: 4 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Transport: RTP/AVP/TCP;unicast;interleaved=0-1 ProxyServerMediaSubsession["MP4V-ES"]::createNewRTPSink() Received 124 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Session: 0525353949;timeout=60 Transport: RTP/AVP/TCP;unicast;interleaved=206-207;mode="PLAY" ProxyRTSPClient["rtsp://192.168.0.82:554/mpeg4/media.amp/"]::continueAfterSETUP(): head codec: MP4V-ES; numSubsessions 1 queue: MP4V-ES Sending request: PLAY rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 5 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Session: 0525353949 Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Session: 0525353949 Range: npt=now- RTP-Info: url=trackID=1;seq=39937;rtptime=848048528 Sending request: OPTIONS rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 6 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Received a complete OPTIONS response: RTSP/1.0 200 OK CSeq: 6 Public: DESCRIBE, GET_PARAMETER, PAUSE, PLAY, SETUP, SET_PARAMETER, TEARDOWN ProxyServerMediaSubsession["MP4V-ES"]: received RTCP "BYE" ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() Sending request: PAUSE rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 7 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.19) Session: 0525353949 ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() ProxyServerMediaSubsession["MP4V-ES"]::createNewStreamSource(session id 976923673) Opening connection to 192.168.0.82, port 554... ProxyServerMediaSubsession["MP4V-ES"]::createNewRTPSink() FramedSource[0x9651f00]::getNextFrame(): attempting to read more than once at the same time! Abortado Thank you in advance. Conchi Abasolo Vaelsys -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 22 01:30:13 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 01:30:13 -0800 Subject: [Live-devel] ProxyServerMediaSubsession Segmentation fault In-Reply-To: <50FE5323.6040301@vaelsys.com> References: <50FD3A63.2080402@vaelsys.com> <50FE5323.6040301@vaelsys.com> Message-ID: <52B942B3-8E5C-4B41-BE9B-F84AFDAE887C@live555.com> > I've been testing the new release and it fixes the bug that I reported last week. When a BYE is received from the backend, the proxy doesn't crash. > But, once the connection with the frontend client is closed, if you try to connect again to the stream source, a new error appears. Thanks for the report. I've just installed a new version (2013.01.22) of the code that should fix this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 22 02:32:41 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 02:32:41 -0800 Subject: [Live-devel] Vob streaming using Live555 Media Server In-Reply-To: <84C9BCCFFD90415388D234B1EFA4705A@worksc08f920f1> References: <84C9BCCFFD90415388D234B1EFA4705A@worksc08f920f1> Message-ID: <51ECE4BE-AA02-47B7-8489-EE059E21F5D1@live555.com> > it's pocible to use Live555 Media server to stream Vob files? As of the most recent release of the software, the "LIVE555 Media Server" (the source code version; not yet the pre-built binary versions) will support streaming VOB files (with filename extension ".vob"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From conchi.ap at vaelsys.com Tue Jan 22 02:56:22 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Tue, 22 Jan 2013 11:56:22 +0100 Subject: [Live-devel] ProxyServerMediaSubsession Segmentation fault In-Reply-To: <52B942B3-8E5C-4B41-BE9B-F84AFDAE887C@live555.com> References: <50FD3A63.2080402@vaelsys.com> <50FE5323.6040301@vaelsys.com> <52B942B3-8E5C-4B41-BE9B-F84AFDAE887C@live555.com> Message-ID: <50FE7056.7000509@vaelsys.com> Thanks again!!. The new release seems to handle the rtcp BYE properly, and when you try to connect to the stream source again, the FramedSource error doesn't appear, but the reconnection doesn't complete succesfully because of a "not session found" error: ProxyServerMediaSubsession["MP4V-ES"]: received RTCP "BYE" ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() Sending request: PAUSE rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 7 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.22) Session: 0703690848 ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() Opening connection to 192.168.0.82, port 554... ...remote connection opened Sending request: GET_PARAMETER rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 8 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.22) Session: 0703690848 Content-Length: 2 Received 49 new bytes of response data. Received a complete GET_PARAMETER response: RTSP/1.0 200 OK CSeq: 8 Session: 0703690848 ProxyServerMediaSubsession["MP4V-ES"]::createNewStreamSource(session id 2141600421) Initiated: ProxyServerMediaSubsession["MP4V-ES"] Sending request: PLAY rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 9 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.22) Session: 0703690848 ProxyServerMediaSubsession["MP4V-ES"]::createNewRTPSink() Received 64 new bytes of response data. Received a complete PLAY response: RTSP/1.0 454 Session Not Found CSeq: 9 Session: 0703690848 ProxyServerMediaSubsession["MP4V-ES"]::closeStreamSource() Sending request: PAUSE rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 10 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.22) Session: 0703690848 ProxyServerMediaSubsession["MP4V-ES"]::createNewStreamSource(session id 2029778874) Sending request: PLAY rtsp://192.168.0.82:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 11 Authorization: Basic cm9vdDpxYXp3c3gxMjM= User-Agent: ProxyRTSPClient (LIVE555 Streaming Media v2013.01.22) Session: 0703690848 ProxyServerMediaSubsession["MP4V-ES"]::createNewRTPSink() Received 65 new bytes of response data. Received a complete PAUSE response: RTSP/1.0 454 Session Not Found CSeq: 10 Session: 0703690848 Conchi Abasolo Vaelsys -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Tue Jan 22 07:59:33 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Tue, 22 Jan 2013 15:59:33 +0000 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: References: <20130110132339.M35692@livingdata.pt> Message-ID: <20130121164802.M92973@livingdata.pt> Hi Ross, thank you for your quick reply. And sorry for taking so long to get back to you. We've been dealing with some other issues that arose regarding this problem (see below). On Thu, 10 Jan 2013 11:00:43 -0800, Ross Finlayson wrote > Instead, try making the following change to > "FileSink::afterGettingFrame()" (lines 130-132 of > "liveMedia/FileSink.cpp"): Change the order of the calls to > onSourceClosure(this); and stopPlaying(); so that "stopPlaying()" > is called first, before "onSourceClosure(this)". > > Let us know if this works for you. (If so, I'll make the change in > the next release of the software.) 1. Yes and no. It did work on the testReplicator app, but on our server a new issue surfaced. We started getting "Internal Error" messages from the StreamReplicator: StreamReplicator::deliverReceivedFrame() Internal Error 2(2,2)! ... StreamReplicator::deliverReceivedFrame() Internal Error 2(3,2)! ... StreamReplicator::deliverReceivedFrame() Internal Error 2(4,2)! ... And, soon enough our DeviceSource would stop working, making all sinks stop, also. The main difference - regarding this issue - between our server and the testReplicator app is that, after an error occurs trying to write a video file, we schedule the writing to start again at a later time. We do that by overriding the MediaSink::stopPlaying method in our own FileSink class: virtual void stopPlaying() { MediaSink::stopPlaying(); // schedule to restart in 10 secs envir() << "OurFileSink stopped! Scheduling to restart in 10 secs.\n"; envir().taskScheduler().scheduleDelayedTask(10000000, (TaskFunc *) OurFileSink::startPlaying, this); } We added this functionality to the testReplicator app and tested again. Oddly enough, we didn't get any error messages, nor did the multicast streaming stop playing. So, we delved in a little dipper: By examining the StreamReplicator error messages it soon became clear that with each deactivation/reactivation of the FileSink replica the fNumDeliveriesMadeSoFar variable would increase beyond the number of active replicas. We then added and ID to StreamReplica class in order to make it easier to tell which replica was doing what and populated the StreamReplicator with debug messages. We found the StreamReplica ID particularly useful. Something you may wish to add at a later time. By examining the behavior of the replicas we were able to determine that our FileSink replica was being deactivated while executing StreamReplicator::deliverReceivedFrame(), because, when FramedSource::afterGetting(replica), on line 259, was called, the write error would occur again. Upon returning to StreamReplicator::deliverReceivedFrame(), the fNumActiveReplicas would have been decreased an so, the condition (fNumDeliveriesMadeSoFar == fNumActiveReplicas - 1 && fMasterReplica != NULL) on line 262 would never be met. We applied a fix to the StreamReplicator to decrease the fNumDeliveriesMadeSoFar variable if the replica being deactivated was also receiving a frame. I'm attaching the StreamReplicator class with our patches and newest version of the testReplicator app with a FileSink extension. With these changes our server is now working as expected. 2. Also, we needed to change the visibility of the FileSink::continuePlaying() method to protected, because, upon restarting to write we are also bypassing some frames on H264 ES video file output by calling it until the metadata frames come up. Could you please apply this change? Thank you for all your attention Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From bruno.abreu at livingdata.pt Tue Jan 22 08:03:55 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Tue, 22 Jan 2013 16:03:55 +0000 Subject: [Live-devel] Problem with fix in MPEG2TransportStreamFromESSource stops frame delivery Message-ID: <20130122155959.M8965@livingdata.pt> Hello Ross, In http://lists.live555.com/pipermail/live-devel/2013-January/016347.html I described the following scenario: "We have been running for some time a RTSPServer that acquires video from a live source (with our own DeviceSource) from which we create 2 or 3 replicas: one for multicast streaming and another to feed a FileSink for file storage. The third replica is created on demand for unicast streaming." Our stream consists in a H264 ES video which we encapsulate in MPEG2TS. In recent versions we detected the following problem: when a unicast session is created the whole pipeline, including replicas, stops sending video. We didn't investigate this in much detail but, since it was working before, we compared with older versions and found out that it was working until the live.2012.08.31 release. So, we experimented to revert some changes in recent versions and one specifically worked for us: --- live.2013.01.15/liveMedia/MPEG2TransportStreamFromESSource.cpp 2012-12-19 17:53:50.000000000 +0000 +++ new/liveMedia/MPEG2TransportStreamFromESSource.cpp 2012-12-19 17:53:58.000000000 +0000 @@ -99,7 +99,7 @@ MPEG2TransportStreamFromESSource } MPEG2TransportStreamFromESSource::~MPEG2TransportStreamFromESSource() { - doStopGettingFrames(); delete fInputSources; } We think this change was introduced in the sequence of: 2012.09.11: - Fixed a bug in "MPEG2TransportStreamFromESSource": Its destructor wasn't stopping the delivery from upstream objects. (Thanks to Jing Li for reporting this.) We don't know the impact of our change in the original fix. Thank you, Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From Pablo.Gomez at scch.at Tue Jan 22 06:57:34 2013 From: Pablo.Gomez at scch.at (Pablo Gomez) Date: Tue, 22 Jan 2013 14:57:34 +0000 Subject: [Live-devel] unicast onDemand from live source NAL Units NVidia Message-ID: <905DBF721640BD47A97C0AE82D85BDF420948508@exmore.scch.at> Hi, I'm trying to implement a unicast ondemand streaming from a live source. The live source comes from the Nvidia encoder NVEnc: http://docs.nvidia.com/cuda/samples/3_Imaging/cudaEncode/doc/nvcuvenc.pdf This encoder produces NAL units ready to send over the network. I have a lot of problems with artifacts. I have tried two ways of sending the packages. ? A Stream Ring buffer: With this approach I'm sending as much bytes as possible. Means. Min(availableBytesInBuffer,fMaxSize) bytes ? A queue where I hold the NAL packets and its size: With this approach I tried to send the entire NAL. Means Min(nal.size,fMaxSize) bytes With the first approach I can see in the log that I'm getting many errors regarding the SPS PPS incorrect. also when I display the stream in the video player I see artifacts and blinkings With the second approach I also see artifacts and blinkings and the main problem is that there are many NAL units that are bigger than fMaxSize. Thus, those frames are truncated. I do not know How to ensure that fMaxSize is always bigger that fFrameSize. I tried to set up in the Streamer code enough size in the OutputPacketBuffer but this does not seem to work.... { OutPacketBuffer::maxSize=10000000; char const* streamName = "testStream"; ServerMediaSession* sms = ServerMediaSession::createNew(*env, streamName, streamName, descriptionString); sms->addSubsession(H264LiveServerMediaSubsession ::createNew(*env,h264LiveBuffer, reuseFirstSource)); rtspServer->addServerMediaSession(sms); announceStream(rtspServer, sms, streamName, inputFileName); } At, this point I wonder... first Is it possible to stream the NAL units coming from the NVidia with the live555? If so,, any suggestion how can I try to solve this? I'm literally stuck at this point. Thank you Best Pablo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 22 09:35:42 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 09:35:42 -0800 Subject: [Live-devel] ProxyServerMediaSubsession Segmentation fault In-Reply-To: <50FE7056.7000509@vaelsys.com> References: <50FD3A63.2080402@vaelsys.com> <50FE5323.6040301@vaelsys.com> <52B942B3-8E5C-4B41-BE9B-F84AFDAE887C@live555.com> <50FE7056.7000509@vaelsys.com> Message-ID: <6DD99527-DD3D-4304-A3EA-14165044BADB@live555.com> > The new release seems to handle the rtcp BYE properly, and when you try to connect to the stream source again, the FramedSource error doesn't appear, but the reconnection doesn't complete succesfully because of a "not session found" error: That sounds reasonable. The "session not found" error is coming from the back-end server. Why do you think that it sent the RTCP "BYE" packet in the first place? Because the stream had ended! Thus, you really should not be proxying streams like this (that can end). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 22 10:08:49 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 10:08:49 -0800 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: <20130121164802.M92973@livingdata.pt> References: <20130110132339.M35692@livingdata.pt> <20130121164802.M92973@livingdata.pt> Message-ID: > On Thu, 10 Jan 2013 11:00:43 -0800, Ross Finlayson wrote >> Instead, try making the following change to >> "FileSink::afterGettingFrame()" (lines 130-132 of >> "liveMedia/FileSink.cpp"): Change the order of the calls to >> onSourceClosure(this); and stopPlaying(); so that "stopPlaying()" >> is called first, before "onSourceClosure(this)". >> >> Let us know if this works for you. (If so, I'll make the change in >> the next release of the software.) > > 1. > Yes and no. It did work on the testReplicator app OK, so I'll make that change (changing the order of the calls to "onSourceClosure(this);" and "stopPlaying();" in "FileSink::afterGettingFrame()") in the next release of the software. > The main difference - regarding this issue - between our server and the > testReplicator app is that, after an error occurs trying to write a video file, > we schedule the writing to start again at a later time. We do that by overriding > the MediaSink::stopPlaying method in our own FileSink class: > > virtual void stopPlaying() { > MediaSink::stopPlaying(); > > // schedule to restart in 10 secs > envir() << "OurFileSink stopped! Scheduling to restart in 10 secs.\n"; > envir().taskScheduler().scheduleDelayedTask(10000000, > (TaskFunc *) OurFileSink::startPlaying, this); > } No - that's a bad idea. You shouldn't be redefining a virtual function to do something completely unrelated to what the rest of the code - that calls this virtual function - expects. (That's why you keep running into trouble :-) The right place to be scheduling a new 'playing' task is in your 'after playing' function - i.e., the function that you passed as a parameter when you called "startPlaying()" on your "FileSink" subclass. That 'after playing' function will get called, automatically, when writes to the file fail (or if the input stream closes) - as a result of the call to "onSourceClosure(this);". So that's where you should be scheduling your new task. > I'm attaching the StreamReplicator class with our patches I didn't see any attachment! > 2. > Also, we needed to change the visibility of the FileSink::continuePlaying() > method to protected OK, I'll make that change (along with the change to the implementation of "FileSink::afterGettingFrame()" - noted earlier) in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 22 10:32:27 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 10:32:27 -0800 Subject: [Live-devel] Problem with fix in MPEG2TransportStreamFromESSource stops frame delivery In-Reply-To: <20130122155959.M8965@livingdata.pt> References: <20130122155959.M8965@livingdata.pt> Message-ID: <800BA280-51C9-4382-B7FA-F99530D182D2@live555.com> > We think this change was introduced in the sequence of: > 2012.09.11: > - Fixed a bug in "MPEG2TransportStreamFromESSource": Its destructor wasn't > stopping the delivery from upstream objects. That's correct. That line doStopGettingFrames(); was added to the "MPEG2TransportStreamFromESSource" destructor for a good reason. You should NOT be deleting that line. If you have a "MPEG2TransportStreamFromESSource" as a replica, then when the "MPEG2TransportStreamFromESSource" destructor gets called - for whatever reason - then that will cause "StreamReplica::doStopGettingFrames()" to be called first, then "StreamReplica::~StreamReplica()". Neither of which should be causing any problems for you. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 22 10:46:08 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 10:46:08 -0800 Subject: [Live-devel] unicast onDemand from live source NAL Units NVidia In-Reply-To: <905DBF721640BD47A97C0AE82D85BDF420948508@exmore.scch.at> References: <905DBF721640BD47A97C0AE82D85BDF420948508@exmore.scch.at> Message-ID: First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a "H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink"). > I tried to set up in the Streamer code enough size in the OutputPacketBuffer but this does not seem to work.... > { > OutPacketBuffer::maxSize=10000000; Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you?? If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers. Nonetheless, if you set "OutPacketBuffer::maxSize" to a value larger than the largest expected NAL unit, then this should work (i.e., you should find that "fMaxSize" will always be large enough for you to copy a whole NAL unit). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Tue Jan 22 11:09:56 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Tue, 22 Jan 2013 19:09:56 +0000 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: References: <20130110132339.M35692@livingdata.pt> <20130121164802.M92973@livingdata.pt> Message-ID: <20130122185511.M20536@livingdata.pt> On Tue, 22 Jan 2013 10:08:49 -0800, Ross Finlayson wrote > OK, so I'll make that change (changing the order of the calls to > "onSourceClosure(this);" and "stopPlaying();" in > "FileSink::afterGettingFrame()") in the next release of the software. > > > The main difference - regarding this issue - between our server and the > > testReplicator app is that, after an error occurs trying to write a video file, > > we schedule the writing to start again at a later time. We do that by overriding > > the MediaSink::stopPlaying method in our own FileSink class: > > > > virtual void stopPlaying() { > > MediaSink::stopPlaying(); > > > > // schedule to restart in 10 secs > > envir() << "OurFileSink stopped! Scheduling to restart in 10 secs.\n"; > > envir().taskScheduler().scheduleDelayedTask(10000000, > > (TaskFunc *) OurFileSink::startPlaying, this); > > } > > No - that's a bad idea. You shouldn't be redefining a virtual > function to do something completely unrelated to what the rest of the > code - that calls this virtual function - expects. (That's why you > keep running into trouble :-) > > The right place to be scheduling a new 'playing' task is in your > 'after playing' function - i.e., the function that you passed as a > parameter when you called "startPlaying()" on your "FileSink" > subclass. That 'after playing' function will get called, > automatically, when writes to the file fail (or if the input stream > closes) - as a result of the call to "onSourceClosure(this);". So > that's where you should be scheduling your new task. Initially we were scheduling in afterPlaying function, but your suggestion of reordering sets fAfterFunc to NULL in MediaSink::stopPlaying() before it gets called in MediaSink::onSourceClosure(). We think the correct behavior would be to call stopGettingFrames on fSource which will call doStopGettingFrames() on the specific replica and also have the afterPlaying() function get called to make the scheduling. > > I'm attaching the StreamReplicator class with our patches > > I didn't see any attachment! You're right, I'm sorry. Sending the files now. > > 2. > > Also, we needed to change the visibility of the FileSink::continuePlaying() > > method to protected > > OK, I'll make that change (along with the change to the implementation > of "FileSink::afterGettingFrame()" - noted earlier) in the next > release of the software. Thank you. -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt -------------- next part -------------- A non-text attachment was scrubbed... Name: StreamReplicator-patched.cpp Type: text/x-c++src Size: 16684 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testReplicator-modified.cpp Type: text/x-c++src Size: 6758 bytes Desc: not available URL: From finlayson at live555.com Tue Jan 22 13:51:29 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 13:51:29 -0800 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: <20130122185511.M20536@livingdata.pt> References: <20130110132339.M35692@livingdata.pt> <20130121164802.M92973@livingdata.pt> <20130122185511.M20536@livingdata.pt> Message-ID: >> The right place to be scheduling a new 'playing' task is in your >> 'after playing' function - i.e., the function that you passed as a >> parameter when you called "startPlaying()" on your "FileSink" >> subclass. That 'after playing' function will get called, >> automatically, when writes to the file fail (or if the input stream >> closes) - as a result of the call to "onSourceClosure(this);". So >> that's where you should be scheduling your new task. > > Initially we were scheduling in afterPlaying function, but your suggestion of > reordering sets fAfterFunc to NULL in MediaSink::stopPlaying() before it gets > called in MediaSink::onSourceClosure(). > We think the correct behavior would be to call stopGettingFrames on fSource Yes, I agree. In the next version of the software, this piece of code - in "FileSink::afterGettingFrame()" - will become: if (fOutFid == NULL || fflush(fOutFid) == EOF) { // The output file has closed. Handle this the same way as if the input source had closed: if (fSource != NULL) fSource->stopGettingFrames(); onSourceClosure(this); return; } Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 22 17:27:30 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Jan 2013 17:27:30 -0800 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: <20130121164802.M92973@livingdata.pt> References: <20130110132339.M35692@livingdata.pt> <20130121164802.M92973@livingdata.pt> Message-ID: > By examining the behavior of the replicas we were able to determine that our > FileSink replica was being deactivated while executing > StreamReplicator::deliverReceivedFrame(), because, when > FramedSource::afterGetting(replica), on line 259, was called, the write error > would occur again. Upon returning to StreamReplicator::deliverReceivedFrame(), > the fNumActiveReplicas would have been decreased an so, the condition > (fNumDeliveriesMadeSoFar == fNumActiveReplicas - 1 && fMasterReplica != NULL) on > line 262 would never be met. > > We applied a fix to the StreamReplicator to decrease the fNumDeliveriesMadeSoFar > variable if the replica being deactivated was also receiving a frame. Ugh. I'm not thrilled by this hack, but right now I don't see a better solution, so I've gone ahead and included it in the latest release (2012.01.23) of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Wed Jan 23 00:12:27 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Wed, 23 Jan 2013 08:12:27 +0000 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: References: <20130110132339.M35692@livingdata.pt> <20130121164802.M92973@livingdata.pt> <20130122185511.M20536@livingdata.pt> Message-ID: <50FF9B6B.2010007@livingdata.pt> On 01/22/2013 09:51 PM, Ross Finlayson wrote: > Yes, I agree. In the next version of the software, this piece of code - > in "FileSink::afterGettingFrame()" - will become: > > if (fOutFid == NULL || fflush(fOutFid) == EOF) { > // The output file has closed. Handle this the same way as if > the input source had closed: > if (fSource != NULL) fSource->stopGettingFrames(); > onSourceClosure(this); > return; > } Great! Thanks. We'll test this today. -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From bruno.abreu at livingdata.pt Wed Jan 23 00:21:54 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Wed, 23 Jan 2013 08:21:54 +0000 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: References: <20130110132339.M35692@livingdata.pt> <20130121164802.M92973@livingdata.pt> Message-ID: <50FF9DA2.1010701@livingdata.pt> On 01/23/2013 01:27 AM, Ross Finlayson wrote: >> We applied a fix to the StreamReplicator to decrease the >> fNumDeliveriesMadeSoFar >> variable if the replica being deactivated was also receiving a frame. > > Ugh. I'm not thrilled by this hack, but right now I don't see a better > solution, so I've gone ahead and included it in the latest release > (2012.01.23) of the software. It's an ugly one, for sure. Our immediate concern just was to keep all other replicas running if one of them ran into trouble downstream. We'll gladly integrate a more definite solution or provide you with feedback if we come up with something more elegant. Testing the new release today. Thank you. Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From zenereyes at gmail.com Wed Jan 23 01:42:33 2013 From: zenereyes at gmail.com (lorenzo franci) Date: Wed, 23 Jan 2013 10:42:33 +0100 Subject: [Live-devel] (no subject) Message-ID: It is possible to have part of c++ code for live streaming H264 for example...I don't understand the faq about the question. My source is a grabber that convert the imput video in h264 format. Thank zenereyes -------------- next part -------------- An HTML attachment was scrubbed... URL: From Pablo.Gomez at scch.at Wed Jan 23 02:27:37 2013 From: Pablo.Gomez at scch.at (Pablo Gomez) Date: Wed, 23 Jan 2013 10:27:37 +0000 Subject: [Live-devel] unicast onDemand from live source NAL Units Message-ID: <905DBF721640BD47A97C0AE82D85BDF4209486B5@exmore.scch.at> >First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a >"H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink"). I did the H264LiveServerMediaSubsession based on the H264FileServerMediaSubssesion. I'm using the H264VideoRTPSink.cpp, H264VideoStreamDiscreteFramer.cpp and the object that inherits FramedSource where I'm reading the NAL units This is how it is connected in the media subsession: FramedSource* H264LiveServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) { estBitrate = 10000; // kbps, estimate // Create the video source: H264LiveStreamFramedSource* liveFramer = H264LiveStreamFramedSource::createNew(envir(),liveBuffer); H264VideoStreamDiscreteFramer* discFramer = H264VideoStreamDiscreteFramer::createNew(envir(),liveFramer); // Create a framer for the Video Elementary Stream: return H264VideoStreamFramer::createNew(envir(), discFramer); } RTPSink* H264LiveServerMediaSubsession ::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) { return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic); } This is the doGetNextFrame in the H264LiveStreamFramedSource I'm using: void H264LiveStreamFramedSource::doGetNextFrame() { // Try to read as many bytes as will fit in the buffer provided (or "fPreferredFrameSize" if less) fFrameSize=fBuffer->read(fTo,fMaxSize,&fNumTruncatedBytes); // We don't know a specific play time duration for this data, // so just record the current time as being the 'presentation time': gettimeofday(&fPresentationTime, NULL); // Inform the downstream object that it has data: FramedSource::afterGetting(this); } About the call fBuffer.read fBuffer->read(fTo,fMaxSize,&fNumTruncatedBytes); is basically to the object that contains the NAL units. This object I have two implementations one tries to copy the whole NAL unit and sets the fNumTruncatedBytes to the truncatedBytes in the read operation.. It returns the number of bytes copied to fTo. The second implementation I have of this buffer is a Ring Buffer. When I write to the ring buffer I write all bytes and when I read from it I read the minimum between availableBytes in buffer and the fMaxSize. I start reading from the last read position+1. Thus, in this approach I do not truncate anything. But, I guess somehow the NAL units are broken. Because if the last read position is in the middle of a NAL unit, the next Read will not have any SPS/PPS. >Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting >this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you?? Yes, 10 million is insane there are no units with that size. Just wrote it to test. Now I set it up to 250000 which is big enough but it does not matter, the fMaxSize is always smaller than that and I'm getting truncated frames quiet often. >If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. >Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets -> the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers. I have checked the Nvidia encoder parameters and it has one parameter to set up the number of slices. I set it up to 4 and 10. I also test it the default mode which lets the encoder decide the slice number. Nevertheless, I'm testing on a lan network so it is basically lossless. Thus, I guess this parameter should not be a problem. Best Pablo ---------------------------------------------------------------------- Message: 1 Date: Tue, 22 Jan 2013 10:46:08 -0800 From: Ross Finlayson To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] unicast onDemand from live source NAL Units NVidia Message-ID: Content-Type: text/plain; charset="iso-8859-1" First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a "H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink"). > I tried to set up in the Streamer code enough size in the OutputPacketBuffer but this does not seem to work.... > { > OutPacketBuffer::maxSize=10000000; If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers. Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you?? If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers. Nonetheless, if you set "OutPacketBuffer::maxSize" to a value larger than the largest expected NAL unit, then this should work (i.e., you should find that "fMaxSize" will always be large enough for you to copy a whole NAL unit). Ross Finlayson Live Networks, Inc. http://www.live555.com/ ---------------------------------------------------------------------- From Pablo.Gomez at scch.at Wed Jan 23 04:58:20 2013 From: Pablo.Gomez at scch.at (Pablo Gomez) Date: Wed, 23 Jan 2013 12:58:20 +0000 Subject: [Live-devel] unicast onDemand from live source NAL Units In-Reply-To: <905DBF721640BD47A97C0AE82D85BDF4209486B0@exmore.scch.at> References: <905DBF721640BD47A97C0AE82D85BDF4209486B0@exmore.scch.at> Message-ID: <905DBF721640BD47A97C0AE82D85BDF420948825@exmore.scch.at> If I write the read operations to a file. And the write operations to another file in order to play them with a video player such as the ffplay. I get the following outputs: When I'm using a Ring buffer and I read fMaxSize bytes I get: The doesn't show any error. That is expected because I'm not losing any single byte. When I'm using a buffer of NAL units and I try to send the whole NAL I get: [h264 @ 00000000007d1fa0] corrupted macroblock 3 12 (total_coeff=-1) [h264 @ 00000000007d1fa0] error while decoding MB 3 12 [h264 @ 00000000007d1fa0] concealing 2304 DC, 2304 AC, 2304 MV errors in I frame [h264 @ 0000000003c407c0] top block unavailable for requested intra mode at 32 12 [h264 @ 0000000003c407c0] error while decoding MB 32 12 [h264 @ 0000000003c407c0] concealing 2304 DC, 2304 AC, 2304 MV errors in I frame [h264 @ 0000000003dbc020] corrupted macroblock 19 24 (total_coeff=-1) [h264 @ 0000000003dbc020] error while decoding MB 19 24 [h264 @ 0000000003dbc020] concealing 1536 DC, 1536 AC, 1536 MV errors in I frame [h264 @ 00000000007d1fa0] Invalid level prefix [h264 @ 00000000007d1fa0] error while decoding MB 19 22 [h264 @ 00000000007d1fa0] concealing 1694 DC, 1694 AC, 1694 MV errors in I frame [h264 @ 0000000003c407c0] Invalid level prefix [h264 @ 0000000003c407c0] error while decoding MB 8 20 [h264 @ 0000000003c407c0] concealing 1833 DC, 1833 AC, 1833 MV errors in I frame [h264 @ 0000000003dbc020] concealing 2013 DC, 2013 AC, 2013 MV errors in I frame [h264 @ 00000000007d1fa0] corrupted macroblock 20 18 (total_coeff=16) [h264 @ 00000000007d1fa0] error while decoding MB 20 18 [h264 @ 00000000007d1fa0] concealing 1949 DC, 1949 AC, 1949 MV errors in I frame [h264 @ 0000000003c407c0] Invalid level prefix [h264 @ 0000000003c407c0] error while decoding MB 50 20 [h264 @ 0000000003c407c0] concealing 1791 DC, 1791 AC, 1791 MV errors in I frame [h264 @ 0000000003dbc020] corrupted macroblock 19 19 (total_coeff=-1) [h264 @ 0000000003dbc020] error while decoding MB 19 19 [h264 @ 0000000003dbc020] concealing 1886 DC, 1886 AC, 1886 MV errors in I frame [h264 @ 00000000007d1fa0] concealing 1950 DC, 1950 AC, 1950 MV errors in I frame [h264 @ 0000000003c407c0] Invalid level prefix [h264 @ 0000000003c407c0] error while decoding MB 38 17 [h264 @ 0000000003c407c0] concealing 1995 DC, 1995 AC, 1995 MV errors in I frame [h264 @ 0000000003dbc020] Invalid level prefix [h264 @ 0000000003dbc020] error while decoding MB 14 17 [h264 @ 0000000003dbc020] concealing 2019 DC, 2019 AC, 2019 MV errors in I frame [h264 @ 00000000007d1fa0] concealing 2047 DC, 2047 AC, 2047 MV errors in I frame [h264 @ 0000000003c407c0] corrupted macroblock 12 15 (total_coeff=-1) [h264 @ 0000000003c407c0] error while decoding MB 12 15 [h264 @ 0000000003c407c0] concealing 2149 DC, 2149 AC, 2149 MV errors in I frame [h264 @ 0000000003dbc020] Invalid level prefix [h264 @ 0000000003dbc020] error while decoding MB 9 16 [h264 @ 0000000003dbc020] concealing 2088 DC, 2088 AC, 2088 MV errors in I frame Which Is somehow expected due to there are some bytes truncated quiet often... If I play the file with what I'm writing from the encoder everything is correct. Best, Pablo -----Original Message----- From: Pablo Gomez Sent: Wednesday, January 23, 2013 11:28 AM To: 'live-devel at ns.live555.com' Subject: Re:Re: [Live-devel] unicast onDemand from live source NAL Units >First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a >"H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink"). I did the H264LiveServerMediaSubsession based on the H264FileServerMediaSubssesion. I'm using the H264VideoRTPSink.cpp, H264VideoStreamDiscreteFramer.cpp and the object that inherits FramedSource where I'm reading the NAL units This is how it is connected in the media subsession: FramedSource* H264LiveServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) { estBitrate = 10000; // kbps, estimate // Create the video source: H264LiveStreamFramedSource* liveFramer = H264LiveStreamFramedSource::createNew(envir(),liveBuffer); H264VideoStreamDiscreteFramer* discFramer = H264VideoStreamDiscreteFramer::createNew(envir(),liveFramer); // Create a framer for the Video Elementary Stream: return H264VideoStreamFramer::createNew(envir(), discFramer); } RTPSink* H264LiveServerMediaSubsession ::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) { return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic); } This is the doGetNextFrame in the H264LiveStreamFramedSource I'm using: void H264LiveStreamFramedSource::doGetNextFrame() { // Try to read as many bytes as will fit in the buffer provided (or "fPreferredFrameSize" if less) fFrameSize=fBuffer->read(fTo,fMaxSize,&fNumTruncatedBytes); // We don't know a specific play time duration for this data, // so just record the current time as being the 'presentation time': gettimeofday(&fPresentationTime, NULL); // Inform the downstream object that it has data: FramedSource::afterGetting(this); } About the call fBuffer.read fBuffer->read(fTo,fMaxSize,&fNumTruncatedBytes); is basically to the object that contains the NAL units. This object I have two implementations one tries to copy the whole NAL unit and sets the fNumTruncatedBytes to the truncatedBytes in the read operation.. It returns the number of bytes copied to fTo. The second implementation I have of this buffer is a Ring Buffer. When I write to the ring buffer I write all bytes and when I read from it I read the minimum between availableBytes in buffer and the fMaxSize. I start reading from the last read position+1. Thus, in this approach I do not truncate anything. But, I guess somehow the NAL units are broken. Because if the last read position is in the middle of a NAL unit, the next Read will not have any SPS/PPS. >Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting >this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you?? Yes, 10 million is insane there are no units with that size. Just wrote it to test. Now I set it up to 250000 which is big enough but it does not matter, the fMaxSize is always smaller than that and I'm getting truncated frames quiet often. >If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. >Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets -> the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers. I have checked the Nvidia encoder parameters and it has one parameter to set up the number of slices. I set it up to 4 and 10. I also test it the default mode which lets the encoder decide the slice number. Nevertheless, I'm testing on a lan network so it is basically lossless. Thus, I guess this parameter should not be a problem. Best Pablo ---------------------------------------------------------------------- Message: 1 Date: Tue, 22 Jan 2013 10:46:08 -0800 From: Ross Finlayson To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] unicast onDemand from live source NAL Units NVidia Message-ID: Content-Type: text/plain; charset="iso-8859-1" First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a "H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink"). > I tried to set up in the Streamer code enough size in the OutputPacketBuffer but this does not seem to work.... > { > OutPacketBuffer::maxSize=10000000; If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers. Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you?? If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers. Nonetheless, if you set "OutPacketBuffer::maxSize" to a value larger than the largest expected NAL unit, then this should work (i.e., you should find that "fMaxSize" will always be large enough for you to copy a whole NAL unit). Ross Finlayson Live Networks, Inc. http://www.live555.com/ ---------------------------------------------------------------------- From finlayson at live555.com Wed Jan 23 05:55:28 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Jan 2013 05:55:28 -0800 Subject: [Live-devel] unicast onDemand from live source NAL Units In-Reply-To: <905DBF721640BD47A97C0AE82D85BDF4209486B5@exmore.scch.at> References: <905DBF721640BD47A97C0AE82D85BDF4209486B5@exmore.scch.at> Message-ID: > FramedSource* H264LiveServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) { > estBitrate = 10000; // kbps, estimate > // Create the video source: > H264LiveStreamFramedSource* liveFramer = H264LiveStreamFramedSource::createNew(envir(),liveBuffer); > H264VideoStreamDiscreteFramer* discFramer = H264VideoStreamDiscreteFramer::createNew(envir(),liveFramer); > // Create a framer for the Video Elementary Stream: > return H264VideoStreamFramer::createNew(envir(), discFramer); No, this is wrong! You should not be creating/using a "H264VideoStreamFramer" at all. That class should be used *only* when the input is a byte stream (e.g., from a file). If - as in your case - the input is a discrete sequence of NAL units (i.e., one NAL unit at a time), then you should use a "H264VideoStreamDiscreteFramer" only. So, you should replace the line return H264VideoStreamFramer::createNew(envir(), discFramer); with return discFramer; That should also fix the problem that you're seeing with "fMaxSize" not being large enough in your "H264VideoStreamLiveFramedSource" implementation. > This is the doGetNextFrame in the H264LiveStreamFramedSource I'm using: > > void H264LiveStreamFramedSource::doGetNextFrame() { > > // Try to read as many bytes as will fit in the buffer provided (or "fPreferredFrameSize" if less) > fFrameSize=fBuffer->read(fTo,fMaxSize,&fNumTruncatedBytes); This should work, provided that your "read()" function always delivers (to "*fTo") a single NAL unit, and nothing else - and blocks until one becomes available. In other words, after "read()" is called, the first bytes of *fTo must be the start of a single NAL unit, with *no* 'start code'. This is not ideal, though, because, ideally, 'read' functions called from a LIVE555-based application should not block (because LIVE555-based applications run in a single-threaded event loop). Instead, if "doGetNextFrame()" gets called when no new NAL unit is currently available, it should return immediately. I suggest that you review the sample code that we have provided in "liveMedia/DeviceSource.cpp". You can use this class as a model for how to write your "H264LiveStreamFramedSource" class. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From conchi.ap at vaelsys.com Wed Jan 23 10:52:23 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Wed, 23 Jan 2013 19:52:23 +0100 Subject: [Live-devel] ProxyRTSPClient::sendLivenessCommand Timeout Message-ID: <51003167.1060305@vaelsys.com> Hello I've been studying the behaviour of the backend liveness mechanism, implemented in ProxyRTSPClient::sendLivenessCommand, and it seems that there is no "timeout" to check if the liveness command response is received. Sometimes the liveness command is sent but no response is received, and the ProxyRTSPClient doesn't notice it. However, in the frontend part (RTSPClientSession) exists a method to control the timeout (livenessTimeoutTask), and this method is launched as a delayed task. I would like to know if really there isn't any liveness timeout control in the backend part. Otherwise, it would be very helpful that you describe which control method are you using Thanks Conchi Abasolo From finlayson at live555.com Wed Jan 23 11:15:45 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Jan 2013 11:15:45 -0800 Subject: [Live-devel] ProxyRTSPClient::sendLivenessCommand Timeout In-Reply-To: <51003167.1060305@vaelsys.com> References: <51003167.1060305@vaelsys.com> Message-ID: > I would like to know if really there isn't any liveness timeout control in the backend part. Otherwise, it would be very helpful that you describe which control method are you using Once the 'back-end' connection to the proxied stream has been established (as a result of a successful "DESCRIBE" command), the proxy server periodically (at an interval randomly chosen between 30 and 60 seconds (usually)) sends a command to the 'back-end' server, to test whether it's still alive. (This command will be "OPTIONS", unless the server has specifically reported (in response to a previous "OPTIONS" command) that it supports "GET_PARAMETER", in which case the command will be "GET_PARAMETER".) The proxy server acts, based upon the response to each 'liveness' command. If the response is "OK", it goes ahead and sends another 'liveness' command later. If the response is not "OK", or if it detects that the RTSP connection with the back-end server has failed, then it assumes that the back-end server is down, and it will then closes the back-end connection, and start again by sending another "DESCRIBE". (These "DESCRIBE" commands are also repeated - at increasing random intervals - until the server responds.) What we don't do, however, is test whether the back-end server actually responds to each 'liveness' command ("OPTIONS" or "GET_PARAMETER"). So far, I've assumed that if the back-end server fails, it will do so by closing the TCP connection, which we (the proxy) will eventually detect. The code currently does not allow for the TCP connection staying alive, but the back-end server simply failing to respond at all to a 'liveness' command. Is this something that you are actually seeing happen?? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Wed Jan 23 12:22:00 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Wed, 23 Jan 2013 20:22:00 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> Message-ID: When trying to get to the bottom of this "TEARDOWN exception" I started by testing different streams. Some noteworthy observations: - The frequency of the exceptions varies greatly. Sometimes I can get one exception every 10min, while other times I can run multiple clients for a few days before seeing one. - The exception occurs on all computers I've tested it on, although I've only been running Windows clients. - The exception has occurred both for live streams and archive, i.e. non-live, streams. It also occurs for different cameras and isn't limited to a specific stream. I've confirmed that the exception happens when streaming from both Cisco VSM 6 and VSM 7. - I've however not seen the exception while streaming from an Axis 243q encoder. Since the crashes are so random, you can't draw any confident conclusions, but I've had multiple clients streaming from 3 Axis encoders for multiple days, so everything points towards that the exceptions don't occur with the Axis encoders. I.e. the exception has (so far) only occurred using Cisco VSM. I then started adding debug printouts to the code. After a while I noticed it always happens in DummySink::afterGettingFrame(), although that makes no sense. In the testRTSPClient example that function only calls continuePlaying(). The debug message (602) right before the call to continuePlaying() is always printed when the exception occurs, but not any message in continuePlaying(). The crashes are very consistent. Not the frequency, but the location. When they occur, 602 is always the last message printed. I've attached an output example. Judging by the callstack it almost looks to me like the printf would be the cause, but the same thing happens if I remove the debug output, i.e. 602, and 601 etc. This however makes no sense at all. What is causing the sudden app crash? I see no explanation at all in the code. As can be seen, I added a try-catch, but that didn't do anything. With strange errors like this I would almost suspect some odd linking problem or strange Windows-issue, but that would not explain why it works most of the time, and why it never tends to crash with some streams (i.e. when not streaming via Cisco VSM). I would also suspect threads going havoc, but as liveMedia is single-threaded that shouldn't be the case. It definitely seems like the server matters. How can that be? Any ideas at all what might be going on? Included files in the attached zip-archive: - testRTSPClient.cpp: Modified example that takes 2 arguments; two rtsp url's to cycle between. The dwell time, i.e the number of seconds to wait between switching streams is set to 5s. E.g.: testRTSPClient.exe rtsp://192.168.1.103/archive/TVF-04Ba rtsp://192.168.1.103/archive/TVF-16a - Output.txt: Typical example of program output; always crashes after debug message 602. Note that the output shows some additional "debug numbers" that I added to the library while hunting this down. - Callstack.png: Snapshot example of the callstack while running the debug version within Visual Studio. /Claes From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 10, 2013 6:43 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Unhandled exception during TEARDOWN I ran your modified application for several hours (on FreeBSD, under GDB), and also on Linux using "valgrind", but unfortunately was unable to find any problem. I suspect that whatever bug is causing this is something that (for some reason) is causing an exception only in your environment. I'm still interested in learning the cause, of course (especially if it is - as it appears to be - a bug in our code), but it looks like you're probably going to have to track this down yourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: RTSPexception.zip Type: application/x-zip-compressed Size: 54057 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From finlayson at live555.com Wed Jan 23 14:05:04 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Jan 2013 14:05:04 -0800 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> Message-ID: <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> > The crashes are very consistent. Not the frequency, but the location. When they occur, 602 is always the last message printed. I've attached an output example. Judging by the callstack it almost looks to me like the printf would be the cause, but the same thing happens if I remove the debug output, i.e. 602, and 601 etc. > > This however makes no sense at all. What is causing the sudden app crash? I see no explanation at all in the code. I suspect that a 'memory smash' - i.e., a write through a bad pointer (caused by a bug in the code) - is to blame. If that happens, then a pointer somewhere else might be getting corrupted, which could lead to an error like this that occurs in an unexpected place in the code. I suggest that you run a 'memory debugger' on your application. See http://en.wikipedia.org/wiki/Memory_debugger Some tools that I've seen recommended are - "Dr. Memory": http://code.google.com/p/drmemory/ - "OllyDbg": http://ollydbg.de/ > I would also suspect threads going havoc, but as liveMedia is single-threaded that shouldn't be the case. Correct - provided, of course, that your *application* uses only a single thread (that calls LIVE555 code). > It definitely seems like the server matters. How can that be? Perhaps it's because the different servers (streams) use different codecs (and thus our RTSP client code uses different classes to receive/process the incoming packets)? I see (from the SDP descriptions returned in response to "DESCRIBE") that the stream(s) that are causing your crash are using motion JPEG. What about the "Axis 243q" streams (the ones that you think do not cause the crash)? What codec do they use? (Please post a SDP description from those streams.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Wed Jan 23 14:39:38 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Wed, 23 Jan 2013 22:39:38 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> Message-ID: The crashes are very consistent. Not the frequency, but the location. When they occur, 602 is always the last message printed. I've attached an output example. Judging by the callstack it almost looks to me like the printf would be the cause, but the same thing happens if I remove the debug output, i.e. 602, and 601 etc. This however makes no sense at all. What is causing the sudden app crash? I see no explanation at all in the code. I suspect that a 'memory smash' - i.e., a write through a bad pointer (caused by a bug in the code) - is to blame. If that happens, then a pointer somewhere else might be getting corrupted, which could lead to an error like this that occurs in an unexpected place in the code. I suggest that you run a 'memory debugger' on your application. See http://en.wikipedia.org/wiki/Memory_debugger Some tools that I've seen recommended are - "Dr. Memory": http://code.google.com/p/drmemory/ - "OllyDbg": http://ollydbg.de/ Thanks. Never heard of those two, but will look into. I would also suspect threads going havoc, but as liveMedia is single-threaded that shouldn't be the case. Correct - provided, of course, that your *application* uses only a single thread (that calls LIVE555 code). Well, "our app" uses multiple threads outside the liveMedia library, but I've lately only been testing with the modified testRTSPClient that I attached. It definitely seems like the server matters. How can that be? Perhaps it's because the different servers (streams) use different codecs (and thus our RTSP client code uses different classes to receive/process the incoming packets)? I see (from the SDP descriptions returned in response to "DESCRIBE") that the stream(s) that are causing your crash are using motion JPEG. What about the "Axis 243q" streams (the ones that you think do not cause the crash)? What codec do they use? (Please post a SDP description from those streams.) That makes sense. I was planning on setting up some Cisco VMS streams using mpeg4, but wasn't sure how/if that affected the client. The Axis 243q always outputs mpeg4 while using rtsp. I've attached output from Axis 243q mpeg4-streams which includes the SDP. Now when you point it out it seems fairly safe to say jpeg is the source of the issue. btw. I know you discourage mjpeg, and I can agree on that, but there are legacy reasons for us still using it and bandwidth isn't an issue in our case. Thanks! /Claes Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: Output-mpeg4.txt URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From pmepme6 at gmail.com Thu Jan 24 00:15:13 2013 From: pmepme6 at gmail.com (Pme) Date: Thu, 24 Jan 2013 09:15:13 +0100 Subject: [Live-devel] Question About PAT/PMT in MPEG2 Message-ID: Hi, I'm building and iOS application, where I'm using a RTSP Client (based on testRTSPClient) to receive a MPEG2 stream and then I'm trying to decode the frames with help of FFMPEG. My question is: Is it possible to obtain the PAT and PMT information before I pass the frames to FFMPEG with help of the live555 library, or does the live555 library has nothing to do with this task and it should be handle by FFMPEG? Thanks and regards From conchi.ap at vaelsys.com Thu Jan 24 01:30:03 2013 From: conchi.ap at vaelsys.com (Conchi Abasolo) Date: Thu, 24 Jan 2013 10:30:03 +0100 Subject: [Live-devel] ProxyRTSPClient::sendLivenessCommand Timeout In-Reply-To: References: <51003167.1060305@vaelsys.com> Message-ID: <5100FF1B.9060701@vaelsys.com> Hello Ross: > Once the 'back-end' connection to the proxied stream has been > established (as a result of a successful "DESCRIBE" command), the > proxy server periodically (at an interval randomly chosen between 30 > and 60 seconds (usually)) sends a command to the 'back-end' server, to > test whether it's still alive. (This command will be "OPTIONS", > unless the server has specifically reported (in response to a previous > "OPTIONS" command) that it supports "GET_PARAMETER", in which case the > command will be "GET_PARAMETER".) > > The proxy server acts, based upon the response to each 'liveness' > command. If the response is "OK", it goes ahead and sends another > 'liveness' command later. If the response is not "OK", or if it > detects that the RTSP connection with the back-end server has failed, > then it assumes that the back-end server is down, and it will then > closes the back-end connection, and start again by sending another > "DESCRIBE". (These "DESCRIBE" commands are also repeated - at > increasing random intervals - until the server responds.) Yes, that's what i understood about the liveness control mechanism, and it works properly when the backend connection is alive and working. > > What we don't do, however, is test whether the back-end server > actually responds to each 'liveness' command ("OPTIONS" or > "GET_PARAMETER"). So far, I've assumed that if the back-end server > fails, it will do so by closing the TCP connection, which we (the > proxy) will eventually detect. The code currently does not allow for > the TCP connection staying alive, but the back-end server simply > failing to respond at all to a 'liveness' command. Is this something > that you are actually seeing happen?? Yes, it is actually happening. I'm not sure if the assumption "/if the back-end server fails, it will do so by closing the TCP connection"/ is strong enough. What i'm actually seeing is that sometimes the liveness is sent but no response is received, and the backend connection seems to be still alive because the proxy doesn't detect any closure, but the connection is not really working. This is a problem, because you can't stream the video throught this proxyRTSPClient anymore, so, i think that would be useful to have a timeout task to control how long should the proxy wait the liveness response in order to reset such connections. Conchi Abasolo Vaelsys -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 24 01:42:29 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Jan 2013 01:42:29 -0800 Subject: [Live-devel] ProxyRTSPClient::sendLivenessCommand Timeout In-Reply-To: <5100FF1B.9060701@vaelsys.com> References: <51003167.1060305@vaelsys.com> <5100FF1B.9060701@vaelsys.com> Message-ID: <51D94EC6-DA7B-47C7-8052-6079E0319912@live555.com> > I'm not sure if the assumption "if the back-end server fails, it will do so by closing the TCP connection" is strong enough. What i'm actually seeing is that sometimes the liveness is sent but no response is received, and the backend connection seems to be still alive because the proxy doesn't detect any closure, but the connection is not really working. This is a problem, because you can't stream the video throught this proxyRTSPClient anymore, so, i think that would be useful to have a timeout task to control how long should the proxy wait the liveness response in order to reset such connections. Yes, I will probably make such a change sometime in the future. You should note, though, that if a back-end server fails in this way (by not responding to RTSP commands at all, even though the TCP connection remains open), then chances are that any further connection attempt (from the proxy server) will also fail. So you really can't expect a client to be able to receive the stream anymore. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Pablo.Gomez at scch.at Thu Jan 24 05:06:42 2013 From: Pablo.Gomez at scch.at (Pablo Gomez) Date: Thu, 24 Jan 2013 13:06:42 +0000 Subject: [Live-devel] unicast onDemand from live source NAL Units Message-ID: <905DBF721640BD47A97C0AE82D85BDF420948ADE@exmore.scch.at> >No, this is wrong! You should not be creating/using a "H264VideoStreamFramer" at all. That class should be used *only* when the input is a >byte stream (e.g., from a file). If - as in your case - the input is a discrete sequence of NAL units (i.e., one NAL unit at a time), then you should >use a "H264VideoStreamDiscreteFramer" only. So, you should replace the line > return H264VideoStreamFramer::createNew(envir(), discFramer); with > return discFramer; Ok, doing that the problem with fMaxSize is fixed and its value is the one I have specified in the OutPacketBuffer::maxSize However, in the player I don't see anything just the 'loading screen'. Because of the fact that I should not include start codes in the NAL Units I deactivate them in the encoder According with my encoder specification http://docs.nvidia.com/cuda/samples/3_Imaging/cudaEncode/doc/nvcuvenc.pdf p.28 I have few options for this: 0 implies that the encoder will add the start codes 1, 2, 4: length prefixed NAL units of size 1, 2, or 4 bytes If I set up the parameter to 0 the Discreteframer complains with the following message 'H264VideoStreamDiscreteFramer error: MPEG 'start code' seen in the input\n";' I guess that's expected because I should not include start codes at this point all clear. However, with the parameter in the encoder set to 1, 2 or 4 it didn't complain at all but I still do not visualize anything in the player. If I keep using the H264VideoStreamFramer as I was using before -I know it is wrong- with encoder parameter set to 0 -start codes-- I visualize the player with artifacts as I already explained in previous posts. Meanwhile with the parameter sets to 1,2 or 4 I do not visualize anything at all means similar results that what I get when I'm using just the discrete framer. I wonder why are the implications with start codes or prefixed NAL units size and the discreteframer.. Pablo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 24 06:59:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Jan 2013 06:59:32 -0800 Subject: [Live-devel] unicast onDemand from live source NAL Units In-Reply-To: <905DBF721640BD47A97C0AE82D85BDF420948ADE@exmore.scch.at> References: <905DBF721640BD47A97C0AE82D85BDF420948ADE@exmore.scch.at> Message-ID: <7F2C29DB-7D78-4C9B-B211-E6F3C2EC405B@live555.com> > I have few options for this: > 0 implies that the encoder will add the start codes > 1, 2, 4: length prefixed NAL units of size 1, 2, or 4 bytes > > If I set up the parameter to 0 the Discreteframer complains with the following message ?H264VideoStreamDiscreteFramer error: MPEG 'start code' seen in the input\n";? I guess that?s expected because I should not include start codes at this point all clear. However, with the parameter in the encoder set to 1, 2 or 4 it didn?t complain at all but I still do not visualize anything in the player. Remember that the data that you copy to *fTo should be a NAL unit, and nothing else. That means no start code at the front. But it also means nothing else at the front - including your 'length prefix'. In other words - you need to omit the 'length prefix' when you copy the NAL unit to *fTo. (Of course, you will use this 'length prefix' value to tell you how much data to copy, and you'll also set "fFrameSize" to this value.) > I wonder why are the implications with start codes or prefixed NAL units size and the discreteframer.. You don't need to speculate about this. Remember, You Have Complete Source Code. Just look at the code in "liveMedia/H264VideoStreamFramer.cpp", starting at line 62. This code expects the delivered data to be a NAL unit - i.e., beginning with a byte that contains the "nal_unit_type" - and nothing else. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From togba.liberty at ipconfigure.com Thu Jan 24 13:43:05 2013 From: togba.liberty at ipconfigure.com (Togba Liberty) Date: Thu, 24 Jan 2013 21:43:05 +0000 Subject: [Live-devel] About AAC Streaming Message-ID: <390D2E6DB4DD854B805F9A0E0E21B6010A17BE6B@BL2PRD0411MB410.namprd04.prod.outlook.com> We are building a server to transcode live JPEG video and AAC audio streams to clients from IP cameras. We have used the testRTSPClient program as a guide to get the JPEG and AAC audio from the cameras and copied the MediaSink buffer into our own shared buffer in the attempt to encode the JPEG video in the near future. We then pull from the shared buffer using tbb::tasks to stream the JPEG and Audio. The video works fine; however we are having issues getting VLC to play the audio regardless of if we stream it or save it to a file. Can you please provide some insight as to the frames generated by MPEG4GenericRTPSource so that we can have an idea of how to restream the AAC audio. Otherwise, can you give further insight as to how to format the raw AAC frames for live555. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 24 16:48:46 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Jan 2013 16:48:46 -0800 Subject: [Live-devel] About AAC Streaming In-Reply-To: <390D2E6DB4DD854B805F9A0E0E21B6010A17BE6B@BL2PRD0411MB410.namprd04.prod.outlook.com> References: <390D2E6DB4DD854B805F9A0E0E21B6010A17BE6B@BL2PRD0411MB410.namprd04.prod.outlook.com> Message-ID: <8299E348-F7E0-4764-8F76-D31AA49110A3@live555.com> > Can you please provide some insight as to the frames generated by MPEG4GenericRTPSource so that we can have an idea of how to restream the AAC audio. The frames are simply AAC audio frames, delivered one at a time. However, to decode (or restream) these frames, you also need extra 'configuration' information. This is carried 'out of band' in the stream's SDP description (that the client received in response to its initial RTSP "DESCRIBE" command). You can get this information by calling the following functions on the stream's "MediaSubsession" object: MediaSubsession::fmtp_mode() This will return a string like "AAC-hbr" - describing which particular AAC 'mode' this audio stream is MediaSubsession::fmtp_config() This returns a string that contains 'configuration' information. Depending upon your decoder, you can pass this string to your decoder 'as is', or you can translate it into binary form by calling our function "parseGeneralConfigStr()". However, for restreaming the frames, the configuration information stays in string form (see below). To restream these frames, you need to create a "MPEG4GenericRTPSink" object, as follows: RTPSink* audioRTPSink = MPEG4GenericRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadType, mediaSubsession->rtpTimestampFrequency(), "audio", mediaSubsession->fmtp_mode(), mediaSubsession->fmtp_config(), mediaSubsession->numChannels()); and, as always, an associated "RTCPInstance" object (to implement RTCP). You can then do the restreaming by calling "startPlaying()" on this "MPEG4GenericRTPSink", taking its input from "mediaSubsession->readSource()" - i.e. audioRTPSink->startPlaying(*(mediaSubsession->readSource()), ); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 24 17:00:40 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Jan 2013 17:00:40 -0800 Subject: [Live-devel] Question About PAT/PMT in MPEG2 In-Reply-To: References: Message-ID: <954B9205-26F7-4946-94BB-44FF84E52ECA@live555.com> > I'm building and iOS application, where I'm using a RTSP Client (based on testRTSPClient) to receive a MPEG2 stream Because you refer to "PAT" and "PMT" information, I assume you're referring to a MPEG-2 *Transport* stream. (There are other kinds of MPEG-2 streams as well.) > and then I'm trying to decode the frames with help of FFMPEG. My question is: Is it possible to obtain the PAT and PMT information before I pass the frames to FFMPEG with help of the live555 library, or does the live555 library has nothing to do with this task and it should be handle by FFMPEG? The latter. When the RTSP/RTP client receives the MPEG Transport Stream data, it doesn't inspect its contents at all. So your decoder will need to do that. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ferielbenghorbel at gmail.com Thu Jan 24 04:30:00 2013 From: ferielbenghorbel at gmail.com (feriel ben ghorbel) Date: Thu, 24 Jan 2013 13:30:00 +0100 Subject: [Live-devel] Questions about live555: Message-ID: Hi all, Ross,I have some question about live555: I'm using it in ubuntu platform at first, is there a possibility to put "*.ts" streams in a directory other than the directory of "live555MediaServer" application?? cause I tried to do it but it refuse to launch the stream NB : when I run the server, simply type "live555MediaServer". then, I am looking for information on the performance of live555 in terms of throughput and number of streams that can diffuse it in the same time. Thanks and regards _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Thu Jan 24 17:25:02 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Fri, 25 Jan 2013 01:25:02 +0000 Subject: [Live-devel] StreamReplicator with FileSink problem In-Reply-To: References: <20130110132339.M35692@livingdata.pt> <20130121164802.M92973@livingdata.pt> Message-ID: <5101DEEE.80704@livingdata.pt> On 01/22/2013 09:51 PM, Ross Finlayson wrote: > Yes, I agree. In the next version of the software, this piece of code - > in "FileSink::afterGettingFrame()" - will become: > > if (fOutFid == NULL || fflush(fOutFid) == EOF) { > // The output file has closed. Handle this the same way as if > the input source had closed: > if (fSource != NULL) fSource->stopGettingFrames(); > onSourceClosure(this); > return; > } On 01/23/2013 01:27 AM, Ross Finlayson wrote: > Ugh. I'm not thrilled by this hack, but right now I don't see a better > solution, so I've gone ahead and included it in the latest release > (2012.01.23) of the software. Hello Ross, just reporting that everything is working as expected in our code with the latest release (2012.01.23) of LIVE555. Thank you very much for your support. Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From finlayson at live555.com Fri Jan 25 01:12:51 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Jan 2013 01:12:51 -0800 Subject: [Live-devel] Questions about live555: In-Reply-To: References: Message-ID: > at first, is there a possibility to put "*.ts" streams in a directory other than the directory of "live555MediaServer" application? Yes, as noted in the online documentation - http://www.live555.com/mediaServer/ - the files to be streamed can must be either in the same directory as the "live555MediaServer" application, or a *subdirectory* of this. (If you put your file in a subdirectory, then (obviously) the subdirectory path will need to be included in the "rtsp://" URL. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From saravanan.s at fossilshale.com Fri Jan 25 03:42:33 2013 From: saravanan.s at fossilshale.com (saravanan) Date: Fri, 25 Jan 2013 17:12:33 +0530 Subject: [Live-devel] Questions about live555: In-Reply-To: References: Message-ID: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> Hi, I am using the Live555 server to stream MJPEG data captured from the device. We derived a class JPEGDeviceSource from JPEGVideSource, and reading the data from sharedbuffer(sent by device) in doGetNextFrame() of JPEGDeviceSource. Everything works fine for a single RTSP client session, if we open a second RTSP client session the frame rate is reduced by 2 . The GetBuffer call to the Device returns always new frame, so single session is working fine with full frame rate(30fps). If we open one more session then the frame rate is reduced by 2 (15fps). Once I call the GetBuffer(), I will get the complete frame. Now I want to make sure that this frame data should be available for all the sessions opened with this server. How to achieve this ? Thanks, Saravanan S -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 25 05:47:49 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Jan 2013 05:47:49 -0800 Subject: [Live-devel] Questions about live555: In-Reply-To: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> Message-ID: <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> > I am using the Live555 server to stream MJPEG data captured from the device. We derived a class JPEGDeviceSource from JPEGVideSource, and reading the data from sharedbuffer(sent by device) in doGetNextFrame() of JPEGDeviceSource. > > Everything works fine for a single RTSP client session, if we open a second RTSP client session the frame rate is reduced by 2 . The GetBuffer call to the Device returns always new frame, so single session is working fine with full frame rate(30fps). If we open one more session then the frame rate is reduced by 2 (15fps). > > Once I call the GetBuffer(), I will get the complete frame. Now I want to make sure that this frame data should be available for all the sessions opened with this server. How to achieve this ? You haven't said anything about your server implementation, but I presume you have implemented your own subclass of "OnDemandServerMediaSubsession". Your subclass's constructor - when it calls the "OnDemandServerMediaSubsession" constructor - should make sure that the "reuseFirstSource" parameter is "True". You do this because you are streaming from a live source, rather than from a file. Setting "reuseFirstSource" to "True" tells the server to use a single input stream as a data source, regardless of how many clients are currently accessing the server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Fri Jan 25 07:22:14 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Fri, 25 Jan 2013 15:22:14 +0000 Subject: [Live-devel] Problem with fix in MPEG2TransportStreamFromESSource stops frame delivery Message-ID: <20130125151154.M77748@livingdata.pt> On 01/22/2013 06:32 PM, Ross Finlayson wrote: >> We think this change was introduced in the sequence of: >> 2012.09.11: >> - Fixed a bug in "MPEG2TransportStreamFromESSource": Its destructor wasn't >> stopping the delivery from upstream objects. > > That's correct. That line > doStopGettingFrames(); > was added to the "MPEG2TransportStreamFromESSource" destructor for a > good reason. You should NOT be deleting that line. > > If you have a "MPEG2TransportStreamFromESSource" as a replica, then when > the "MPEG2TransportStreamFromESSource" destructor gets called - for > whatever reason - then that will cause > "StreamReplica::doStopGettingFrames()" to be called first, then > "StreamReplica::~StreamReplica()". Neither of which should be causing > any problems for you. Of course. I didn't mean that fix was wrong, just that it exposed a problem that must lie elsewhere and, for us, the immediate solution to keep things running was to remove that line. And, in fact, we just found out where the real problem is. It's in StreamReplicator. In order to create the stream's SDP description, the createNewStreamSource() method from an OnDemandServerMediaSubsession is called twice. The first returned FramedSource is immediately destructed and then a new one is created. We've been observing this behavior for a long time, and it's fine. In our case this means that a StreamReplica (of our live DeviceSource) will be created and immediately destructed, without (and this is what causes the problem) ever having been activated. Since our replica feeds a MPEG2TransportStreamFromESSource, the introduction of a call to doStopGettingFrames() on MPEG2TransportStreamFromESSource's destructor causes StreamReplica::doStopGettingFrames() to be called also. And the problem is that the replica wasn't activated (getNextFrame() was never called) but deactivateStreamReplica() will be called (line 318) and will decrease the fNumActiveReplicas counter. From then on we see the "Internal Error 2" message (which we hadn't noticed before, sorry!) and everything stops. The following patch did solve this problem and, this time, I think this is a correct solution: @@ -316,4 +316,6 @@ void StreamReplica::doStopGettingFrames() { - fFrameIndex = -1; // When we start reading again, this will tell the replicator that we were previously inactive. - fOurReplicator.deactivateStreamReplica(this); + if (fFrameIndex != -1) { // this might not have been activated at all + fFrameIndex = -1; // When we start reading again, this will tell the replicator that we were previously inactive. + fOurReplicator.deactivateStreamReplica(this); + } } Can this be fixed in next release? Thank you, Bruno Abreu PS - I'm having mail server problems so this is a new message, not a reply to the previous one on this topic. Please accept my apologies if it creates a new thread. -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt From saravanan.s at fossilshale.com Fri Jan 25 07:43:12 2013 From: saravanan.s at fossilshale.com (saravanan) Date: Fri, 25 Jan 2013 21:13:12 +0530 Subject: [Live-devel] Questions about live555: In-Reply-To: <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> Message-ID: <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> Dear Ross Finlayson, Thanks, Now it works fine with your input to set the flat to True. Regards, Saravanan S From: Ross Finlayson [mailto:finlayson at live555.com] Sent: Friday, January 25, 2013 7:18 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Questions about live555: I am using the Live555 server to stream MJPEG data captured from the device. We derived a class JPEGDeviceSource from JPEGVideSource, and reading the data from sharedbuffer(sent by device) in doGetNextFrame() of JPEGDeviceSource. Everything works fine for a single RTSP client session, if we open a second RTSP client session the frame rate is reduced by 2 . The GetBuffer call to the Device returns always new frame, so single session is working fine with full frame rate(30fps). If we open one more session then the frame rate is reduced by 2 (15fps). Once I call the GetBuffer(), I will get the complete frame. Now I want to make sure that this frame data should be available for all the sessions opened with this server. How to achieve this ? You haven't said anything about your server implementation, but I presume you have implemented your own subclass of "OnDemandServerMediaSubsession". Your subclass's constructor - when it calls the "OnDemandServerMediaSubsession" constructor - should make sure that the "reuseFirstSource" parameter is "True". You do this because you are streaming from a live source, rather than from a file. Setting "reuseFirstSource" to "True" tells the server to use a single input stream as a data source, regardless of how many clients are currently accessing the server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Fri Jan 25 10:02:20 2013 From: jshanab at smartwire.com (Jeff Shanab) Date: Fri, 25 Jan 2013 18:02:20 +0000 Subject: [Live-devel] Question About PAT/PMT in MPEG2 In-Reply-To: <954B9205-26F7-4946-94BB-44FF84E52ECA@live555.com> References: , <954B9205-26F7-4946-94BB-44FF84E52ECA@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC22524C66A@IL-BOL-EXCH01.smartwire.com> If you are using the HTTP Live Streaming that depends on the MPEG-2 Transport Stream, I implemented this on live cameras by gorilla subclassing (Cut-n-paste and modify) the MPEG2TransportStreamFromESSource to a MPEG2TransportStreamFromESSource4iOS class. I changed the inserting of the PAT and PMT to be relative to a chosen number of keyframes instead of a random time so I could easily segment it on the fly downstream. HLS on ios is extreamly picky about the segment size and what it starts with. While it must have recieved at least once in the past, I start each segment with it and a keyframe to allow connections at any time. ________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Ross Finlayson [finlayson at live555.com] Sent: Thursday, January 24, 2013 7:00 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Question About PAT/PMT in MPEG2 I'm building and iOS application, where I'm using a RTSP Client (based on testRTSPClient) to receive a MPEG2 stream Because you refer to "PAT" and "PMT" information, I assume you're referring to a MPEG-2 *Transport* stream. (There are other kinds of MPEG-2 streams as well.) and then I'm trying to decode the frames with help of FFMPEG. My question is: Is it possible to obtain the PAT and PMT information before I pass the frames to FFMPEG with help of the live555 library, or does the live555 library has nothing to do with this task and it should be handle by FFMPEG? The latter. When the RTSP/RTP client receives the MPEG Transport Stream data, it doesn't inspect its contents at all. So your decoder will need to do that. Ross Finlayson Live Networks, Inc. http://www.live555.com/ This message and any attachments contain confidential and proprietary information, and may contain privileged information, belonging to one or more affiliates of Windy City Wire Cable & Technology Products, LLC. No privilege is waived by this transmission. Unauthorized use, copying or disclosure of such information is prohibited and may be unlawful. If you receive this message in error, please delete it from your system, destroy any printouts or copies of it, and notify the sender immediately by e-mail or phone. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 25 10:49:57 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Jan 2013 10:49:57 -0800 Subject: [Live-devel] Problem with fix in MPEG2TransportStreamFromESSource stops frame delivery In-Reply-To: <20130125151154.M77748@livingdata.pt> References: <20130125151154.M77748@livingdata.pt> Message-ID: > Can this be fixed in next release? Yes, I have included it in a new version - 2013.01.25 - released just now. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From togba.liberty at ipconfigure.com Fri Jan 25 11:13:46 2013 From: togba.liberty at ipconfigure.com (Togba Liberty) Date: Fri, 25 Jan 2013 19:13:46 +0000 Subject: [Live-devel] About AAC Streaming In-Reply-To: <8299E348-F7E0-4764-8F76-D31AA49110A3@live555.com> References: <390D2E6DB4DD854B805F9A0E0E21B6010A17BE6B@BL2PRD0411MB410.namprd04.prod.outlook.com> <8299E348-F7E0-4764-8F76-D31AA49110A3@live555.com> Message-ID: <390D2E6DB4DD854B805F9A0E0E21B6010A17C0AF@BL2PRD0411MB410.namprd04.prod.outlook.com> Hi All, Thanks for the expedient response. What I failed to mention in my original email was that we are developing a transcoding server that will encode mjpeg stream to H264 stream in real time and send out both streams through RTSP to multiple clients. We were able to build the pipeline and grab the mjpeg stream from the camera through Live555 testRTSPClient program and encode the resulting stream and send it out to multiple clients. The issue we have now is that in retransmitting the audio stream we don't understand if Live555 adds an extra header to the AAC frames. This is because we are able to use the RTSPClient subsession to generate the initial settings for our server's RTPSink and all works fine. Also, we have different threads to spool up multiple streams from different cameras and as a result, our clients have different UsageEnvironments and TaskSchedulers from that of the RTSPServer (this was mentioned as a way to implement transferring streams from live sources). In order to transfer the streams, we copy the Live555 buffers into our shared buffers and call the event trigger on the onDemandMediaServerSubsession creatNewStreamSource function. So in our project the pipeline looks like: testRTSPClient (jpeg & aac):TaskScheduler1 --> shared buffer (jpeg & aac) --> H.264 encoder (h.264) --> RTSPServer (jpeg & h.264 & aac):TaskScheduler2 --> Client We have the jpeg and h.264 pipeline going, we just need to know what headers the MPEG4GenericRTPSource is adding to the aac frames to be able to retransmit it so VLC can play it. Thanks, Togba Liberty C++ Developer | ipConfigure, Inc. www.ipconfigure.com [Description: cid:7D82F492-5DFF-427B-9295-2AD8F6DD9DA3] From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 24, 2013 7:49 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] About AAC Streaming Can you please provide some insight as to the frames generated by MPEG4GenericRTPSource so that we can have an idea of how to restream the AAC audio. The frames are simply AAC audio frames, delivered one at a time. However, to decode (or restream) these frames, you also need extra 'configuration' information. This is carried 'out of band' in the stream's SDP description (that the client received in response to its initial RTSP "DESCRIBE" command). You can get this information by calling the following functions on the stream's "MediaSubsession" object: MediaSubsession::fmtp_mode() This will return a string like "AAC-hbr" - describing which particular AAC 'mode' this audio stream is MediaSubsession::fmtp_config() This returns a string that contains 'configuration' information. Depending upon your decoder, you can pass this string to your decoder 'as is', or you can translate it into binary form by calling our function "parseGeneralConfigStr()". However, for restreaming the frames, the configuration information stays in string form (see below). To restream these frames, you need to create a "MPEG4GenericRTPSink" object, as follows: RTPSink* audioRTPSink = MPEG4GenericRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadType, mediaSubsession->rtpTimestampFrequency(), "audio", mediaSubsession->fmtp_mode(), mediaSubsession->fmtp_config(), mediaSubsession->numChannels()); and, as always, an associated "RTCPInstance" object (to implement RTCP). You can then do the restreaming by calling "startPlaying()" on this "MPEG4GenericRTPSink", taking its input from "mediaSubsession->readSource()" - i.e. audioRTPSink->startPlaying(*(mediaSubsession->readSource()), ); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 25313 bytes Desc: image001.png URL: From finlayson at live555.com Fri Jan 25 14:55:38 2013 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 25 Jan 2013 14:55:38 -0800 Subject: [Live-devel] About AAC Streaming In-Reply-To: <390D2E6DB4DD854B805F9A0E0E21B6010A17C0AF@BL2PRD0411MB410.namprd04.prod.outlook.com> References: <390D2E6DB4DD854B805F9A0E0E21B6010A17BE6B@BL2PRD0411MB410.namprd04.prod.outlook.com> <8299E348-F7E0-4764-8F76-D31AA49110A3@live555.com> <390D2E6DB4DD854B805F9A0E0E21B6010A17C0AF@BL2PRD0411MB410.namprd04.prod.outlook.com> Message-ID: <1273BCC7-F0C9-4502-9982-3763616C2B56@live555.com> > The issue we have now is that in retransmitting the audio stream we don?t understand if Live555 adds an extra header to the AAC frames. As I said before - it doesn't. The frames that come from the "MPEG4GenericRTPSource" are simply AAC frames, with no extra header added. > We have the jpeg and h.264 pipeline going, we just need to know what headers the MPEG4GenericRTPSource is adding to the aac frames to be able to retransmit it so VLC can play it. VLC should be able to play the stream, provided that you set up the "MPEG4GenericRTPSink" correctly (as I noted in my previous email), and also provided that you are setting "fPresentationTime" correctly on each outgoing frame. The best way to test this is *not* to use VLC at first, but instead to use our "testRTSPClient" application. You should make sure that the "a=config" value in the SDP description is the same as it was in the original ('back-end') stream. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmepme6 at gmail.com Sat Jan 26 02:35:58 2013 From: pmepme6 at gmail.com (Pme) Date: Sat, 26 Jan 2013 11:35:58 +0100 Subject: [Live-devel] Question About PAT/PMT in MPEG2 In-Reply-To: <954B9205-26F7-4946-94BB-44FF84E52ECA@live555.com> References: <954B9205-26F7-4946-94BB-44FF84E52ECA@live555.com> Message-ID: <97F24C32-EF91-40D8-A9BA-8D6A15BBA345@gmail.com> Thanks for the quick response. One other question: I implemented the DummySink, but I think I need to modified the data I'm getting before passing it to FFMPEG (header file and other stuff. I've read that's necessary for H.264, but not sure for MPEG-TS). Is that right? If is the case, would you mind pointing me to an example, MPEG1or2VideoRTPSink maybe? Thanks a lot and regards. On Jan 25, 2013, at 2:00 AM, Ross Finlayson wrote: >> I'm building and iOS application, where I'm using a RTSP Client (based on testRTSPClient) to receive a MPEG2 stream > > Because you refer to "PAT" and "PMT" information, I assume you're referring to a MPEG-2 *Transport* stream. (There are other kinds of MPEG-2 streams as well.) > > >> and then I'm trying to decode the frames with help of FFMPEG. My question is: Is it possible to obtain the PAT and PMT information before I pass the frames to FFMPEG with help of the live555 library, or does the live555 library has nothing to do with this task and it should be handle by FFMPEG? > > The latter. When the RTSP/RTP client receives the MPEG Transport Stream data, it doesn't inspect its contents at all. So your decoder will need to do that. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 26 20:30:47 2013 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Jan 2013 20:30:47 -0800 Subject: [Live-devel] Question About PAT/PMT in MPEG2 In-Reply-To: <97F24C32-EF91-40D8-A9BA-8D6A15BBA345@gmail.com> References: <954B9205-26F7-4946-94BB-44FF84E52ECA@live555.com> <97F24C32-EF91-40D8-A9BA-8D6A15BBA345@gmail.com> Message-ID: <0DCAB3BF-E9B2-4F18-8AAF-EB9514A7A3A9@live555.com> > I implemented the DummySink, but I think I need to modified the data I'm getting before passing it to FFMPEG No you don't - not for MPEG Transport Stream data. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Pablo.Gomez at scch.at Mon Jan 28 00:50:37 2013 From: Pablo.Gomez at scch.at (Pablo Gomez) Date: Mon, 28 Jan 2013 08:50:37 +0000 Subject: [Live-devel] unicast onDemand from live source NAL Units Message-ID: <905DBF721640BD47A97C0AE82D85BDF420949105@exmore.scch.at> Hi Ross, >Remember that the data that you copy to *fTo should be a NAL unit, and nothing else. That means no start >code at the front. But it also means nothing else at the front - including your >'length prefix'. >In other words - you need to omit the 'length prefix' when you copy the NAL unit to *fTo. (Of course, you >will use this 'length prefix' value to tell you how much data to copy, and you'll also set "fFrameSize" to >this value.) Ok so now I have the problem that I'm not sure if I'm writing something else at the front. I did few tests: If I set the framing type parameter to '0' -start codes- the output looks like this: '00 00 00 01 09 10 00 00 00 01 67 42 C0 1F F4 02 00 30 D8 08 80 00 01 F4 ...' With this NAL unit and the DiscreteFramer as expected it is not working. If I set the framing type parameter to '1' -prefix length- the output looks like this: '02 09 10 20 67 42 C0 1F F4 02 00 30 D8 08 80 00 75 30 .... ' With this NAL unit I cannot see anything neither -as expected because there is some prefix at the beginning. If I set the framing type parameter to '2' also prefix length, the output looks like this: '00 02 09 10 00 20 67 42 C0 1F F4 02 00 30 D8 08 80 00 01 F4 80 00 75 30 70 00 00 0B ....' Again cannot see anything If I set the framing type parameter to '4' also prefix length, the output looks like this: '00 00 00 02 09 10 00 00 00 20 67 42 C0 1F F4 02 00 30 D8 08 80 00 01 F4 80 00 75 30 ...' Same results. Every time the encoder has a NAL unit ready, a callback function it is called. void nalUnitReady(unsigned char *nal,size_t size); >From that function -which is called from a different thread than the one that does the doGetNextFrame() at the streaming server- I also signal the H264LiveStreamFramedSource object, based now on the DeviceSource template. I have tried to omit the front of the NAL units from there increasing the pointer a few bytes. If I increase the pointer 10 bytes and the encoder parameter is set to '4' the output looks like this: '67 42 C0 1F F4 02 00 30 D8 08 80 00 01 F4 80 75 30 ...' But still, I don't see anything neither. Right now I'm not sure if the problem I have is in the NAL Units front.. or I did something wrong with the live555. Is there any special start pattern? It looks like to me that '67 42 C0 1F' somehow it is but not sure. Also I'm not sure about the meaning of length prefix because NAL units seems to have that '02 09 10' at the beginning so it looks like it is not the size of the NAL.. Therefore, regarding the size of the NAL I'm using the size provided in the callback function. I also did a test reducing the size the same amount of bytes I'm skipping at the front. But it doesn't work neither. Any clue? Thanks! Best Pablo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 01:07:01 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 01:07:01 -0800 Subject: [Live-devel] unicast onDemand from live source NAL Units In-Reply-To: <905DBF721640BD47A97C0AE82D85BDF420949105@exmore.scch.at> References: <905DBF721640BD47A97C0AE82D85BDF420949105@exmore.scch.at> Message-ID: <57F0D8D9-40D1-470D-AB29-78F946453008@live555.com> Look, I don't know how much clearer I can be about this. The data that you copy to *fTo should be a single NAL unit, AND NOTHING ELSE! That means that there should not be ANY 'start code' or 'length prefix' or anything else at the start of the data. (I thought I made this clear in my last email!) To use the example data that you used in your last email, this means that the data that you should copy to *fTo should be 09 10 00 00 00 01 67 42 C0 1F F4 02 00 30 D8 08 80 00 01 F4 ... up until the end of the NAL unit. Don't forget that you must also set "fFrameSize" to the amount of data that you copied - i.e., to the size of the NAL unit that you copied. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon Jan 28 01:07:27 2013 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 28 Jan 2013 10:07:27 +0100 Subject: [Live-devel] RTP header extension In-Reply-To: <95AA7B33-3D4D-4920-9348-430C2E45E8B9@live555.com> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp> <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> <7773_1358770148_50FD2FE4_7773_3282_1_1BE8971B6CFF3A4F97AF4011882AA2550156097EF359@THSONEA01CMS01P.one.grp> <95AA7B33-3D4D-4920-9348-430C2E45E8B9@live555.com> Message-ID: <9044_1359364093_51063FFD_9044_5167_2_1BE8971B6CFF3A4F97AF4011882AA255015609AFF83C@THSONEA01CMS01P.one.grp> Hi Ross, In order to bring RTP header extension support in live555, are you asking for a commercial support or a coding support ? Best Regards, Michel. [@@THALES GROUP RESTRICTED@@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : lundi 21 janvier 2013 15:28 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] RTP header extension I would like to use RTP header extension in order to send if frame is a synchronisation point, and timestamps (recording time). Do you think it's possible to give a callback to the RTSPClient ? No, because RTP header extensions have nothing to do with RTSP. Support - for both transmitters and receivers - for RTP header extensions will likely happen someday, but this is not currently a high-priority feature for free support. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 01:20:24 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 01:20:24 -0800 Subject: [Live-devel] unicast onDemand from live source NAL Units In-Reply-To: <57F0D8D9-40D1-470D-AB29-78F946453008@live555.com> References: <905DBF721640BD47A97C0AE82D85BDF420949105@exmore.scch.at> <57F0D8D9-40D1-470D-AB29-78F946453008@live555.com> Message-ID: > To use the example data that you used in your last email, this means that the data that you should copy to *fTo should be > 09 10 00 00 00 01 67 42 C0 1F F4 02 00 30 D8 08 80 00 01 F4 ... Oops, it turns out that this wasn't correct. The "00 00 00 01" in the data is the 'start code', that we shouldn't be including. In this example, the first NAL unit is just two bytes long: 09 10 (FYI, it's an "access unit delimiter" NAL unit) That's ALL that you you should be copying to *fTo at first. The second NAL unit is 0x20 (i.e., 32) bytes long, and begins 67 42 C0 1F F4 02 00 30 D8 08 80 00 01 F4 80 00 75 30 70 00 00 0B ... (FYI, it's a "sequence parameter set" (i.e., SPS) NAL unit) It's important that you copy only one NAL unit at a time (and, of course, set "fFrameSize" correctly for each). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 01:45:21 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 01:45:21 -0800 Subject: [Live-devel] RTP header extension In-Reply-To: <9044_1359364093_51063FFD_9044_5167_2_1BE8971B6CFF3A4F97AF4011882AA255015609AFF83C@THSONEA01CMS01P.one.grp> References: <12191_1358252478_50F549BE_12191_2520_1_1BE8971B6CFF3A4F97AF4011882AA255015607AAEDDB@THSONEA01CMS01P.one.grp> <8A4A85CF-DEE3-4CE6-BAE5-FC02B9C58ABD@live555.com> <28364_1358331320_50F67DB8_28364_17971_1_1BE8971B6CFF3A4F97AF4011882AA255015607AE98BF@THSONEA01CMS01P.one.grp> <8D621190-17B3-42EF-A2FC-227D355FF585@live555.com> <7773_1358770148_50FD2FE4_7773_3282_1_1BE8971B6CFF3A4F97AF4011882AA2550156097EF359@THSONEA01CMS01P.one.grp> <95AA7B33-3D4D-4920-9348-430C2E45E8B9@live555.com> <9044_1359364093_51063FFD_9044_5167_2_1BE8971B6CFF3A4F97AF4011882AA255015609AFF83C@THSONEA01CMS01P.one.grp> Message-ID: > In order to bring RTP header extension support in live555, are you asking for a commercial support or a coding support ? Right now all I'm saying is that it's not at the top of my 'to do' list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From saravanan.s at fossilshale.com Mon Jan 28 05:11:58 2013 From: saravanan.s at fossilshale.com (saravanan) Date: Mon, 28 Jan 2013 18:41:58 +0530 Subject: [Live-devel] Questions about live555: In-Reply-To: <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> Message-ID: <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> Hi, I could able to play MJPEG(640x480) and AAC streams fine in separate RTSP server media sessions. But, If I try to play both the streams through single RTSP server media session then I am getting the following results, . Some time video is very slow but audio is OK . Some time video is not at all playing but audio is OK . Some time both audio and video are not at all playing I am using OnDemandServerMediaSubsession for my testing. Your help would be appreciated. Thanks, Saravanan S From: saravanan [mailto:saravanan.s at fossilshale.com] Sent: Friday, January 25, 2013 9:13 PM To: 'LIVE555 Streaming Media - development & use' Subject: Re: [Live-devel] Questions about live555: Dear Ross Finlayson, Thanks, Now it works fine with your input to set the flat to True. Regards, Saravanan S From: Ross Finlayson [mailto:finlayson at live555.com] Sent: Friday, January 25, 2013 7:18 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Questions about live555: I am using the Live555 server to stream MJPEG data captured from the device. We derived a class JPEGDeviceSource from JPEGVideSource, and reading the data from sharedbuffer(sent by device) in doGetNextFrame() of JPEGDeviceSource. Everything works fine for a single RTSP client session, if we open a second RTSP client session the frame rate is reduced by 2 . The GetBuffer call to the Device returns always new frame, so single session is working fine with full frame rate(30fps). If we open one more session then the frame rate is reduced by 2 (15fps). Once I call the GetBuffer(), I will get the complete frame. Now I want to make sure that this frame data should be available for all the sessions opened with this server. How to achieve this ? You haven't said anything about your server implementation, but I presume you have implemented your own subclass of "OnDemandServerMediaSubsession". Your subclass's constructor - when it calls the "OnDemandServerMediaSubsession" constructor - should make sure that the "reuseFirstSource" parameter is "True". You do this because you are streaming from a live source, rather than from a file. Setting "reuseFirstSource" to "True" tells the server to use a single input stream as a data source, regardless of how many clients are currently accessing the server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 06:45:45 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 06:45:45 -0800 Subject: [Live-devel] Questions about live555: In-Reply-To: <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> Message-ID: <23F188D0-A730-4E28-9C3E-7854B7B9D81F@live555.com> > I could able to play MJPEG(640x480) and AAC streams fine in separate RTSP server media sessions. But, If I try to play both the streams through single RTSP server media session then I am getting the following results, > > ? Some time video is very slow but audio is OK > ? Some time video is not at all playing but audio is OK > ? Some time both audio and video are not at all playing Problems like this - where media players can play audio and video streams separately, but not together - are often caused by the 'presentation times' (i.e, the "fPresentationTime" values) of the two streams not being properly synchronized. You should make sure that the "fPresentationTime" values for each stream are accurate, in sync, and are aligned with 'wall clock' time (i.e., the time that you would get by calling "gettimeofday()"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yogesh_marathe at ti.com Mon Jan 28 07:07:03 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Mon, 28 Jan 2013 15:07:03 +0000 Subject: [Live-devel] Parsing frames received by testRTSPClient In-Reply-To: <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9F5C47@DBDE01.ent.ti.com> Hi, I want two know if testRTSPClient allows me to specify codec type and if it can give me single encoded frame every time afterGettingFrame() is invoked? If not, what would be my options to have support for H264 frame parsing in received bit stream? I want to identify frame boundary in the encoded bit stream returned by testRTSPClient. Currently if I submit received bit stream to decoder as it is, it fails to decode. Any help would be appreciated. Thanks in advance. Regards, Yogesh. -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Mon Jan 28 12:34:28 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Mon, 28 Jan 2013 20:34:28 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> Message-ID: The crashes are very consistent. Not the frequency, but the location. When they occur, 602 is always the last message printed. I've attached an output example. Judging by the callstack it almost looks to me like the printf would be the cause, but the same thing happens if I remove the debug output, i.e. 602, and 601 etc. This however makes no sense at all. What is causing the sudden app crash? I see no explanation at all in the code. I suspect that a 'memory smash' - i.e., a write through a bad pointer (caused by a bug in the code) - is to blame. If that happens, then a pointer somewhere else might be getting corrupted, which could lead to an error like this that occurs in an unexpected place in the code. I suggest that you run a 'memory debugger' on your application. See http://en.wikipedia.org/wiki/Memory_debugger Some tools that I've seen recommended are - "Dr. Memory": http://code.google.com/p/drmemory/ - "OllyDbg": http://ollydbg.de/ I updated to the latest code (live.2013.01.25.tar) and have been running the modified testRTSPClient using DrMemory. The exception occurs at DrMemory Error #10 and #11. The Windows app crash dialog is then displayed and after closing that the remaining DrMemory info is printed (as can be seen in the CmdOutput and results.txt files). I've so far only glanced at this and am not sure how much it helps me, but passing it on in hope you might spot something useful. Included files: - CmdOutput-DrMemory4416.txt : Last part of command prompt output. Includes the debug messages. - testRTSPClient.cpp: The modified testRTSPClient source. DrMemory-testRTSPClient.exe.4416.000\ - global.4416.log : - missing_symbols.txt - results.txt - suppress.txt Note: I shortened global.4416.log (marked with ...SNIP...) as it contained MB's of the following warning: WARNING: unreadable or invalid AFD_POLL_INFO /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: DrMemory-testRTSPClient.zip Type: application/x-zip-compressed Size: 21451 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From Jesse.Hemingway at nerdery.com Mon Jan 28 10:09:49 2013 From: Jesse.Hemingway at nerdery.com (Jesse Hemingway) Date: Mon, 28 Jan 2013 12:09:49 -0600 Subject: [Live-devel] Noob wandering Message-ID: Apologies in advance for following deluge. I'm new to Live555, RTP and RTSP in general; and trying to gather resources to understand how to consume video+audio streams. Based on the testRTSPClient example, I've gotten a stable RTSP connection working on my target platform; in this case to a video+audio RTSP source. But, now I'm struggling to figure out the next step. My custom MediaSink classes do not receive any frame data via afterGettingFrame(), but guessing there is significantly more logic required in these classes than shown in testRTSPClient. What's a 'frame', exactly, anyway - my research indicates these are RTP specific, but a concise definition is elusive. E.g. a frame of audio - seems inefficient to call a function for the typical definition of 'frame' e.g. multi-channel sample. Without deep study, rfc4571's definition is too generic to help. The best FAQ I've found is http://www.cs.columbia.edu/~hgs/rtp/faq.html, but still assumes more understanding that I have so far (plus the color scheme makes my eyeballs implode). And how about these higher-level MediaSinks - do those work right out of the box? Seems too good to be true. Let's say I had AAC+VP8 streams coming in. Would I conditionally create a MPEG4LATMAudioRTPSink (what if it's non-LATM MPEG4?) and a VP8VideoRTPSink in continueAfterSETUP() based on inspecting the subsession? I suppose I will have to try this out myself ;) I know we're encouraged to study the examples and browse the library source, but there is a LOT to scan without having a roadmap! A lot of these examples seem to have more to do with serving rather than consuming streams, can someone point out a good one for study? In the class graph I also see some specific MediaSink classes, e.g. MPEG4ESVideoRTPSink. Would it make more sense to study these implementations? I guess I'm struggling to understand the high-level roadmap of information. Many thanks! Jesse -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 13:51:56 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 13:51:56 -0800 Subject: [Live-devel] Parsing frames received by testRTSPClient In-Reply-To: <6EFDC7CD849764409289BF330AF9704A3E9F5C47@DBDE01.ent.ti.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> <6EFDC7CD849764409289BF330AF9704A3E9F5C47@DBDE01.ent.ti.com> Message-ID: <9186CFD4-1A03-4CEC-8976-F2FEF95560FD@live555.com> > I want two know if testRTSPClient allows me to specify codec type and if it can give me single encoded frame every time afterGettingFrame() is invoked? Yes. Look at the "DummySink" class that the "testRTSPClient" demo application uses. Note, in particular, the (non-static) "DummySink::afterGettingFrame()" function ("testRTSPClient.cpp", lines 479-500. Note that when this function is called, a complete 'frame' (for H.264, this will be a "NAL unit") will have already been delivered into "fReceiveBuffer". Note that our "DummySink" implementation doesn't actually do anything with this data; that's why it's called a 'dummy' sink. If you wanted to decode these frames, you would replace "DummySink" with your own "MediaSink" subclass. It's "afterGettingFrame()" function would pass the data (at "fReceiveBuffer", of length "frameSize") to a decoder. Because you are receiving H.264 video data, there is one more thing that you have to do before you start feeding frames to your decoder. H.264 streams have out-of-band configuration information (SPS and PPS NAL units) that you may need to feed to the decoder to initialize it. To get this information, call "MediaSubsession::fmtp_spropparametersets()" (on the video 'subsession' object). This will give you a (ASCII) character string. You can then pass this to "parseSPropParameterSets()", to generate binary NAL units for your decoder.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 13:59:37 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 13:59:37 -0800 Subject: [Live-devel] Noob wandering In-Reply-To: References: Message-ID: > Based on the testRTSPClient example, I've gotten a stable RTSP connection working on my target platform; in this case to a video+audio RTSP source. But, now I'm struggling to figure out the next step. My custom MediaSink classes do not receive any frame data via afterGettingFrame() That is the first thing that you should fix. Note that when the (non-static) "DummySink::afterGettingFrame()" function ("testRTSPClient.cpp", lines 479-500) is called, a complete frame will have already been delivered into "fReceiveBuffer". Note that our "DummySink" implementation doesn't actually do anything with this data; that's why it's called a 'dummy' sink. If you wanted to decode these frames, you would replace "DummySink" with your own "MediaSink" subclass. It's "afterGettingFrame()" function would pass the data (at "fReceiveBuffer", of length "frameSize") to a decoder. > , but guessing there is significantly more logic required in these classes than shown in testRTSPClient. Not really... > What's a 'frame', exactly It's a complete unit of data that can be passed to a decoder. The specific details depend on the specific (audio and/or video) codecs that you're receiving. > And how about these higher-level MediaSinks - do those work right out of the box? Seems too good to be true. Let's say I had AAC+VP8 streams coming in. Would I conditionally create a MPEG4LATMAudioRTPSink (what if it's non-LATM MPEG4?) and a VP8VideoRTPSink in continueAfterSETUP() based on inspecting the subsession? I suppose I will have to try this out myself ;) No, you're confused here. The "*RTPSink" classes are used only for *transmitting* RTP packets. I.e., they're used by servers, not clients, and are therefore not classes that you would use. If you want to decode (and then play) the received data, then you would need a decoder for each media type. Note, however, that our code *does not* include any decoders. For that, you would use a separate software library - or decoding hardware. > I know we're encouraged to study the examples and browse the library source, but there is a LOT to scan without having a roadmap! A lot of these examples seem to have more to do with serving rather than consuming streams, can someone point out a good one for study? In the class graph I also see some specific MediaSink classes, e.g. MPEG4ESVideoRTPSink. Would it make more sense to study these implementations? I guess I'm struggling to understand the high-level roadmap of information. Maybe you should begin by explaining what it is specifically that you're trying to do... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 14:23:49 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 14:23:49 -0800 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> Message-ID: <16C3C15D-4D72-4293-B9A1-9B47DB9056BE@live555.com> > I updated to the latest code (live.2013.01.25.tar) and have been running the modified testRTSPClient using DrMemory. The exception occurs at DrMemory Error #10 and #11. The Windows app crash dialog is then displayed and after closing that the remaining DrMemory info is printed (as can be seen in the CmdOutput and results.txt files). > > I've so far only glanced at this and am not sure how much it helps me, but passing it on in hope you might spot something useful. It gives me a little information. The problem is happening on line 540 when "DummySink::afterGettingFrame()" calls the virtual function "continuePlaying()". The fact that an invalid memory access is happening here suggests that the ("DummySink") object's virtual function table has been smashed. I.e., it appears that something has been (improperly, of course) writing over the "DummySink" object's memory. Now, the question becomes: What is writing over the "DummySink" object's memory, any where and why? Because you reported that the error occurs only when you're receiving a JPEG/RTP stream, I suspect that the culprit is the "JPEGVideoRTPSource" class, but I don't (yet) know of any bug there that might be causing this. But "JPEGVideoRTPSource" is where I'd focus attention now. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jesse.Hemingway at nerdery.com Mon Jan 28 15:07:40 2013 From: Jesse.Hemingway at nerdery.com (Jesse Hemingway) Date: Mon, 28 Jan 2013 17:07:40 -0600 Subject: [Live-devel] Noob wandering In-Reply-To: References: Message-ID: Thanks Ross, > Based on the testRTSPClient example, I've gotten a stable RTSP connection > working on my target platform; in this case to a video+audio RTSP source. > But, now I'm struggling to figure out the next step. My custom MediaSink > classes do not receive any frame data via afterGettingFrame() > > > That is the first thing that you should fix. Note that when the > (non-static) "DummySink::afterGettingFrame()" function > ("testRTSPClient.cpp", lines 479-500) is called, a complete frame will have > already been delivered into "fReceiveBuffer". > That's my first problem then - I should have been more clear, in my case, that function is not getting called at all. I'm not debugging the original project, as that appears to exit immediately after proving that it can set up a connection (maybe I've chosen a bad test source? "rtsp:// media1.law.harvard.edu/Media/policy_a/2012/02/02_unger.mov"). I have a CustomMediaSink based more or less on copy-pasting your DummyMediaSink. I do these things in continueAfterSETUP: scs.subsession->sink = CustomRTSPMediaSink::createNew(env, *scs. subsession, rtspClient->url()); and if all is well, scs.subsession->sink->startPlaying(*(scs.subsession->readSource()), subsessionAfterPlaying, scs.subsession); continueAfterPLAY then occurs without error, and I see some low-level RTSP logs happen that give me the sense the session is happily running, but no calls to my sad, little CustomRTSPMediaSink::afterGettingFrame(...) implementation. And how about these higher-level MediaSinks - do those work right out of > the box? Seems too good to be true. Let's say I had AAC+VP8 streams > coming in. Would I conditionally create a MPEG4LATMAudioRTPSink (what if > it's non-LATM MPEG4?) and a VP8VideoRTPSink in continueAfterSETUP() based > on inspecting the subsession? I suppose I will have to try this out myself > ;) > > > No, you're confused here. The "*RTPSink" classes are used only for > *transmitting* RTP packets. I.e., they're used by servers, not clients, > and are therefore not classes that you would use. > OK, thank you for that info! I also understand that your library does not offer the codecs. Maybe you should begin by explaining what it is specifically that you're > trying to do... > I'd love to say more, but I'm prohibited from going into too much detail to a public mailing list. I really appreciate your quick feedback. Thanks, Jesse -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 15:31:40 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 15:31:40 -0800 Subject: [Live-devel] Noob wandering In-Reply-To: References: Message-ID: <397B04E2-9D96-44DD-A0DB-A7BEF4924CAA@live555.com> >> Based on the testRTSPClient example, I've gotten a stable RTSP connection working on my target platform; in this case to a video+audio RTSP source. But, now I'm struggling to figure out the next step. My custom MediaSink classes do not receive any frame data via afterGettingFrame() > > That is the first thing that you should fix. Note that when the (non-static) "DummySink::afterGettingFrame()" function ("testRTSPClient.cpp", lines 479-500) is called, a complete frame will have already been delivered into "fReceiveBuffer". > > That's my first problem then - I should have been more clear, in my case, that function is not getting called at all. I'm not debugging the original project, as that appears to exit immediately after proving that it can set up a connection (maybe I've chosen a bad test source? "rtsp://media1.law.harvard.edu/Media/policy_a/2012/02/02_unger.mov"). That stream is OK; however it works only if you request RTP-over-TCP streaming (in your call to "RTSPClient::sendSetupCommand()". (That server is apparently behind a firewall that blocks UDP packets.) Instead, try the following (audio-only) stream, which works: rtsp://64.202.98.91:554/sog.sdp > Maybe you should begin by explaining what it is specifically that you're trying to do... > > I'd love to say more, but I'm prohibited from going into too much detail to a public mailing list. OK, fair enough. However, if your questions are deliberately vague, then don't be surprised if you don't get a response. (Also, I hope your employer is OK with you using Open Source, LGPL'd software in your project :-) Note also that if if you want to ask 'private' questions (i.e., not on this mailing list), then I am available for consulting (e.g., in an advisory role). But if you want to ask questions about this software 'for free', then you'll need to do so using this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ferielbenghorbel at gmail.com Mon Jan 28 05:10:41 2013 From: ferielbenghorbel at gmail.com (feriel ben ghorbel) Date: Mon, 28 Jan 2013 14:10:41 +0100 Subject: [Live-devel] Questions about Performance of live555 Message-ID: Hi all, How bombard live555 by streams "*.ts" to test the performance of live555 in terms of throughput and number of streams that can diffuse it in the same time. Thanks and regards _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From live555 at csmiller.demon.co.uk Mon Jan 28 05:50:32 2013 From: live555 at csmiller.demon.co.uk (live555 at csmiller.demon.co.uk) Date: Mon, 28 Jan 2013 13:50:32 +0000 Subject: [Live-devel] RTCPInstance SRHandler determine sender's port Message-ID: Hi, With the RTCPInstance SRHandler callback, is there any method to determine which UDP port the sender is using for RTCP? I'm writing a receiver for a point-to-point RTP stream. The stream is initiated by using a method other than RTSP. The code does not know what RTP and RTCP ports that the sender will pick. However, I am required to send back RR packets. I can't see any obvious way to get this from the RTCPInstance (or the RTPInstance). TIA, Colin S. Miller From finlayson at live555.com Mon Jan 28 18:31:03 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 18:31:03 -0800 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> Message-ID: <9F06331A-B727-4FC8-B018-838D9F5B1476@live555.com> Please replace your version of "JPEGVideoRTPSource.cpp" with this attached version - that adds debugging output - and recompile. Then, when you get the crash again, please send us the diagnostic output again. With luck, this will help track down the problem some more. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: JPEGVideoRTPSource.cpp Type: application/octet-stream Size: 17922 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 18:38:14 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 18:38:14 -0800 Subject: [Live-devel] Questions about Performance of live555 In-Reply-To: References: Message-ID: <7699A6D7-B5E9-49F7-8BBD-50595C3BF123@live555.com> > How bombard live555 by streams "*.ts" to test the performance of live555 in terms of throughput and number of streams > that can diffuse it in the same time. (I'm assuming that the above sentence was intended to be a question...) You can do this easily just by running multiple RTSP client applications (e.g., "testRTSPClient" or "openRTSP") concurrently. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 28 18:39:55 2013 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Jan 2013 18:39:55 -0800 Subject: [Live-devel] RTCPInstance SRHandler determine sender's port In-Reply-To: References: Message-ID: > With the RTCPInstance SRHandler callback, is there any method > to determine which UDP port the sender is using for RTCP? No, and this is why you should support RTSP - in both your sender and receiver. We spent a lot of effort standardizing this protocol, and I spent a lot of time implementing it - in the "LIVE555 Streaming Media" software. This was for a good reason. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From saravanan.s at fossilshale.com Tue Jan 29 02:52:44 2013 From: saravanan.s at fossilshale.com (saravanan) Date: Tue, 29 Jan 2013 16:22:44 +0530 Subject: [Live-devel] Add/Remove SubSessions dynamically In-Reply-To: <23F188D0-A730-4E28-9C3E-7854B7B9D81F@live555.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> <23F188D0-A730-4E28-9C3E-7854B7B9D81F@live555.com> Message-ID: <00c801cdfe0e$c6800b90$538022b0$@s@fossilshale.com> Hi, I have created one ServerMediaSession with two sub sessions, one for MJPEG and one for AAC. When I played this from the client it works fine. Now I want to remove the MJPEG sub session and add the H264 sub session to the existing ServerMediaSession. I found there is a function removeServerMediaSession() to remove a server media session from the RTSPServer, similarly is there any functions to dynamically remove and add the sub sessions to the server media session ? Regards, Saravanan S -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 29 06:49:42 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Jan 2013 06:49:42 -0800 Subject: [Live-devel] Add/Remove SubSessions dynamically In-Reply-To: <00c801cdfe0e$c6800b90$538022b0$@s@fossilshale.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> <23F188D0-A730-4E28-9C3E-7854B7B9D81F@live555.com> <00c801cdfe0e$c6800b90$538022b0$@s@fossilshale.com> Message-ID: <8CD2D011-C599-4A51-A87B-FFDB20FEEC1E@live555.com> > I have created one ServerMediaSession with two sub sessions, one for MJPEG and one for AAC. When I played this from the client it works fine. Now I want to remove the MJPEG sub session and add the H264 sub session to the existing ServerMediaSession. I found there is a function removeServerMediaSession() to remove a server media session from the RTSPServer, similarly is there any functions to dynamically remove and add the sub sessions to the server media session ? No - you can get the same effect by removing the whole "ServerMediaSession" object, then creating a new "ServerMediaSession", adding whatever "ServerMediaSubsession"s you want to it, then adding this new "ServerMediaSession" to the server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Tue Jan 29 06:55:55 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Tue, 29 Jan 2013 14:55:55 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: <9F06331A-B727-4FC8-B018-838D9F5B1476@live555.com> References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> <9F06331A-B727-4FC8-B018-838D9F5B1476@live555.com> Message-ID: Please replace your version of "JPEGVideoRTPSource.cpp" with this attached version - that adds debugging output - and recompile. Then, when you get the crash again, please send us the diagnostic output again. With luck, this will help track down the problem some more. Ross Finlayson Live Networks, Inc. http://www.live555.com/ Thanks. I've attached two files from two different exceptions. Note that in one case I re-directed stderr, so it doesn't included the printf messages I added to the testRTSPClient code. I did run it using DrMemory, and I only included the last 1000 or so lines of the output due to size. Please let me know if you want me to execute anything else. Tnx! /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testRTSPClientOutput-jpegExample.zip Type: application/x-zip-compressed Size: 30100 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From yogesh_marathe at ti.com Tue Jan 29 07:29:20 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Tue, 29 Jan 2013 15:29:20 +0000 Subject: [Live-devel] Parsing frames received by testRTSPClient In-Reply-To: <9186CFD4-1A03-4CEC-8976-F2FEF95560FD@live555.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> <6EFDC7CD849764409289BF330AF9704A3E9F5C47@DBDE01.ent.ti.com> <9186CFD4-1A03-4CEC-8976-F2FEF95560FD@live555.com> Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9F607B@DBDE01.ent.ti.com> Thanks Ross. Instead of adding a MediaSink I decided to add implementation for identifying NAL units in afterGettingFrames() of DummySink. I added some parameters in DummySink for received bitrate calculation and that seems to be working for me and my decoder is now able to decode streams. I was actually interested how much CPU load is consumed while doing this so I didn't go for MediaSink implementation that you suggested. Since, I just want to receive stream and send to decoder, do you think using testRTSPClient over OpenRTSP would be an advantage or there should be no difference if either one is used from CPU cycles perspective? I'm about to profile both on my system. I had read in one of the FAQs Live555 is not thread safe. Does that mean if a multithreaded application wants to use it, application has to synchronize between objects it is using from Live555? Regards, Yogesh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, January 29, 2013 3:22 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Parsing frames received by testRTSPClient I want two know if testRTSPClient allows me to specify codec type and if it can give me single encoded frame every time afterGettingFrame() is invoked? Yes. Look at the "DummySink" class that the "testRTSPClient" demo application uses. Note, in particular, the (non-static) "DummySink::afterGettingFrame()" function ("testRTSPClient.cpp", lines 479-500. Note that when this function is called, a complete 'frame' (for H.264, this will be a "NAL unit") will have already been delivered into "fReceiveBuffer". Note that our "DummySink" implementation doesn't actually do anything with this data; that's why it's called a 'dummy' sink. If you wanted to decode these frames, you would replace "DummySink" with your own "MediaSink" subclass. It's "afterGettingFrame()" function would pass the data (at "fReceiveBuffer", of length "frameSize") to a decoder. Because you are receiving H.264 video data, there is one more thing that you have to do before you start feeding frames to your decoder. H.264 streams have out-of-band configuration information (SPS and PPS NAL units) that you may need to feed to the decoder to initialize it. To get this information, call "MediaSubsession::fmtp_spropparametersets()" (on the video 'subsession' object). This will give you a (ASCII) character string. You can then pass this to "parseSPropParameterSets()", to generate binary NAL units for your decoder.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From togba.liberty at ipconfigure.com Tue Jan 29 12:06:00 2013 From: togba.liberty at ipconfigure.com (Togba Liberty) Date: Tue, 29 Jan 2013 20:06:00 +0000 Subject: [Live-devel] About AAC Streaming In-Reply-To: <1273BCC7-F0C9-4502-9982-3763616C2B56@live555.com> References: <390D2E6DB4DD854B805F9A0E0E21B6010A17BE6B@BL2PRD0411MB410.namprd04.prod.outlook.com> <8299E348-F7E0-4764-8F76-D31AA49110A3@live555.com> <390D2E6DB4DD854B805F9A0E0E21B6010A17C0AF@BL2PRD0411MB410.namprd04.prod.outlook.com> <1273BCC7-F0C9-4502-9982-3763616C2B56@live555.com> Message-ID: <390D2E6DB4DD854B805F9A0E0E21B6010A17C92E@BL2PRD0411MB410.namprd04.prod.outlook.com> Hi Ross, Thanks for the quick replies. We are now receiving JPEG, H.264, and Audio at the client end of our pipeline; however, the audio stream has a lot of noise. We suspect that this might be down to audio synchronization. After going through Live555 library, we would like to know if there is a way that Live555 hardcopies one FramedSource into another. The idea is to be able to directly transfer the audio and video RTP packets from the camera, through the transcoding server, to our clients without having to remove the payload of the JPEG and audio streams. Note that the client and server ends of our transcoding server have two different TaskSchedulers. Thanks. Togba Liberty C++ Developer | ipConfigure, Inc. www.ipconfigure.com From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, January 25, 2013 5:56 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] About AAC Streaming The issue we have now is that in retransmitting the audio stream we don't understand if Live555 adds an extra header to the AAC frames. As I said before - it doesn't. The frames that come from the "MPEG4GenericRTPSource" are simply AAC frames, with no extra header added. We have the jpeg and h.264 pipeline going, we just need to know what headers the MPEG4GenericRTPSource is adding to the aac frames to be able to retransmit it so VLC can play it. VLC should be able to play the stream, provided that you set up the "MPEG4GenericRTPSink" correctly (as I noted in my previous email), and also provided that you are setting "fPresentationTime" correctly on each outgoing frame. The best way to test this is *not* to use VLC at first, but instead to use our "testRTSPClient" application. You should make sure that the "a=config" value in the SDP description is the same as it was in the original ('back-end') stream. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From togba.liberty at ipconfigure.com Tue Jan 29 12:26:41 2013 From: togba.liberty at ipconfigure.com (Togba Liberty) Date: Tue, 29 Jan 2013 20:26:41 +0000 Subject: [Live-devel] About AAC Streaming In-Reply-To: <390D2E6DB4DD854B805F9A0E0E21B6010A17C92E@BL2PRD0411MB410.namprd04.prod.outlook.com> References: <390D2E6DB4DD854B805F9A0E0E21B6010A17BE6B@BL2PRD0411MB410.namprd04.prod.outlook.com> <8299E348-F7E0-4764-8F76-D31AA49110A3@live555.com> <390D2E6DB4DD854B805F9A0E0E21B6010A17C0AF@BL2PRD0411MB410.namprd04.prod.outlook.com> <1273BCC7-F0C9-4502-9982-3763616C2B56@live555.com> <390D2E6DB4DD854B805F9A0E0E21B6010A17C92E@BL2PRD0411MB410.namprd04.prod.outlook.com> Message-ID: <390D2E6DB4DD854B805F9A0E0E21B6010A17C953@BL2PRD0411MB410.namprd04.prod.outlook.com> I believe I may have found what I've been looking for. The StreamReplicator class may just solve my problem. Please advise. Togba Liberty C++ Developer | ipConfigure, Inc. www.ipconfigure.com [Description: cid:7D82F492-5DFF-427B-9295-2AD8F6DD9DA3] From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Togba Liberty Sent: Tuesday, January 29, 2013 3:06 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] About AAC Streaming Hi Ross, Thanks for the quick replies. We are now receiving JPEG, H.264, and Audio at the client end of our pipeline; however, the audio stream has a lot of noise. We suspect that this might be down to audio synchronization. After going through Live555 library, we would like to know if there is a way that Live555 hardcopies one FramedSource into another. The idea is to be able to directly transfer the audio and video RTP packets from the camera, through the transcoding server, to our clients without having to remove the payload of the JPEG and audio streams. Note that the client and server ends of our transcoding server have two different TaskSchedulers. Thanks. Togba Liberty C++ Developer | ipConfigure, Inc. www.ipconfigure.com From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, January 25, 2013 5:56 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] About AAC Streaming The issue we have now is that in retransmitting the audio stream we don't understand if Live555 adds an extra header to the AAC frames. As I said before - it doesn't. The frames that come from the "MPEG4GenericRTPSource" are simply AAC frames, with no extra header added. We have the jpeg and h.264 pipeline going, we just need to know what headers the MPEG4GenericRTPSource is adding to the aac frames to be able to retransmit it so VLC can play it. VLC should be able to play the stream, provided that you set up the "MPEG4GenericRTPSink" correctly (as I noted in my previous email), and also provided that you are setting "fPresentationTime" correctly on each outgoing frame. The best way to test this is *not* to use VLC at first, but instead to use our "testRTSPClient" application. You should make sure that the "a=config" value in the SDP description is the same as it was in the original ('back-end') stream. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 25313 bytes Desc: image001.png URL: From CERLANDS at arinc.com Tue Jan 29 13:01:59 2013 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Tue, 29 Jan 2013 21:01:59 +0000 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: <9F06331A-B727-4FC8-B018-838D9F5B1476@live555.com> References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> <9F06331A-B727-4FC8-B018-838D9F5B1476@live555.com> Message-ID: I was almost certain mjpeg was the cause, as I've been running multiple testRTSPClient's towards 3 Axis 243q encoders for 2+ weeks without any issues (using mpeg4). I however just put up two mpeg4 proxies/streams on the Cisco VSM. I didn't expect a crash, but I got one within an hour. I've attached the console output where you can confirm that the streams are identified as mp4. Noteworthy is that 603 is printed before the crash, i.e. it doesn't stop right after 602 as with mjpeg. Also included all files generated by DrMemory. The 1500 number is just something DrMemory comes up with to distinguish executions. Included the source as well, although it's the same as before, besides I experimented to increase the buffer to ensure that wasn't the issue. As always, please let me know what I can do to assist. /Claes -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: DrMemory-testRTSPClient-1500.zip Type: application/x-zip-compressed Size: 49291 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5740 bytes Desc: not available URL: From finlayson at live555.com Tue Jan 29 13:35:28 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Jan 2013 13:35:28 -0800 Subject: [Live-devel] Parsing frames received by testRTSPClient In-Reply-To: <6EFDC7CD849764409289BF330AF9704A3E9F607B@DBDE01.ent.ti.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> <6EFDC7CD849764409289BF330AF9704A3E9F5C47@DBDE01.ent.ti.com> <9186CFD4-1A03-4CEC-8976-F2FEF95560FD@live555.com> <6EFDC7CD849764409289BF330AF9704A3E9F607B@DBDE01.ent.ti.com> Message-ID: <40219BCD-2630-44ED-82BD-0A81E854128C@live555.com> > Since, I just want to receive stream and send to decoder, do you think using testRTSPClient over OpenRTSP would be an advantage or there should be no difference if either one is used from CPU cycles perspective? I?m about to profile both on my system. Both applications use about the same amount of CPU. However, "testRTSPClient" is much cleaner and easier to understand (because, in contrast, "openRTSP" has many optional features). So "testRTSPClient" is what you should use as a model. > I had read in one of the FAQs Live555 is not thread safe. That's true, but somewhat misleading. It's like saying that a high-speed rail carriage is not 'airworthy'. > Does that mean if a multithreaded application wants to use it, application has to synchronize between objects it is using from Live555? This is all explained very clearly in the FAQ - that everyone was asked to read before posting to this mailing list. See: http://www.live555.com/liveMedia/faq.html#threads Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 29 13:47:46 2013 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 29 Jan 2013 13:47:46 -0800 Subject: [Live-devel] Unhandled exception during TEARDOWN In-Reply-To: References: <7BB9F6CA-38D7-4EA3-8EFE-1722A7130C0A@live555.com> <248B66F9-29DE-43FD-8AA3-921BC043803A@live555.com> <9F06331A-B727-4FC8-B018-838D9F5B1476@live555.com> Message-ID: > I was almost certain mjpeg was the cause, as I've been running multiple testRTSPClient's towards 3 Axis 243q encoders for 2+ weeks without any issues (using mpeg4). I however just put up two mpeg4 proxies/streams on the Cisco VSM. I didn't expect a crash, but I got one within an hour. OK, so this tells us that "JPEGVideoRTPSource" is probably not to blame. > I've attached the console output where you can confirm that the streams are identified as mp4. Noteworthy is that 603 is printed before the crash, i.e. it doesn't stop right after 602 as with mjpeg. Also included all files generated by DrMemory. The 1500 number is just something DrMemory comes up with to distinguish executions. Included the source as well, although it's the same as before, besides I experimented to increase the buffer to ensure that wasn't the issue. > > As always, please let me know what I can do to assist. At this point, I don't think there's much else I can do to help. You're going to have to figure out yourself just exactly where/why the memory for the "DummySink" object is getting overwritten. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zenereyes at gmail.com Tue Jan 29 07:12:02 2013 From: zenereyes at gmail.com (lorenzo franci) Date: Tue, 29 Jan 2013 16:12:02 +0100 Subject: [Live-devel] Features H264 Message-ID: I reade the faq abouts testH264VideoToTransportStream and i saw that i must to modify FramedSource . I have a flow h264 from grabber and i can to access to the encode buffer and to the set of features of flow. Where i must to set the features of my h264 streaming? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From yogesh_marathe at ti.com Tue Jan 29 20:12:57 2013 From: yogesh_marathe at ti.com (Marathe, Yogesh) Date: Wed, 30 Jan 2013 04:12:57 +0000 Subject: [Live-devel] Parsing frames received by testRTSPClient In-Reply-To: <40219BCD-2630-44ED-82BD-0A81E854128C@live555.com> References: <00ac01cdfaf1$123ad400$36b07c00$@s@fossilshale.com> <4156BD39-E4AD-4B16-97A5-8A63DF5753E7@live555.com> <000001cdfb12$b09cedb0$11d6c910$@s@fossilshale.com> <00b001cdfd59$0ee70430$2cb50c90$@s@fossilshale.com> <6EFDC7CD849764409289BF330AF9704A3E9F5C47@DBDE01.ent.ti.com> <9186CFD4-1A03-4CEC-8976-F2FEF95560FD@live555.com> <6EFDC7CD849764409289BF330AF9704A3E9F607B@DBDE01.ent.ti.com> <40219BCD-2630-44ED-82BD-0A81E854128C@live555.com> Message-ID: <6EFDC7CD849764409289BF330AF9704A3E9F616F@DBDE01.ent.ti.com> Thanks of answers. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, January 30, 2013 3:05 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Parsing frames received by testRTSPClient Since, I just want to receive stream and send to decoder, do you think using testRTSPClient over OpenRTSP would be an advantage or there should be no difference if either one is used from CPU cycles perspective? I'm about to profile both on my system. Both applications use about the same amount of CPU. However, "testRTSPClient" is much cleaner and easier to understand (because, in contrast, "openRTSP" has many optional features). So "testRTSPClient" is what you should use as a model. I had read in one of the FAQs Live555 is not thread safe. That's true, but somewhat misleading. It's like saying that a high-speed rail carriage is not 'airworthy'. Does that mean if a multithreaded application wants to use it, application has to synchronize between objects it is using from Live555? This is all explained very clearly in the FAQ - that everyone was asked to read before posting to this mailing list. See: http://www.live555.com/liveMedia/faq.html#threads Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno.abreu at livingdata.pt Wed Jan 30 10:17:31 2013 From: bruno.abreu at livingdata.pt (Bruno Abreu) Date: Wed, 30 Jan 2013 18:17:31 +0000 Subject: [Live-devel] Scaling a H264 elementary stream Message-ID: <20130130180211.M81070@livingdata.pt> Hello everyone, we've just developed a H264 video stream filter that allows scaling. Fast-forward and slow-motion, only, reverse-play is not contemplated. It works by adjusting the presentation time and frame duration of each frame based on the current scale factor, thus speeding up or slowing down frame pulling form the upstream source and delivery to the downstream sink. We set up our pipeline like this: ByteFileFileSource -> H264VideoStreamFramer -> ScalableH264VideoStreamDiscreteFramer -> H264VideoRTPSink Our ByteFileFileSource reads a H264 elementary stream and feeds it to the H264VideoStreamFramer. After that we use our scalable filter which extends H264VideoStreamDiscreteFramer. We know a discrete framer is supposed to be used for discrete sources, not when reading from a file. But we tried using a FramedFilter extension and that doesn't work because H264VideoRTPSink requires its source to be a H264VideoStreamFramer. And, since our source is already split into frames by the previous element, the H264VideoStreamDiscreteFramer suited this task just fine. This might not be a "correct way" of implementing scale on H264 video streams, but is working very well for us. We are aware that other, maybe better, techniques are available. Namely those in which only the I-frames are sent. That is exactly the downside to this method: every frame is sent to the client and, for high scale factors, the client has to work harder in decoding them at higher rates. On the other hand, it doesn't require index files neither do we need to encapsulate our elementary stream in a container. We implement the mandatory - virtual void testScaleFactor(float& scale); - virtual void setStreamSourceScale(FramedSource* inputSource, float scale); methods in our H264VideoFileServerMediaSubsession and, when setStreamSourceScale(FramedSource* inputSource, float scale) is called, we call setScaleFactor(float scale) on our filter. We've been using VLC as the client for testing. Any suggestions in improving this filter are welcome. Thank you, Bruno Abreu -- Living Data - Sistemas de Informa??o e Apoio ? Decis?o, Lda. LxFactory - Rua Rodrigues de Faria, 103, edif?cio I - 4? piso Phone: +351 213622163 1300-501 LISBOA Fax: +351 213622165 Portugal URL: www.livingdata.pt -------------- next part -------------- A non-text attachment was scrubbed... Name: ScalableH264VideoStreamDiscreteFramer.hh Type: text/x-c++hdr Size: 2229 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ScalableH264VideoStreamDiscreteFramer.cpp Type: text/x-c++src Size: 5090 bytes Desc: not available URL: From markuss at sonicfoundry.com Wed Jan 30 10:31:59 2013 From: markuss at sonicfoundry.com (Markus Schumann) Date: Wed, 30 Jan 2013 18:31:59 +0000 Subject: [Live-devel] stream descriptor only reachable by HTTP In-Reply-To: References: <1ED2F9A76678E0428E90FB2B6F93672D0108AE4F@postal.sonicfoundry.net> <1ED2F9A76678E0428E90FB2B6F93672D0108AFA6@postal.sonicfoundry.net> Message-ID: <1ED2F9A76678E0428E90FB2B6F93672D0108B5FD@postal.sonicfoundry.net> Works like a charm - thanks. Markus. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, January 21, 2013 2:22 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] stream descriptor only reachable by HTTP Do I have to use a lightweight HTTP client to get the SDP and then side load your RTPClient with the string? However you get the SDP description is up to you. (However, since you already have it - because you included it in your last email - you shouldn't need to get it again :-) Once you have the SDP description, you can just follow steps 1/ to 3/ in my previous email. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 30 12:25:21 2013 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 30 Jan 2013 12:25:21 -0800 Subject: [Live-devel] Scaling a H264 elementary stream In-Reply-To: <20130130180211.M81070@livingdata.pt> References: <20130130180211.M81070@livingdata.pt> Message-ID: <99C816E7-1229-4F0A-8FF1-1BBD66E22E16@live555.com> Yes, this seems like a good way to implement fast-forward and slow-motion, provided that (1) the network and receiver can handle the increased bitrate when you increase the scale factor - as you noted, and (2) the receiver can properly handle frames that arrive at an 'unusual' frame rate. By that, I mean that H.264 "SPS" NAL units often contain parameters (specifically, 'num_units_in_tick' and 'time_scale') that imply a particular frame rate. Some receivers might get confused if frames arrive at a different frame rate. (However, you have apparently found that VLC handles these streams OK.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zvika.meiseles at gmail.com Wed Jan 30 10:46:01 2013 From: zvika.meiseles at gmail.com (Zvika Meiseles) Date: Wed, 30 Jan 2013 20:46:01 +0200 Subject: [Live-devel] RTP stream retransmit Message-ID: I've written some code using the live555 library that receives an RTP MPEG2 stream, converts it to TS and transmits it using RTP to another address. The incoming RTP stream is a live feed coming from a camera. In order not to be vulnerable to networking delays, I've implemented a FramedFilter that takes the TS packets and puts them in a queue. The "client" of this queue is a SimpleRTPSink, which reads them through a MPEG2TransportStreamFramer. The frames are "released" from the queue after a slight delay. The idea was for the "Framer" to read the frames in near-live rate, and the RTPSource to put frames into the queue in near-live rate. The initial amount of "queued frames" should allow slight differences between in/out rates. My problem is that the "Framer" reports most of the frames' duration as "0.0", and as a result 'uSecondsToGo' in the RTPSink is always 0, and the frames are sent out too fast. Any idea how to get this to work? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at smartwire.com Wed Jan 30 13:39:53 2013 From: jshanab at smartwire.com (Jeff Shanab) Date: Wed, 30 Jan 2013 21:39:53 +0000 Subject: [Live-devel] Scaling a H264 elementary stream In-Reply-To: <99C816E7-1229-4F0A-8FF1-1BBD66E22E16@live555.com> References: <20130130180211.M81070@livingdata.pt> <99C816E7-1229-4F0A-8FF1-1BBD66E22E16@live555.com> Message-ID: <615FD77639372542BF647F5EBAA2DBC22525708E@IL-BOL-EXCH01.smartwire.com> I implement fast forward, and reverse play, and stepping forward and backward, but I do it from a buffer on the client. I thought Reverse required a buffer. Can rtsp stream backwards, By gop obviously, since diffframes depend on keyframe This message and any attachments contain confidential and proprietary information, and may contain privileged information, belonging to one or more affiliates of Windy City Wire Cable & Technology Products, LLC. No privilege is waived by this transmission. Unauthorized use, copying or disclosure of such information is prohibited and may be unlawful. If you receive this message in error, please delete it from your system, destroy any printouts or copies of it, and notify the sender immediately by e-mail or phone. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, January 30, 2013 2:25 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Scaling a H264 elementary stream Yes, this seems like a good way to implement fast-forward and slow-motion, provided that (1) the network and receiver can handle the increased bitrate when you increase the scale factor - as you noted, and (2) the receiver can properly handle frames that arrive at an 'unusual' frame rate. By that, I mean that H.264 "SPS" NAL units often contain parameters (specifically, 'num_units_in_tick' and 'time_scale') that imply a particular frame rate. Some receivers might get confused if frames arrive at a different frame rate. (However, you have apparently found that VLC handles these streams OK.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 31 00:40:32 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 Jan 2013 00:40:32 -0800 Subject: [Live-devel] RTP stream retransmit In-Reply-To: References: Message-ID: > My problem is that the "Framer" reports most of the frames' duration as "0.0", and as a result 'uSecondsToGo' in the RTPSink is always 0, and the frames are sent out too fast. That's strange. The "MPEG2TransportStreamFramer" class should be scanning the "PTS" (timestamps) in the incoming MPEG Transport Stream packets, and using this to compute an estimated 'duration' for each. So I don't know why this is not working for you. But Remember, You Have Complete Source Code! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From simon.roehrl.ext at siemens.com Thu Jan 31 08:29:27 2013 From: simon.roehrl.ext at siemens.com (Roehrl, Simon) Date: Thu, 31 Jan 2013 17:29:27 +0100 Subject: [Live-devel] READ_FROM_FILES_SYNCHRONOUSLY applies for WinCE as well Message-ID: <76EB6D54AFDD1349B9D48319372F7BA70A3EDACBF7@DEMCHP99E84MSX.ww902.siemens.net> Hey, the problem with sockets applies for Windows CE as well. Find attached a patch that defines READ_FROM_FILES_SYNCHRONOUSLY when compiled for WinCE. Regards Simon -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: InputFile.hh.patch Type: application/octet-stream Size: 696 bytes Desc: InputFile.hh.patch URL: From finlayson at live555.com Thu Jan 31 10:54:41 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 Jan 2013 10:54:41 -0800 Subject: [Live-devel] READ_FROM_FILES_SYNCHRONOUSLY applies for WinCE as well In-Reply-To: <76EB6D54AFDD1349B9D48319372F7BA70A3EDACBF7@DEMCHP99E84MSX.ww902.siemens.net> References: <76EB6D54AFDD1349B9D48319372F7BA70A3EDACBF7@DEMCHP99E84MSX.ww902.siemens.net> Message-ID: > the problem with sockets applies for Windows CE as well. Find attached a patch that defines READ_FROM_FILES_SYNCHRONOUSLY when compiled for WinCE. I'm puzzled why nobody has mentioned this before. But unless someone objects, this change will be included in the next release of the software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From temp2010 at forren.org Thu Jan 31 14:42:17 2013 From: temp2010 at forren.org (temp2010 at forren.org) Date: Thu, 31 Jan 2013 17:42:17 -0500 Subject: [Live-devel] Am I accidentally H.264 encoding twice??? Message-ID: Am I accidentally H.264 encoding twice? I have a monstrosity combination of media foundation and live555. It manages to send IP video out, and VLC receives it. But the quality is bad. I have code that was a 555 test program, that I modified to make my own sub thread. I have a MF_H264_DeviceSource class that I derived from DeviceSource per 555 FAQ advice. I have H264VideoDeviceStreamer class that I derived from testH264VideoStreamer main program. At the bottom of H264VideoDeviceStreamer (from testH264VideoStreamer), I changed a reference to ByteStreamFileSource to my MF_H264_DeviceSource. This then feeds H264VideoStreamFramer which goes into a play function. WHAT? Just a while ago I realized I'm passing H.264 encoded buffers to H264VideoStreamFramer, which is perhaps doubly encoding them to H.264 again. Am I accidentally H.264 encoding twice? Does the original ByteStreamFileSource fed to H264VideoStreamFramer feed raw buffers to H264VideoStreamFramer? If so, I need to feed raw buffers to H264VideoStreamFramer as well. And then, doe H264VideoStreamFramer do the H264 encoding for me? Well, I tried removing the potentially doubled H.264 encoding from my side, but then VLC seems to receive nothing. Do note that VLC seems very forgiving, so I was thinking the receiving but bad quality was because VLC was forgiving and adaptable enough that it managed to play my doubly-encoded H.264. ?? Any help is greatly appreciated. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 31 16:52:59 2013 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 31 Jan 2013 16:52:59 -0800 Subject: [Live-devel] Am I accidentally H.264 encoding twice??? In-Reply-To: References: Message-ID: <2471B04D-1F02-45B1-BCEA-FBDA2509A591@live555.com> > WHAT? Just a while ago I realized I'm passing H.264 encoded buffers to H264VideoStreamFramer, which is perhaps doubly encoding them to H.264 again. > > Am I accidentally H.264 encoding twice? Does the original ByteStreamFileSource fed to H264VideoStreamFramer feed raw buffers to H264VideoStreamFramer? I think you're confused about what our software does. *None* of our software does *any* encoding. In particular, the "H264VideoStreamFramer" and "H264VideoStreamDiscreteFramer" classes each take - as input - already-encoded H.264 video data. They don't do any 'encoding' (because the input data is already encoded. All they do is parse the input H.264 video data, and output a sequence of H.264 'NAL units', with proper 'presentation time' and 'duration' values. The difference between these two classes is that "H264VideoStreamFramer" takes - as input - H.264 video data that appears in a byte stream (e.g. a file or pipe). "H264VideoStreamDiscreteFramer", on the other hand, takes as input discrete NAL units (i.e., one NAL unit at a time), *without* any preceding 'start code'. So, the choice of which of these 'framer' classes to use depends on what kind of data comes out of your "MF_H264_DeviceSource" class. If this class outputs an unstructured byte stream (that contains H.264 video data, with 'start codes' preceding each NAL units), then use a "H264VideoStreamFramer". If, however, your "MF_H264_DeviceSource" class outputs a sequence of NAL units (one at a time, without a preceding 'start code'), then use a "H264VideoStreamDiscreteFramer" instead. In either case, your 'framer' object should then be fed into a "H264VideoRTPSink" object, for streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zvika.meiseles at gmail.com Thu Jan 31 22:55:14 2013 From: zvika.meiseles at gmail.com (Zvika Meiseles) Date: Fri, 1 Feb 2013 08:55:14 +0200 Subject: [Live-devel] RTP stream retransmit Message-ID: That's strange. The "MPEG2TransportStreamFramer" class should be scanning the "PTS" (timestamps) in the incoming MPEG Transport Stream packets, and using this to compute an estimated 'duration' for each. So I don't know why this is not working for you. I think I figured the cause - many of the incoming RTP frames "get" the same presentation time (via 'receptionStatsDB().noteIncomingPacket(...)'. I have not yet determined the cause for that. Here is my object chain, maybe you can see something wrong: MPEG1or2VideoRTPSource --> MPEG2TransportStreamFromESSource --> FrameQueue(this is mine) --> MPEG2TransportStreamFramer --> SimpleRTPSink Maybe I need to modify the presentation time (somehow) to account for my artificial delay. I was hoping I wouldn't need that... -------------- next part -------------- An HTML attachment was scrubbed... URL: