From goelli at goelli.de Wed Jan 1 06:59:15 2014 From: goelli at goelli.de (=?iso-8859-1?Q?Thomas_G=F6llner?=) Date: Wed, 1 Jan 2014 15:59:15 +0100 Subject: [Live-devel] derived RTSPServer class - get RTSPClientSession by SessionId Message-ID: <000701cf0702$0a2acd60$1e806820$@goelli.de> Happy New Year to everyone! I'm trying to write my own special RTSPServer which extends the functionality of the library's RTSPServer for some special protocol. My RTSPServer is supposed to make little changes to the RTSP Messages before they are send. I'm now stuck at the point where I want to implement a DESCRIBE-Response for Requests with only rtsp://:/ as Content-Base but with a correct SessionID. For the special protocol I have to safe some parameters in the myRTSPClientSession-class. To access them during DESCRIBE-handle I must have access to the private fOurServer.fClientSessions->Lookup(sessionId) method to get the correct ClientSession. I am wondering that this does not work, because fOurServer is a RTSPServer& I had some trouble before that with access to protected RTSPServer::RTSPClientConnactio::fResponseBuffer from a ClientSession. I was able to do that with some casting. This solution seems not to work here. I also thought about getting the modifications into the ServerMedieSession and ServerMediaSubssion classes, but I want to be able to access the Streams via "normal" clients and "mySpecial" clients. The changes to the description will confuse the "normal" clients. To clarify what I mean and what I've done so far, I post a shortend version of the SourceCode to myRTSPServer-class. Please tell me how I can access fClientSessions Hashtable which is declared as private. Or am I supposed to do this in another way? Best regards, Thomas G?llner #include "RTSPServer.hh" #include "RTSPCommon.hh" #include using namespace std; class myRTSPServer :public RTSPServer { public: class myRTSPClientSession; // forward class myRTSPClientConnection; // forward private: friend class myRTSPClientSession; friend class myRTSPClientConnection; public: //static create new protected: //Constructor //virtual Destructor virtual RTSPClientConnection* createNewClientConnection(int clientSocket, struct sockaddr_in clientAddr) { return new myRTSPClientConnection (*this, clientSocket, clientAddr); } virtual RTSPClientSession* createNewClientSession(u_int32_t sessionId) { return new myRTSPClientSession (*this, sessionId); } public: class myRTSPClientConnection: public RTSPServer::RTSPClientConnection { public: //Constructor and Destructor protected: friend class myRTSPClientSession; virtual void handleCmd_DESCRIBE(char const* urlPreSuffix, char const* urlSuffix, char const* fullRequestStr) { char sessionId[RTSP_PARAM_STRING_MAX]; //some Test if the request is one I want to change Boolean result = testFunction(urlSuffix, fullRequestStr, sessionId); // call standard DESCRIBE-handle if (!result) RTSPServer::RTSPClientConnection::handleCmd_DESCRIBE(urlPreSuffix, urlSuffix, fullRequestStr); // change urlSuffix to create DESCRIBE-Reply which I want to change else { RTSPServer::RTSPClientConnection::handleCmd_DESCRIBE(urlPreSuffix, "DummyStream", fullRequestStr); RTSPServer::RTSPClientSession* clientSession = (RTSPServer::RTSPClientSession*)(fOurServer.fClientSessions->Lookup(sessionI d)); if (clientSession == NULL) { // Check for wrong SessionId strcpy((char*)fResponseBuffer, ""); handleCmd_bad(); return; } // Get the SpecialParameters float mySpecialParameter1 = clientSession-> fmySpecialParameter1; float mySpecialParameter2 = clientSession-> fmySpecialParameter2; char* mySpecialParameter3 = clientSession-> fmySpecialParameter3; char* mySpecialLine = new char[RTSP_PARAM_STRING_MAX]; // Create special line sprintf(mySpecialLine, "This is my special message.\r\n" "It contains the three special parameters.\r\n" " mySpecialParameter1 = %.3f\r\n" " mySpecialParameter2 = %.3f\r\n" " mySpecialParameter3 = %s\r\n", mySpecialParameter1, mySpecialParameter2, mySpecialParameter3); stringstream oldResponseBuffer; oldResponseBuffer << fResponseBuffer; string newResponseBuffer; // Do my changes //write them back snprintf((char*)fResponseBuffer, sizeof fResponseBuffer, "%s" "%s\r\n\r\n", newResponseBuffer, mySpecialLine); } } }; class myRTSPClientSession: public RTSPServer::RTSPClientSession { public: // Constructor and virtual Destructor protected: friend class myRTSPClientConnection; float fmySpecialParameter1; float fmySpecialParameter2; char fmySpecialParameter3[5]; virtual void handleCmd_SETUP(RTSPServer::RTSPClientConnection* ourClientConnection, char const* urlPreSuffix, char const* urlSuffix, char const* fullRequestStr) { char sessionId[RTSP_PARAM_STRING_MAX]; char mySpecialParameter1[RTSP_PARAM_STRING_MAX]; char mySpecialParameter2[RTSP_PARAM_STRING_MAX]; char mySpecialParameter3[RTSP_PARAM_STRING_MAX]; //some Test if the request is one I want to change Boolean result = testFunction(urlSuffix, fullRequestStr, sessionId, mySpecialParameter1, mySpecialParameter2, mySpecialParameter3); // call standard DESCRIBE-handle if (!result) RTSPServer::RTSPClientSession::handleCmd_SETUP(ourClientConnection, urlPreSuffix, urlSuffix, fullRequestStr); // change urlSuffix to create DummyStream else { RTSPServer::RTSPClientSession::handleCmd_SETUP(ourClientConnection, urlPreSuffix, "DummyStream", fullRequestStr); // Set the SpecialParameters for this SETUP-Request fmySpecialParameter1 = atof(mySpecialParameter1); fmySpecialParameter2 = atof(mySpecialParameter2); snprintf(fmySpecialParameter3, 4, mySpecialParameter3); // I had the same problem, but in this case I resolved it this way myRTSPServer::myRTSPClientConnection* myOurClientConnection = (myRTSPServer::myRTSPClientConnection*)ourClientConnection; char *ptr; // cut last \r\n from fResponseBuffer ptr = strrchr((char*)myOurClientConnection->fResponseBuffer, '\r'); *ptr = '\0'; // add one special line for my protocol strcat((char*)myOurClientConnection->fResponseBuffer, "This is my special line\r\n\r\n"); } } }; }; From finlayson at live555.com Wed Jan 1 13:14:26 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Jan 2014 13:14:26 -0800 Subject: [Live-devel] derived RTSPServer class - get RTSPClientSession by SessionId In-Reply-To: <000701cf0702$0a2acd60$1e806820$@goelli.de> References: <000701cf0702$0a2acd60$1e806820$@goelli.de> Message-ID: <67DE7FBA-BEF4-4D28-9718-3F345598CE90@live555.com> > I'm trying to write my own special RTSPServer which extends the > functionality of the library's RTSPServer for some special protocol. > My RTSPServer is supposed to make little changes to the RTSP Messages before > they are send. I'm now stuck at the point where I want to implement a > DESCRIBE-Response for Requests with only rtsp://:/ as > Content-Base but with a correct SessionID. It's unusual for a "DESCRIBE" request to contain a "Session:" header. I think it's allowed in the protocol, but our RTSP client implementation does not support it. (So, if you wanted to do this, for your client(s) you'd either need someone else's RTSP client implementation, or else hack ours :-( So, I'm curious, why do you want to do this? I.e., why do you want the SDP description resulting from the "DESCRIBE" command to depend upon the particular (already existing) session in which the command was sent, rather than simply upon the object that's being described? Are you sure that you can't do what you want using the existing "SET_PARAMETER" and/or "GET_PARAMETER" commands (which are already supported by both our server *and* client implementations)? Generally speaking, "special protocols" are a bad idea, if there are existing (and standardized!) protocols that do the same thing... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From goelli at goelli.de Thu Jan 2 02:00:47 2014 From: goelli at goelli.de (=?utf-8?Q?Thomas_G=C3=B6llner?=) Date: Thu, 2 Jan 2014 11:00:47 +0100 Subject: [Live-devel] derived RTSPServer class - get RTSPClientSession by SessionId Message-ID: > It's unusual for a "DESCRIBE" request to contain a "Session:" header. I think it's allowed in the protocol, but our RTSP client implementation does not support it. (So, if you wanted to do this, for your client(s) you'd either need someone else's RTSP client implementation, or else hack ours :-( > > So, I'm curious, why do you want to do this? I.e., why do you want the SDP description resulting from the "DESCRIBE" command to depend upon the particular (already existing) session in which the command was sent, rather than simply upon the object that's being described? > > Are you sure that you can't do what you want using the existing "SET_PARAMETER" and/or "GET_PARAMETER" commands (which are already supported by both our server *and* client implementations)? > > Generally speaking, "special protocols" are a bad idea, if there are existing (and standardized!) protocols that do the same thing... Hi Ross, thanks for your quick reply. I did not want to mention this, but exchange "SpecialProtocol" with SAT>IP and you have your answer. There are many clients for this protocol so I have to make a Server that speeks their language. I saw in the code that DESCRIBE with sessionID is not supported but some clients rely on that. Even some clients make the channel-changing only in play commands. So I have to change the stream during a session which is also not supported by the library. That's another point, I'm stuck at the moment. I don't know how to stop playing a RTSP unicast during PLAY command. But I'm still searching... Another thing on the horizon is playing only specific PID from a TS. I hope at this point the MPEG Demux and Mux Classes will help. But perhaps I have to start a new topic when I am on that point. I have to do this that way because of GLGPL. The code I write unfortunatly has to be closed-source by now. So I have to derive classes. It would be an easy way if I just copy most of the code of the handle-Functions, but I don't think that this would be reconcilable with the license. I will have a second look on SET/GET Parameter commands. But I know the clients don't use it. Best regards, Thomas From finlayson at live555.com Thu Jan 2 02:30:35 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Jan 2014 02:30:35 -0800 Subject: [Live-devel] derived RTSPServer class - get RTSPClientSession by SessionId In-Reply-To: References: Message-ID: <9D3686ED-FC50-4571-8AE1-FEADF569AF10@live555.com> > I did not want to mention this, but exchange "SpecialProtocol" with SAT>IP and you have your answer. There are many clients for this protocol so I have to make a Server that speeks their language. Please tell the developer(s) of these clients to contact me directly about this, so I can fully understand - from them - the apparent non-standard (or at least unorthodox) way in which they're using the RTSP protocol. In any case, because this would apparently involve changes or extensions to the LIVE555 software, this is beyond the scope of this mailing list - i.e., simple help that I provide 'for free'. > I have to do this that way because of GLGPL. It's "LGPL", not "GLGPL". FYI. > The code I write unfortunatly has to be closed-source by now. So I have to derive classes. > It would be an easy way if I just copy most of the code of the handle-Functions, but I don't think that this would be reconcilable with the license. That's correct - it wouldn't be. > I will have a second look on SET/GET Parameter commands. But I know the clients don't use it. Again, please tell the developers of these clients to get in touch with me, so I can discuss their needs with them. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jawahar456 at gmail.com Thu Jan 2 02:58:32 2014 From: jawahar456 at gmail.com (Jawahar Venugopal) Date: Thu, 2 Jan 2014 16:28:32 +0530 Subject: [Live-devel] Difficulty in streaming more than one videos using live555proxyserver program. Message-ID: Hi, I downloaded the live555proxyserver.cpp program(from http://www.live555.com/liveMedia/public/live.2013.12.05.tar.gz) to stream multiple backend rtsp streams, but when I tried to play the proxied URL's using vlc the videos are getting intermixed. I am using Ubuntu 12.04 LTS 64 bit machine and Spydroid android application to stream the video. I read FAQ's carefully but couldn't solve the problem. Let me know your findings. Thanks & Regards, Jawahar V -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 2 08:43:29 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Jan 2014 08:43:29 -0800 Subject: [Live-devel] Difficulty in streaming more than one videos using live555proxyserver program. In-Reply-To: References: Message-ID: <7ACAFFA6-AC89-476A-B92B-E1A24005D665@live555.com> > I downloaded the live555proxyserver.cpp program(from http://www.live555.com/liveMedia/public/live.2013.12.05.tar.gz) FYI, the latest version of the code (the only version that we support) can be downloaded from http://www.live555.com/liveMedia/public/live555-latest.tar.gz > to stream multiple backend rtsp streams, but when I tried to play the proxied URL's using vlc the videos are getting intermixed. > > I am using Ubuntu 12.04 LTS 64 bit machine and Spydroid android application to stream the video. > > I read FAQ's carefully but couldn't solve the problem. Let me know your findings. I can't "let you know my findings" because you haven't given me any details about your problem! In particular, you need to run the "LIVE555 Proxy Server" with the "-V" option; see http://www.live555.com/proxyServer/#verbose-output Set up a simple example - e.g., with just two back-end RTSP streams - that illustrates your problem, and include the diagnostic output in your next email. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mail at chernowii.com Thu Jan 2 09:03:30 2014 From: mail at chernowii.com (Konrad) Date: Thu, 2 Jan 2014 18:03:30 +0100 Subject: [Live-devel] Get a newer version of Live Message-ID: <714703AB-3140-480D-81FD-B10E14C322C3@chernowii.com> Hello. Looking at the gopro open source page, I found they use ?live.2012.02.04.tar? (two years old) Where I can download the most recent version of live.XXXX.XX.XX.tar? Thanks and best regards, Konrad Iturbe http://chernowii.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 2 11:52:18 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Jan 2014 11:52:18 -0800 Subject: [Live-devel] Get a newer version of Live In-Reply-To: <714703AB-3140-480D-81FD-B10E14C322C3@chernowii.com> References: <714703AB-3140-480D-81FD-B10E14C322C3@chernowii.com> Message-ID: <849B0901-D00C-4C5F-8B9F-BE154F32C1DC@live555.com> > Looking at the gopro open source page, I found they use ?live.2012.02.04.tar? (two years old) > Where I can download the most recent version of live.XXXX.XX.XX.tar? See http://www.live555.com/liveMedia/faq.html#latest-version (It's not clear to me how GoPro is using the "LIVE555 Streaming Media" software, but if they are using it as embedded software in one of their products, then please let them know that - under the terms of the LGPL - they must update their software to use the latest version of the "LIVE555 Streaming Media" software, or else tell you how you can perform this upgrade yourself.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mail at chernowii.com Thu Jan 2 12:10:08 2014 From: mail at chernowii.com (Konrad) Date: Thu, 2 Jan 2014 21:10:08 +0100 Subject: [Live-devel] Get a newer version of Live In-Reply-To: <849B0901-D00C-4C5F-8B9F-BE154F32C1DC@live555.com> References: <714703AB-3140-480D-81FD-B10E14C322C3@chernowii.com> <849B0901-D00C-4C5F-8B9F-BE154F32C1DC@live555.com> Message-ID: <6576A5C1-DE5A-4C0A-B151-9834DE43E221@chernowii.com> I told them already. I am trying to replace the old version by the new version (2013 31 12) in the firmware: https://github.com/KonradIT/UnofficialGoProFW/tree/master/NoLagLivePreviewFW It seems that the new gopro, the HERO3+ is using the latest live, but the old one not. Anyway, thanks for the link. El 02/01/2014, a las 20:52, Ross Finlayson escribi?: >> Looking at the gopro open source page, I found they use ?live.2012.02.04.tar? (two years old) >> Where I can download the most recent version of live.XXXX.XX.XX.tar? > > See > http://www.live555.com/liveMedia/faq.html#latest-version > > (It's not clear to me how GoPro is using the "LIVE555 Streaming Media" software, but if they are using it as embedded software in one of their products, then please let them know that - under the terms of the LGPL - they must update their software to use the latest version of the "LIVE555 Streaming Media" software, or else tell you how you can perform this upgrade yourself.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel Konrad Iturbe http://chernowii.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From eadeli at gmail.com Sun Jan 5 04:41:56 2014 From: eadeli at gmail.com (Ehsan Adeli) Date: Sun, 5 Jan 2014 16:11:56 +0330 Subject: [Live-devel] To use openRTSP in a multi-thread program Message-ID: Hello all, I am trying to use the openRTSP code in my program and use it to record from multiple sources at the same time. Therefore, I need to run each instance in a separate thread. But openRTSP code is not thread safe (many global variables). I will be left with three options: 1- To run openRTSP as a stand-alone process for each client stream. This would not be efficient, and I would lose control over each openRTSP running process. 2- To re-implement openRTSP with a same scheme as in testRTSPClient test program. Creating my own StreamClientState, ourRTSPClient and DummySink classes. But I would lose all the functionality implemented in openRTSP. Implementing every one of them would be hard. 3- Try to make the current openRTSP implementation thread-safe, by defining StreamClientState and putting all global variables in playcomm.cpp in that class and trying to over ride RTSPClient used there. But the RTSPClient is initiated in the HandlerServerForREGISTERCommand and finding the correspondences is a bit hard. Can any body help? Do you have any ideas how this could be done? Is it possible to handle multi-threading in an easier way? Does the 'applicationName' variable here play a role? Thanks for your help, in advance. Regards, Ehsan -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 6 02:21:53 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Jan 2014 02:21:53 -0800 Subject: [Live-devel] To use openRTSP in a multi-thread program In-Reply-To: References: Message-ID: > I am trying to use the openRTSP code in my program and use it to record from multiple sources at the same time. Therefore, I need to run each instance in a separate thread. No, the second sentence does not follow logically from the first. (See below.) > But openRTSP code is not thread safe (many global variables). I will be left with three options: > 1- To run openRTSP as a stand-alone process for each client stream. Yes, that is what you should do. E.g., use a shell script to exec multiple "openRTSP" commands concurrently. > This would not be efficient, and I would lose control over each openRTSP running process. No, this would still be efficient, and you could easily control each "openRTSP" process, from your shell script. E.g., your shell script could note the pid of each "openRTSP" process, so it can kill them, if necessary. I continue to be surprised how - in this century - so many programmers have become afraid of (or unaware of) structuring applications as multiple processes, resorting instead to using multiple threads within a single process - which is *much* more difficult, and *much* more error-prone (and is something that's required only when you need shared memory). > 2- To re-implement openRTSP with a same scheme as in testRTSPClient test program. Creating my own StreamClientState, ourRTSPClient and DummySink classes. But I would lose all the functionality implemented in openRTSP. Implementing every one of them would be hard. You wouldn't need 'all' the functionality of "openRTSP" - just the parts that you want. But in any case, I don't recommend that you do this, unless you need only very basic functionality of "openRTSP". (Especially as this is a hobby for you.) > 3- Try to make the current openRTSP implementation thread-safe, by defining StreamClientState and putting all global variables in playcomm.cpp in that class and trying to over ride RTSPClient used there. But the RTSPClient is initiated in the HandlerServerForREGISTERCommand and finding the correspondences is a bit hard. No, I don't recommend that you do this. The "openRTSP" code is very complex (in part because it's also used for "playSIP" - a SIP client). It's not something to mess with. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From eadeli at gmail.com Mon Jan 6 03:37:29 2014 From: eadeli at gmail.com (Ehsan Adeli) Date: Mon, 6 Jan 2014 15:07:29 +0330 Subject: [Live-devel] To use openRTSP in a multi-thread program In-Reply-To: References: Message-ID: Dear Ross, Thanks for your reply. Well, I do understand your point. That was why I tried to explain every solution that came to my mind and all were not right for my case. About the first solution to run each one in a separate process, all the controlling is not killing a process or the like. I am recording files from incoming RTSP connections, and whenever I am done with a file I want the parent program know about that, so I call a callback function. Or whenever there is a connection error or any type of error, I want to handle the error by letting the main program know about that and resolve the issue. I also need to stop the RTSP connection neatly, and not to just KILL the process. I ask the connection to stop (anytime the user asks me to) and it will wrap everything up and close any open connections and files. And many more. If I use multiple processes, I would need to use inter process communication or something like that, and that is not right. The right solution is to do like something done in the testRTSPClient test project. I am doing that and it seems to be working fine. BTW, I am not afraid of using multiple processes. I use such a solution, when I need to. But sometimes you need to have control over your own program and interact with your program, rather than giving everything to the OS. I know the idea for openRTSP was to be used as an stand alone program and to be used in a shell-script. I also think that it is the best solution for such applications. I just simply asked a question here, to see if anybody has taught about it before or not (although I have seen old emails about the same issue, with the answer to use shell-scripting). Again, thank you all very much, for this great library and all your great efforts. Regards, Ehsan On Mon, Jan 6, 2014 at 1:51 PM, Ross Finlayson wrote: > I am trying to use the openRTSP code in my program and use it to record > from multiple sources at the same time. Therefore, I need to run each > instance in a separate thread. > > > No, the second sentence does not follow logically from the first. (See > below.) > > > But openRTSP code is not thread safe (many global variables). I will be > left with three options: > 1- To run openRTSP as a stand-alone process for each client stream. > > > Yes, that is what you should do. E.g., use a shell script to exec > multiple "openRTSP" commands concurrently. > > This would not be efficient, and I would lose control over each openRTSP > running process. > > > No, this would still be efficient, and you could easily control each > "openRTSP" process, from your shell script. E.g., your shell script could > note the pid of each "openRTSP" process, so it can kill them, if necessary. > > I continue to be surprised how - in this century - so many programmers > have become afraid of (or unaware of) structuring applications as multiple > processes, resorting instead to using multiple threads within a single > process - which is *much* more difficult, and *much* more error-prone (and > is something that's required only when you need shared memory). > > > 2- To re-implement openRTSP with a same scheme as in testRTSPClient test > program. Creating my own StreamClientState, ourRTSPClient and DummySink > classes. But I would lose all the functionality implemented in openRTSP. > Implementing every one of them would be hard. > > > You wouldn't need 'all' the functionality of "openRTSP" - just the parts > that you want. But in any case, I don't recommend that you do this, unless > you need only very basic functionality of "openRTSP". (Especially as this > is a hobby for you.) > > > 3- Try to make the current openRTSP implementation thread-safe, by > defining StreamClientState and putting all global variables in playcomm.cpp > in that class and trying to over ride RTSPClient used there. But the > RTSPClient is initiated in the HandlerServerForREGISTERCommand and finding > the correspondences is a bit hard. > > > No, I don't recommend that you do this. The "openRTSP" code is very > complex (in part because it's also used for "playSIP" - a SIP client). > It's not something to mess with. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vgottardi at hotmail.com Mon Jan 6 19:24:32 2014 From: vgottardi at hotmail.com (Victor Gottardi) Date: Mon, 6 Jan 2014 22:24:32 -0500 Subject: [Live-devel] Problem with HTTP/1.1 in RTSP-over-HTTP In-Reply-To: References: Message-ID: Hi all, RTSP client stopped working after HTTP version was upgraded to 1.1. The problem is that HTTP/1.1 requires the Host field (see http://tools.ietf.org/search/rfc2616#section-14.23). I use a web server with a RTSP tunneling mod that "proxies" RTSP-over-HTTP to a regular RTSP server. The web server drops the request with 400 (Bad-Request) because of the missing Host field. This happens before the request can reach the mod. Regards,Victor -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 6 21:12:04 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Jan 2014 21:12:04 -0800 Subject: [Live-devel] Problem with HTTP/1.1 in RTSP-over-HTTP In-Reply-To: References: Message-ID: <39E8E5F1-F12D-44FC-B1EB-940B374F45EB@live555.com> > RTSP client stopped working after HTTP version was upgraded to 1.1. The problem is that HTTP/1.1 requires the Host field (see http://tools.ietf.org/search/rfc2616#section-14.23). > > I use a web server with a RTSP tunneling mod that "proxies" RTSP-over-HTTP to a regular RTSP server. The web server drops the request with 400 (Bad-Request) because of the missing Host field. Victor, Thanks for the report. I've just installed a new version (2014.01.07) of the "LIVE555 Streaming Media" software that should fix this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jawahar456 at gmail.com Mon Jan 6 23:33:51 2014 From: jawahar456 at gmail.com (Jawahar Venugopal) Date: Tue, 7 Jan 2014 13:03:51 +0530 Subject: [Live-devel] Difficulty in streaming more than one videos using live555proxyserver program. Message-ID: Hi, Thanks for your reply, As you suggested, -I have downloaded the latest source code http://www.live555.com/liveMedia/public/live.2014.01.07.tar.gz , compiled on my UBUNTU 12.04 LTS 64-bit system. -I set up a simple scenario, with two back end RTSP streams from "SPYDROID" android application (https://code.google.com/p/spydroid-ipcamera/) on 2 samsung mobile phones. -When the experiment is conducted with only one stream, live555proxyserver creates a proxy stream that plays on VLC (v2.0.5) perfectly. -The problem arises when we try the experiment with two back-end streams, -The live555proxyserver creates two proxied URL's for the two input RTSP links. When only one proxied URL is played on VLC it works correctly and streams the video from the android application. -When the second proxied URL is played concurrently in another instance of VLC. The earlier VLC becomes blank and the second VLC plays an intermixed video containing frames from both the phones. -I have attached a text file containing the Verbose output and a couple of snapshots. -Kindly let me know on your insights. Thanks & Regards, Jawahar V -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- ./live555ProxyServer -v rtsp://192.168.1.104:8086 rtsp://192.168.1.102:8086 LIVE555 Proxy Server (LIVE555 Streaming Media library version 2014.01.07) RTSP stream, proxying the stream "rtsp://192.168.1.104:8086" Play this stream using the URL: rtsp://192.168.1.137:8554/proxyStream-1 RTSP stream, proxying the stream "rtsp://192.168.1.102:8086" Play this stream using the URL: rtsp://192.168.1.137:8554/proxyStream-2 (We use port 8000 for optional RTSP-over-HTTP tunneling.) ProxyServerMediaSession["192.168.1.102:8086/"] added new "ProxyServerMediaSubsession" for RTP/video/H263-1998 track ProxyServerMediaSession["192.168.1.104:8086/"] added new "ProxyServerMediaSubsession" for RTP/video/H263-1998 track ProxyServerMediaSubsession["H263-1998"]::createNewStreamSource(session id 0) Initiated: ProxyServerMediaSubsession["H263-1998"] ProxyServerMediaSubsession["H263-1998"]::createNewRTPSink() ProxyServerMediaSubsession["H263-1998"]::closeStreamSource() ProxyServerMediaSubsession["H263-1998"]::createNewStreamSource(session id 2978686081) ProxyServerMediaSubsession["H263-1998"]::createNewRTPSink() ProxyServerMediaSubsession["H263-1998"]::createNewStreamSource(session id 0) Initiated: ProxyServerMediaSubsession["H263-1998"] ProxyServerMediaSubsession["H263-1998"]::createNewRTPSink() ProxyServerMediaSubsession["H263-1998"]::closeStreamSource() ProxyServerMediaSubsession["H263-1998"]::createNewStreamSource(session id 1489423258) ProxyServerMediaSubsession["H263-1998"]::createNewRTPSink() ProxyServerMediaSubsession["H263-1998"]::closeStreamSource() -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot from 2014-01-07 12_16_20.png Type: image/png Size: 389983 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot from 2014-01-07 12_17_42.png Type: image/png Size: 265300 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot from 2014-01-07 12_18_03.png Type: image/png Size: 218458 bytes Desc: not available URL: From finlayson at live555.com Tue Jan 7 02:13:40 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 7 Jan 2014 02:13:40 -0800 Subject: [Live-devel] Difficulty in streaming more than one videos using live555proxyserver program. In-Reply-To: References: Message-ID: <8C9651CA-8098-481F-838B-BE460C9314C6@live555.com> > -I have attached a text file containing the Verbose output Thanks, but please do this again, using the "-V" (i.e., upper-case V) option, rather than "-v" (lower case). This will produce more verbose (and, I hope, more informative) output. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kari.kallioinen at sasken.com Tue Jan 7 06:01:44 2014 From: kari.kallioinen at sasken.com (Kari Kallioinen) Date: Tue, 7 Jan 2014 14:01:44 +0000 Subject: [Live-devel] Compiling livemedia for Axis Cris Message-ID: <7F763CFFF78CA443BD136F74E16179E5029926@exgmbxfz01.sasken.com> Hi, I am trying to compile livemedia for Axis Cris platform. Firstly I made some modifications to config.cris-axis-linux-gnu -file: --- a/external/live/config.cris-axis-linux-gnu +++ b/external/live/config.cris-axis-linux-gnu @@ -5,17 +5,17 @@ AXIS_DIR = $(AXIS_TOP_DIR)/target/cris-axis-linux-gnu COMPILE_OPTS = $(INCLUDES) -I. -mlinux -isystem $(AXIS_DIR)/include -Wall -O2 -DSOCKLEN_T=socklen_t -DCRIS -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 C = c -C_COMPILER = gcc-cris +C_COMPILER = crisv32-axis-linux-gnu-gcc C_FLAGS = $(COMPILE_OPTS) CPP = cpp -CPLUSPLUS_COMPILER = c++-cris +CPLUSPLUS_COMPILER = crisv32-axis-linux-gnu-g++ CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wno-ctor-dtor-privacy -ansi -pipe OBJ = o -LINK = c++-cris -static -o +LINK = crisv32-axis-linux-gnu-g++ -static -o AXIS_LINK_OPTS = -L$(AXIS_DIR)/lib LINK_OPTS = -L. CONSOLE_LINK_OPTS = $(LINK_OPTS) -L$(AXIS_DIR)/lib -mlinux -LIBRARY_LINK = ld-cris -mcrislinux -o +LIBRARY_LINK = crisv32-axis-linux-gnu-ld -mcrislinux -o LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic LIB_SUFFIX = a LIBS_FOR_CONSOLE_APPLICATION = And my compilation fails: make[1]: Entering directory `/home/karika/compiletest/live/testProgs' crisv32-axis-linux-gnu-g++ -static -otestMP3Streamer -L. -L/home/karika/axis/emb-app-sdk_1_3/target/cris-axis-linux-gnu/lib -mlinux testMP3Streamer.o ../liveMedia/libliveMedia.a ../groupsock/libgroupsock.a ../BasicUsageEnvironment/libBasicUsageEnvironment.a ../UsageEnvironment/libUsageEnvironment.a ../liveMedia/libliveMedia.a: could not read symbols: Bad value collect2: ld returned 1 exit status And I have no Idea why libliveMedia.a is broken. libgroupsock.a is working one and it is compiled and linked with same flags as libliveMedia.a. Any ideas? Kari Kallioinen ________________________________ SASKEN BUSINESS DISCLAIMER: This message may contain confidential, proprietary or legally privileged information. In case you are not the original intended Recipient of the message, you must not, directly or indirectly, use, disclose, distribute, print, or copy any part of this message and you are requested to delete it and inform the sender. Any views expressed in this message are those of the individual sender unless otherwise stated. Nothing contained in this message shall be construed as an offer or acceptance of any offer by Sasken Communication Technologies Limited ("Sasken") unless sent with that express intent and with due authority of Sasken. Sasken has taken enough precautions to prevent the spread of viruses. However the company accepts no liability for any damage caused by any virus transmitted by this email. Read Disclaimer at http://www.sasken.com/extras/mail_disclaimer.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Tue Jan 7 10:16:57 2014 From: warren at etr-usa.com (Warren Young) Date: Tue, 07 Jan 2014 11:16:57 -0700 Subject: [Live-devel] Compiling livemedia for Axis Cris In-Reply-To: <7F763CFFF78CA443BD136F74E16179E5029926@exgmbxfz01.sasken.com> References: <7F763CFFF78CA443BD136F74E16179E5029926@exgmbxfz01.sasken.com> Message-ID: <52CC4499.1070102@etr-usa.com> On 1/7/2014 07:01, Kari Kallioinen wrote: > > And I have no Idea why libliveMedia.a is broken. libgroupsock.a is > working one and it is compiled and linked with same flags as libliveMedia.a. With cross-compilation involved, you must have at least two toolchains installed. You might even have three, what with the configure file diffs you show. Your build tree may contain build outputs from multiple incompatible toolchains. Have you tried "make clean && make"? If that doesn't help, how about a clean unpack of the tarball, followed by a reconfigure with your custom config.cris-axis-linux-gnu file? If that also doesn't help, I'd carefully check the make(1) output to ensure that the right tools are being called at *every* step. From lucian.orasanu at gmail.com Tue Jan 7 02:27:40 2014 From: lucian.orasanu at gmail.com (Orasanu Lucian) Date: Tue, 7 Jan 2014 10:27:40 +0000 Subject: [Live-devel] openRtsp problems!! Message-ID: Hello, I'am trayng to play an stream from an ip camera, but is not working until I use with -p option to specify the port. this is the log without -p option: the error is: Unable to create receiver for "video/H264" subsession: getsockname() error: Bad file descriptor and this is with -p option and it works: http://pastebin.com/nkuW3yPp how to solve this problem??? and i'am sure that something is wrong with my system, because on my laptop is working perfect. I use gentoo on both systems. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 7 23:18:11 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 7 Jan 2014 23:18:11 -0800 Subject: [Live-devel] openRtsp problems!! In-Reply-To: References: Message-ID: <798994A8-A21D-4CDE-9AEB-CAADB1A5FEED@live555.com> > I'am trayng to play an stream from an ip camera, but is not working until I use with -p option to specify the port. > > this is the log without -p option: There wasn't any log here. > the error is: > Unable to create receiver for "video/H264" subsession: getsockname() error: Bad file descriptor > > and this is with -p option and it works: > > http://pastebin.com/nkuW3yPp I was able to access your stream just fine. It appears that you have a firewall somewhere that is blocking UDP packets, except those sent to port 554 (the client port number that you specified using the "-p" option). > how to solve this problem??? Turn off your firewall. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vgottardi at hotmail.com Tue Jan 7 18:29:22 2014 From: vgottardi at hotmail.com (Victor Gottardi) Date: Tue, 7 Jan 2014 21:29:22 -0500 Subject: [Live-devel] Problem with HTTP/1.1 in RTSP-over-HTTP In-Reply-To: References: , Message-ID: Thank you Ross for the very quick fix. It works fine with the new version. Victor -------------- next part -------------- An HTML attachment was scrubbed... URL: From kari.kallioinen at sasken.com Wed Jan 8 01:37:12 2014 From: kari.kallioinen at sasken.com (Kari Kallioinen) Date: Wed, 8 Jan 2014 09:37:12 +0000 Subject: [Live-devel] Compiling livemedia for Axis Cris In-Reply-To: <52CC4499.1070102@etr-usa.com> References: <7F763CFFF78CA443BD136F74E16179E5029926@exgmbxfz01.sasken.com>, <52CC4499.1070102@etr-usa.com> Message-ID: <7F763CFFF78CA443BD136F74E16179E502B9EC@exgmbxfz01.sasken.com> ________________________________________ From: live-devel-bounces at ns.live555.com [live-devel-bounces at ns.live555.com] on behalf of Warren Young [warren at etr-usa.com] Sent: Tuesday, January 07, 2014 20:16 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Compiling livemedia for Axis Cris > With cross-compilation involved, you must have at least two toolchains > installed. You might even have three, what with the configure file > diffs you show. Your build tree may contain build outputs from multiple > incompatible toolchains. > > Have you tried "make clean && make"? This was first toolchain I tried. I have been doing make clean between every try. > If that doesn't help, how about a clean unpack of the tarball, followed > by a reconfigure with your custom config.cris-axis-linux-gnu fle? > > If that also doesn't help, I'd carefully check the make(1) output to > ensure that the right tools are being called at *every* step. I downloaded live555-latest.tar.gz and modified config.cris-axis-linux-gnu file with that diff. I tried with fresh sources and I read trough the output but didn't recognise that anything would be wrong. Output is attached make.gz I also tried to find how the libliveMedia.a is broken: $ crisv32-axis-linux-gnu-ld libliveMedia.a libliveMedia.a: could not read symbols: Bad value $ crisv32-axis-linux-gnu-nm libliveMedia.a (output attached nm.gz) Any ideas or haven't I noticed something? Kari _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ________________________________ SASKEN BUSINESS DISCLAIMER: This message may contain confidential, proprietary or legally privileged information. In case you are not the original intended Recipient of the message, you must not, directly or indirectly, use, disclose, distribute, print, or copy any part of this message and you are requested to delete it and inform the sender. Any views expressed in this message are those of the individual sender unless otherwise stated. Nothing contained in this message shall be construed as an offer or acceptance of any offer by Sasken Communication Technologies Limited ("Sasken") unless sent with that express intent and with due authority of Sasken. Sasken has taken enough precautions to prevent the spread of viruses. However the company accepts no liability for any damage caused by any virus transmitted by this email. Read Disclaimer at http://www.sasken.com/extras/mail_disclaimer.html -------------- next part -------------- A non-text attachment was scrubbed... Name: make.gz Type: application/gzip Size: 2897 bytes Desc: make.gz URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: nm.gz Type: application/gzip Size: 39216 bytes Desc: nm.gz URL: From lucian.orasanu at gmail.com Wed Jan 8 02:15:48 2014 From: lucian.orasanu at gmail.com (Orasanu Lucian) Date: Wed, 8 Jan 2014 10:15:48 +0000 Subject: [Live-devel] openRtsp problems!! In-Reply-To: <798994A8-A21D-4CDE-9AEB-CAADB1A5FEED@live555.com> References: <798994A8-A21D-4CDE-9AEB-CAADB1A5FEED@live555.com> Message-ID: thank you for help and sorry for not posting the link, but this camera with this log is behind an router http://pastebin.com/im1Td2zi with -p 554 http://pastebin.com/i8u4TaHv and this camera is not, this camera and the pc has routing ip's http://pastebin.com/pNkGut6V with -p 554 http://pastebin.com/2p2Ftm6Z both of them cameras on this pc have the same problem when use without -p option. on my laptop they work perfect: here is other camera with my laptop, and my laptop is behind router with one camera. http://pastebin.com/kazpKnCM Something eles must be, something else is the problem, but what?? On Wed, Jan 8, 2014 at 7:18 AM, Ross Finlayson wrote: > I'am trayng to play an stream from an ip camera, but is not working > until I use with -p option to specify the port. > > this is the log without -p option: > > > There wasn't any log here. > > > the error is: > Unable to create receiver for "video/H264" subsession: getsockname() > error: Bad file descriptor > > and this is with -p option and it works: > > http://pastebin.com/nkuW3yPp > > > I was able to access your stream just fine. It appears that you have a > firewall somewhere that is blocking UDP packets, except those sent to port > 554 (the client port number that you specified using the "-p" option). > > > how to solve this problem??? > > > Turn off your firewall. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 8 06:39:43 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 8 Jan 2014 06:39:43 -0800 Subject: [Live-devel] openRtsp problems!! In-Reply-To: References: <798994A8-A21D-4CDE-9AEB-CAADB1A5FEED@live555.com> Message-ID: I can't explain the "Bad file descriptor" error; I've never seen that happen before. Something is wrong either with the operating system on that computer, or the way in which you built the "LIVE555 Streaming Media" code for that operating system. (If it's Windows, then perhaps you used the wrong version of the 'winsock' headers and/or library??) The "Unsupported transport" error with the second stream - rtsp://37.43.41.142:554/live/ch00_0 - is a different problem; it's a problem with the server (i.e., camera). The server is telling the client ("openRTSP") that it doesn't support regular RTP-over-UDP streaming. Instead, you'll need to request RTP-over-TCP streaming, by running "openRTSP" with the "-t" option. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From marcin at speed666.info Wed Jan 8 09:34:06 2014 From: marcin at speed666.info (Marcin) Date: Wed, 08 Jan 2014 18:34:06 +0100 Subject: [Live-devel] openRtsp problems!! In-Reply-To: References: <798994A8-A21D-4CDE-9AEB-CAADB1A5FEED@live555.com> Message-ID: <52CD8C0E.8090606@speed666.info> Hi, What i can say it that you try to connect to AirCam camera which has very limited RTSP implementation and dont count that something will change in the future. There is even no beta FW available above version 1.2 that you already have. Marcin W dniu 2014-01-08 15:39, Ross Finlayson pisze: > I can't explain the "Bad file descriptor" error; I've never seen that > happen before. Something is wrong either with the operating system on > that computer, or the way in which you built the "LIVE555 Streaming > Media" code for that operating system. (If it's Windows, then perhaps > you used the wrong version of the 'winsock' headers and/or library??) > > The "Unsupported transport" error with the second stream - > rtsp://37.43.41.142:554/live/ch00_0 - is a different problem; it's a > problem with the server (i.e., camera). The server is telling the > client ("openRTSP") that it doesn't support regular RTP-over-UDP > streaming. Instead, you'll need to request RTP-over-TCP streaming, by > running "openRTSP" with the "-t" option. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhawesh at vizexperts.com Wed Jan 8 07:23:52 2014 From: bhawesh at vizexperts.com (Bhawesh Kumar Choudhary) Date: Wed, 8 Jan 2014 20:53:52 +0530 Subject: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 Message-ID: <00ef01cf0c85$a3e21480$eba63d80$@vizexperts.com> Hi, I am using Live555 library to create rtsp stream for live device source. I read the FAQ and mailing list and sub classed all the required classes :: 1. Framed source (using the deviceSource.cpp model) 2. OnDemandServerMediaSubsession (using H264VideoFileServerMediaSubsession model) But few on the questions are I am not able to figure out: 1. In each call of my doGetNextFrame() of my device source I am assuming that no frame data is available and I am returning. Instead whenever I receive data I trigger a event which schedule getNextFrame() of my device source. Is there is status check (waiting for event or running) required for Live555 run loop to schedule the task of the new data arrival in the function signalNewFrameData() of device source? 2. In my OnDemandServerMediaSubsession subclass I have implemented both required function createNewStreamSource() and createNewRTPSink(). Since I set the reuseFirstSource to true in my OnDemandServerMediaSunsession I keep a reference of my device source in my dataMember variable. But the following code /// Function in OnDemandServerMediaSunsession causing application crash char const* OnDemandServerMediaSubsession::sdpLines() { if (fSDPLines == NULL) { // We need to construct a set of SDP lines that describe this // subsession (as a unicast stream). To do so, we first create // dummy (unused) source and "RTPSink" objects, // whose parameters we use for the SDP lines: unsigned estBitrate; FramedSource* inputSource = createNewStreamSource(0, estBitrate); // will return only device source if (inputSource == NULL) return NULL; // file not found struct in_addr dummyAddr; dummyAddr.s_addr = 0; Groupsock dummyGroupsock(envir(), dummyAddr, 0, 0); unsigned char rtpPayloadType = 96 + trackNumber()-1; // if dynamic RTPSink* dummyRTPSink = createNewRTPSink(&dummyGroupsock, rtpPayloadType, inputSource); setSDPLinesFromRTPSink(dummyRTPSink, inputSource, estBitrate); Medium::close(dummyRTPSink); closeStreamSource(inputSource); // will deallocate my device source and make my application crash. Commenting out this line and above line makes my application running } return fSDPLines; } Commenting out those line is the right solution or I am doing something wrong elsewhere? Also is it required to store the rtpSink returned by the function createNewRTPSink() in my subclass of onDemandServerMediaSubSession? Thanks and Regards, Bhawesh Kumar Choudhary Graphics Engineer VizExperts India Pvt. Ltd. -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Wed Jan 8 11:45:10 2014 From: warren at etr-usa.com (Warren Young) Date: Wed, 08 Jan 2014 12:45:10 -0700 Subject: [Live-devel] Compiling livemedia for Axis Cris In-Reply-To: <7F763CFFF78CA443BD136F74E16179E502B9EC@exgmbxfz01.sasken.com> References: <7F763CFFF78CA443BD136F74E16179E5029926@exgmbxfz01.sasken.com>, <52CC4499.1070102@etr-usa.com> <7F763CFFF78CA443BD136F74E16179E502B9EC@exgmbxfz01.sasken.com> Message-ID: <52CDAAC6.5010703@etr-usa.com> On 1/8/2014 02:37, Kari Kallioinen wrote: > Any ideas or haven't I noticed something? Does it work if you build a shared library? If so, try adding -fPIC to the command line. Some embedded systems require relocatable code. From finlayson at live555.com Wed Jan 8 12:28:43 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 8 Jan 2014 12:28:43 -0800 Subject: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 In-Reply-To: <00ef01cf0c85$a3e21480$eba63d80$@vizexperts.com> References: <00ef01cf0c85$a3e21480$eba63d80$@vizexperts.com> Message-ID: <600D66D6-F513-42D2-8CBF-4567102D30A9@live555.com> > I am using Live555 library to create rtsp stream for live device source. I read the FAQ and mailing list and sub classed all the required classes :: > 1. Framed source (using the deviceSource.cpp model) > 2. OnDemandServerMediaSubsession (using H264VideoFileServerMediaSubsession model) > But few on the questions are I am not able to figure out: > 1. In each call of my doGetNextFrame() of my device source I am assuming that no frame data is available and I am returning. Instead whenever I receive data I trigger a event which schedule getNextFrame() of my device source. Is there is status check (waiting for event or running) required for Live555 run loop to schedule the task of the new data arrival in the function signalNewFrameData() of device source? The LIVE555 event loop (which you entered when you ran "doEventLoop()") automatically figures out when "TaskScheduler::triggerEvent()" has been called (from another thread), and calls the appropriate handler function (which you registered when you called "createEventTrigger()"; i.e., the "deliverFrame0()" function in the "DeviceSource" example code). > 2. In my OnDemandServerMediaSubsession subclass I have implemented both required function createNewStreamSource() and createNewRTPSink(). Since I set the reuseFirstSource to true in my OnDemandServerMediaSunsession I keep a reference of my device source in my dataMember variable. That's your problem. Setting "reuseFirstSource" to True simply means that only one instance of the source class will be created *at a time*, regardless of the number of concurrent RTSP clients that have requested the stream. It does *not* mean that only one instance of the source class will be created *ever*. In fact, as you noticed, the "OnDemandServerMediaSubsession" code creates an initial instance of the source class (it uses this to generate the SDP description in response to the first RTSP "DESCRIBE"). It then closes this object. Later, when the first RTSP client does a RTSP "SETUP", another instance of the source class will be created. (That instance will not get closed again until the last concurrent client does a "TEARDOWN".) So, your code should allow for the possibility of more than one instance of your data source class being instantiated (and later closed) - but sequentially, not concurrently. DO NOT modify the supplied "OnDemandServerMediaSubsession" source code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lucian.orasanu at gmail.com Thu Jan 9 01:34:22 2014 From: lucian.orasanu at gmail.com (Orasanu Lucian) Date: Thu, 9 Jan 2014 11:34:22 +0200 Subject: [Live-devel] openRtsp problems!! In-Reply-To: <52CD8C0E.8090606@speed666.info> References: <798994A8-A21D-4CDE-9AEB-CAADB1A5FEED@live555.com> <52CD8C0E.8090606@speed666.info> Message-ID: Hello, I know that this lists are useful for others, when they have similar problems. My system is Gentoo linux x64, afther an emerge -e system, emerge -e world and upgrade kernel from 3.7.1-gentoo to 3.12.1-gentoo all is perfect now!!! Is working like an charm!!, dont know for shur but I belive that emerge -e world, fixed the problem. Thank you for attention!! and help!! On Wed, Jan 8, 2014 at 7:34 PM, Marcin wrote: > Hi, > What i can say it that you try to connect to AirCam camera which has very > limited RTSP implementation and dont count that something will change in > the future. > There is even no beta FW available above version 1.2 that you already have. > Marcin > > > W dniu 2014-01-08 15:39, Ross Finlayson pisze: > > I can't explain the "Bad file descriptor" error; I've never seen that > happen before. Something is wrong either with the operating system on that > computer, or the way in which you built the "LIVE555 Streaming Media" code > for that operating system. (If it's Windows, then perhaps you used the > wrong version of the 'winsock' headers and/or library??) > > The "Unsupported transport" error with the second stream - > rtsp://37.43.41.142:554/live/ch00_0 - is a different problem; it's a > problem with the server (i.e., camera). The server is telling the client > ("openRTSP") that it doesn't support regular RTP-over-UDP streaming. > Instead, you'll need to request RTP-over-TCP streaming, by running > "openRTSP" with the "-t" option. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing listlive-devel at lists.live555.comhttp://lists.live555.com/mailman/listinfo/live-devel > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhawesh at vizexperts.com Fri Jan 10 04:38:44 2014 From: bhawesh at vizexperts.com (Bhawesh Kumar Choudhary) Date: Fri, 10 Jan 2014 18:08:44 +0530 Subject: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 In-Reply-To: <600D66D6-F513-42D2-8CBF-4567102D30A9@live555.com> References: <00ef01cf0c85$a3e21480$eba63d80$@vizexperts.com> <600D66D6-F513-42D2-8CBF-4567102D30A9@live555.com> Message-ID: <009001cf0e00$e75fd540$b61f7fc0$@vizexperts.com> Thanks, the fix does work, I am able to stream my live source. But the video quality I am getting on other side is quite glitch. Basically I am streaming encoded data from ffmpeg's output packet (AVPacket) but I am suspecting that since FFmpeg Gives more than one nal unit in a single AVPacket there might be data loss in live media while streaming because of which playing in client side producing wrong images. I have increased the outPacketBuffer::maxSize to 160000 but it doesn't seem to fix the problem. Does live media do the parsing of Nal unit which are inside FFmpeg's AVPacket or I have to copy single nal unit at a time in my device source? Thanks Bhawesh Kumar VizExperts India Pvt. Ltd. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 09 January 2014 01:59 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 I am using Live555 library to create rtsp stream for live device source. I read the FAQ and mailing list and sub classed all the required classes :: 1. Framed source (using the deviceSource.cpp model) 2. OnDemandServerMediaSubsession (using H264VideoFileServerMediaSubsession model) But few on the questions are I am not able to figure out: 1. In each call of my doGetNextFrame() of my device source I am assuming that no frame data is available and I am returning. Instead whenever I receive data I trigger a event which schedule getNextFrame() of my device source. Is there is status check (waiting for event or running) required for Live555 run loop to schedule the task of the new data arrival in the function signalNewFrameData() of device source? The LIVE555 event loop (which you entered when you ran "doEventLoop()") automatically figures out when "TaskScheduler::triggerEvent()" has been called (from another thread), and calls the appropriate handler function (which you registered when you called "createEventTrigger()"; i.e., the "deliverFrame0()" function in the "DeviceSource" example code). 2. In my OnDemandServerMediaSubsession subclass I have implemented both required function createNewStreamSource() and createNewRTPSink(). Since I set the reuseFirstSource to true in my OnDemandServerMediaSunsession I keep a reference of my device source in my dataMember variable. That's your problem. Setting "reuseFirstSource" to True simply means that only one instance of the source class will be created *at a time*, regardless of the number of concurrent RTSP clients that have requested the stream. It does *not* mean that only one instance of the source class will be created *ever*. In fact, as you noticed, the "OnDemandServerMediaSubsession" code creates an initial instance of the source class (it uses this to generate the SDP description in response to the first RTSP "DESCRIBE"). It then closes this object. Later, when the first RTSP client does a RTSP "SETUP", another instance of the source class will be created. (That instance will not get closed again until the last concurrent client does a "TEARDOWN".) So, your code should allow for the possibility of more than one instance of your data source class being instantiated (and later closed) - but sequentially, not concurrently. DO NOT modify the supplied "OnDemandServerMediaSubsession" source code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 10 07:51:40 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Jan 2014 07:51:40 -0800 Subject: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 In-Reply-To: <009001cf0e00$e75fd540$b61f7fc0$@vizexperts.com> References: <00ef01cf0c85$a3e21480$eba63d80$@vizexperts.com> <600D66D6-F513-42D2-8CBF-4567102D30A9@live555.com> <009001cf0e00$e75fd540$b61f7fc0$@vizexperts.com> Message-ID: > Thanks, the fix does work, I am able to stream my live source. But the video quality I am getting on other side is quite glitch. Basically I am streaming encoded data from ffmpeg?s output packet (AVPacket) but I am suspecting that since FFmpeg Gives more than one nal unit in a single AVPacket there might be data loss in live media while streaming because of which playing in client side producing wrong images. I have increased the outPacketBuffer::maxSize to 160000 but it doesn?t seem to fix the problem. > Does live media do the parsing of Nal unit which are inside FFmpeg?s AVPacket or I have to copy single nal unit at a time in my device source? The latter - because your input source consists of discrete NAL units, rather than a byte stream. Specifically, your input source must deliver NAL units, one at a time, *without* 0x00000001 start codes, into a "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer"; that class is used only when streaming a H.264 video byte stream - i.e., from a file or a pipe). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vasudevan.madubhushi at siemens.com Fri Jan 10 11:34:25 2014 From: vasudevan.madubhushi at siemens.com (Madubhushi, Vasudevan) Date: Fri, 10 Jan 2014 19:34:25 +0000 Subject: [Live-devel] Building live Media on Windows as a Dynamically linked library Message-ID: <0E446C12F0F2D647B0F74D1F55AF6F942375E58C@USLZUA0EM24MSX.ww017.siemens.net> Hi Ross, After trawling through the posts relate to this topic , I found one which expresses your opinion on using Live555 as a shared library. (http://lists.live555.com/pipermail/live-devel/2004-July/000955.html) . Certainly understand the reasoning behind it but then the conundrum we have here is that statically linking violates the LGPL terms and to enable dynamic linking the LGPL must be violated to add the DLL export tags to classes and API's of live Media. So the question I have is, did you guys get a chance since then to provide makefile options where live media code could be deployed as a DLL on Windows? If not is this planned in the near future? Thanks, Vasu This message and any attachments are solely for the use of intended recipients. The information contained herein may include trade secrets, protected health or personal information, privileged or otherwise confidential information. Unauthorized review, forwarding, printing, copying, distributing, or using such information is strictly prohibited and may be unlawful. If you are not an intended recipient, you are hereby notified that you received this email in error, and that any review, dissemination, distribution or copying of this email and any attachment is strictly prohibited. If you have received this email in error, please contact the sender and delete the message and any attachment from your system. Thank you for your cooperation -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 10 12:04:40 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Jan 2014 12:04:40 -0800 Subject: [Live-devel] Building live Media on Windows as a Dynamically linked library In-Reply-To: <0E446C12F0F2D647B0F74D1F55AF6F942375E58C@USLZUA0EM24MSX.ww017.siemens.net> References: <0E446C12F0F2D647B0F74D1F55AF6F942375E58C@USLZUA0EM24MSX.ww017.siemens.net> Message-ID: I'm not planning on making any non-standard, Windows-specific additions to the code. Other operating systems have managed to use the "LIVE555 Streaming Media" as dynamically-linked libraries in their applications - note, for example, the supplied "config.linux-with-shared-libraries" configuration file (for generating Makefiles) for Linux - without requiring any modification to the code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Fri Jan 10 12:55:56 2014 From: warren at etr-usa.com (Warren Young) Date: Fri, 10 Jan 2014 13:55:56 -0700 Subject: [Live-devel] Building live Media on Windows as a Dynamically linked library In-Reply-To: References: <0E446C12F0F2D647B0F74D1F55AF6F942375E58C@USLZUA0EM24MSX.ww017.siemens.net> Message-ID: <52D05E5C.9050808@etr-usa.com> On 1/10/2014 13:04, Ross Finlayson wrote: > I'm not planning on making any non-standard, Windows-specific additions > to the code. If you're immovable on that point, then someone needs to write a .def file for the DLL, which is *painful* for a large C++ library like live555: http://msdn.microsoft.com/en-us/library/d91k01sh.aspx > Other operating systems have managed to use the "LIVE555 > Streaming Media" as dynamically-linked libraries in their applications - Other linkers on other operating systems behave differently. No big revelation, that. The Visual C++ linker will automatically do the right thing if you do something like this: // In a common header file somewhere #if MAKING_WINDOWS_DLL # define EXPORT __declspec(dllexport) #elif _WIN32 # define EXPORT __declspec(dllimport) #else # define EXPORT #endif // In a .h file for a procedural module: EXPORT void myfunction(...)... // In a .h file for an OO module: class EXPORT MyClass : public MyBase ... The dllimport clause is necessary to allow the same set of headers to be used while building the DLL and for building *against* the DLL. The final clause makes EXPORT a no-op on non-Windows systems. You'd obviously need to add -DMAKING_WINDOWS_DLL to the Windows config file, or to the VC++ project files. It is a lot of work to decorate all of the public symbols this way, but a lot less than to create a .def file. From parkchan1960 at gmail.com Fri Jan 10 00:12:47 2014 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Fri, 10 Jan 2014 16:12:47 +0800 Subject: [Live-devel] HTTP live streaming crash Message-ID: Hi, The latest release(0107) will fail with a SIGSEGV on HTTP live streaming. It faults at MediaSink::onSourceClosure. In TCPStreamSink::processBuffer, onSourceClosure is being called twice, one being called inside the fSource->getNextFrame call, and the other being called when the source is done. Would this be a problem? Thanks. Park -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhawesh at vizexperts.com Sat Jan 11 00:41:20 2014 From: bhawesh at vizexperts.com (Bhawesh Kumar Choudhary) Date: Sat, 11 Jan 2014 14:11:20 +0530 Subject: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 In-Reply-To: References: <00ef01cf0c85$a3e21480$eba63d80$@vizexperts.com> <600D66D6-F513-42D2-8CBF-4567102D30A9@live555.com> <009001cf0e00$e75fd540$b61f7fc0$@vizexperts.com> Message-ID: <001d01cf0ea8$e7c6e520$b754af60$@vizexperts.com> Ok, this approach I have tested before also. Though server streams data happily but clients are not able to play it properly. The approach I have done is to copy single nal unit data in each getNextFrame() call of my device source to buffer. Obviously nals doesn't contain the start code. I removed it by removing first four byte and Live555 seems to be happy with nal units and streams data without complaining. In my H264VideoStreamDiscreateFramer it does get sps and pps nals before every key frame. But clients (openRTSP.exe) is not able to play. It will get the first play response from server and then waiting for data forever. I am able to figure out what I am doing wrong?? Thanks Bhawesh Kumar VizExperts India Pvt. Ltd. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: 10 January 2014 21:22 To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 Thanks, the fix does work, I am able to stream my live source. But the video quality I am getting on other side is quite glitch. Basically I am streaming encoded data from ffmpeg's output packet (AVPacket) but I am suspecting that since FFmpeg Gives more than one nal unit in a single AVPacket there might be data loss in live media while streaming because of which playing in client side producing wrong images. I have increased the outPacketBuffer::maxSize to 160000 but it doesn't seem to fix the problem. Does live media do the parsing of Nal unit which are inside FFmpeg's AVPacket or I have to copy single nal unit at a time in my device source? The latter - because your input source consists of discrete NAL units, rather than a byte stream. Specifically, your input source must deliver NAL units, one at a time, *without* 0x00000001 start codes, into a "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer"; that class is used only when streaming a H.264 video byte stream - i.e., from a file or a pipe). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jawahar456 at gmail.com Fri Jan 10 05:10:11 2014 From: jawahar456 at gmail.com (Jawahar Venugopal) Date: Fri, 10 Jan 2014 18:40:11 +0530 Subject: [Live-devel] Difficulty in streaming more than one videos using live555proxyserver program. (Jawahar Venugopal) Message-ID: Hi, Thanks you Same problem.. Given diagnostic output with -V.. Find attachments.. Regards, Jawahar Venugopal -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: typescript Type: application/octet-stream Size: 11588 bytes Desc: not available URL: From torben.schmidt at gmail.com Fri Jan 10 10:06:11 2014 From: torben.schmidt at gmail.com (Torben Schmidt) Date: Fri, 10 Jan 2014 19:06:11 +0100 Subject: [Live-devel] MediaServer stream sent over wrong interface even after SendingInterfaceAddr and ReceivingInterfaceAddr were changed Message-ID: Hi, I have a Server with multiple network interfaces and I want to run an instance of live555mediaServer on each interface. I am setting SendingInterfaceAddr and ReceivingInterfaceAddr to the IP of the desired interface. This will create the RTSP Server on the right interface, but when I start a stream the RTP Packets will always be sent by eth0. Is there a way to change this behavior, so the RTP Packets will be sent from the right interface? (Additional info: All interfaces have Multicast enabled, System is running debian 64Bit, Latest live555 release) Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From tboonefisher at clear.net Fri Jan 10 12:44:15 2014 From: tboonefisher at clear.net (TBooneFisher) Date: Fri, 10 Jan 2014 14:44:15 -0600 Subject: [Live-devel] Is RealAudio Or SaveFrom.net Using Live555 ? Message-ID: <84B3D20C1153442BBEBD5ABC3F657F2D@TBNotebook> Ross, do you happen to know if either http://www.real.com/realdownloader or http://en.savefrom.net/ are using Live555? The reason that I ask is that Realdownloader is having a problem with YouTube streams that SaveFrom is not. The .mp4 files that Realdownloader creates from YouTube are missing audio but will play with VLC. They will not play at all with Win7 MediaPlayer. SaveFrom downloads work fine with MP & VLC. Of course, Real blames the problem on YouTube and states that only their RealPlayer will play the files;-) Your thoughts? Tom Fisher From john95018 at gmail.com Sun Jan 12 10:20:09 2014 From: john95018 at gmail.com (john dicostanzo) Date: Sun, 12 Jan 2014 23:50:09 +0530 Subject: [Live-devel] problem in MPEG2TransportStreamIndexer app.. Message-ID: HI, I am using test app MPEG2TransportStreamIndexer. I built windows solution for it and built it in debug mode but when i am running it, it is getting crashed in file "filesink.cpp" at line if (fOutFid == NULL || fflush(fOutFid) == EOF) in function afterGettingFrame. Regards, John -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 12 13:39:57 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 12 Jan 2014 13:39:57 -0800 Subject: [Live-devel] problem in MPEG2TransportStreamIndexer app.. In-Reply-To: References: Message-ID: <905AA7B6-96F3-4FBE-8FBD-81871C3D1C80@live555.com> > I am using test app MPEG2TransportStreamIndexer. > I built windows solution for it and built it in debug mode but when i am running it, > it is getting crashed in file "filesink.cpp" at line > > if (fOutFid == NULL || fflush(fOutFid) == EOF) in function afterGettingFrame. Are you using the latest version of the software. The "FileSink.cpp" code *did* change in the latest version, but not in any way that should have affected "MPEG2TransportStreamIndexer". If you upgraded from the previous version, make sure that you don't have any old binaries lying around. Make sure that you build the software from scratch. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 12 16:27:58 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 12 Jan 2014 16:27:58 -0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: References: Message-ID: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> > The latest release(0107) will fail with a SIGSEGV on HTTP live streaming. That's strange. The code for HTTP Live Streaming has not been changed in several months. > It faults at MediaSink::onSourceClosure. It would be nice to know exactly why the fault is occurring. > In TCPStreamSink::processBuffer, onSourceClosure is being called twice, one being called inside the fSource->getNextFrame call, and the other being called when the source is done. No, the function that's being called (as a 'input source closure' handler) as a result of "getNextFrame()" is "TCPStreamSink::ourOnSourceClosure()" (note the "our" in the name). It's a different function from the "onSourceClosure()" function that's called later. There might well still be a bug in the code, but that's not it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Jan 12 16:33:16 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 12 Jan 2014 16:33:16 -0800 Subject: [Live-devel] Subclassing OnDemandServerMediaSubsession for Live Video Sources to stream h264 In-Reply-To: <001d01cf0ea8$e7c6e520$b754af60$@vizexperts.com> References: <00ef01cf0c85$a3e21480$eba63d80$@vizexperts.com> <600D66D6-F513-42D2-8CBF-4567102D30A9@live555.com> <009001cf0e00$e75fd540$b61f7fc0$@vizexperts.com> <001d01cf0ea8$e7c6e520$b754af60$@vizexperts.com> Message-ID: <80756C27-E887-4958-9F10-D7CDEAB5844F@live555.com> > But clients (openRTSP.exe) is not able to play. It will get the first play response from server and then waiting for data forever. I am able to figure out what I am doing wrong?? See http://www.live555.com/liveMedia/faq.html#openRTSP-empty-files Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From peter.mccarthy at bibbol.ltd.uk Mon Jan 13 05:28:06 2014 From: peter.mccarthy at bibbol.ltd.uk (Peter McCarthy) Date: Mon, 13 Jan 2014 13:28:06 +0000 Subject: [Live-devel] SIGSEGV Exception After sendDescribeCommand() Called Message-ID: <52D3E9E6.8080504@bibbol.ltd.uk> Hi Ross, I'm developing a prototype Android app (Nexus 7, Jelly Bean 4.3) which uses some source code based on the testRTSPClient.cpp sample. The app currently uses the Live555 release dated 2013-10-03 and it works very reliably. However, if I then use any release after that date (including 2014-01-11), the app immediately throws a SIGSEGV exception. Now, this happens sometime after sendDescribeCommand(), but before continueAfterDESCRIBE() , is called. I've included an abbreviated dump of the exception below: 01-11 10:37:09.158 2797-2826/com._._ A/libc? Fatal signal 11 (SIGSEGV) at 0x00000000 (code=1), thread 2826 (m._._) 01-11 10:37:09.208 2797-2797/com._._ I/Choreographer? Skipped 561 frames! The application may be doing too much work on its main thread. 01-11 10:37:09.258 120-120/? I/DEBUG? *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 01-11 10:37:09.258 120-120/? I/DEBUG? Build fingerprint: 'google/nakasi/grouper:4.3/JWR66Y/776638:user/release-keys' 01-11 10:37:09.258 120-120/? I/DEBUG? Revision: '0' 01-11 10:37:09.258 120-120/? I/DEBUG? pid: 2797, tid: 2826, name: m._._ >>> com._._ <<< 01-11 10:37:09.258 120-120/? I/DEBUG? signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 00000000 . . . 01-11 10:37:09.358 120-120/? I/DEBUG? #01 pc 00089331 /data/app-lib/com._._-2/lib_RtspH264.so (RTSPClient::sendRequest(RTSPClient::RequestRecord*)+684) 01-11 10:37:09.358 120-120/? I/DEBUG? #02 pc 00089483 /data/app-lib/com._._-2/lib_RtspH264.so (RTSPClient::connectionHandler1()+186) . . . 01-11 10:37:09.358 120-120/? I/DEBUG? 6af14c54 69843435 /data/app-lib/com._._-2/lib_RtspH264.so (BasicTaskScheduler::setBackgroundHandling(int, int, void (*)(void*, int), void*)) . . . 01-11 10:37:09.368 120-120/? I/DEBUG? 6af14ccc 69843791 /data/app-lib/com._._-2/lib_RtspH264.so (BasicTaskScheduler::SingleStep(unsigned int)+668) . . . Have you got any ideas as to why this is happening? Thanks in advance for your help. Regards, Peter From finlayson at live555.com Mon Jan 13 07:14:59 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Jan 2014 07:14:59 -0800 Subject: [Live-devel] SIGSEGV Exception After sendDescribeCommand() Called In-Reply-To: <52D3E9E6.8080504@bibbol.ltd.uk> References: <52D3E9E6.8080504@bibbol.ltd.uk> Message-ID: <8FA04F5D-736B-4CE7-8D10-1F1A6AAF55A3@live555.com> > The app currently uses the Live555 release dated 2013-10-03 and it works very reliably. However, if I then use any release after that date (including 2014-01-11), the app immediately throws a SIGSEGV exception. > > Now, this happens sometime after sendDescribeCommand(), but before continueAfterDESCRIBE() , is called. Unfortunately, because the problem seems to happen only with your custom application, I'm not sure I can help you with it. I'm also unaware of any recent changes to our RTSP client-related code that may be causing your problem. I assume that - whenever you upgrade to a new version of our code - you're doing a clean replacement - i.e., completely removing all old binaries (and all LIVE555 ".cpp" and ".hh" files) before you recompile. > I've included an abbreviated dump of the exception below: Abbreviating the stack trace doesn't help; you should post the whole thing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From peter.mccarthy at bibbol.ltd.uk Mon Jan 13 08:32:26 2014 From: peter.mccarthy at bibbol.ltd.uk (Peter McCarthy) Date: Mon, 13 Jan 2014 16:32:26 +0000 Subject: [Live-devel] SIGSEGV Exception After sendDescribeCommand() Called In-Reply-To: <8FA04F5D-736B-4CE7-8D10-1F1A6AAF55A3@live555.com> References: <52D3E9E6.8080504@bibbol.ltd.uk> <8FA04F5D-736B-4CE7-8D10-1F1A6AAF55A3@live555.com> Message-ID: <52D4151A.6050700@bibbol.ltd.uk> Oh dear, it's a very embarrassing error by me. I've been compiling with old Live555 headers. This has been causing the exception. It all works fine now. Thanks for your help, Ross. On 13/01/14 15:14, Ross Finlayson wrote: >> The app currently uses the Live555 release dated 2013-10-03 and it >> works very reliably. However, if I then use any release after that >> date (including 2014-01-11), the app immediately throws a SIGSEGV >> exception. >> >> Now, this happens sometime after sendDescribeCommand(), but before >> continueAfterDESCRIBE() , is called. > > Unfortunately, because the problem seems to happen only with your > custom application, I'm not sure I can help you with it. I'm also > unaware of any recent changes to our RTSP client-related code that may > be causing your problem. I assume that - whenever you upgrade to a > new version of our code - you're doing a clean replacement - i.e., > completely removing all old binaries (and all LIVE555 ".cpp" and ".hh" > files) before you recompile. > > >> I've included an abbreviated dump of the exception below: > > Abbreviating the stack trace doesn't help; you should post the whole > thing. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon Jan 13 09:03:11 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 13 Jan 2014 18:03:11 +0100 Subject: [Live-devel] our_MD5Data mismatched free Message-ID: <20052_1389632592_52D41C50_20052_5060_1_1BE8971B6CFF3A4F97AF4011882AA2550156423E7385@THSONEA01CMS01P.one.grp> Hi Ross, valgrind report mismatch between allocation and free : - Authenticator::computeDigestResponse call our_MD5Data that allocate using new char[] - Authentificator::reclaimDisgestResponse call free on the buffer Don't you think it could be better to replace the "free" with "delete []" ? Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 13 12:00:27 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Jan 2014 12:00:27 -0800 Subject: [Live-devel] our_MD5Data mismatched free In-Reply-To: <20052_1389632592_52D41C50_20052_5060_1_1BE8971B6CFF3A4F97AF4011882AA2550156423E7385@THSONEA01CMS01P.one.grp> References: <20052_1389632592_52D41C50_20052_5060_1_1BE8971B6CFF3A4F97AF4011882AA2550156423E7385@THSONEA01CMS01P.one.grp> Message-ID: <9F601266-AB99-4E01-9037-70AF05C78E21@live555.com> > valgrind report mismatch between allocation and free : > - Authenticator::computeDigestResponse call our_MD5Data that allocate using new char[] > - Authentificator::reclaimDisgestResponse call free on the buffer > > Don?t you think it could be better to replace the ?free? with ?delete []? ? Yes, that was my mistake. I recently changed our MP5 code from C to C++ (and changed a "malloc()" to a "new()"), but forgot to change the corresponding "free()" to a "delete[]". Thanks for the report. I've just installed a new version (2014.01.13) that fixes this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 13 14:51:51 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Jan 2014 14:51:51 -0800 Subject: [Live-devel] Difficulty in streaming more than one videos using live555proxyserver program. (Jawahar Venugopal) In-Reply-To: References: Message-ID: <150BCDCC-E8CC-4184-A5B1-B40A4CF91C3E@live555.com> > Same problem.. Given diagnostic output with -V.. > Find attachments.. The problem is your server (i.e., network camera). Each instance of the camera is sending its stream to the same UDP port (5006 in this case). Therefore, the proxy server receives both streams on the same UDP port, and can't distinguish them. Your server (i.e., network camera) is unusual in that it includes a specific IP destination address and UDP port (5006) in the SDP description that it returns in response to "DESCRIBE". It is more common (for unicast streams like this) for the UDP port number (and IP destination address) in the SDP description to be set to 0 (and 0.0.0.0). The subsequent "SETUP" command (and response) will then be used to set the specific IP destination address and port for the stream. In summary: You need to fix (or replace) your network cameras. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 13 14:54:46 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Jan 2014 14:54:46 -0800 Subject: [Live-devel] MediaServer stream sent over wrong interface even after SendingInterfaceAddr and ReceivingInterfaceAddr were changed In-Reply-To: References: Message-ID: > I have a Server with multiple network interfaces and I want to run an instance of live555mediaServer on each interface. > > I am setting SendingInterfaceAddr and ReceivingInterfaceAddr to the IP of the desired interface. > This will create the RTSP Server on the right interface, but when I start a stream the RTP Packets will always be sent by eth0. > > Is there a way to change this behavior, so the RTP Packets will be sent from the right interface? Try upgrading to the most recent release of the code. This new version will use "ReceivingInterfaceAddr" (if set to something other than INADDR_ANY) as 'our IP address'. It may work for you. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 13 15:00:06 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Jan 2014 15:00:06 -0800 Subject: [Live-devel] Is RealAudio Or SaveFrom.net Using Live555 ? In-Reply-To: <84B3D20C1153442BBEBD5ABC3F657F2D@TBNotebook> References: <84B3D20C1153442BBEBD5ABC3F657F2D@TBNotebook> Message-ID: > Ross, do you happen to know if either http://www.real.com/realdownloader > or http://en.savefrom.net/ are using Live555? I don't know (but have no reason to believe that they would be using our software). Why don't you ask them. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From eadeli at gmail.com Mon Jan 13 07:27:31 2014 From: eadeli at gmail.com (Ehsan Adeli) Date: Mon, 13 Jan 2014 18:57:31 +0330 Subject: [Live-devel] Problem with recording incoming RTSP streams Message-ID: Hello, I am using live555 and the openRTSP program to record videos from IP cameras. But some cameras, dependent to the network condition and their settings, sometimes stream videos with less frame-rates as specified in their profile settings (They drop frames or quality to meet network requirements). But when I am recording and saving the incoming stream to mp4 files, openRTSP sets the frame-rate as specified by the SDP, or the default value, or set by -f option. That causes the recorded video not play properly. I have tried recording with VLC without any transcoding, and it sets the frame-rate correctly. Do you have any ideas how this could be solved? Can I set the mp4 file to be VFR? Or can I re-set the framerate after the recording is finished in the mp4 file? BTW, I know one naive solution would be re-encoding the file, but anyways, can it be done without encoding/decoding? Thank you all. Regards, Ehsan -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 13 15:13:43 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Jan 2014 15:13:43 -0800 Subject: [Live-devel] Problem with recording incoming RTSP streams In-Reply-To: References: Message-ID: <6C734BAF-D5E5-4EA8-A482-7B7230C68D45@live555.com> The basic problem here is that the ".mov"/".mp4" file format is ill-suited for recording incoming RTP streams, because, in those file formats, it is difficult to accurately record media frames with their accurate presentation times. (VLC doesn't have this problem because it is decoding, then re-encoding the stream, even if it doesn't 'transcode' to a different codec or frame rate.) The best solution is to use a better media file format - like Matroska - that allows us to record each frame with its precise presentation time. We have a pending 'funded project' to add support for writing Matroska files: http://live555.com/funded-projects/live555_mkv.html Feel free to contribute to this project if you would like to see this happen. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From phuocpham at ssu.ac.kr Tue Jan 14 05:14:17 2014 From: phuocpham at ssu.ac.kr (=?EUC-KR?B?ZGFuaWVs?=) Date: Tue, 14 Jan 2014 22:14:17 +0900 (KST) Subject: [Live-devel] =?euc-kr?q?Problem_with_mpegts_muxer?= Message-ID: <52d53b493fe5_@_imoxion.com> Hi Ross, I'm working on an embedded device, the IP network camera. I am using MPEG2TransportStreamFromESSource to mux H.264 and AAC into transport stream. I succedded the muxing. But at the client side, the bottom right corner always has some broken part as shown in the image of the link below. The broken part is shown in the red square of the image. http://s1208.photobucket.com/user/phuocpham09t/media/mpegts.png.html Do you have any ideas why? I really appreciate your help. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Santosh.Basawaraj at lnties.com Tue Jan 14 04:41:16 2014 From: Santosh.Basawaraj at lnties.com (SantoshBasawaraj) Date: Tue, 14 Jan 2014 12:41:16 +0000 Subject: [Live-devel] Live555 mpeg4(.mp4) streaming Message-ID: Hello, I've tried Live555 server for streaming videos between client and server. i used VLC as a client tool to open videos.i can able to stream videos of extension .ts, .mpg, .webm .aac and also mp3 files. but i couldnt able to stream .mp4(mpeg 4) files. ive tried the enhanced version of live555 also which supports the mpeg4 format. when im trying to stream mp4 format vlc shows the following error: [0x877da80] dummy interface: using the dummy interface module... [0xb5000970] live555 demux error: Failed to connect with rtsp://ipaddress:8554/123.mp4 [0xb5400700] main input error: open of `rtsp://ipaddress:8554/123.mp4' failed [0xb5400700] main input error: Your input can't be opened [0xb5400700] main input error: VLC is unable to open the MRL 'rtsp://ipaddress:8554/123.mp4'. Check the log for details. can u please help me with this issue. thanking u in advance. Larsen & Toubro Limited www.larsentoubro.com This Email may contain confidential or privileged information for the intended recipient (s). If you are not the intended recipient, please do not use or disseminate the information, notify the sender and delete it from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 14 07:08:42 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Jan 2014 07:08:42 -0800 Subject: [Live-devel] Problem with mpegts muxer In-Reply-To: <52d53b493fe5_@_imoxion.com> References: <52d53b493fe5_@_imoxion.com> Message-ID: <50D5ACC4-D22F-4304-9FDD-BD87853CD6BC@live555.com> > I'm working on an embedded device, the IP network camera. I am using MPEG2TransportStreamFromESSource to mux H.264 and AAC into transport stream. I succedded the muxing. But at the client side, the bottom right corner always has some broken part as shown in the image of the link below. The broken part is shown in the red square of the image. > http://s1208.photobucket.com/user/phuocpham09t/media/mpegts.png.html > > Do you have any ideas why? No, unfortunately I don't. However, why are you multiplexing your video and audio into a Transport Stream, and then transmitting the Transport Stream? It is *much* more efficient (and robust) to stream the H.264 video and AAC audio separately, i.e., as separate RTP streams - without dealing with Transport Streams at all. The way to do this is to create two different "ServerMediaSubsession"s (each one a subclass of "OnDemandServerMediaSubsession", assuming that your server is streaming unicast), and add each one to your server's "ServerMediaSession" (using two calls to "RTSPServer::addSubsession()"). One of your "ServerMediaSubsession" subclasses (for video) would create (in its "createNewRTPSink()" virtual function implementation) a "H264VideoRTPSink", fed from a "H264VideoStreamDiscreteFramer". Your second "ServerMediaSubsession" subclass (for audio) would create (in its "createNewRTPSink()" virtual function implementation) a "MPEG4GenericRTPSink". (See "ADTSAudioFileServerMediaSubsession.cpp" for an example of how to create a "MPEG4GenericRTPSink" for streaming AAC audio.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 14 07:22:02 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Jan 2014 07:22:02 -0800 Subject: [Live-devel] Live555 mpeg4(.mp4) streaming In-Reply-To: References: Message-ID: > I've tried Live555 server for streaming videos between client and server. i used VLC as a client tool to open videos.i can able to stream videos of extension .ts, .mpg, .webm .aac and also mp3 files. but i couldnt able to stream .mp4(mpeg 4) files. That's because our supplied server code doesn't support this. > ive tried the enhanced version of live555 also which supports the mpeg4 format. Sorry, but we can't offer any support for software that was not downloaded from "http://www.live555.com/liveMedia/public". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Tue Jan 14 09:19:43 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Tue, 14 Jan 2014 18:19:43 +0100 Subject: [Live-devel] _H264_VIDEO_STREAM_DISCRETE_FRAMER_HH definition Message-ID: <15630_1389719984_52D571B0_15630_590_1_1BE8971B6CFF3A4F97AF4011882AA25501564245C444@THSONEA01CMS01P.one.grp> Hi Ross, I am just trying to try to play with H265, but definition of _H264_VIDEO_STREAM_DISCRETE_FRAMER_HH is done in both include file : - http://www.live555.com/liveMedia/doxygen/html/H265VideoStreamDiscreteFramer_8hh-source.html - http://www.live555.com/liveMedia/doxygen/html/H264VideoStreamDiscreteFramer_8hh-source.html The nit is not possible to get definition of H264VideoStreamDiscreteFramer and H265VideoStreamDiscreteFramer: Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 14 11:24:00 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Jan 2014 11:24:00 -0800 Subject: [Live-devel] _H264_VIDEO_STREAM_DISCRETE_FRAMER_HH definition In-Reply-To: <15630_1389719984_52D571B0_15630_590_1_1BE8971B6CFF3A4F97AF4011882AA25501564245C444@THSONEA01CMS01P.one.grp> References: <15630_1389719984_52D571B0_15630_590_1_1BE8971B6CFF3A4F97AF4011882AA25501564245C444@THSONEA01CMS01P.one.grp> Message-ID: > I am just trying to try to play with H265, but definition of _H264_VIDEO_STREAM_DISCRETE_FRAMER_HH is done in both include file : > - http://www.live555.com/liveMedia/doxygen/html/H265VideoStreamDiscreteFramer_8hh-source.html > - http://www.live555.com/liveMedia/doxygen/html/H264VideoStreamDiscreteFramer_8hh-source.html Thanks; this will be fixed in the next release. Be warned, however, that H.265 support is still a 'work in progress'. In particular, there's no "H265VideoRTPSource" just yet. (So, you can transmit H.265/RTP streams, but you can't yet receive them.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Mon Jan 13 18:44:32 2014 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Tue, 14 Jan 2014 10:44:32 +0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: References: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> Message-ID: On Tue, Jan 14, 2014 at 10:23 AM, Pak Man Chan wrote: > On Mon, Jan 13, 2014 at 8:27 AM, Ross Finlayson wrote: > >> In TCPStreamSink::processBuffer, onSourceClosure is being called >> twice, one being called inside the fSource->getNextFrame call, and the >> other being called when the source is done. >> >> >> No, the function that's being called (as a 'input source closure' >> handler) as a result of "getNextFrame()" is >> "TCPStreamSink::ourOnSourceClosure()" (note the "our" in the name). It's a >> different function from the "onSourceClosure()" function that's called >> later. >> >> >> This is a stack trace on hitting MediaSink::onSourceClosure(), > "getNextFrame()" will indeed call onSourceClosure through > "TCPStreamSink::ourOnSourceClosure()". > > #0 MediaSink::onSourceClosure (this=0x6b0a20) at MediaSink.cpp:99 > #1 0x0000000000449d4a in TCPStreamSink::processBuffer (this=0x6b0a20) > at TCPStreamSink.cpp:75 > #2 0x0000000000449ed5 in TCPStreamSink::ourOnSourceClosure1 > (this=0x6b0a20) > at TCPStreamSink.cpp:115 > #3 0x0000000000449eb0 in TCPStreamSink::ourOnSourceClosure > (clientData=0x6b0a20) > at TCPStreamSink.cpp:109 > #4 0x000000000042ab52 in FramedSource::handleClosure (clientData=0x6af750) > at FramedSource.cpp:99 > #5 0x000000000044145d in MPEG2TransportStreamFramer::doGetNextFrame > (this=0x6af750) > at MPEG2TransportStreamFramer.cpp:107 > #6 0x000000000042aaaa in FramedSource::getNextFrame (this=0x6af750, > to=0x6b0a70 "G\001", maxSize=10000, > afterGettingFunc=0x449dc6 unsigned int, unsigned int, timeval, unsigned int)>, > afterGettingClientData=0x6b0a20, > onCloseFunc=0x449e90 , > onCloseClientData=0x6b0a20) at FramedSource.cpp:78 > #7 0x0000000000449d0f in TCPStreamSink::processBuffer (this=0x6b0a20) > at TCPStreamSink.cpp:70 > #8 0x0000000000449e8d in TCPStreamSink::afterGettingFrame (this=0x6b0a20, > frameSize=1128, numTruncatedBytes=0) at TCPStreamSink.cpp:104 > #9 0x0000000000449e06 in TCPStreamSink::afterGettingFrame > (clientData=0x6b0a20, > frameSize=1128, numTruncatedBytes=0) at TCPStreamSink.cpp:94 > #10 0x000000000042ab0b in FramedSource::afterGetting (source=0x6af750) > at FramedSource.cpp:91 > #11 0x00000000004417e8 in MPEG2TransportStreamFramer::afterGettingFrame1 ( > this=0x6af750, frameSize=1128, presentationTime=...) > at MPEG2TransportStreamFramer.cpp:192 > #12 0x0000000000441567 in MPEG2TransportStreamFramer::afterGettingFrame ( > clientData=0x6af750, frameSize=1128, presentationTime=...) > at MPEG2TransportStreamFramer.cpp:136 > #13 0x000000000042ab0b in FramedSource::afterGetting (source=0x6af660) > at FramedSource.cpp:91 > #14 0x000000000042b5f6 in ByteStreamFileSource::doReadFromFile > (this=0x6af660) > at ByteStreamFileSource.cpp:179 > #15 0x000000000042b38f in ByteStreamFileSource::fileReadableHandler > (source=0x6af660) > at ByteStreamFileSource.cpp:123 > #16 0x000000000046ad90 in BasicTaskScheduler::SingleStep (this=0x6a8010, > maxDelayTime=0) at BasicTaskScheduler.cpp:163 > #17 0x000000000046d4aa in BasicTaskScheduler0::doEventLoop (this=0x6a8010, > watchVariable=0x0) at BasicTaskScheduler0.cpp:80 > #18 0x0000000000402556 in main (argc=1, argv=0x7fffffffe4a8) > at live555MediaServer.cpp:88 > > after this, MediaSink::onSourceClosure() will be called again > on TCPStreamSink.cpp:75. > > Following is the stack trace when the crash happens. > > Program received signal SIGSEGV, Segmentation fault. > 0x0000000000000000 in ?? () > (gdb) bt > #0 0x0000000000000000 in ?? () > #1 0x0000000000403c33 in MediaSink::onSourceClosure (this=0x6b5850) > at MediaSink.cpp:99 > #2 0x0000000000449d4a in TCPStreamSink::processBuffer (this=0x6b5850) > at TCPStreamSink.cpp:75 > #3 0x0000000000449e8d in TCPStreamSink::afterGettingFrame (this=0x6b5850, > frameSize=1316, numTruncatedBytes=0) at TCPStreamSink.cpp:104 > #4 0x0000000000449e06 in TCPStreamSink::afterGettingFrame > (clientData=0x6b5850, > frameSize=1316, numTruncatedBytes=0) at TCPStreamSink.cpp:94 > #5 0x000000000042ab0b in FramedSource::afterGetting (source=0x6b3f10) > at FramedSource.cpp:91 > #6 0x00000000004417e8 in MPEG2TransportStreamFramer::afterGettingFrame1 ( > this=0x6b3f10, frameSize=1316, presentationTime=...) > at MPEG2TransportStreamFramer.cpp:192 > #7 0x0000000000441567 in MPEG2TransportStreamFramer::afterGettingFrame ( > clientData=0x6b3f10, frameSize=1316, presentationTime=...) > at MPEG2TransportStreamFramer.cpp:136 > #8 0x000000000042ab0b in FramedSource::afterGetting (source=0x6b3e40) > at FramedSource.cpp:91 > #9 0x000000000042b5f6 in ByteStreamFileSource::doReadFromFile > (this=0x6b3e40) > at ByteStreamFileSource.cpp:179 > #10 0x000000000042b38f in ByteStreamFileSource::fileReadableHandler > (source=0x6b3e40) > at ByteStreamFileSource.cpp:123 > #11 0x000000000046ad90 in BasicTaskScheduler::SingleStep (this=0x6a8010, > maxDelayTime=0) at BasicTaskScheduler.cpp:163 > #12 0x000000000046d4aa in BasicTaskScheduler0::doEventLoop (this=0x6a8010, > watchVariable=0x0) at BasicTaskScheduler0.cpp:80 > #13 0x0000000000402556 in main (argc=1, argv=0x7fffffffe4a8) > at live555MediaServer.cpp:88 > > The crash happens on the 0113 release too. Here is a valgrind report with HLS streaming running, so it seems the crashing problem is not on MediaSink::onSourceClosure() being called twice in getNextFrame. ==31388== Invalid read of size 8 ==31388== at 0x4031C0: Medium::envir() const (Media.hh:59) ==31388== by 0x403C08: MediaSink::onSourceClosure() (MediaSink.cpp:99) ==31388== by 0x449D49: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:75) ==31388== by 0x449E8C: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:104) ==31388== by 0x449E05: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:94) ==31388== by 0x42AB0A: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==31388== by 0x4417E7: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==31388== by 0x441566: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==31388== by 0x42AB0A: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==31388== by 0x42B5F5: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==31388== by 0x42B38E: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==31388== by 0x46AD8F: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==31388== Address 0x5a82278 is 8 bytes inside a block of size 10,096 free'd ==31388== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==31388== by 0x449A7D: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:35) ==31388== by 0x40376F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==31388== by 0x403392: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==31388== by 0x4033CF: Medium::close(Medium*) (Media.cpp:59) ==31388== by 0x414761: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==31388== by 0x4147C1: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==31388== by 0x415211: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==31388== by 0x403C60: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==31388== by 0x449D49: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:75) ==31388== by 0x449ED4: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:115) ==31388== by 0x449EAF: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:109) Thanks. Park -------------- next part -------------- An HTML attachment was scrubbed... URL: From phuocpham at ssu.ac.kr Wed Jan 15 21:16:53 2014 From: phuocpham at ssu.ac.kr (=?EUC-KR?B?xsq53ceq?=) Date: Thu, 16 Jan 2014 14:16:53 +0900 (KST) Subject: [Live-devel] =?euc-kr?q?Problem_with_mpegts_muxer?= Message-ID: <52d76ccd3fa0_@_imoxion.com> Hi Ross, Thank you for your response. I know how to use ServerMediaSession and ServerMediaSubsession to stream H.264 and AAC as separate RTP streams. And I also know how to mux those two into a transport stream. But it seems that my problem happens with the last TS packets of each H.264 frame. I tried to increase MAX_INPUT_ES_FRAME_SIZE and it works on my PC for the testH264VideoToTransportStream.cpp case, but it does't work on my IP camera. ----------?????---------- ????: Ross Finlayson ????: LIVE555 Streaming Media - development & use ????: 2014-01-15 00:08:42 ? ?: Re: [Live-devel] Problem with mpegts muxer I'm working on an embedded device, the IP network camera. I am using MPEG2TransportStreamFromESSource to mux H.264 and AAC into transport stream. I succedded the muxing. But at the client side, the bottom right corner always has some broken part as shown in the image of the link below. The broken part is shown in the red square of the image. http://s1208.photobucket.com/user/phuocpham09t/media/mpegts.png.html Do you have any ideas why? No, unfortunately I don't. However, why are you multiplexing your video and audio into a Transport Stream, and then transmitting the Transport Stream? It is *much* more efficient (and robust) to stream the H.264 video and AAC audio separately, i.e., as separate RTP streams - without dealing with Transport Streams at all. The way to do this is to create two different "ServerMediaSubsession"s (each one a subclass of "OnDemandServerMediaSubsession", assuming that your server is streaming unicast), and add each one to your server's "ServerMediaSession" (using two calls to "RTSPServer::addSubsession()"). One of your "ServerMediaSubsession" subclasses (for video) would create (in its "createNewRTPSink()" virtual function implementation) a "H264VideoRTPSink", fed from a "H264VideoStreamDiscreteFramer". Your second "ServerMediaSubsession" subclass (for audio) would create (in its "createNewRTPSink()" virtual function implementation) a "MPEG4GenericRTPSink". (See "ADTSAudioFileServerMediaSubsession.cpp" for an example of how to create a "MPEG4GenericRTPSink" for streaming AAC audio.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From phuocpham at ssu.ac.kr Wed Jan 15 21:20:14 2014 From: phuocpham at ssu.ac.kr (=?EUC-KR?B?xsq53ceq?=) Date: Thu, 16 Jan 2014 14:20:14 +0900 (KST) Subject: [Live-devel] =?euc-kr?q?Problem_with_mpegts_muxer?= Message-ID: <52d76dc63fe6_@_imoxion.com> FYI, the version of Live555 used on my IP camera is the version (1986-2010). ----------?????---------- ????: Ross Finlayson ????: LIVE555 Streaming Media - development & use ????: 2014-01-15 00:08:42 ? ?: Re: [Live-devel] Problem with mpegts muxer I'm working on an embedded device, the IP network camera. I am using MPEG2TransportStreamFromESSource to mux H.264 and AAC into transport stream. I succedded the muxing. But at the client side, the bottom right corner always has some broken part as shown in the image of the link below. The broken part is shown in the red square of the image. http://s1208.photobucket.com/user/phuocpham09t/media/mpegts.png.html Do you have any ideas why? No, unfortunately I don't. However, why are you multiplexing your video and audio into a Transport Stream, and then transmitting the Transport Stream? It is *much* more efficient (and robust) to stream the H.264 video and AAC audio separately, i.e., as separate RTP streams - without dealing with Transport Streams at all. The way to do this is to create two different "ServerMediaSubsession"s (each one a subclass of "OnDemandServerMediaSubsession", assuming that your server is streaming unicast), and add each one to your server's "ServerMediaSession" (using two calls to "RTSPServer::addSubsession()"). One of your "ServerMediaSubsession" subclasses (for video) would create (in its "createNewRTPSink()" virtual function implementation) a "H264VideoRTPSink", fed from a "H264VideoStreamDiscreteFramer". Your second "ServerMediaSubsession" subclass (for audio) would create (in its "createNewRTPSink()" virtual function implementation) a "MPEG4GenericRTPSink". (See "ADTSAudioFileServerMediaSubsession.cpp" for an example of how to create a "MPEG4GenericRTPSink" for streaming AAC audio.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 16 01:37:46 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Jan 2014 01:37:46 -0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: References: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> Message-ID: <2B72C2F8-8A9F-4474-9818-1093750C87D5@live555.com> > The crash happens on the 0113 release too. Try upgrading to the most recent version (currently, 2014.01.16). It fixes a bug that might have caused your problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 16 16:45:04 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Jan 2014 16:45:04 -0800 Subject: [Live-devel] LIVE555 now supports streaming (transmitting and receiving) H.265 RTP streams Message-ID: <36E23D06-D0E1-4D6D-ACAA-67CB7A8CC6A1@live555.com> FYI, the latest version (2014.01.17) of the "LIVE555 Streaming Media" software now supports streaming (transmitting and receiving) H.265 RTP streams. H.265 (also called "HEVC") is MPEG's next generation video codec, after H.264. (It gives comparable video quality to H.264, but with about half the bitrate.) H.265 transmission support has been added to the "testOnDemandRTSPServer" demo application, and to the "LIVE555 Media Server" (currently just the source-code; not the pre-built binary versions). There's also a new multicast demo application "testH265VideoStreamer" (similar to the existing "testH264VideoStreamer"), and a new "testH265VideoToTransportStream" demo application (similar to the existing "testH264VideoToTransportStream"). H.265 reception support has also been added to the "testRTSPClient" and "openRTSP" applications. And the "LIVE555 Proxy Server" now supports proxying of H.265 RTSP/RTP streams. (Google has developed a comparable-quality codec called "VP9" - which (unlike H.265) is likely to be made available royalty-free. We will also be supporting VP9 sometime in the future.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From john95018 at gmail.com Wed Jan 15 09:46:52 2014 From: john95018 at gmail.com (john dicostanzo) Date: Wed, 15 Jan 2014 23:16:52 +0530 Subject: [Live-devel] TS parsing in testMPEG2TransportReceiver Message-ID: HI, I am using testMPEG2TransportReceiver over rpt . At receiver end,i want to receive and parse each packet of 188 byte of TS stream. please suggest what function and classes i can use for this purpose. need to subclass any classes??? John -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Thu Jan 16 06:31:09 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Thu, 16 Jan 2014 15:31:09 +0100 Subject: [Live-devel] problem processing RTSP TEARDOWN Message-ID: <13568_1389882672_52D7ED30_13568_4318_1_1BE8971B6CFF3A4F97AF4011882AA25501564250CCC6@THSONEA01CMS01P.one.grp> Hi Ross, Recently appears a case of use of memory that is no more allocated. It is possible to reproduce this using live555MediaServer and openRTSP -d 5 The valgrind lokks like : ==9860== Invalid read of size 1 ==9860== at 0x4AD5B5: RTSPServer::RTSPClientConnection::handleRequestBytes(int) (RTSPServer.cpp:1096) ==9860== by 0x4AC3C8: RTSPServer::RTSPClientConnection::incomingRequestHandler1() (RTSPServer.cpp:787) ==9860== by 0x4AC344: RTSPServer::RTSPClientConnection::incomingRequestHandler(void*, int) (RTSPServer.cpp:780) ==9860== by 0x504E0A: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:159) ==9860== by 0x50369B: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:92) ==9860== by 0x46212B: main (live555MediaServer.cpp:88) ==9860== Address 0x5bed261 is 33 bytes inside a block of size 64 free'd ==9860== at 0x4C26DCF: operator delete(void*) (vg_replace_malloc.c:387) ==9860== by 0x4AE791: RTSPServer::RTSPClientSession::~RTSPClientSession() (RTSPServer.cpp:1380) ==9860== by 0x4B00C8: RTSPServer::RTSPClientSession::handleCmd_TEARDOWN(RTSPServer::RTSPClientConnection*, ServerMediaSubsession*) (RTSPServer.cpp:1817) ==9860== by 0x4AFE50: RTSPServer::RTSPClientSession::handleCmd_withinSession(RTSPServer::RTSPClientConnection*, char const*, char const*, char const*, char const*) (RTSPServer.cpp:1780) ==9860== by 0x4AD127: RTSPServer::RTSPClientConnection::handleRequestBytes(int) (RTSPServer.cpp:1012) ==9860== by 0x4AC3C8: RTSPServer::RTSPClientConnection::incomingRequestHandler1() (RTSPServer.cpp:787) ==9860== by 0x4AC344: RTSPServer::RTSPClientConnection::incomingRequestHandler(void*, int) (RTSPServer.cpp:780) ==9860== by 0x504E0A: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:159) ==9860== by 0x50369B: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:92) ==9860== by 0x46212B: main (live555MediaServer.cpp:88) ==986 The code that cause the problem trig an internal PLAY during SETUP processing, so perhaps clientSession could be set to NULL after calling handleCmd_withinSession (RTSPServer:1012). Thanks for your support, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 17 06:50:50 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Jan 2014 06:50:50 -0800 Subject: [Live-devel] problem processing RTSP TEARDOWN In-Reply-To: <13568_1389882672_52D7ED30_13568_4318_1_1BE8971B6CFF3A4F97AF4011882AA25501564250CCC6@THSONEA01CMS01P.one.grp> References: <13568_1389882672_52D7ED30_13568_4318_1_1BE8971B6CFF3A4F97AF4011882AA25501564250CCC6@THSONEA01CMS01P.one.grp> Message-ID: "valgrind" frequently reports 'false positives' - i.e., 'errors' that aren't really errors. Therefore, I don't pay attention to "valgrind" reports, unless they're accompanied by a report of a real problem (such as a crash), or an identification of a specific bug in the code. In other words, a "valgrind" report, is not, per se, a "problem". In this case, I don't see any problem in the code. The "RTSPClientSession" object is looked up - for each request - by looking up the 'session id string' in a hash table. The "RTSPClientSession" destructor (which is called when handling a "TEARDOWN") also removes the object from the hash table. So I don't see any way that a "RTSPClientSession" object can get deleted twice. So, right now I don't see this "valgrind" report as indicating a real problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Fri Jan 17 07:48:46 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Fri, 17 Jan 2014 16:48:46 +0100 Subject: [Live-devel] problem processing RTSP TEARDOWN In-Reply-To: References: <13568_1389882672_52D7ED30_13568_4318_1_1BE8971B6CFF3A4F97AF4011882AA25501564250CCC6@THSONEA01CMS01P.one.grp> Message-ID: <29895_1389973727_52D950DF_29895_688_2_1BE8971B6CFF3A4F97AF4011882AA25501564258EF8F@THSONEA01CMS01P.one.grp> Hi Ross, I did not say RTSPClientSession will deleted twice, I just report an access to an object after its deletion. The problem report by valgrind occurs processing TEARDOWN because : - RTSPServer:1016 clientSession->handleCmd_withinSession(this, cmdName, urlPreSuffix, urlSuffix, (char const*)fRequestBuffer); => delete the RTSPClientSession the is pointed by clientSession. - RTSPServer:1098 if (clientSession != NULL && clientSession->fStreamAfterSETUP && strcmp(cmdName, "SETUP") == 0) { ? The deleted clientSession is not NULL, point on a unallocated memory and clientSession->fStreamAfterSETUP is evaluated. Running under debugger show that clientSession is no more allocated when clientSession->fStreamAfterSETUP is evaluated. I guess that till the memory is mapped on something, it will work. But I am not sure that in certain situation the memory could become no more valid and perhaps raise a segment violation . Are you sure that accessing to an memory that was freed could not raise any problem ? Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : vendredi 17 janvier 2014 15:51 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] problem processing RTSP TEARDOWN "valgrind" frequently reports 'false positives' - i.e., 'errors' that aren't really errors. Therefore, I don't pay attention to "valgrind" reports, unless they're accompanied by a report of a real problem (such as a crash), or an identification of a specific bug in the code. In other words, a "valgrind" report, is not, per se, a "problem". In this case, I don't see any problem in the code. The "RTSPClientSession" object is looked up - for each request - by looking up the 'session id string' in a hash table. The "RTSPClientSession" destructor (which is called when handling a "TEARDOWN") also removes the object from the hash table. So I don't see any way that a "RTSPClientSession" object can get deleted twice. So, right now I don't see this "valgrind" report as indicating a real problem. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 17 11:08:27 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Jan 2014 11:08:27 -0800 Subject: [Live-devel] problem processing RTSP TEARDOWN In-Reply-To: <29895_1389973727_52D950DF_29895_688_2_1BE8971B6CFF3A4F97AF4011882AA25501564258EF8F@THSONEA01CMS01P.one.grp> References: <13568_1389882672_52D7ED30_13568_4318_1_1BE8971B6CFF3A4F97AF4011882AA25501564250CCC6@THSONEA01CMS01P.one.grp> <29895_1389973727_52D950DF_29895_688_2_1BE8971B6CFF3A4F97AF4011882AA25501564258EF8F@THSONEA01CMS01P.one.grp> Message-ID: <32DE978D-E9AE-46E5-BCDD-406A64FFEDAE@live555.com> > The problem report by valgrind occurs processing TEARDOWN because : > - RTSPServer:1016 clientSession->handleCmd_withinSession(this, cmdName, urlPreSuffix, urlSuffix, (char const*)fRequestBuffer); > => delete the RTSPClientSession the is pointed by clientSession. > - RTSPServer:1098 if (clientSession != NULL && clientSession->fStreamAfterSETUP && strcmp(cmdName, "SETUP") == 0) { > ? The deleted clientSession is not NULL, point on a unallocated memory and clientSession->fStreamAfterSETUP is evaluated. Yes, you're right; I misinterpreted the valgrind report. Changing line 1098 to if (strcmp(cmdName, "SETUP") == 0 && clientSession != NULL && clientSession->fStreamAfterSETUP) { or just if (strcmp(cmdName, "SETUP") == 0 && clientSession->fStreamAfterSETUP) { will avoid the problem. The next release of the software will include this fix (actually, a better fix that eliminates the call to "strcmp()"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jawahar456 at gmail.com Fri Jan 17 04:13:48 2014 From: jawahar456 at gmail.com (Jawahar Venugopal) Date: Fri, 17 Jan 2014 17:43:48 +0530 Subject: [Live-devel] Difficulty in streaming more than one videos using live555proxyserver program. Message-ID: Hi, Thank you for your inputs Ross.. I looked into the source code of Spydroid (My Network ip camera), As you hinted the UDP port was hard-coded as 5006 and hence the inter-mixing of videos. I changed it to 0 and everything seems to work well now. Thanks a lot for your help. Regards, Jawahar Venugopal -------------- next part -------------- An HTML attachment was scrubbed... URL: From pallela.venkatkarthik at vvdntech.com Fri Jan 17 20:00:00 2014 From: pallela.venkatkarthik at vvdntech.com (Pallela Venkat Karthik) Date: Sat, 18 Jan 2014 09:30:00 +0530 Subject: [Live-devel] testOnDemandRTSPServer UDP only Message-ID: Hi, I am using testOnDemandRTSPServer. I can stream using TCP or UDP from QT player(with client side settings changed). What changes do I need to make so that only UDP streaming is allowed. -- *With Warm Regards,* Venkat Karthik VVDN Technologies Pvt Ltd *Cell : *+91 9884292064 | *Skype :* venk.kart -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 17 20:24:26 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Jan 2014 20:24:26 -0800 Subject: [Live-devel] testOnDemandRTSPServer UDP only In-Reply-To: References: Message-ID: > I am using testOnDemandRTSPServer. I can stream using TCP or UDP from QT player(with client side settings changed). What changes do I need to make so that only UDP streaming is allowed. Venkat, I'm a bit surprised by your question, because most people consider RTP/RTCP-over-TCP streaming to be a feature. (If the client is behind a firewall, then sometimes this is the only way that it will be able to access the stream.) However, if you really want to modify your server so that it refuses all requests to stream via TCP, you can do so by changing line 1650 of "liveMedia/RTSPServer.cpp" (assuming the latest version of the code: 2014.01.18) from if (fIsMulticast) { to if (fIsMulticast || streamingMode == RTP_TCP) { (Of course, if you make this modification, it - like all modifications to the code - will be subject to the conditions of the LGPL.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Thu Jan 16 17:47:31 2014 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Fri, 17 Jan 2014 09:47:31 +0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: <2B72C2F8-8A9F-4474-9818-1093750C87D5@live555.com> References: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> <2B72C2F8-8A9F-4474-9818-1093750C87D5@live555.com> Message-ID: On Thu, Jan 16, 2014 at 5:37 PM, Ross Finlayson wrote: > The crash happens on the 0113 release too. > > > Try upgrading to the most recent version (currently, 2014.01.16). It > fixes a bug that might have caused your problem. > > > Hi Ross, The latest(0117) release still crashes on HLS. Here is the stack trace, #0 0x0000000000000000 in ?? () #1 0x000000000040d003 in MediaSink::onSourceClosure (this=0x6be880) at MediaSink.cpp:99 #2 0x0000000000454b11 in TCPStreamSink::processBuffer (this=0x6be880) at TCPStreamSink.cpp:77 #3 0x0000000000454c59 in TCPStreamSink::afterGettingFrame (this=0x6be880, frameSize=376, numTruncatedBytes=0) at TCPStreamSink.cpp:106 #4 0x0000000000454bce in TCPStreamSink::afterGettingFrame (clientData=0x6be880, frameSize=376, numTruncatedBytes=0) at TCPStreamSink.cpp:96 #5 0x0000000000434def in FramedSource::afterGetting (source=0x6bd8b0) at FramedSource.cpp:91 #6 0x000000000044c0f0 in MPEG2TransportStreamFramer::afterGettingFrame1 ( this=0x6bd8b0, frameSize=376, presentationTime=...) at MPEG2TransportStreamFramer.cpp:192 #7 0x000000000044be67 in MPEG2TransportStreamFramer::afterGettingFrame ( clientData=0x6bd8b0, frameSize=376, presentationTime=...) at MPEG2TransportStreamFramer.cpp:136 #8 0x0000000000434def in FramedSource::afterGetting (source=0x6bd7e0) at FramedSource.cpp:91 #9 0x00000000004358fa in ByteStreamFileSource::doReadFromFile (this=0x6bd7e0) at ByteStreamFileSource.cpp:179 #10 0x0000000000435693 in ByteStreamFileSource::fileReadableHandler (source=0x6bd7e0) at ByteStreamFileSource.cpp:123 #11 0x0000000000476e21 in BasicTaskScheduler::SingleStep (this=0x6b5010, maxDelayTime=0) at BasicTaskScheduler.cpp:163 #12 0x000000000047962e in BasicTaskScheduler0::doEventLoop (this=0x6b5010, watchVariable=0x0) at BasicTaskScheduler0.cpp:80 #13 0x000000000040b862 in main (argc=1, argv=0x7fffffffe4a8) at live555MediaServer.cpp:88 When running under valgrind, it will not crash but valgrind detected some invalid memory accesses. Not sure if these invalid memory accesses are related to the crash. Attached is the valgrind report when running HLS. Thanks. Park -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- ==509== Invalid read of size 1 ==509== at 0x454ADA: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:75) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== by 0x47962D: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:80) ==509== by 0x40B861: main (live555MediaServer.cpp:88) ==509== Address 0x5a2ed48 is 10,088 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== ==509== Invalid read of size 4 ==509== at 0x454CB0: TCPStreamSink::numUnwrittenBytes() const (TCPStreamSink.hh:57) ==509== by 0x454AF0: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:75) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== by 0x47962D: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:80) ==509== Address 0x5a2ed44 is 10,084 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== ==509== Invalid read of size 4 ==509== at 0x454CBA: TCPStreamSink::numUnwrittenBytes() const (TCPStreamSink.hh:57) ==509== by 0x454AF0: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:75) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== by 0x47962D: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:80) ==509== Address 0x5a2ed40 is 10,080 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== ==509== Invalid read of size 8 ==509== at 0x40C538: Medium::envir() const (Media.hh:59) ==509== by 0x40CFD8: MediaSink::onSourceClosure() (MediaSink.cpp:99) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== Address 0x5a2c5e8 is 8 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== ==509== Invalid read of size 8 ==509== at 0x4795AA: BasicTaskScheduler0::unscheduleDelayedTask(void*&) (BasicTaskScheduler0.cpp:71) ==509== by 0x40D002: MediaSink::onSourceClosure() (MediaSink.cpp:99) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== Address 0x5a2c610 is 48 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== ==509== Invalid write of size 8 ==509== at 0x4795C8: BasicTaskScheduler0::unscheduleDelayedTask(void*&) (BasicTaskScheduler0.cpp:72) ==509== by 0x40D002: MediaSink::onSourceClosure() (MediaSink.cpp:99) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== Address 0x5a2c610 is 48 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== ==509== Invalid write of size 8 ==509== at 0x40D007: MediaSink::onSourceClosure() (MediaSink.cpp:101) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== by 0x47962D: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:80) ==509== Address 0x5a2c618 is 56 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== ==509== Invalid read of size 8 ==509== at 0x40D013: MediaSink::onSourceClosure() (MediaSink.cpp:102) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454C58: TCPStreamSink::afterGettingFrame(unsigned int, unsigned int) (TCPStreamSink.cpp:106) ==509== by 0x454BCD: TCPStreamSink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (TCPStreamSink.cpp:96) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x44C0EF: MPEG2TransportStreamFramer::afterGettingFrame1(unsigned int, timeval) (MPEG2TransportStreamFramer.cpp:192) ==509== by 0x44BE66: MPEG2TransportStreamFramer::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) (MPEG2TransportStreamFramer.cpp:136) ==509== by 0x434DEE: FramedSource::afterGetting(FramedSource*) (FramedSource.cpp:91) ==509== by 0x4358F9: ByteStreamFileSource::doReadFromFile() (ByteStreamFileSource.cpp:179) ==509== by 0x435692: ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource*, int) (ByteStreamFileSource.cpp:123) ==509== by 0x476E20: BasicTaskScheduler::SingleStep(unsigned int) (BasicTaskScheduler.cpp:163) ==509== by 0x47962D: BasicTaskScheduler0::doEventLoop(char*) (BasicTaskScheduler0.cpp:80) ==509== Address 0x5a2c620 is 64 bytes inside a block of size 10,096 free'd ==509== at 0x4C2836C: operator delete(void*) (vg_replace_malloc.c:480) ==509== by 0x454839: TCPStreamSink::~TCPStreamSink() (TCPStreamSink.cpp:37) ==509== by 0x40CB1F: MediaLookupTable::remove(char const*) (Media.cpp:151) ==509== by 0x40C71A: Medium::close(UsageEnvironment&, char const*) (Media.cpp:53) ==509== by 0x40C757: Medium::close(Medium*) (Media.cpp:59) ==509== by 0x41DBB8: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:59) ==509== by 0x41DC17: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::~RTSPClientConnectionSupportingHTTPStreaming() (RTSPServerSupportingHTTPStreaming.cpp:60) ==509== by 0x41E685: RTSPServerSupportingHTTPStreaming::RTSPClientConnectionSupportingHTTPStreaming::afterStreaming(void*) (RTSPServerSupportingHTTPStreaming.cpp:260) ==509== by 0x40D030: MediaSink::onSourceClosure() (MediaSink.cpp:103) ==509== by 0x454B10: TCPStreamSink::processBuffer() (TCPStreamSink.cpp:77) ==509== by 0x454CA0: TCPStreamSink::ourOnSourceClosure1() (TCPStreamSink.cpp:117) ==509== by 0x454C7B: TCPStreamSink::ourOnSourceClosure(void*) (TCPStreamSink.cpp:111) ==509== From finlayson at live555.com Fri Jan 17 23:53:34 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Jan 2014 23:53:34 -0800 Subject: [Live-devel] TS parsing in testMPEG2TransportReceiver In-Reply-To: References: Message-ID: <42D758BA-C231-4D51-AFA1-53B74BB4EABB@live555.com> > I am using testMPEG2TransportReceiver over rpt . > At receiver end,i want to receive and parse each packet of 188 byte of TS stream. > please suggest what function and classes i can use for this purpose. We don't provide any general-purpose Transport Stream parsing code - but I'm sure you can find code for doing this somewhere else, and then use it to write your own "MediaSink" subclass. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Jan 18 17:02:13 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 18 Jan 2014 17:02:13 -0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: References: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> <2B72C2F8-8A9F-4474-9818-1093750C87D5@live555.com> Message-ID: > The latest(0117) release still crashes on HLS. \ OK, thanks for the report. I believe I have fixed the bug in the most recent version (2014.01.19). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Sun Jan 19 18:02:06 2014 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Mon, 20 Jan 2014 10:02:06 +0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: References: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> <2B72C2F8-8A9F-4474-9818-1093750C87D5@live555.com> Message-ID: On Sun, Jan 19, 2014 at 9:02 AM, Ross Finlayson wrote: > The latest(0117) release still crashes on HLS. \ > > > OK, thanks for the report. > > I believe I have fixed the bug in the most recent version (2014.01.19). > > > Hi Ross, 2014.01.19 release fails at the same location as the 0117 release when HLS runs. Here is the stack trace at crash. #0 0x0000000000000000 in ?? () #1 0x000000000040d003 in MediaSink::onSourceClosure (this=0x6be830) at MediaSink.cpp:99 #2 0x0000000000454b4d in TCPStreamSink::processBuffer (this=0x6be830) at TCPStreamSink.cpp:77 #3 0x0000000000454c95 in TCPStreamSink::afterGettingFrame (this=0x6be830, frameSize=376, numTruncatedBytes=0) at TCPStreamSink.cpp:106 #4 0x0000000000454c0a in TCPStreamSink::afterGettingFrame (clientData=0x6be830, frameSize=376, numTruncatedBytes=0) at TCPStreamSink.cpp:96 #5 0x0000000000434dd7 in FramedSource::afterGetting (source=0x6bd8a0) at FramedSource.cpp:91 #6 0x000000000044c12c in MPEG2TransportStreamFramer::afterGettingFrame1 ( this=0x6bd8a0, frameSize=376, presentationTime=...) at MPEG2TransportStreamFramer.cpp:192 #7 0x000000000044bea3 in MPEG2TransportStreamFramer::afterGettingFrame ( clientData=0x6bd8a0, frameSize=376, presentationTime=...) at MPEG2TransportStreamFramer.cpp:136 #8 0x0000000000434dd7 in FramedSource::afterGetting (source=0x6bd7d0) at FramedSource.cpp:91 #9 0x00000000004358fe in ByteStreamFileSource::doReadFromFile (this=0x6bd7d0) at ByteStreamFileSource.cpp:179 #10 0x0000000000435697 in ByteStreamFileSource::fileReadableHandler (source=0x6bd7d0) at ByteStreamFileSource.cpp:123 #11 0x0000000000476e5d in BasicTaskScheduler::SingleStep (this=0x6b5010, maxDelayTime=0) at BasicTaskScheduler.cpp:163 #12 0x000000000047966a in BasicTaskScheduler0::doEventLoop (this=0x6b5010, watchVariable=0x0) at BasicTaskScheduler0.cpp:80 #13 0x000000000040b862 in main (argc=1, argv=0x7fffffffe4a8) at live555MediaServer.cpp:88 Thanks. Park -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 20 00:50:49 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Jan 2014 00:50:49 -0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: References: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> <2B72C2F8-8A9F-4474-9818-1093750C87D5@live555.com> Message-ID: > 2014.01.19 release fails at the same location as the 0117 release when HLS runs. Arggh! Tracking down and fixing bugs in this code has been frustrating. Try the next release: 2014.01.20 - available now. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From phuocpham at ssu.ac.kr Mon Jan 20 04:39:18 2014 From: phuocpham at ssu.ac.kr (=?EUC-KR?B?ZGFuaWVs?=) Date: Mon, 20 Jan 2014 21:39:18 +0900 (KST) Subject: [Live-devel] =?euc-kr?q?Problem=5Fwith=5Fmpegts=5Fmuxer?= Message-ID: <52dd1c773fe1_@_imoxion.com> Hi Ross, Thank you so much for your response! I already updated the lasted version of Live555. Now I got another problem. I use this kind of chain to make the transport stream: InputVideo(Live H.264)-> H264VideoStreamFramer->MPEG2TransportStreamFromESSource. With that chain I got the :Frame truncated -> would you tell me where I can increase the buffer ? The second problem is that after I use VLC to request data and play back. For the first VLC client It found the SPS and PPS and it plays back ok. But for the second VLC client, it cannot play back the stream because it doesnot receive the SPS and PPS and keeps waiting for those parameters. (packetizer_h264 warning: waiting for SPS/PPS) ----------?????---------- ????: ??? ????: Ross Finlayson ,LIVE555 Streaming Media - development & use ????: 2014-01-16 14:20:14 ? ?: Re: [Live-devel]Problem_with_mpegts_muxer FYI, the version of Live555 used on my IP camera is the version (1986-2010). ----------?????---------- ????: Ross Finlayson ????: LIVE555 Streaming Media - development & use ????: 2014-01-15 00:08:42 ? ?: Re: [Live-devel] Problem with mpegts muxer I'm working on an embedded device, the IP network camera. I am using MPEG2TransportStreamFromESSource to mux H.264 and AAC into transport stream. I succedded the muxing. But at the client side, the bottom right corner always has some broken part as shown in the image of the link below. The broken part is shown in the red square of the image. http://s1208.photobucket.com/user/phuocpham09t/media/mpegts.png.html Do you have any ideas why? No, unfortunately I don't. However, why are you multiplexing your video and audio into a Transport Stream, and then transmitting the Transport Stream? It is *much* more efficient (and robust) to stream the H.264 video and AAC audio separately, i.e., as separate RTP streams - without dealing with Transport Streams at all. The way to do this is to create two different "ServerMediaSubsession"s (each one a subclass of "OnDemandServerMediaSubsession", assuming that your server is streaming unicast), and add each one to your server's "ServerMediaSession" (using two calls to "RTSPServer::addSubsession()"). One of your "ServerMediaSubsession" subclasses (for video) would create (in its "createNewRTPSink()" virtual function implementation) a "H264VideoRTPSink", fed from a "H264VideoStreamDiscreteFramer". Your second "ServerMediaSubsession" subclass (for audio) would create (in its "createNewRTPSink()" virtual function implementation) a "MPEG4GenericRTPSink". (See "ADTSAudioFileServerMediaSubsession.cpp" for an example of how to create a "MPEG4GenericRTPSink" for streaming AAC audio.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From phuocpham at ssu.ac.kr Mon Jan 20 05:44:27 2014 From: phuocpham at ssu.ac.kr (=?EUC-KR?B?xsq53ceq?=) Date: Mon, 20 Jan 2014 22:44:27 +0900 (KST) Subject: [Live-devel] =?euc-kr?q?Problem=5Fwith=5Fmpegts=5Fmuxer?= Message-ID: <52dd29523fb0_@_imoxion.com> Hi Ross, I just found out what's going on here. I feed the Live H.264 directly to the MPEG2TransportStreamFromESSource to make the TS. But right here I have two cases: + If I keep the startcode of the H.264 stream then the broken part at the right bottom corner appears but instead I can play back with multiples VLC clients. + If I skip the startcode of the H.264 stream then there's no broken part at the right bottom corner but instead I just can play it back with only one VLC client. Do you have any ideas? ----------?????---------- ????: daniel ????: Ross Finlayson ,LIVE555 Streaming Media - development & use ????: 2014-01-20 21:39:18 ? ?: Re: [Live-devel]Problem_with_mpegts_muxer Hi Ross, Thank you so much for your response! I already updated the lasted version of Live555. Now I got another problem. I use this kind of chain to make the transport stream: InputVideo(Live H.264)-> H264VideoStreamFramer->MPEG2TransportStreamFromESSource. With that chain I got the :Frame truncated -> would you tell me where I can increase the buffer ? The second problem is that after I use VLC to request data and play back. For the first VLC client It found the SPS and PPS and it plays back ok. But for the second VLC client, it cannot play back the stream because it doesnot receive the SPS and PPS and keeps waiting for those parameters. (packetizer_h264 warning: waiting for SPS/PPS) ----------?????---------- ????: ??? ????: Ross Finlayson ,LIVE555 Streaming Media - development & use ????: 2014-01-16 14:20:14 ? ?: Re: [Live-devel]Problem_with_mpegts_muxer FYI, the version of Live555 used on my IP camera is the version (1986-2010). ----------?????---------- ????: Ross Finlayson ????: LIVE555 Streaming Media - development & use ????: 2014-01-15 00:08:42 ? ?: Re: [Live-devel] Problem with mpegts muxer I'm working on an embedded device, the IP network camera. I am using MPEG2TransportStreamFromESSource to mux H.264 and AAC into transport stream. I succedded the muxing. But at the client side, the bottom right corner always has some broken part as shown in the image of the link below. The broken part is shown in the red square of the image. http://s1208.photobucket.com/user/phuocpham09t/media/mpegts.png.html Do you have any ideas why? No, unfortunately I don't. However, why are you multiplexing your video and audio into a Transport Stream, and then transmitting the Transport Stream? It is *much* more efficient (and robust) to stream the H.264 video and AAC audio separately, i.e., as separate RTP streams - without dealing with Transport Streams at all. The way to do this is to create two different "ServerMediaSubsession"s (each one a subclass of "OnDemandServerMediaSubsession", assuming that your server is streaming unicast), and add each one to your server's "ServerMediaSession" (using two calls to "RTSPServer::addSubsession()"). One of your "ServerMediaSubsession" subclasses (for video) would create (in its "createNewRTPSink()" virtual function implementation) a "H264VideoRTPSink", fed from a "H264VideoStreamDiscreteFramer". Your second "ServerMediaSubsession" subclass (for audio) would create (in its "createNewRTPSink()" virtual function implementation) a "MPEG4GenericRTPSink". (See "ADTSAudioFileServerMediaSubsession.cpp" for an example of how to create a "MPEG4GenericRTPSink" for streaming AAC audio.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 20 09:58:55 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Jan 2014 09:58:55 -0800 Subject: [Live-devel] Problem_with_mpegts_muxer In-Reply-To: <52dd29523fb0_@_imoxion.com> References: <52dd29523fb0_@_imoxion.com> Message-ID: <4E4804C6-90AD-47C4-9E18-6DA906FB3B94@live555.com> First, please trim your messages when responding to email. On Jan 20, 2014, at 5:44 AM, ??? wrote: > I just found out what's going on here. > I feed the Live H.264 directly to the MPEG2TransportStreamFromESSource to make the TS. But right here I have two cases: > + If I keep the startcode of the H.264 stream then the broken part at the right bottom corner appears but instead I can play back with multiples VLC clients. > + If I skip the startcode of the H.264 stream then there's no broken part at the right bottom corner but instead I just can play it back with only one VLC client. > > Do you have any ideas? No, because VLC is not our software. In any case, I've already explained to you that multiplexing your video (& audio) into a Transport Stream - and then streaming the Transport Stream - is a bad idea. It's much better to stream the video (and audio) directly - as individual RTP streams. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dgarri02 at harris.com Mon Jan 20 17:21:00 2014 From: dgarri02 at harris.com (Garrison, David) Date: Tue, 21 Jan 2014 01:21:00 +0000 Subject: [Live-devel] streaming h264 dynamic payload from rtp source Message-ID: Hello - I've been looking at the examples and have tried to accomplish setting up an RTSP server that can stream RTP H264 dynamic payload from a live RTP source carrying MPEG2 Transport with H264 PES payload and keep running into problems. Any guidance would be greatly appreciated. My latest attempt includes the following: 1. Instantiate an RTSPServer using my own ProxyServer derived from ServerMediaSession. 2. The ProxyServer owns a SimpleRTPSource which it passes into my own Subsession that extends OnDemandServerMediaaSubsession 3. Subsession overrides createNewStreamSource and does the following: a. Creates MPEG1or2Demux using the SimpleRTPSource as its FrameSource input b. Creates an MPEG2IFrameIndexFromTransportStream using the demuxed newVideoStream() as its FrameSource input c. Return a new H264VideoStreamFramer using object from prior step as its frame source 4. Subsession overrides createNewRTPSinnk and returns a new H264VideoRTPSink After the client sets up and requests the play, nothing happens and it tries again over and over. I at least know the sockets are setup correctly for the SimpleRTPSource. Dave Garrison -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 20 17:47:21 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Jan 2014 17:47:21 -0800 Subject: [Live-devel] streaming h264 dynamic payload from rtp source In-Reply-To: References: Message-ID: <930145E1-1CB2-4F54-B2DE-FEACC094C039@live555.com> > I?ve been looking at the examples and have tried to accomplish setting up an RTSP server that can stream RTP H264 dynamic payload from a live RTP source carrying MPEG2 Transport with H264 PES payload and keep running into problems. Any guidance would be greatly appreciated. Dave, The bad news is that you're on completely the wrong track :-) The good news, however, is that it will probably be fairly easy to do what you want. The first thing to note is that the "LIVE555 Streaming Media" code does not include any mechanism for demultiplexing one (or more) stream(s) from a Transport Stream. (Note that the "MPEG1or2Demux" class is for demultiplexing a MPEG *Program Stream*, which is completely different.) Also, the "IndexFromTransportStream" code is for creating a separate index file from a Transport Stream file - to later use for 'trick play' access to the original file. It has nothing to with demultiplexing from a Transport Stream. Therefore, I suggest that you find (or write) some other software that will convert an input Transport Stream into an output H.264 video stream (consisting of a byte-stream that contains a sequence of H.264 NAL units, each beginning with a 0x00 0x00 0x00 0x01 'start code'). There is probably some other Open Source software out there that will do this. Then, write your own "OnDemandServerMediaSubsession" subclass that: 1/ Reimplements "createNewStreamSource()" to create a new instance of your 'Transport Stream-to-H.264 Video conversion software', and feed this into a "H264VideoStreamFramer" (*not* a "H264VideoStreamDiscreteFramer", unless your conversion software delivers discrete H.264 NAL units (i.e., one-at-a-time), without start codes). 2/ Reimplements "createNewRTPSink()" to return a new "H264VideoRTPSink". (You got that part right :-) Even simpler, though, you can probably get what you want by doing the following: - Change the "testOnDemandRTSPServer" code by changing line 92 of "testProgs/testOnDemandRTSPServer.cpp" from char const* inputFileName = "test.264"; to char const* inputFileName = "stdin"; Then, run Your-Transport-Stream-to-H264-Video-Conversion-Application | testOnDemandRTSPServer (assuming that 'Your-Transport-Stream-to-H264-Video-Conversion-Application' outputs to 'stdout') This should work, provided that the input stream contains frequently-occurring H.264 SPS and PPS NAL units. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Mon Jan 20 17:40:26 2014 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Tue, 21 Jan 2014 09:40:26 +0800 Subject: [Live-devel] HTTP live streaming crash In-Reply-To: References: <2A9286C6-B9B1-43F7-9B09-8435B22E173E@live555.com> <2B72C2F8-8A9F-4474-9818-1093750C87D5@live555.com> Message-ID: > > > Arggh! Tracking down and fixing bugs in this code has been frustrating. > > Try the next release: 2014.01.20 - available now. > > > Hi Ross, Thanks, this fixes the crash. Regards, Park -------------- next part -------------- An HTML attachment was scrubbed... URL: From parkchan1960 at gmail.com Mon Jan 20 17:47:21 2014 From: parkchan1960 at gmail.com (Pak Man Chan) Date: Tue, 21 Jan 2014 09:47:21 +0800 Subject: [Live-devel] UDP ports in HTTP Live Streaming Message-ID: Hi Ross, Could you advise the roles of the UDP ports in HLS? The problem with them is that they are not being closed when streaming ends. Sooner or later, new sockets run out and streaming will stop as the server will try repeatedly creating the sockets in OnDemandServerMediaSubsession::getStreamParameters. I've tried simply skipping the socket creation, but this will stop streaming too. Regards, Park -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Jan 20 23:05:35 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Jan 2014 23:05:35 -0800 Subject: [Live-devel] UDP ports in HTTP Live Streaming In-Reply-To: References: Message-ID: <7D1A47BF-6870-459F-9AD4-3A84FA84CF33@live555.com> > Could you advise the roles of the UDP ports in HLS? > > The problem with them is that they are not being closed when streaming ends. Sooner or later, new sockets run out and streaming will stop as the server will try repeatedly creating the sockets in OnDemandServerMediaSubsession::getStreamParameters. OK, I've just installed a new version (2014.01.21) of the code that will no longer create these (unneeded) UDP ports when HTTP Live Streaming is being set up. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dgarri02 at harris.com Tue Jan 21 10:52:58 2014 From: dgarri02 at harris.com (Garrison, David) Date: Tue, 21 Jan 2014 18:52:58 +0000 Subject: [Live-devel] streaming h264 dynamic payload from rtp source In-Reply-To: <930145E1-1CB2-4F54-B2DE-FEACC094C039@live555.com> References: <930145E1-1CB2-4F54-B2DE-FEACC094C039@live555.com> Message-ID: Ross, Thanks for the quick response. Ok, So I've started my own MPEG2PesParser class that extends FramedFilter to do the parsing. I started the implementation for doGetNextFrame() and understand how to parse the PES from the datagram payload but I still do not see how to get at the data nor how to pass that data along to the sink. I set some breakpoints in RTPInterface and see the startNetworkReading being called in response to the PLAY request but the incomingReportHandler is not called. Seems my MPEG2PesParser::doGetNextFrame() is being called before the RTPInterface::incomingReportHandler is called. What am I missing? Dave Garrison From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Monday, January 20, 2014 8:47 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] streaming h264 dynamic payload from rtp source I've been looking at the examples and have tried to accomplish setting up an RTSP server that can stream RTP H264 dynamic payload from a live RTP source carrying MPEG2 Transport with H264 PES payload and keep running into problems. Any guidance would be greatly appreciated. Dave, The bad news is that you're on completely the wrong track :-) The good news, however, is that it will probably be fairly easy to do what you want. The first thing to note is that the "LIVE555 Streaming Media" code does not include any mechanism for demultiplexing one (or more) stream(s) from a Transport Stream. (Note that the "MPEG1or2Demux" class is for demultiplexing a MPEG *Program Stream*, which is completely different.) Also, the "IndexFromTransportStream" code is for creating a separate index file from a Transport Stream file - to later use for 'trick play' access to the original file. It has nothing to with demultiplexing from a Transport Stream. Therefore, I suggest that you find (or write) some other software that will convert an input Transport Stream into an output H.264 video stream (consisting of a byte-stream that contains a sequence of H.264 NAL units, each beginning with a 0x00 0x00 0x00 0x01 'start code'). There is probably some other Open Source software out there that will do this. Then, write your own "OnDemandServerMediaSubsession" subclass that: 1/ Reimplements "createNewStreamSource()" to create a new instance of your 'Transport Stream-to-H.264 Video conversion software', and feed this into a "H264VideoStreamFramer" (*not* a "H264VideoStreamDiscreteFramer", unless your conversion software delivers discrete H.264 NAL units (i.e., one-at-a-time), without start codes). 2/ Reimplements "createNewRTPSink()" to return a new "H264VideoRTPSink". (You got that part right :-) Even simpler, though, you can probably get what you want by doing the following: - Change the "testOnDemandRTSPServer" code by changing line 92 of "testProgs/testOnDemandRTSPServer.cpp" from char const* inputFileName = "test.264"; to char const* inputFileName = "stdin"; Then, run Your-Transport-Stream-to-H264-Video-Conversion-Application | testOnDemandRTSPServer (assuming that 'Your-Transport-Stream-to-H264-Video-Conversion-Application' outputs to 'stdout') This should work, provided that the input stream contains frequently-occurring H.264 SPS and PPS NAL units. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 21 11:11:01 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Jan 2014 11:11:01 -0800 Subject: [Live-devel] streaming h264 dynamic payload from rtp source In-Reply-To: References: <930145E1-1CB2-4F54-B2DE-FEACC094C039@live555.com> Message-ID: <5E4B98BD-A1D8-41CF-B9BC-25B6C8657D38@live555.com> > Ok, So I?ve started my own MPEG2PesParser class that extends FramedFilter to do the parsing. I started the implementation for doGetNextFrame() and understand how to parse the PES from the datagram payload but I still do not see how to get at the data nor how to pass that data along to the sink. > > I set some breakpoints in RTPInterface and see the startNetworkReading being called in response to the PLAY request but the incomingReportHandler is not called. > > Seems my MPEG2PesParser::doGetNextFrame() is being called before the RTPInterface::incomingReportHandler is called. > > What am I missing? I don't know, because unfortunately I don't understand what you are trying to do. In your last email, you said you wanted to write a server - i.e., something that transmits RTP packets. But now, you seem to be talking about something that *receives* RTP packets. In any case, I suspect that your project may be too complex for me to give you much advice 'for free' on this mailing list. If your company is interested in having me consult with you to help you with your project, then please contact me by separate email. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From D.Wilder at f5.com Tue Jan 21 12:03:58 2014 From: D.Wilder at f5.com (Dave Wilder) Date: Tue, 21 Jan 2014 20:03:58 +0000 Subject: [Live-devel] =?windows-1252?q?=93Unknown_RTP_version_3=94_RTSP_is?= =?windows-1252?q?sue?= Message-ID: Hi Guys, First of all, thank you for the service you provide to the community. I wonder if you can tell me if the live555 Linux Media Server ?Unknown RTP version 3? RTSP issue has been fixed. I had heard that newer versions have addressed this issue. I went to the FAQs page but did not find this question/answer. So I may not even have the correct assumption here. I am current running version 0.80 (LIVE555 Streaming Media library version 2013.12.18) I use the openRTSP command to send the thread (e.g. openRTSP rtsp://10.2.1.1:554//HostShare/test.aac) Thanks. *** Packet Capture *** No.???? Time?????????? Source??????????????? Destination?????????? Protocol Length Info ???? 46 0.773136000??? 10.145.192.1????????? 10.145.192.2????????? RTSP???? 262??? SETUP rtsp://10.145.192.2//HostShare/Alarm_Antelope.aac/track1 RTSP/1.0 Frame 46: 262 bytes on wire (2096 bits), 262 bytes captured (2096 bits) on interface 0 Ethernet II, Src: Vmware_01:06:75 (00:50:56:01:06:75), Dst: Vmware_01:06:ae (00:50:56:01:06:ae) Internet Protocol Version 4, Src: 10.145.192.1 (10.145.192.1), Dst: 10.145.192.2 (10.145.192.2) Transmission Control Protocol, Src Port: 38137 (38137), Dst Port: rtsp (554), Seq: 313, Ack: 894, Len: 196 Real Time Streaming Protocol No.???? Time?????????? Source??????????????? Destination?????????? Protocol Length Info ???? 47 0.773795000??? 10.145.192.2????????? 10.145.192.1????????? RTSP???? 280??? Reply: RTSP/1.0 200 OK Frame 47: 280 bytes on wire (2240 bits), 280 bytes captured (2240 bits) on interface 0 Ethernet II, Src: Vmware_01:06:ae (00:50:56:01:06:ae), Dst: Vmware_01:06:75 (00:50:56:01:06:75) Internet Protocol Version 4, Src: 10.145.192.2 (10.145.192.2), Dst: 10.145.192.1 (10.145.192.1) Transmission Control Protocol, Src Port: rtsp (554), Dst Port: 38137 (38137), Seq: 894, Ack: 509, Len: 214 Real Time Streaming Protocol No.???? Time?????????? Source??????????????? Destination?????????? Protocol Length Info ???? 48 0.774059000??? 10.145.192.1????????? 10.145.192.2????????? RTP????? 46???? Unknown RTP version 3 Frame 48: 46 bytes on wire (368 bits), 46 bytes captured (368 bits) on interface 0 Ethernet II, Src: Vmware_01:06:75 (00:50:56:01:06:75), Dst: Vmware_01:06:ae (00:50:56:01:06:ae) Internet Protocol Version 4, Src: 10.145.192.1 (10.145.192.1), Dst: 10.145.192.2 (10.145.192.2) User Datagram Protocol, Src Port: 47824 (47824), Dst Port: 6970 (6970) Real-Time Transport Protocol ??? 11.. .... = Version: Unknown (3) From finlayson at live555.com Tue Jan 21 12:17:42 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Jan 2014 13:17:42 -0700 Subject: [Live-devel] =?windows-1252?q?=93Unknown_RTP_version_3=94_RTSP_is?= =?windows-1252?q?sue?= In-Reply-To: References: Message-ID: <5FDD1E9E-A08F-4871-A543-3BFA980EAFF5@live555.com> Dave, Your message implied that this is a known problem; however, I've never seen it before - and unfortunately can't explain it. Are you running a pre-built binary version of the "LIVE555 Media Server"? If so, then please instead try a version that you've built yourself - from source code. (You'll find it in the "mediaServer" directory.) Does "openRTSP" (and "testRTSPClient") successfully receive data? If so, then your 'packet capture' software must be in error, and the RTP packets' version number must, in fact, be correct. But if "openRTSP" and "testRTSPClient" doesn't receive any data, then it seems that the "LIVE555 Media Server" may, indeed, be sending RTP packets with a bad 'version number' - but I have no idea how that could possibly be happening. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From demthedj at gmail.com Wed Jan 22 05:37:46 2014 From: demthedj at gmail.com (Sergey Kuprienko) Date: Wed, 22 Jan 2014 15:37:46 +0200 Subject: [Live-devel] Group Sockets select to poll() port Message-ID: I've faced problems using live555 to capture many streams per process. The source is select() calls. It can't accept fd index more than FD_SETSIZE ( 1024 on most distros). I've made some patches to code and i believe it would be useful Sorry, if i've choosed wrong way to send a patch, but cannot found right way to post it on site. 1) GroupsockHelper.cpp : // Block until the socket is readable (with a 5-second timeout): #define GROUPSOCK_USES_POLL #ifndef GROUPSOCK_USES_POLL fd_set rd_set; FD_ZERO(&rd_set); FD_SET((unsigned)sock, &rd_set); const unsigned numFds = sock+1; struct timeval timeout; timeout.tv_sec = 5; timeout.tv_usec = 0; int result = select(numFds, &rd_set, NULL, NULL, &timeout); if (result <= 0) break; #else struct pollfd pollFd; memset(&pollFd,0,sizeof(pollFd)); pollFd.fd = sock; pollFd.events = POLLIN | POLLERR; int result = poll(&pollFd, 1, 5000); if (result < 0 ) break; #endif 2) I've made poll()-based task scheduler - how can I post it the best way ? -- ?????? ????????? ????? ?????????? ??, "??-??" Sergey Kuprienko Head of software development dpt. -------------- next part -------------- An HTML attachment was scrubbed... URL: From iouannouk80 at gmail.com Wed Jan 22 08:37:34 2014 From: iouannouk80 at gmail.com (Kostas Ioannou) Date: Wed, 22 Jan 2014 18:37:34 +0200 Subject: [Live-devel] catchup tv Message-ID: I would like to record mpeg-ts with mpeg-2, index it and then making it available for catchup tv. I thought of storing the incoming udp transport stream into chunks and then running the MPEG2TransportStreamIndexer. Files are playable by STB (Motorola) also with trickplay support. However there are 2 things to address 1 - how to continue the playback both forward and in reverse (during rewind) to another chunk and 2- how to extend the rtsp request reader to understand timecode parameters, e.g rtsp:///myStream/?start=2014-01-22-10:00:00&end=2014-01-22-20:00:00 Seems that there is a similar implementation but doesn't anyone has the following source? https://forum.videolan.org/viewtopic.php?t=76912 -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Wed Jan 22 20:20:57 2014 From: fantasyvideo at 126.com (Tony) Date: Thu, 23 Jan 2014 12:20:57 +0800 (CST) Subject: [Live-devel] Regarding mp4 in live555MediaServer Message-ID: <5fe44587.34c9.143bd5457db.Coremail.fantasyvideo@126.com> Hi Ross, Does live555MediaServer have plan to support mp4 file? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 23 18:07:37 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 23 Jan 2014 19:07:37 -0700 Subject: [Live-devel] Group Sockets select to poll() port In-Reply-To: References: Message-ID: <6B94B39D-9EB1-45D2-B481-C4AD42DE2448@live555.com> On Jan 22, 2014, at 6:37 AM, Sergey Kuprienko wrote: > I've faced problems using live555 to capture many streams per process. > The source is select() calls. It can't accept fd index more than FD_SETSIZE ( 1024 on most distros). > > I've made some patches to code and i believe it would be useful > Sorry, if i've choosed wrong way to send a patch, but cannot found right way to post it on site. > > 1) GroupsockHelper.cpp : This change probably doesn't need to be made, because this code (to find the application's local IP address) is executed only once, when the process starts running. At that time you'll almost certainly have a low socket number, so the "select()" call should always work. > 2) I've made poll()-based task scheduler - how can I post it the best way ? If you've written your own task scheduler simply by defining your own new subclass of "TaskScheduler" - rather than by modifying the existing "BasicTaskScheduler" code, then you can - if you wish - simply post your code. There should be any 'patches' here, because you should not have modified the existing code. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pagan81 at gmail.com Thu Jan 23 09:45:00 2014 From: pagan81 at gmail.com (pagan81 at gmail.com) Date: Thu, 23 Jan 2014 21:45:00 +0400 Subject: [Live-devel] I always get disconnect after 1, 5 minutes work with using RTSP-over-HTTP, but VLC work correct. Message-ID: <1844958432.20140123214500@gmail.com> Hi live-devel. I use Live555 in my application how client. I have kit of links for testing. rtsp://hdn.octoshape.net/er-rt/ch1_720p rtsp://hdn.octoshape.net/er-rt/ch1_480p rtsp://hdn.octoshape.net/er-rt/ch1_320p rtsp://hdn.octoshape.net/er-rt/ch1_240p rtsp://hdn.octoshape.net/er-rt/ch2_720p rtsp://hdn.octoshape.net/er-rt/ch2_480p rtsp://hdn.octoshape.net/er-rt/ch2_320p rtsp://hdn.octoshape.net/er-rt/ch2_240p rtsp://hdn.octoshape.net/er-rt/ch5_720p rtsp://hdn.octoshape.net/er-rt/ch5_480p rtsp://hdn.octoshape.net/er-rt/ch5_320p rtsp://hdn.octoshape.net/er-rt/ch5_240p For all links need to use RTSP-over-HTTP, I use 554 port for this. I am testing with using opentRTSP.exe. I used last version from 2014.01.21. Example: openRTSP.exe -T 554 rtsp://hdn.octoshape.net/er-rt/ch1_320p Launch is fine, I am receiving data from server, but after 1.5 minutes server closes connection, and I do not know why. But VLC works fine with these links. What am I doing wrong ? Thanks. -- Pagan81 mailto:Pagan81 at gmail.com From pallela.venkatkarthik at vvdntech.com Fri Jan 24 02:47:21 2014 From: pallela.venkatkarthik at vvdntech.com (Pallela Venkat Karthik) Date: Fri, 24 Jan 2014 16:17:21 +0530 Subject: [Live-devel] Slow frame rate at streaming Message-ID: Hi, I have a sensor which will write to a buffer memory, I am unable to get the correct framerate om streaming.If I stream a wall clock on Quick Time only about 50 seconds are going for true 60 seconds and eventually my frame buffer (circular) is getting full I am using testOnDemandRTSPServer of latest live555 and using ByteStreamFileSource.cpp reading frames inside ByteStreamFileSource::doReadFromFile() from buffer into memory pointed by fTo On my linux machine, clock is also working fine, gettimeofday increment speed is correct Help me in fixing the issue -- *With Warm Regards,* Venkat Karthik VVDN Technologies Pvt Ltd *Cell : *+91 9884292064 | *Skype :* venk.kart -------------- next part -------------- An HTML attachment was scrubbed... URL: From kcchao0921 at gmail.com Fri Jan 24 02:53:06 2014 From: kcchao0921 at gmail.com (=?UTF-8?B?6LaZ5ZyL56ug?=) Date: Fri, 24 Jan 2014 18:53:06 +0800 Subject: [Live-devel] Fwd: RTP over RTSP(tcp) doesn't work in version 2014.01.21 In-Reply-To: References: Message-ID: Hi Ross, I use liveMedia library which version is 2014.01.21 to do streaming via RTP over RTSP. I found that modification in OnDemandServerMediaSubsession::getStreamParameters() let streaming via RTP over RTSP not work. After I traced, I found clientRTPPortNum is 0 after parseTransportHeader() is called in RTSPClientSession::handleCmd_SETUP (line 1588 in RTSPServer.cpp). This will let getStreamParameters() do things wrong. I modified line 166 in OnDemandServerMediaSubsession.cpp: if (clientRTPPort.num() != 0) -> if ((clientRTPPort.num() != 0) || (rtpChannelId != 0xFF)) and it seems work. I don't know if I do it the right way. P.S. I use vlc 2.0.8 and configure it to RTP over RTSP(TCP). It can let this problem happen. Many thanks, KC Chao -- ??????? ??????? ??????? ??????? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 24 03:37:44 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jan 2014 04:37:44 -0700 Subject: [Live-devel] RTP over RTSP(tcp) doesn't work in version 2014.01.21 In-Reply-To: References: Message-ID: Thanks for the report! I've just released a new version (2014.01.24) that fixes this. Sorry for the inconvenience. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 24 03:41:39 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jan 2014 04:41:39 -0700 Subject: [Live-devel] Slow frame rate at streaming In-Reply-To: References: Message-ID: <534C3792-13F6-4FD0-B63E-A14E574244DD@live555.com> > I have a sensor which will write to a buffer memory, I am unable to get the correct framerate om streaming.If I stream a wall clock on Quick Time only about 50 seconds are going for true 60 seconds and eventually my frame buffer (circular) is getting full This is very vague. It might help if you tell us what specifically you are doing with this buffer. I.e., what kind of data is it, and how are you using it? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From demthedj at gmail.com Thu Jan 23 22:38:48 2014 From: demthedj at gmail.com (Sergey Kuprienko) Date: Fri, 24 Jan 2014 08:38:48 +0200 Subject: [Live-devel] Group Sockets select to poll() port In-Reply-To: <6B94B39D-9EB1-45D2-B481-C4AD42DE2448@live555.com> References: <6B94B39D-9EB1-45D2-B481-C4AD42DE2448@live555.com> Message-ID: 1) In high-load app it is not. For example, I'm starting capturing after storage, IPC, streaming servers and many other stuff initialized - so i'm running out of FD_SETSIZE very fast. I'm recommed you to implement patch - it's quite simple and local. 2) Ok, posting a link to code at blog would be acceptable ? 2014/1/24 Ross Finlayson > > On Jan 22, 2014, at 6:37 AM, Sergey Kuprienko wrote: > > I've faced problems using live555 to capture many streams per process. > The source is select() calls. It can't accept fd index more than > FD_SETSIZE ( 1024 on most distros). > > I've made some patches to code and i believe it would be useful > Sorry, if i've choosed wrong way to send a patch, but cannot found right > way to post it on site. > > 1) GroupsockHelper.cpp : > > > This change probably doesn't need to be made, because this code (to find > the application's local IP address) is executed only once, when the process > starts running. At that time you'll almost certainly have a low socket > number, so the "select()" call should always work. > > > 2) I've made poll()-based task scheduler - how can I post it the best way ? > > > If you've written your own task scheduler simply by defining your own new > subclass of "TaskScheduler" - rather than by modifying the existing > "BasicTaskScheduler" code, then you can - if you wish - simply post your > code. There should be any 'patches' here, because you should not have > modified the existing code. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- ?????? ????????? ????? ?????????? ??, "??-??" Sergey Kuprienko Head of software development dpt. http://www.f-f.kiev.ua +38 097 985 15 69 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 24 03:43:57 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jan 2014 04:43:57 -0700 Subject: [Live-devel] Regarding mp4 in live555MediaServer In-Reply-To: <5fe44587.34c9.143bd5457db.Coremail.fantasyvideo@126.com> References: <5fe44587.34c9.143bd5457db.Coremail.fantasyvideo@126.com> Message-ID: > Does live555MediaServer have plan to support mp4 file? Are you asking if we plan to support *demultiplexing* from MP4 files? If so, then the answer is no. See http://lists.live555.com/pipermail/live-devel/2013-September/017424.html Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 24 03:50:57 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jan 2014 04:50:57 -0700 Subject: [Live-devel] I always get disconnect after 1, 5 minutes work with using RTSP-over-HTTP, but VLC work correct. In-Reply-To: <1844958432.20140123214500@gmail.com> References: <1844958432.20140123214500@gmail.com> Message-ID: The problem here is that the QuickTime/Darwin Streaming Server code that 'Wowza' uses is buggy - it does not recognize incoming RTCP "RR" reports (from the client) as indicating that the client is still alive. To fix this, do one of the following: - Get 'Wowza' to fix the bug in their server, or - Update your copy of the "openRTSP" application to periodically send "OPTIONS" or "GET_PARAMETER" commands to the server (this is what VLC does to overcome this problem), or - Use a different RTSP server (e.g., the "LIVE555 Media Server", which doesn't have this problem). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Fri Jan 24 06:25:16 2014 From: tayeb.dotnet at gmail.com (Tayeb Meftah) Date: Fri, 24 Jan 2014 15:25:16 +0100 Subject: [Live-devel] SDP File streaming Message-ID: <435C854D0737428AA270A119BCDD806D@worksc08f920f1> Hello Ross, i have a multicast streams (24) it's pocible to receive the SDP file that i created from sap (using sapWatch) using live555MediaServer? i tryed, but got 404 not found i think this can be easy for creating playlist & simplifying Multicast access thank, Tayeb Meftah Voice of the blind T Broadcast Freedom http://www.vobradio.org Phone:447559762242 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ga at ed4u.com Fri Jan 24 17:58:35 2014 From: ga at ed4u.com (Gordon Apple) Date: Fri, 24 Jan 2014 19:58:35 -0600 Subject: [Live-devel] Streaming from a Mac Message-ID: Newbie here. Are there any examples available for setting up live555 to stream from frame buffers (h.264) produced by AVCaptureVideoDataOutput? We have screen capture capability and can currently produce a file using AVCaptureVideoFileOutput, but we also want live low-latency streaming. We want to multicast to remote students and eventually add streaming back from students (laptop, iPad, whatever). Initially we would be happy with a demo streaming from one Mac to another on a local network (maybe receiving with VLC). If we can get that working, then we will progress to the next stage, maybe a proxy server, then the other capabilities. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 24 18:50:56 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jan 2014 18:50:56 -0800 Subject: [Live-devel] Streaming from a Mac In-Reply-To: References: Message-ID: > \Newbie here. Are there any examples available for setting up live555 to stream from frame buffers (h.264) produced by AVCaptureVideoDataOutput? We have screen capture capability and can currently produce a file using AVCaptureVideoFileOutput, but we also want live low-latency streaming. See http://www.live555.com/liveMedia/faq.html#liveInput Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Jan 24 20:24:11 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Jan 2014 20:24:11 -0800 Subject: [Live-devel] SDP File streaming In-Reply-To: <435C854D0737428AA270A119BCDD806D@worksc08f920f1> References: <435C854D0737428AA270A119BCDD806D@worksc08f920f1> Message-ID: <71A8A893-0445-401A-A80D-1CF3E3659E8A@live555.com> I don't understand your question - but I'm quite sure that the answer is "no". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tayeb.dotnet at gmail.com Sat Jan 25 03:11:20 2014 From: tayeb.dotnet at gmail.com (Tayeb Meftah) Date: Sat, 25 Jan 2014 12:11:20 +0100 Subject: [Live-devel] SDP File streaming References: <435C854D0737428AA270A119BCDD806D@worksc08f920f1> <71A8A893-0445-401A-A80D-1CF3E3659E8A@live555.com> Message-ID: <961ED2DFEF3D45F0A290BD2BF9AF74A6@worksc08f920f1> Ok, let me explain: let's say, i have a H.264 (MKV) movie i want to multicast it using ffmpeg but in rtp format after the multicasting started, the FFMpeg will provide a sdp output like i posted befaure we save this sdp file and give it to vlc, it'lle play it what i want is to put the sdp file in the root folder of Live555MediaServer, like i used to do with Darwine and just fetch it using: rtsp://localhost:8554/movie.sdp though? thank, Tayeb Meftah Voice of the blind T Broadcast Freedom http://www.vobradio.org Phone:447559762242 ----- Original Message ----- From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Saturday, January 25, 2014 5:24 AM Subject: Re: [Live-devel] SDP File streaming I don't understand your question - but I'm quite sure that the answer is "no". Ross Finlayson Live Networks, Inc. http://www.live555.com/ ------------------------------------------------------------------------------ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From goelli at goelli.de Sun Jan 26 12:48:36 2014 From: goelli at goelli.de (=?iso-8859-1?Q?Thomas_G=F6llner?=) Date: Sun, 26 Jan 2014 21:48:36 +0100 Subject: [Live-devel] Adding some DVB Tables to a Stream Message-ID: <000001cf1ad7$fc0d61e0$f42825a0$@goelli.de> Hi, I want to add some DVB tables (e.g. SDT, NIT) to a stream that I'm streaming over LIVE555 RTSPServer. Can someone tell me where the best point for this injection is? I'm receiving a TS via RTPSource or UDPSource.^ Best regards, Thomas From finlayson at live555.com Sun Jan 26 13:12:12 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 26 Jan 2014 13:12:12 -0800 Subject: [Live-devel] Adding some DVB Tables to a Stream In-Reply-To: <000001cf1ad7$fc0d61e0$f42825a0$@goelli.de> References: <000001cf1ad7$fc0d61e0$f42825a0$@goelli.de> Message-ID: > I want to add some DVB tables (e.g. SDT, NIT) to a stream that I'm streaming > over LIVE555 RTSPServer. Can someone tell me where the best point for this > injection is? I'm receiving a TS via RTPSource or UDPSource.^ Because you are using our RTSP server implementation, I presume that you have written your own "OnDemandServerMediaSubsession" subclass, and thus have reimplemented the "createNewStreamSource()" virtual function. If so, then the best place to add this 'injection' would be in your "createNewStreamSource()" function. Specifically, you would define and implement your own subclass of "FramedFilter" that does the 'injection'. Your "createNewStreamSource()" function would then create a "RTPSource" or "BasicUDPSource" - as it presumably does now - but then feed this into a new object of your "FramedFilter" subclass. The function would then return a pointer to your new "FramedFilter" subclass object. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From goelli at goelli.de Mon Jan 27 02:23:13 2014 From: goelli at goelli.de (=?iso-8859-1?Q?Thomas_G=F6llner?=) Date: Mon, 27 Jan 2014 11:23:13 +0100 Subject: [Live-devel] Adding some DVB Tables to a Stream Message-ID: <000a01cf1b49$c90a4610$5b1ed230$@goelli.de> Thanks Ross for your quick reply. I have made a new subclass of MPEG2TransportUDPServerMediaSubsession and added my "SDTinserter" subclass of FramedFilter into the filter chain. The createNewStreamSource() function ends like that: ... transportStreamSource = SDTInserter::createNew(envir(), transportStreamSource); transportStreamSource = MPEG2TransportStreamFramer::createNew(envir(), transportStreamSource); return transportStreamSource; } The SDTInserter is only a dummy now, that is expected to only pass all packets, because I do not fully understand this filter chain. I've implementet a doGetNextFrame() function, that calls fInputSource->getNextFrame(fTo, fMaxSize, afterGettingFrame, this, FramedSource::handleClosure, this); If the dummy works, I want to insert the tables here. Periodically I will write a table frame to fTo and call the getNextFrame function with fTo[newStart] and lower fMaxSize. The static afterGettingFrame function of my SDTinserter dummy simply calls the dummyobjects afterGettingFrame function and that calls FramedSource::afterGetting(this); But I think I have overseen something, because that doesn't work. Do you have another hint for me how the data is carried from RTP/UDP Source to the filter and especially from the filter to... to what - the Sink I guess ;-) ? Best regards, Thomas G?llner From goelli at goelli.de Mon Jan 27 06:34:25 2014 From: goelli at goelli.de (=?iso-8859-1?Q?Thomas_G=F6llner?=) Date: Mon, 27 Jan 2014 15:34:25 +0100 Subject: [Live-devel] Adding some DVB Tables to a Stream Message-ID: <000b01cf1b6c$e0811e90$a1835bb0$@goelli.de> I've got it. The objects afterGettingFrame() function copys the parameters. That's, what I forgot. So I did not tell the next Filter in queue, that there is any data - because of frameSize. I'll try now to insert the tables as I mentioned and report the result. Best regards, Thomas G?llner From jshanab at jfs-tech.com Sun Jan 26 04:49:20 2014 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Sun, 26 Jan 2014 07:49:20 -0500 Subject: [Live-devel] Memoryleak only when connecting memory tool Message-ID: I have a strange problem showing up while trying to solve a memory leak elsewhere in my code. When I attach a memory tool it somehow messes timing or something and live555 begins to create BufferPacket structures of 68 bytes with a payload of 10K. Somehow they are not released so the code to get a free one ends up allocating a new one. (Reorder buffer in multiframedrtpsource) If I let the app run for over a day a slow growth of memory usage occurs. This is what I am trying to solve. When I connect the Memory Tool, there are no BufferPackets at all. After a few hours, there are a few hundred or a few thousand based on how many streams are connected. We quickly get into over a gigabyte memory usage on a 32 bit process and it gets worse from there. These are a live-leak or memory growth issue not a dead-leak. Running valgrind or IPP memory leak tools do not show a leak. The original leak problem is identical on linux and windows. The version of live555 is not current, but i have been given reasons we cannot update at the moment. However, if this rings any bells and points to a specific fix in newer code, I can patch or at least prove the need to update!) We have two versions of the code, one with the problem and one without. Swapping in the live555.dll from the version that does not show the issue into the version that does, does not solve the original problem. But the issue of BufferPacket growth when connecting the memory tool is hampering our efforts to find the real leak. The memory tool in question only runs in windows. Any suggestions would be greatly appreciated. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 28 11:26:28 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jan 2014 11:26:28 -0800 Subject: [Live-devel] SDP File streaming In-Reply-To: <961ED2DFEF3D45F0A290BD2BF9AF74A6@worksc08f920f1> References: <435C854D0737428AA270A119BCDD806D@worksc08f920f1> <71A8A893-0445-401A-A80D-1CF3E3659E8A@live555.com> <961ED2DFEF3D45F0A290BD2BF9AF74A6@worksc08f920f1> Message-ID: > Ok, let me explain: > let's say, i have a H.264 (MKV) movie > i want to multicast it using ffmpeg > but in rtp format > after the multicasting started, the FFMpeg will provide a sdp output like i posted befaure > we save this sdp file and give it to vlc, it'lle play it > what i want is to put the sdp file in the root folder of Live555MediaServer, like i used to do with Darwine and just fetch it using: > rtsp://localhost:8554/movie.sdp No, this won't work, because the "LIVE555 Media Server" serves only its own streams; not streams that are being delivered by another server. In any case, you don't need a RTSP server to deliver a ".sdp" file to VLC; instead, you could just use a HTTP server, and have VLC access the file using a "http://" URL. But the *real* question that you should have asked is: Why do I have to multicast a ".mkv" file using 'ffmpeg'? Why can't I do this using a LIVE555 demo application? Well now you can. The most recent "LIVE555 Streaming Media" software release - 2014.01.28 - includes a new demo application (in "testProgs") called "testMKVStreamer", that streams a Matroska file (named "test.mkv") via IP multicast. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Jan 28 13:35:06 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Jan 2014 13:35:06 -0800 Subject: [Live-devel] Memoryleak only when connecting memory tool In-Reply-To: References: Message-ID: > If I let the app run for over a day a slow growth of memory usage occurs. This is what I am trying to solve. When I connect the Memory Tool, there are no BufferPackets at all. After a few hours, there are a few hundred or a few thousand based on how many streams are connected. We quickly get into over a gigabyte memory usage on a 32 bit process and it gets worse from there. If this is happening, it must be because the incoming data is being consumed at a slower rate than the rate at which it is arriving. I.e., your 'Memory Tool' is causing your system to slow down so much that it can't keep up with processing the incoming streams. To avoid this, you'll need to reduce the number and/or bitrate of your incoming streams. > The version of live555 is not current We support only the current release of the code! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Tue Jan 28 23:51:08 2014 From: nambirajan.manickam at i-velozity.com (Nambirajan) Date: Wed, 29 Jan 2014 13:21:08 +0530 Subject: [Live-devel] Play response setting in Live555 Server Message-ID: <004701cf1cc6$e0c5b770$a2512650$@manickam@i-velozity.com> Hi Ross, We are facing some difficulty in setting the current play time in the Play response from Live555 Media Server. The scenario is as below. We are using the handleCmd_PLAY function to send the response to the player for play request as per the design. Unfortunately the player sends the range as : -7868. It is not giving the start time, only end time is sent from the player to the Server. In that scenario, when we try to use the getCurrentNPT() method to get the current play time. But we are not getting the current play time. We are getting the time as 0.000. Can you please give some info regarding this. Thanks and regards, M. Nambirajan -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 29 00:15:24 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 29 Jan 2014 00:15:24 -0800 Subject: [Live-devel] Play response setting in Live555 Server In-Reply-To: <004701cf1cc6$e0c5b770$a2512650$@manickam@i-velozity.com> References: <004701cf1cc6$e0c5b770$a2512650$@manickam@i-velozity.com> Message-ID: > We are facing some difficulty in setting the current play time in the Play response from Live555 Media Server. The scenario is as below. > > We are using the handleCmd_PLAY function to send the response to the player for play request as per the design. Unfortunately the player sends the range as : -7868. It is not giving the start time, only end time is sent from the player to the Server. In that scenario, when we try to use the getCurrentNPT() method to get the current play time. But we are not getting the current play time. We are getting the time as 0.000. Can you please give some info regarding this. I'm sorry, but I don't really understand what you're asking. Are you using the supplied, unmodified "RTSPServer" code, or have you written your own subclass that redefines "handleCmd_Play()"? If it's the latter, then I'm not sure that I can help you. (Generally speaking, I can help you only if you're using the original, unmodified code.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Wed Jan 29 01:40:33 2014 From: nambirajan.manickam at i-velozity.com (Nambirajan) Date: Wed, 29 Jan 2014 15:10:33 +0530 Subject: [Live-devel] Play response setting in Live555 Server In-Reply-To: References: <004701cf1cc6$e0c5b770$a2512650$@manickam@i-velozity.com> Message-ID: <006001cf1cd6$2bc036b0$8340a410$@manickam@i-velozity.com> Hi Ross, Thanks for your reply. We are using unmodified code. We want to send the current play time in the play response to the player when the player sends only the range end without specifying the range start in the Range Header. Thanks and regards, M. Nambirajan From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, January 29, 2014 1:45 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Play response setting in Live555 Server We are facing some difficulty in setting the current play time in the Play response from Live555 Media Server. The scenario is as below. We are using the handleCmd_PLAY function to send the response to the player for play request as per the design. Unfortunately the player sends the range as : -7868. It is not giving the start time, only end time is sent from the player to the Server. In that scenario, when we try to use the getCurrentNPT() method to get the current play time. But we are not getting the current play time. We are getting the time as 0.000. Can you please give some info regarding this. I'm sorry, but I don't really understand what you're asking. Are you using the supplied, unmodified "RTSPServer" code, or have you written your own subclass that redefines "handleCmd_Play()"? If it's the latter, then I'm not sure that I can help you. (Generally speaking, I can help you only if you're using the original, unmodified code.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Jan 29 13:33:52 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 29 Jan 2014 13:33:52 -0800 Subject: [Live-devel] Play response setting in Live555 Server In-Reply-To: <006001cf1cd6$2bc036b0$8340a410$@manickam@i-velozity.com> References: <004701cf1cc6$e0c5b770$a2512650$@manickam@i-velozity.com> <006001cf1cd6$2bc036b0$8340a410$@manickam@i-velozity.com> Message-ID: <1466CC06-04E7-4D7E-9735-26A32993C5B4@live555.com> > Thanks for your reply. We are using unmodified code. We want to send the current play time in the play response to the player when the player sends only the range end without specifying the range start in the Range Header. Thanks for the note. It turns out that our RTSP server implementation wasn't handling this kind of "Range:" header properly. I've just installed a new version - 2014.01.29 - of the "LIVE555 Streaming Media" code that fixes this. It will set the 'current play time' as the start time in the "PLAY" response. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at jfs-tech.com Tue Jan 28 15:09:22 2014 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Tue, 28 Jan 2014 18:09:22 -0500 Subject: [Live-devel] Memoryleak only when connecting memory tool In-Reply-To: References: Message-ID: Because of this suspicion, I ran without the tool at very low resolutions and used windows' right-click mini-dump feature. I then post processed these dumps and found the same exact objects, the 68 byte BufferPacket structure and the 10k BufferPackets growing. Surprisingly they are not in lock step, the number of structs and buffers are not the same. I ran this test on 2 windows 7 machines and have the same leak on Linux. During the test one one machine I am using a total of 2% CPU. 5 of the 8 cores are idle and the 64bit OS is at around 5/12 GB. The process is 32 bit and these tests are discontinued when memory usage approaches 1GB. It does seems to be timing sensitive, like we do not return soon enough so it creates a BufferPacket. The problem is it isn't releaseing them Almost like there are more buffer packets than there are scheduled or direct afterGetting calls. Not a Performance issue, just somehow missed consumer tasks. The more streams or the attachment of the memory validator tool does increase the frequency of the allocations. Because this software usually runs 50+ high res streams on lesser hardware, there is during this test, a surplus of consumer capability. Tests of the newest version of live555 are scheduled. a bit of migration may be needed. Our code goes on windows, linux and embedded targets, so getting all architecture builds is not trivial. I did compare the current live555 MultiFramedRtpSource with ours and there have been a few changes. Thanks On Tue, Jan 28, 2014 at 4:35 PM, Ross Finlayson wrote: > If I let the app run for over a day a slow growth of memory usage > occurs. This is what I am trying to solve. When I connect the Memory Tool, > there are no BufferPackets at all. After a few hours, there are a few > hundred or a few thousand based on how many streams are connected. We > quickly get into over a gigabyte memory usage on a 32 bit process and it > gets worse from there. > > > If this is happening, it must be because the incoming data is being > consumed at a slower rate than the rate at which it is arriving. I.e., > your 'Memory Tool' is causing your system to slow down so much that it > can't keep up with processing the incoming streams. To avoid this, you'll > need to reduce the number and/or bitrate of your incoming streams. > > > The version of live555 is not current > > > We support only the current release of the code! > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From goelli at goelli.de Thu Jan 30 03:36:43 2014 From: goelli at goelli.de (=?iso-8859-1?Q?Thomas_G=F6llner?=) Date: Thu, 30 Jan 2014 12:36:43 +0100 Subject: [Live-devel] Adding some DVB Tables to a Stream Message-ID: <001a01cf1daf$8caeec30$a60cc490$@goelli.de> It worked perfectly. I'm just adding a table packet periodically instead of calling fInputSource->getNextFrame(...) -----Urspr?ngliche Nachricht----- Gesendet: Montag, 27. Januar 2014 15:34 An: 'live-devel at lists.live555.com' Betreff: AW: Adding some DVB Tables to a Stream I've got it. The objects afterGettingFrame() function copys the parameters. That's, what I forgot. So I did not tell the next Filter in queue, that there is any data - because of frameSize. I'll try now to insert the tables as I mentioned and report the result. Best regards, Thomas G?llner From david.cassany at i2cat.net Thu Jan 30 07:15:38 2014 From: david.cassany at i2cat.net (David Cassany Viladomat) Date: Thu, 30 Jan 2014 16:15:38 +0100 Subject: [Live-devel] decoupling MediaSubSession from RTP stack Message-ID: Hi all, First of all I would like to program an RTSPClient that only handles RTSP session, just the sesion negotiation and session keep alive. In our application we are already capable to manage incomping RTP flows, we just need an RTSP manager to interact with our RTP manager. On server side we already successfully achieved it implementing our ServerMediaSubsession class. We would like to achieve the same on the client side. We have been having a look on the testRTSPClient.cpp code and as far as I understood the sdp parsing is done in MediaSubsession class which at the same time depends on RTPSource. Does anyone have an idea if could be possible to such MediaSubsession features without having to link our application to Live555 RTP stack? (livemedia rtp stack and our rtp stack have some incompatibilities, many common enum types defined) I am not sure could explain myself. Thanks in advance for any suggestion. Kind regards, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 30 10:49:11 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Jan 2014 10:49:11 -0800 Subject: [Live-devel] decoupling MediaSubSession from RTP stack In-Reply-To: References: Message-ID: <9776308B-77EC-4C3F-8E82-838B4D80F99C@live555.com> > First of all I would like to program an RTSPClient that only handles RTSP session, just the sesion negotiation and session keep alive. In our application we are already capable to manage incomping RTP flows, we just need an RTSP manager to interact with our RTP manager. On server side we already successfully achieved it implementing our ServerMediaSubsession class. We would like to achieve the same on the client side. Yes, you can do this. I suggest looking at how our "openRTSP" client application implements the "-r" option, which tells the command to start playing the stream (using RTSP), but not actually receive it. See http://www.live555.com/openRTSP/#no-receive You should be able to use "openRTSP" (with the "-r" option, and also "-p ", if the stream is unicast) to demonstrate streaming to your application using RTSP only, not RTP/RTCP. The secret is to *not* call "initiate()" on each of your "MediaSubsession" objects; that function is (normally) used to create "RTPSource" and "RTCPInstance" objects for receiving the RTP/RTCP packets. And obviously, you won't want to create any "MediaSink" objects (e.g., "DummySink" in the "testRTSPClient" code), and not call "startPlaying()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Thu Jan 30 16:06:49 2014 From: warren at etr-usa.com (Warren Young) Date: Thu, 30 Jan 2014 17:06:49 -0700 Subject: [Live-devel] File handle leak in slightly modified testMPEG2TransportStreamer Message-ID: <52EAE919.5020103@etr-usa.com> In an attempt to reduce the number of global variables in testMPEG2TransportStreamer.cpp, I made videoSource a local variable in play(), then said Medium::close(videoSink->source()); in afterPlaying(). This appears to work, but if you watch the FD count, it goes up by one per pass through play(). On investigating, I discovered that videoSink->source() does not return the same pointer as was passed to videoSink->startPlaying()! This apparently results in the ByteStreamFileSource instance not being cleaned up, which explains the handle leak. This leak is mostly harmless...until you try to play a 6-second video in a continuous loop for 14 hours. :) The default FD limit of 1024 on Linux gets exhausted in about 2 hours. Is this all as expected, by design, or is Medium::close() not as smart as it should be about chasing down unused resources? From finlayson at live555.com Thu Jan 30 16:44:18 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Jan 2014 16:44:18 -0800 Subject: [Live-devel] File handle leak in slightly modified testMPEG2TransportStreamer In-Reply-To: <52EAE919.5020103@etr-usa.com> References: <52EAE919.5020103@etr-usa.com> Message-ID: <0B0AE1F0-F256-4C57-AA3A-DC7E7F44F26A@live555.com> I haven't seen this at all with the unmodified "testMPEGTransportStreamer" code - which is what you should be using to test this. The call to Medium::close(videoSource); in the "afterPlaying()" function (line 133) does, indeed, cause the "ByteStreamFileSource" object (and its underlying FID) to be closed. > On investigating, I discovered that videoSink->source() does not return the same pointer as was passed to videoSink->startPlaying() "videoSink->source()" should be the same as "videoSource", from the time that "startPlaying()" was called, up until the time that "afterPlaying()" is called. At that time, "videoSink->source()" gets set to NULL. That's expected, and OK, because the "afterPlaying()" function doesn't close "videoSink->source()"; it closes "videoSource". I see no bug here. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Jan 30 18:36:17 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Jan 2014 18:36:17 -0800 Subject: [Live-devel] Memoryleak only when connecting memory tool In-Reply-To: References: Message-ID: <72832CE2-22D9-4F79-BEA4-068B76EE04A0@live555.com> The "ReorderingPacketBuffer" is not a 'memory leak'. *Every* incoming packet that gets added to this queue will later get consumed, and its memory will then get reclaimed (perhaps reused for the next incoming packet). If an incoming packet weren't added to the "ReorderingPacketBuffer", then it would, instead remain buffered inside the OS - but without the benefit of being able to reorder out-of-order incoming packets. And as I said before, if the length of the "ReorderingPacketBuffer" increases over time, it must be because you're not consuming the incoming packets fast enough. Anyway, I'm not going to accept any more emails from you - unless they happen to describe a very specific bug, with a very specific cause. Please remember that every message that's posted to this mailing list is read by almost 2000 people - many of whom are experienced software professionals who are making quite extensive use of this software. Whenever you feel the need to post prolifically to this mailing list with random 'software conspiracy theories', then please check yourself, and ask: Do you really think that you have some special insight that the other ~1999 people on this list don't? In any case, please remember that Live Networks Inc. is not a charity, and I can't spend an unlimited amount of time helping out people who are using this software for free. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.cassany at i2cat.net Thu Jan 30 23:42:49 2014 From: david.cassany at i2cat.net (David Cassany Viladomat) Date: Fri, 31 Jan 2014 08:42:49 +0100 Subject: [Live-devel] decoupling MediaSubSession from RTP stack In-Reply-To: <9776308B-77EC-4C3F-8E82-838B4D80F99C@live555.com> References: <9776308B-77EC-4C3F-8E82-838B4D80F99C@live555.com> Message-ID: Hi Ross, Thank you ver much for your quick and concrete response. We will definitelycheck this! Regards, David 2014-01-30 Ross Finlayson : > First of all I would like to program an RTSPClient that only handles RTSP > session, just the sesion negotiation and session keep alive. In our > application we are already capable to manage incomping RTP flows, we just > need an RTSP manager to interact with our RTP manager. On server side we > already successfully achieved it implementing our ServerMediaSubsession > class. We would like to achieve the same on the client side. > > > Yes, you can do this. I suggest looking at how our "openRTSP" client > application implements the "-r" option, which tells the command to start > playing the stream (using RTSP), but not actually receive it. See > http://www.live555.com/openRTSP/#no-receive > > You should be able to use "openRTSP" (with the "-r" option, and also "-p > ", if the stream is unicast) to demonstrate streaming to your > application using RTSP only, not RTP/RTCP. > > The secret is to *not* call "initiate()" on each of your "MediaSubsession" > objects; that function is (normally) used to create "RTPSource" and > "RTCPInstance" objects for receiving the RTP/RTCP packets. > > And obviously, you won't want to create any "MediaSink" objects (e.g., > "DummySink" in the "testRTSPClient" code), and not call "startPlaying()". > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Fri Jan 31 00:03:17 2014 From: nambirajan.manickam at i-velozity.com (Nambirajan) Date: Fri, 31 Jan 2014 13:33:17 +0530 Subject: [Live-devel] Play response setting in Live555 Server In-Reply-To: <1466CC06-04E7-4D7E-9735-26A32993C5B4@live555.com> References: <004701cf1cc6$e0c5b770$a2512650$@manickam@i-velozity.com> <006001cf1cd6$2bc036b0$8340a410$@manickam@i-velozity.com> <1466CC06-04E7-4D7E-9735-26A32993C5B4@live555.com> Message-ID: <003101cf1e5a$e955fbd0$bc01f370$@manickam@i-velozity.com> Hi Ross, Thanks for update and the fix. It is now working fine. Best Regards, M. Nambirajan From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, January 30, 2014 3:04 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Play response setting in Live555 Server Thanks for your reply. We are using unmodified code. We want to send the current play time in the play response to the player when the player sends only the range end without specifying the range start in the Range Header. Thanks for the note. It turns out that our RTSP server implementation wasn't handling this kind of "Range:" header properly. I've just installed a new version - 2014.01.29 - of the "LIVE555 Streaming Media" code that fixes this. It will set the 'current play time' as the start time in the "PLAY" response. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Fri Jan 31 12:09:53 2014 From: warren at etr-usa.com (Warren Young) Date: Fri, 31 Jan 2014 13:09:53 -0700 Subject: [Live-devel] File handle leak in slightly modified testMPEG2TransportStreamer In-Reply-To: <0B0AE1F0-F256-4C57-AA3A-DC7E7F44F26A@live555.com> References: <52EAE919.5020103@etr-usa.com> <0B0AE1F0-F256-4C57-AA3A-DC7E7F44F26A@live555.com> Message-ID: <52EC0311.3040505@etr-usa.com> On 1/30/2014 17:44, Ross Finlayson wrote: > I haven't seen this at all with the unmodified > "testMPEGTransportStreamer" code - which is what you should be using to > test this. My point in asking is to learn why behavior changes when you change that one aspect of the existing code. > At that time, "videoSink->source()" gets > set to NULL. I see. You're zeroing a pointer to memory that is still in use, without deleting it. This in turn is why there has to be a separate pointer: because you can't currently depend on the sink to keep it through the call to afterPlaying(). Given that, why doesn't the sink take full ownership of the source object once you pass it to startPlaying()? Then it will be free to delete it before zeroing the pointer, so afterPlaying() doesn't need to call Medium::close() on it at all. Then there is no need for this extra global pointer. Obviously a few globals aren't a big problem in a tiny little example program. I care because the design constraints that require a program to keep track of this videoSource object in a 158 line program also apply in a 158,000 line program. I hope I don't need to defend state space reduction and encapsulation in 2014. From finlayson at live555.com Fri Jan 31 13:47:26 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 31 Jan 2014 13:47:26 -0800 Subject: [Live-devel] File handle leak in slightly modified testMPEG2TransportStreamer In-Reply-To: <52EC0311.3040505@etr-usa.com> References: <52EAE919.5020103@etr-usa.com> <0B0AE1F0-F256-4C57-AA3A-DC7E7F44F26A@live555.com> <52EC0311.3040505@etr-usa.com> Message-ID: > Given that, why doesn't the sink take full ownership of the source object once you pass it to startPlaying()? Because it doesn't :-) That design decision was made (I think; it's been well over a decade now) because 'sources' and 'sinks' are different kinds of object. In principle, many different objects - not just a 'sink' - may want to perform some operation (other than 'playing') on a 'source'. And because the 'source' and 'sink' were each created by a third party (the "testMPEG2TransportStreamer" application code, in this case), it makes sense for the same third party to take responsibility for reclaiming the 'source' and 'sink' objects, when it wants to do so. (OTOH, "FramedFilter" objects (a subclass of "FramedSource") *do* take ownership of their upstream input source object; that's why you can delete an entire filter chain by deleting just the tail "FramedFilter" object ("videoSource" in this case).) Anyway, the way to avoid global variables in (your modified) "testMPEG2TransportStreamer" application is to use the 'clientData' parameter in the 'afterPlaying' callback. E.g., you can do something like: struct streamParameters { FramedSource* source; MediaSink* sink; // any other stream-specific parameters that you want }; and then: streamParameters* myStream = new streamParameters; myStream->source = videoSource; myStream->sink = videoSink; videoSink<-startPlaying(*videoSource, afterPlaying, myStream); // note: The last parameter to "startPlaying()" is different than before and then, in "afterPlaying()": void afterPlaying(void* clientData) { streamParameters* myStream = (streamParameters*)clientData; myStream->sink->stopPlaying(); Medium::close(myStream->source); play(); } Voila! No global variables. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksilver986 at gmail.com Fri Jan 31 01:25:02 2014 From: ksilver986 at gmail.com (Kevin Silver) Date: Fri, 31 Jan 2014 09:25:02 +0000 Subject: [Live-devel] Stream WAV audio over RTP Message-ID: Hello, I apologise in advance if this is a basic question. I am writing an application which I would like to stream 8 and 16 bit pcm audio from wav files over rtp to a receiver elsewhere on my network. The control mechanism for this however will eventually be snmp (with SAP messages for the reciever configuration) rather than rtcp so I would like the stream to simply be rtp. I have seen the example program 'testWAVAudioStreamer' which is clearly very close to my needs however using rtsp. I was wondering whether there is an example available to use the libraries to send out rtp without the control protocol? Thank you in advance for any help Kevin -------------- next part -------------- An HTML attachment was scrubbed... URL: