From fantasyvideo at 126.com Tue Apr 1 07:22:29 2014 From: fantasyvideo at 126.com (Tony) Date: Tue, 1 Apr 2014 22:22:29 +0800 (CST) Subject: [Live-devel] Regarding the crash if the rtsp client counts is over than 10 In-Reply-To: <3C1CC4FE-CEDD-449C-947E-A54D1FDC1065@live555.com> References: <2a0c9316.e456.14511c7a4e5.Coremail.fantasyvideo@126.com> <3C1CC4FE-CEDD-449C-947E-A54D1FDC1065@live555.com> Message-ID: <27c0cfd3.c9e8.1451dab7d70.Coremail.fantasyvideo@126.com> Oh, I downloaded the latest version. It seems more stable and faster. THANK YOU FOR YOUR GREAT WORK!! At 2014-03-31 04:46:58,"Ross Finlayson" wrote: Are you using the latest version of the "LIVE555 Streaming Media" software?? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 1 14:12:23 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 1 Apr 2014 14:12:23 -0700 Subject: [Live-devel] Regarding the crash if the rtsp client counts is over than 10 In-Reply-To: <27c0cfd3.c9e8.1451dab7d70.Coremail.fantasyvideo@126.com> References: <2a0c9316.e456.14511c7a4e5.Coremail.fantasyvideo@126.com> <3C1CC4FE-CEDD-449C-947E-A54D1FDC1065@live555.com> <27c0cfd3.c9e8.1451dab7d70.Coremail.fantasyvideo@126.com> Message-ID: <9CE58378-A8EC-4497-89E6-8241E64B1E24@live555.com> > Oh, I downloaded the latest version. It seems more stable and faster. Please everybody: If you're having problems with the software, the *first* thing you should check is whether or not you have an up-to-date version. Otherwise you risk wasting lots of people's time (especially yours). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Wed Apr 2 04:12:44 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Wed, 2 Apr 2014 13:12:44 +0200 Subject: [Live-devel] MJPEG support for MKV parser Message-ID: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> Hi Ross, I am using the live555 code in order to parse MKV file and it works nicely. However it fails when MKV contains a MJPEG stream. I tried the following modification of the parser : diff -rup live.ref/liveMedia/MatroskaFileParser.cpp live/liveMedia/MatroskaFileParser.cpp --- live/liveMedia/MatroskaFileParser.cpp 2014-03-18 03:46:32.000000000 +0100 +++ live.new/liveMedia/MatroskaFileParser.cpp 2014-04-01 18:50:01.396098411 +0200 @@ -456,6 +456,8 @@ Boolean MatroskaFileParser::parseTrack() track->mimeType = "video/THEORA"; } else if (strncmp(codecID, "S_TEXT", 6) == 0) { track->mimeType = "text/T140"; + } else if (strncmp(codecID, "V_MJPEG", 6) == 0) { + track->mimeType = "video/JPEG"; } } else { delete[] codecID; Doing this allow to get the track using "MatroskaDemux::newDemuxedTrack" as you suggest me in a previous post. Do you think it could be an interesting evolution in next live555 release ? Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 2 06:18:49 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 2 Apr 2014 06:18:49 -0700 Subject: [Live-devel] MJPEG support for MKV parser In-Reply-To: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> References: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> Message-ID: <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> > I am using the live555 code in order to parse MKV file and it works nicely. > > However it fails when MKV contains a MJPEG stream. [...] > Do you think it could be an interesting evolution in next live555 release ? Yes. However, I'd need an example of such a file to test the implementation. Please put such a file on a (publicly-accessible) web server, and send us the URL, so I can download it and use it for testing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Wed Apr 2 06:56:41 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Wed, 2 Apr 2014 15:56:41 +0200 Subject: [Live-devel] MJPEG support for MKV parser In-Reply-To: <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> References: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> Message-ID: <18420_1396447003_533C171B_18420_4324_1_1BE8971B6CFF3A4F97AF4011882AA25501564868285C@THSONEA01CMS01P.one.grp> Hi Ross, I just post an MJPEG stream inside a MKV container at http://dl.free.fr/j6ya9ZwlW Web interface is in French for me, I hope it's in English for you. Otherwise I will send using an other way. Thanks again for your support, Michel. [@@ THALES GROUP INTERNAL @@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : mercredi 2 avril 2014 15:19 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] MJPEG support for MKV parser I am using the live555 code in order to parse MKV file and it works nicely. However it fails when MKV contains a MJPEG stream. [...] Do you think it could be an interesting evolution in next live555 release ? Yes. However, I'd need an example of such a file to test the implementation. Please put such a file on a (publicly-accessible) web server, and send us the URL, so I can download it and use it for testing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 3 01:27:10 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Apr 2014 01:27:10 -0700 Subject: [Live-devel] MJPEG support for MKV parser In-Reply-To: <18420_1396447003_533C171B_18420_4324_1_1BE8971B6CFF3A4F97AF4011882AA25501564868285C@THSONEA01CMS01P.one.grp> References: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> <18420_1396447003_533C171B_18420_4324_1_1BE8971B6CFF3A4F97AF4011882AA25501564868285C@THSONEA01CMS01P.one.grp> Message-ID: <4EAB0B97-A2AC-45F7-A1E0-9F689B981DD2@live555.com> I took a look at your example MKV file. It turns out that this file (and apparently all MKV files with JPEG video tracks) stores each JPEG frame as a complete JPEG image, including a JPEG header. For RTP streaming, however, we don't include the JPEG header. Therefore, to stream a JPEG video track from a MKV file, we'd need to add an additional filter (not currently implemented) that parses the JPEG header of each frame, to derive/extract out the 'type', 'qfactor', and (if non-standard) the quantization table - to put in the RTP packet. This is something that might get done sometime in a future, but since JPEG video streaming is a bad idea (that nobody should really be doing in 2014), it's not a high priority for me. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Thu Apr 3 01:08:30 2014 From: fantasyvideo at 126.com (Tony) Date: Thu, 3 Apr 2014 16:08:30 +0800 (CST) Subject: [Live-devel] Regarding the resource destroy in live555 Message-ID: <310a69f9.8857.14526a1d21c.Coremail.fantasyvideo@126.com> Hi Ross, I have some questions needs your help. 1. I found that when I call removeServerMediaSession, only the audio and video OnDemandMediaSubsessions are destroyed. The audio framed source ,video framed source , simple rtp sink, h264 video rtp sink are not destroyed. Should I destroy them manually? 2. If the source is media file, sometimes one user require it , but after that how can I know clientsession is disconnected and remove it? 3. At the end of media file, there would be no data is produced. But live555 still tried to get frame. So I should call doStopGettingFrames? 4. I don't understand why the real player, quick time, media player don't support live555 media server? only vlc supports? Should I do any additional job? Best Regards! Tony -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 3 02:42:49 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Apr 2014 02:42:49 -0700 Subject: [Live-devel] Regarding the resource destroy in live555 In-Reply-To: <310a69f9.8857.14526a1d21c.Coremail.fantasyvideo@126.com> References: <310a69f9.8857.14526a1d21c.Coremail.fantasyvideo@126.com> Message-ID: > 1. I found that when I call removeServerMediaSession, only the audio and video OnDemandMediaSubsessions are destroyed. Note that there is also a member function RTSPServer::deleteServerMediaSession() which closes all current client sessions, then removes the "ServerMediaSession". > 3. At the end of media file, there would be no data is produced. But live555 still tried to get frame. So I should call doStopGettingFrames? If your input source object (a "FramedSource" subclass) reaches the end of its input, it should call "FramedSource::handleClosure()". > 4. I don't understand why the real player, quick time, media player don't support live555 media server? Why don't you ask them? We're the one who supports the IETF standards. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashvyrkin at gosniias.ru Thu Apr 3 04:08:51 2014 From: ashvyrkin at gosniias.ru (Andrey Shvyrkin) Date: Thu, 03 Apr 2014 15:08:51 +0400 Subject: [Live-devel] VLC RTSP Server and live555 testRTSPClient app Message-ID: <533D4143.7080507@gosniias.ru> Hi, Ross. I'm using VLC for creating rtsp stream. The problem is that the built testRTSPClient app starts playing the stream, but about every 40 seconds, the client loses its connection and calls subsessionByeHandler function. From andrea.beoldo at technoaware.com Thu Apr 3 08:25:39 2014 From: andrea.beoldo at technoaware.com (Andrea Beoldo) Date: Thu, 03 Apr 2014 17:25:39 +0200 Subject: [Live-devel] Problem with serverMediaSubsession::createNewRTSPink method Message-ID: <533D7D73.90201@technoaware.com> Dear support, I have implemented a RTSP live streaming server with multiple stream that is used in our ONVIF server. In this server we create several profiles, every profile set up a live555 streaming server on different port (h264). I use latest Live555 (25 march 2014). The problem occurs in the release version when from the client switch from a stream to another. I don't identify exactly where the system crash, but it seem block after a call of serverMediaSubsession::createNewRTPSink. I add some log to understand better and I find to sometimes the system crash in the doGetNextFrame method. This problem is not present in the debug version. I tried to insert different passive wait (before calling createNewStreamSource(), createNewRTPSink() and deleteStream()) and the problem occurs far less times. Do you have any idea of the problem? Thanks in advance for your help. Bets regards Andrea Here there is the code of my serverMediaSubsession and of the functions doGetNextFrame() and deliverFrame() of our framedSource : class serverMediaSubsession: public OnDemandServerMediaSubsession { public : static serverMediaSubsession* createNew(UsageEnvironment &env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent); void checkForAuxSDPLine1(); void afterPlayingDummy1(); protected: serverMediaSubsession(UsageEnvironment &env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent); virtual ~serverMediaSubsession(); void setDoneFlag() { fDoneFlag = ~0; } protected: //redefined virtual functions virtual char const* getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource); virtual FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate); // "estBitrate" is the stream's estimated bitrate, in kbps virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource); virtual void deleteStream(unsigned clientSessionId, void*& streamToken); private: UsageEnvironment* m_pEnvironment; rtspVideoStreamer * m_pParent; rtspVideoStreamer::parameters fParam; char* fAuxSDPLine; char fDoneFlag; // used when setting up "fAuxSDPLine" RTPSink* fDummyRTPSink; }; /*--------------------------------------------------------------------*/ rtspVideoStreamer::serverMediaSubsession* rtspVideoStreamer::serverMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent) { return new serverMediaSubsession(env, reuseFirstSource,params,parent); } /*--------------------------------------------------------------------*/ rtspVideoStreamer::serverMediaSubsession ::serverMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, rtspVideoStreamer::parameters params, rtspVideoStreamer* parent) : OnDemandServerMediaSubsession(env, reuseFirstSource), fParam(params), m_pParent(parent) { m_pEnvironment = &env; fAuxSDPLine = NULL; fDoneFlag = 0; fDummyRTPSink = NULL; } /*--------------------------------------------------------------------*/ rtspVideoStreamer::serverMediaSubsession::~serverMediaSubsession() { envir().taskScheduler().unscheduleDelayedTask(nextTask()); delete[] fAuxSDPLine; } /*--------------------------------------------------------------------*/ static void afterPlayingDummy(void* clientData) { rtspVideoStreamer::serverMediaSubsession* subsess = (rtspVideoStreamer::serverMediaSubsession*)clientData; subsess->afterPlayingDummy1(); } /*--------------------------------------------------------------------*/ void rtspVideoStreamer::serverMediaSubsession::afterPlayingDummy1() { envir().taskScheduler().unscheduleDelayedTask(nextTask()); setDoneFlag(); } /*--------------------------------------------------------------------*/ static void checkForAuxSDPLine(void* clientData) { rtspVideoStreamer::serverMediaSubsession* subsess = (rtspVideoStreamer::serverMediaSubsession*)clientData; subsess->checkForAuxSDPLine1(); } /*--------------------------------------------------------------------*/ void rtspVideoStreamer::serverMediaSubsession::checkForAuxSDPLine1() { char const* dasl; if (fAuxSDPLine != NULL) { setDoneFlag(); } else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) { fAuxSDPLine = strDup(dasl); fDummyRTPSink = NULL; setDoneFlag(); } else { int uSecsToDelay = 100000; // 100 ms nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay, (TaskFunc*)checkForAuxSDPLine, this); } } /*--------------------------------------------------------------------*/ char const* rtspVideoStreamer::serverMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) { if (fAuxSDPLine != NULL) return fAuxSDPLine; if (fDummyRTPSink == NULL) { fDummyRTPSink = rtpSink; fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this); checkForAuxSDPLine(this); } envir().taskScheduler().doEventLoop(&fDoneFlag); return fAuxSDPLine; } /*--------------------------------------------------------------------*/ FramedSource* rtspVideoStreamer::serverMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) { passiveWait(1000*1000); estBitrate = 500; m_pParent->lockSource(); FramedSource* mysource = NULL; if (fParam.m_encoderParams.m_videoCodec != mjpeg) { mysource = genericSource::createNew(*m_pEnvironment, fParam, m_pParent); } else { mysource = new myJPEGVideoSource(*m_pEnvironment, fParam, m_pParent); } m_pParent->m_bClientConnected = true; m_pParent->unlockSource(); FramedSource* videoES = mysource; switch(fParam.m_encoderParams.m_videoCodec) { case mpeg4: return MPEG4VideoStreamFramer::createNew(*m_pEnvironment, videoES); break; case h263p: return H263plusVideoStreamFramer::createNew(*m_pEnvironment, videoES); break; case h264: return H264VideoStreamFramer::createNew(*m_pEnvironment, videoES); break; case mpeg2: case mpeg1: return MPEG1or2VideoStreamFramer::createNew(*m_pEnvironment, videoES); break; case mjpeg: return mysource; break; default: return NULL; break; } } /*--------------------------------------------------------------------*/ RTPSink* rtspVideoStreamer::serverMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) { passiveWait(1000*1000); switch(fParam.m_encoderParams.m_videoCodec) { case mpeg4: return MPEG4ESVideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock, 96); break; case h263p: return H263plusVideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock, 96); break; case h264: return H264VideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock, rtpPayloadTypeIfDynamic/*96*/); break; case mpeg2: case mpeg1: return MPEG1or2VideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock); break; case mjpeg: return JPEGVideoRTPSink::createNew(*m_pEnvironment, rtpGroupsock); break; default: return NULL; break; } } /*--------------------------------------------------------------------*/ void rtspVideoStreamer::serverMediaSubsession::deleteStream(unsigned clientSessionId, void*& streamToken) { OnDemandServerMediaSubsession::deleteStream(clientSessionId, streamToken); passiveWait(1500*1000); StreamState* streamState = (StreamState*)streamToken; if (streamState != NULL) m_pParent->m_bClientConnected = true; else m_pParent->m_bClientConnected = false; } /*--------------------------------------------------------------------*/ void rtspVideoStreamer::genericSource::doGetNextFrame() { bool ret; ucImage img; m_pParent->getImage(img); if (img.isValid()) { if ((m_imgSize.width()!=img.getWidth()) || (m_imgSize.height()!=img.getHeight())) { imgResizer imgres; imgResizer::parameters resizeparam; resizeparam.m_width = m_imgSize.width(); resizeparam.m_height = m_imgSize.height(); imgres.setParameters(resizeparam); imgres.apply(img); } try { ret = m_enc.apply(img, m_pBufferCompressed, m_iCompressedSize); } catch (...) { } } try { deliverFrame(); } catch (...) { LOG_DBG("rtspVideoStreamer::genericSource::doGetNextFrame()::deliverFrame() ecception"); } if (0 /* the source stops being readable */) { handleClosure(this); return; } } /*--------------------------------------------------------------------*/ void rtspVideoStreamer::genericSource::deliverFrame() { if ((unsigned)m_iCompressedSize > fMaxSize) { fNumTruncatedBytes = m_iCompressedSize - fMaxSize; fFrameSize = fMaxSize; memcpy(fTo, m_pBufferCompressed, fMaxSize); } else { memcpy(fTo, m_pBufferCompressed, m_iCompressedSize); fFrameSize = m_iCompressedSize; } gettimeofday(&fPresentationTime, NULL); if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet FramedSource::afterGetting(this); } -- *Andrea Beoldo* Project Manager/R&D Technoaware Srl Corso Buenos Aires 18/11, 16129 Genova (GE) Ph. +39 010 5539239 Fax. +39 0105539240 Email: andrea.beoldo at technoaware.com Web: www.technoaware.com ------------------------------------------------------------------------ *Privacy* Le informazioni contenute in questo messaggio sono riservate e confidenziali. Il loro utilizzo ? consentito esclusivamente al destinatario del messaggio, per le finalit? indicate nel messaggio stesso. Qualora Lei non fosse la persona a cui il presente messaggio ? destinato, La invitiamo ad eliminarlo dal Suo sistema ed a distruggere le varie copie o stampe, dandocene gentilmente comunicazione. Ogni utilizzo improprio ? contrario ai principi del D.lgs 196/03 e alla legislazione europea (Direttiva 2002/58/CE). TechnoAware opera in conformit? al D.lgs 196/2003 e alla legislazione europea. The information contained in this message as well as the attached file(s)is confidential/privileged and is only intended for the person to whom it is addressed. If the reader of this message is not the intended recipient or the employee or agent responsible for delivering the message to the intended recipient, or you have received this communication in error, please be aware that any dissemination, distribution or duplication is strictly prohibited and can be illegal. Please notify us immediately and delete all copies from your mailbox and other archives. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 3 09:02:06 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Apr 2014 09:02:06 -0700 Subject: [Live-devel] VLC RTSP Server and live555 testRTSPClient app In-Reply-To: <533D4143.7080507@gosniias.ru> References: <533D4143.7080507@gosniias.ru> Message-ID: <13ABBA01-E987-4E8E-840A-E8CEE8B2D191@live555.com> > Hi, Ross. I'm using VLC for creating rtsp stream. The problem is that the built testRTSPClient app starts playing the stream, but about every 40 seconds, the client loses its connection and calls subsessionByeHandler function. If the client is calling "subsessionByeHandler", it must be because the server has sent a RTCP "BYE" packet, indicating that its input stream has ended. To verify this, please run "openRTSP" as your client, and send us the diagnostic output (i.e., from stderr). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 3 09:42:38 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Apr 2014 09:42:38 -0700 Subject: [Live-devel] Problem with serverMediaSubsession::createNewRTSPink method In-Reply-To: <533D7D73.90201@technoaware.com> References: <533D7D73.90201@technoaware.com> Message-ID: <66BC70A8-AB55-4D04-BA15-FEEF3BB6FBC0@live555.com> > The problem occurs in the release version when from the client switch from a stream to another. I don't identify exactly where the system crash That should be your first task: To find out exactly where your crash occurs, and why. A stack trace would be useful. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Thu Apr 3 12:46:30 2014 From: warren at etr-usa.com (Warren Young) Date: Thu, 03 Apr 2014 13:46:30 -0600 Subject: [Live-devel] MJPEG support for MKV parser In-Reply-To: <4EAB0B97-A2AC-45F7-A1E0-9F689B981DD2@live555.com> References: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> <18420_1396447003_533C171B_18420_4324_1_1BE8971B6CFF3A4F97AF4011882AA25501564868285C@THSONEA01CMS01P.one.grp> <4EAB0B97-A2AC-45F7-A1E0-9F689B981DD2@live555.com> Message-ID: <533DBA96.3000407@etr-usa.com> On 4/3/2014 02:27, Ross Finlayson wrote: > > JPEG video streaming is a bad idea (that nobody should really be doing > in 2014) Intra-frame compression is perfectly sensible for IP cameras on private LANs, especially when you're doing things like motion detection on the server side, rather than on-camera. You don't have to account for motion estimation error buildup, for one thing. For another, the inexpensive high-efficiency codec ICs found in IP cameras typically crush a lot of detail out of the scene relative to JPEG. Even if you're going to use something like H.264 for video storage after motion estimation, a software encoder will typically give better results than compressing on-camera. (At the expense of more watts per megabit, of course.) If intra-frame compression were perfect for everything, the industry wouldn't still be coming up with new I-only codecs like ProRes and CineForm. From finlayson at live555.com Thu Apr 3 13:24:08 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Apr 2014 13:24:08 -0700 Subject: [Live-devel] MJPEG support for MKV parser In-Reply-To: <533DBA96.3000407@etr-usa.com> References: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> <18420_1396447003_533C171B_18420_4324_1_1BE8971B6CFF3A4F97AF4011882AA25501564868285C@THSONEA01CMS01P.one.grp> <4EAB0B97-A2AC-45F7-A1E0-9F689B981DD2@live555.com> <533DBA96.3000407@etr-usa.com> Message-ID: <3834E24A-73A3-482E-847E-F4BF52F8922E@live555.com> I have no doubt that MJPEG is useful for many things. However, streaming is not one of them. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Thu Apr 3 04:52:08 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Thu, 3 Apr 2014 13:52:08 +0200 Subject: [Live-devel] MJPEG support for MKV parser In-Reply-To: <4EAB0B97-A2AC-45F7-A1E0-9F689B981DD2@live555.com> References: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> <18420_1396447003_533C171B_18420_4324_1_1BE8971B6CFF3A4F97AF4011882AA25501564868285C@THSONEA01CMS01P.one.grp> <4EAB0B97-A2AC-45F7-A1E0-9F689B981DD2@live555.com> Message-ID: <10689_1396525930_533D4B6A_10689_3307_1_1BE8971B6CFF3A4F97AF4011882AA2550156486EA556@THSONEA01CMS01P.one.grp> Hi Ross, I know this, however I was just wondering if it is possible to manage the case in the switch in order to parse it. About MJPEG streaming I remember you consider this is a bad idea. I read many times http://www.live555.com/liveMedia/faq.html#jpeg-streaming I also posted a possible solution to stream video/JPEG http://lists.live555.com/pipermail/live-devel/2012-February/014672.html. I probably modify it but I am still using it and it works quite well. There is some use cases where MJPEG streaming has still some meaning : - It needs less CPU usage, so VCA still use it - It could easily be played backward, as H264 need to decode a full GOP - It is simple to subsample the stream. Sorry for disturbing. Best Regards, Michel. [@@ THALES GROUP INTERNAL @@] De : live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] De la part de Ross Finlayson Envoy? : jeudi 3 avril 2014 10:27 ? : LIVE555 Streaming Media - development & use Objet : Re: [Live-devel] MJPEG support for MKV parser I took a look at your example MKV file. It turns out that this file (and apparently all MKV files with JPEG video tracks) stores each JPEG frame as a complete JPEG image, including a JPEG header. For RTP streaming, however, we don't include the JPEG header. Therefore, to stream a JPEG video track from a MKV file, we'd need to add an additional filter (not currently implemented) that parses the JPEG header of each frame, to derive/extract out the 'type', 'qfactor', and (if non-standard) the quantization table - to put in the RTP packet. This is something that might get done sometime in a future, but since JPEG video streaming is a bad idea (that nobody should really be doing in 2014), it's not a high priority for me. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Thu Apr 3 20:34:12 2014 From: warren at etr-usa.com (Warren Young) Date: Thu, 03 Apr 2014 21:34:12 -0600 Subject: [Live-devel] MJPEG support for MKV parser In-Reply-To: <3834E24A-73A3-482E-847E-F4BF52F8922E@live555.com> References: <27032_1396437165_533BF0AD_27032_6587_1_1BE8971B6CFF3A4F97AF4011882AA2550156486822EE@THSONEA01CMS01P.one.grp> <162C647D-F605-4CE4-89CA-E02E5B3A3FD6@live555.com> <18420_1396447003_533C171B_18420_4324_1_1BE8971B6CFF3A4F97AF4011882AA25501564868285C@THSONEA01CMS01P.one.grp> <4EAB0B97-A2AC-45F7-A1E0-9F689B981DD2@live555.com> <533DBA96.3000407@etr-usa.com> <3834E24A-73A3-482E-847E-F4BF52F8922E@live555.com> Message-ID: <533E2834.3000002@etr-usa.com> On 4/3/2014 14:24, Ross Finlayson wrote: > I have no doubt that MJPEG is useful for many things. However, > streaming is not one of them. What if the streaming is done over HTTP or RTP-over-TCP? That solves your packet loss concern: http://www.live555.com/liveMedia/faq.html#jpeg-streaming ...leaving only the inefficiency, which is unavoidable if you need one of the benefits of I-only compression. From ashvyrkin at gosniias.ru Thu Apr 3 20:49:05 2014 From: ashvyrkin at gosniias.ru (Andrey Shvyrkin) Date: Fri, 04 Apr 2014 07:49:05 +0400 Subject: [Live-devel] VLC RTSP Server and live555 testRTSPClient app In-Reply-To: <13ABBA01-E987-4E8E-840A-E8CEE8B2D191@live555.com> References: <533D4143.7080507@gosniias.ru> <13ABBA01-E987-4E8E-840A-E8CEE8B2D191@live555.com> Message-ID: <533E2BB1.5000108@gosniias.ru> 03.04.2014 20:02, Ross Finlayson ?????: >> Hi, Ross. I'm using VLC for creating rtsp stream. The problem is that >> the built testRTSPClient app starts playing the stream, but about >> every 40 seconds, the client loses its connection and calls >> subsessionByeHandler function. > > If the client is calling "subsessionByeHandler", it must be because > the server has sent a RTCP "BYE" packet, indicating that its input > stream has ended. > > To verify this, please run "openRTSP" as your client, and send us the > diagnostic output (i.e., from stderr). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel Ross, it's log from "openRTSP" app. In SETUP response timeout = 60 seconds and RTCP "BYE" packet received after 60 seconds. If VLC sent a RTCP "BYE" packet, why VLC is playing its own stream without teardown? Opening connection to 192.168.33.77, port 8554... ...remote connection opened Sending request: OPTIONS rtsp://192.168.33.77:8554/media0 RTSP/1.0 CSeq: 2 User-Agent: C:\dev\Live555Viewer\Release\Live555Viewer.exe (LIVE555 Streaming Me dia v2014.03.25) Received 124 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK Server: VLC/2.1.3 Content-Length: 0 Cseq: 2 Public: DESCRIBE,SETUP,TEARDOWN,PLAY,PAUSE,GET_PARAMETER Sending request: DESCRIBE rtsp://192.168.33.77:8554/media0 RTSP/1.0 CSeq: 3 User-Agent: C:\dev\Live555Viewer\Release\Live555Viewer.exe (LIVE555 Streaming Me dia v2014.03.25) Accept: application/sdp Received 662 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK Server: VLC/2.1.3 Date: Fri, 04 Apr 2014 03:26:41 GMT Content-Type: application/sdp Content-Base: rtsp://192.168.33.77:8554/media0 Content-Length: 453 Cache-Control: no-cache Cseq: 3 v=0 o=- 15485808675063797458 15485808675063797458 IN IP4 Jocker-PC s=Unnamed i=N/A c=IN IP4 0.0.0.0 t=0 0 a=tool:vlc 2.1.3 a=recvonly a=type:broadcast a=charset:UTF-8 a=control:rtsp://192.168.33.77:8554/media0 m=video 0 RTP/AVP 96 b=RR:0 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=640028;sprop-parameter-sets=Z2QA KKzZQHgCJ+XARAAAAwAEAAADAMo8YMZY,aOvjyyLA; a=control:rtsp://192.168.33.77:8554/media0/trackID=2 Opened URL "rtsp://192.168.33.77:8554/media0", returning a SDP description: v=0 o=- 15485808675063797458 15485808675063797458 IN IP4 Jocker-PC s=Unnamed i=N/A c=IN IP4 0.0.0.0 t=0 0 a=tool:vlc 2.1.3 a=recvonly a=type:broadcast a=charset:UTF-8 a=control:rtsp://192.168.33.77:8554/media0 m=video 0 RTP/AVP 96 b=RR:0 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=640028;sprop-parameter-sets=Z2QA KKzZQHgCJ+XARAAAAwAEAAADAMo8YMZY,aOvjyyLA; a=control:rtsp://192.168.33.77:8554/media0/trackID=2 Created receiver for "video/H264" subsession (client ports 64166-64167) Sending request: SETUP rtsp://192.168.33.77:8554/media0/trackID=2 RTSP/1.0 CSeq: 4 User-Agent: C:\dev\Live555Viewer\Release\Live555Viewer.exe (LIVE555 Streaming Me dia v2014.03.25) Transport: RTP/AVP;unicast;client_port=64166-64167 Received 270 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK Server: VLC/2.1.3 Date: Fri, 04 Apr 2014 03:26:41 GMT Transport: RTP/AVP/UDP;unicast;client_port=64166-64167;server_port=64168-64169;s src=4D36B604;mode=play Session: 766278e0bcac1ed7;timeout=60 Content-Length: 0 Cache-Control: no-cache Cseq: 4 Setup "video/H264" subsession (client ports 64166-64167) Created output file: "video-H264-1" Sending request: PLAY rtsp://192.168.33.77:8554/media0 RTSP/1.0 CSeq: 5 User-Agent: C:\dev\Live555Viewer\Release\Live555Viewer.exe (LIVE555 Streaming Me dia v2014.03.25) Session: 766278e0bcac1ed7 Range: npt=0.000- Received 277 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK Server: VLC/2.1.3 Date: Fri, 04 Apr 2014 03:26:41 GMT RTP-Info: url=rtsp://192.168.33.77:8554/media0/trackID=2;seq=12802;rtptime=31960 74254 Range: npt=215.623000- Session: 766278e0bcac1ed7;timeout=60 Content-Length: 0 Cache-Control: no-cache Cseq: 5 Started playing session Receiving streamed data... MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds t he client's buffer size (100000). 4237 bytes of trailing data will be dropped! FileSink::afterGettingFrame(): The input frame data was too large for our buffer size (100000). 4237 bytes of trailing data was dropped! Correct this by incre asing the "bufferSize" parameter in the "createNew()" call to at least 104237 Received RTCP "BYE" on "video/H264" subsession (after 60 seconds) Sending request: TEARDOWN rtsp://192.168.33.77:8554/media0 RTSP/1.0 CSeq: 6 User-Agent: C:\dev\Live555Viewer\Release\Live555Viewer.exe (LIVE555 Streaming Me dia v2014.03.25) Session: 766278e0bcac1ed7 Received 166 new bytes of response data. Received a complete TEARDOWN response: RTSP/1.0 200 OK Server: VLC/2.1.3 Date: Fri, 04 Apr 2014 03:27:41 GMT Session: 766278e0bcac1ed7;timeout=60 Content-Length: 0 Cache-Control: no-cache Cseq: 6 -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 3 21:09:23 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 3 Apr 2014 21:09:23 -0700 Subject: [Live-devel] VLC RTSP Server and live555 testRTSPClient app In-Reply-To: <533E2BB1.5000108@gosniias.ru> References: <533D4143.7080507@gosniias.ru> <13ABBA01-E987-4E8E-840A-E8CEE8B2D191@live555.com> <533E2BB1.5000108@gosniias.ru> Message-ID: <31AA2E67-B099-42AD-9593-E076B624C69C@live555.com> The problem here is that VLC - when run as a RTSP server - is not standards compliant. It should be listening to incoming RTCP "RR" packets from the client, and using them to tell it that the client is still alive. (Note that VLC's RTSP server implementation - unlike its RTSP client implementation - does not use our software.) VLC - when run as a client - works around this bug by explicitly sending a 'dummy' RTSP command ("GET_PARAMETER", I think) periodically. But it shouldn't have to, because RTCP "RR" packets (which all compliant RTSP/RTP clients are required to send) should be enough. Please tell the developers of VLC to fix this bug. Alternatively, use some other RTSP server (such as ours), instead of VLC. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashvyrkin at gosniias.ru Thu Apr 3 23:00:35 2014 From: ashvyrkin at gosniias.ru (Andrey Shvyrkin) Date: Fri, 04 Apr 2014 10:00:35 +0400 Subject: [Live-devel] VLC RTSP Server and live555 testRTSPClient app In-Reply-To: <31AA2E67-B099-42AD-9593-E076B624C69C@live555.com> References: <533D4143.7080507@gosniias.ru> <13ABBA01-E987-4E8E-840A-E8CEE8B2D191@live555.com> <533E2BB1.5000108@gosniias.ru> <31AA2E67-B099-42AD-9593-E076B624C69C@live555.com> Message-ID: <533E4A83.2030808@gosniias.ru> 04.04.2014 8:09, Ross Finlayson ?????: > The problem here is that VLC - when run as a RTSP server - is not > standards compliant. It should be listening to incoming RTCP "RR" > packets from the client, and using them to tell it that the client is > still alive. (Note that VLC's RTSP server implementation - unlike its > RTSP client implementation - does not use our software.) > > VLC - when run as a client - works around this bug by explicitly > sending a 'dummy' RTSP command ("GET_PARAMETER", I think) > periodically. But it shouldn't have to, because RTCP "RR" packets > (which all compliant RTSP/RTP clients are required to send) should be > enough. Please tell the developers of VLC to fix this bug. > > Alternatively, use some other RTSP server (such as ours), instead of VLC. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel I modified "testRTSPClient" code and sent GET_PARAMETER command every 30 seconds, but VLC respond error. [URL:"rtsp://192.168.33.77:8554/media0"]: Started playing session... Sending request: GET_PARAMETER rtsp://192.168.33.77:8554/media0 RTSP/1.0 CSeq: 5 User-Agent: C:\dev\Live555Viewer\Release\Live555Viewer.exe (LIVE555 Streaming Me dia v2014.03.25) Session: 12d0ee350d27aa85 Content-Length: 2 Received 138 new bytes of response data. Received a complete GET_PARAMETER response: RTSP/1.0 451 Client error Server: VLC/2.1.3 Date: Fri, 04 Apr 2014 05:55:04 GMT Content-Length: 0 Cache-Control: no-cache Cseq: 5 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashvyrkin at gosniias.ru Thu Apr 3 23:16:30 2014 From: ashvyrkin at gosniias.ru (Andrey Shvyrkin) Date: Fri, 04 Apr 2014 10:16:30 +0400 Subject: [Live-devel] VLC RTSP Server and live555 testRTSPClient app In-Reply-To: <533E4A83.2030808@gosniias.ru> References: <533D4143.7080507@gosniias.ru> <13ABBA01-E987-4E8E-840A-E8CEE8B2D191@live555.com> <533E2BB1.5000108@gosniias.ru> <31AA2E67-B099-42AD-9593-E076B624C69C@live555.com> <533E4A83.2030808@gosniias.ru> Message-ID: <533E4E3E.5000804@gosniias.ru> Sorry, it's my issue, i passed an empty string in "sendGetParameterCommand" as "parameterName", now I pass the null parameter. VLC no longer disconnects. From mahesh122001 at gmail.com Sat Apr 5 23:49:17 2014 From: mahesh122001 at gmail.com (Mahesh Babu) Date: Sun, 6 Apr 2014 12:19:17 +0530 Subject: [Live-devel] how to stream h264 file from buffer Message-ID: I am store h264 video file in buffer(RAM), l need to stream the video from buffer(RAM). Please provide me essential instructions that i should fallow. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Sun Apr 6 08:37:34 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Sun, 6 Apr 2014 18:37:34 +0300 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses Message-ID: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> Hi, I'm a fairly new user of Live555, having worked on a small application for about a week or so. So far I've more or less successfully managed to stream out video from connected local USB cameras using OpenCV. It was definitely not a trivial task. Now I'd like to proxy some remote surveillance cameras. As first I started working on some own proxying code, but that didn't work too well and then I found ProxyServerMediaSubsession which seems to do exactly what I want to do for now. However, I'd like that the streams my application serves are multicasted, and ProxyServerMediaSubsession has an explicit mention that it uses unicast only. From what I understand to get it to use multicast then the Groupsock that gets passed to: RTPSink* ProxyServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource); should be setup to use multicast, right? Is there any nice way I can do that? Also I probably want to also be able to decode the incoming stream in order to grab still images from it and do some manipulation on it before re-encoding it again. Is there some good way to "duplicate" the incoming stream into my custom code for doing the image manipulation while at the same time keeping the normal proxied stream working? I'm sorry if these questions are dumb, but I didn't find anything in the docs or examples. Best regards, Jan Ekholm From finlayson at live555.com Sun Apr 6 12:54:21 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 6 Apr 2014 12:54:21 -0700 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> Message-ID: > Now I'd like to proxy some remote surveillance cameras. As first I started working on some > own proxying code, but that didn't work too well and then I found ProxyServerMediaSubsession > which seems to do exactly what I want to do for now. However, I'd like that the streams my > application serves are multicasted, and ProxyServerMediaSubsession has an explicit > mention that it uses unicast only. That's correct. That code is used to allow a single 'back-end' RTSP/RTP stream (which can be either unicast or multicast) to be proxied - *unicast* - to one or more 'front-end' clients. It is not intended to be used for multicast front-end streaming, because would be based upon completely different code ("PassiveServerMediaSubsession" rather than "OnDemandServerMediaSubsession"). Fortunately, however, you can proxy to a multicast 'front-end' very simply, using just a modification to the existing tools: (Assuming that your 'back-end' camera streams H.264 video.) 1/ In the "testH264VideoStreamer.cpp" code (in "testProgs"), change "inputFileName" from "test.264" to "stdin". 2/ Then run "openRTSP" on the 'back-end' stream, piping its output to your modified "testH264VideoStreamer" application. I.e.: openRTSP -v rtsp://back-end-rtsp-url | your_modified_testH264VideoStreamer > Also I probably want to also be able to decode the incoming stream in order to grab > still images from it and do some manipulation on it before re-encoding it again. Is there > some good way to "duplicate" the incoming stream into my custom code for doing the > image manipulation while at the same time keeping the normal proxied stream working? Yes, you can use the "StreamReplicator" class. (Note the "testReplicator" demo application - in "testProgs" - that illustrates how to use this.) If you use the mechanism that I suggested above (piping "openRTSP" into a modified "testH264VideoStreamer"), then you could update the "testH264VideoStreamer" some more, by feeding the "StreamReplicator" from the input "ByteStreamFileSource" (from "stdin"), create two replicas, then feed one replica into the "H264VideoStreamFramer" (and thus the "H264VideoRTPSink", for streaming), and feed another replica into your decoder. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Sun Apr 6 14:32:01 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Mon, 7 Apr 2014 00:32:01 +0300 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> Message-ID: <3D54E143-3E3B-42FA-915D-81F4DE25827A@d-pointer.com> Thank you for the quick answer, Ross! On 6 apr 2014, at 22:54, Ross Finlayson wrote: >> Now I'd like to proxy some remote surveillance cameras. As first I started working on some >> own proxying code, but that didn't work too well and then I found ProxyServerMediaSubsession >> which seems to do exactly what I want to do for now. However, I'd like that the streams my >> application serves are multicasted, and ProxyServerMediaSubsession has an explicit >> mention that it uses unicast only. > > That's correct. That code is used to allow a single 'back-end' RTSP/RTP stream (which can be either unicast or multicast) to be proxied - *unicast* - to one or more 'front-end' clients. It is not intended to be used for multicast front-end streaming, because would be based upon completely different code ("PassiveServerMediaSubsession" rather than "OnDemandServerMediaSubsession"). So OnDemandServerMediaSubsession and its subclasses are always for unicast streaming while PassiveServerMediaSubsession can perform multicast streaming? I understood the latter as a class that simply wrapped an existing sink, not that there would be any difference in the way the data is streamed. I guess it's not as simple as making a version of ProxyServerMediaSubsession that is based on PassiveServerMediaSubsession instead? > Fortunately, however, you can proxy to a multicast 'front-end' very simply, using just a modification to the existing tools: > (Assuming that your 'back-end' camera streams H.264 video.) > 1/ In the "testH264VideoStreamer.cpp" code (in "testProgs"), change "inputFileName" from "test.264" to "stdin". > 2/ Then run "openRTSP" on the 'back-end' stream, piping its output to your modified "testH264VideoStreamer" application. I.e.: > openRTSP -v rtsp://back-end-rtsp-url | your_modified_testH264VideoStreamer The backend cameras are quite normal and output H.264 or mjpeg. But as my application is supposed to be able to proxy several of those cameras, as well as do optional scaling, framerate dropping as well as grab still images, I do need to have it all as a more or less monolithic application. Based on your description above it's perfectly doable, as long as I can figure out how to properly create the source -> sink chain. I haven't understood yet how to take data that one example application sends to a dummy sink and make that a real source to a part of the application that could do the multicasting. > >> Also I probably want to also be able to decode the incoming stream in order to grab >> still images from it and do some manipulation on it before re-encoding it again. Is there >> some good way to "duplicate" the incoming stream into my custom code for doing the >> image manipulation while at the same time keeping the normal proxied stream working? > > Yes, you can use the "StreamReplicator" class. (Note the "testReplicator" demo application - in "testProgs" - that illustrates how to use this.) Aha, I was looking for things like "proxying" and "forwarding". That example looks indeed very promising and the StreamReplicator class seems easy to use. > If you use the mechanism that I suggested above (piping "openRTSP" into a modified "testH264VideoStreamer"), then you could update the "testH264VideoStreamer" some more, by feeding the "StreamReplicator" from the input "ByteStreamFileSource" (from "stdin"), create two replicas, then feed one replica into the "H264VideoStreamFramer" (and thus the "H264VideoRTPSink", for streaming), and feed another replica into your decoder. I think I have to do it all in one application in this case. But it doesn't seem to be impossible at all. Thank you again for the help, I'm learning more every day. :) Best regards, Jan Ekholm From finlayson at live555.com Sun Apr 6 14:37:37 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 6 Apr 2014 14:37:37 -0700 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: <3D54E143-3E3B-42FA-915D-81F4DE25827A@d-pointer.com> References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> <3D54E143-3E3B-42FA-915D-81F4DE25827A@d-pointer.com> Message-ID: <167C76C7-BB39-4C1B-9C4D-63E9B888A47E@live555.com> > So OnDemandServerMediaSubsession and its subclasses are always for unicast streaming while PassiveServerMediaSubsession can perform > multicast streaming? Yes. > I guess it's not as simple as making a version of ProxyServerMediaSubsession that is based on > PassiveServerMediaSubsession instead? You could certainly do that, but then that would be your own code, and therefore probably not something that I could help you with (at least, not for free) on this mailing list. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Sun Apr 6 23:21:03 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Mon, 7 Apr 2014 09:21:03 +0300 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: <167C76C7-BB39-4C1B-9C4D-63E9B888A47E@live555.com> References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> <3D54E143-3E3B-42FA-915D-81F4DE25827A@d-pointer.com> <167C76C7-BB39-4C1B-9C4D-63E9B888A47E@live555.com> Message-ID: On 7 apr 2014, at 00:37, Ross Finlayson wrote: >> So OnDemandServerMediaSubsession and its subclasses are always for unicast streaming while PassiveServerMediaSubsession can perform >> multicast streaming? > > Yes. Ok, that clears it up a bit. And as PassiveServerMediaSubsession would multicast it would do it all the time, regardless of if anyone actually receives the stream, right? Not all of the backend cameras are interesting all the time though, so this also requires some consideration as network bandwidth is finite after all. > >> I guess it's not as simple as making a version of ProxyServerMediaSubsession that is based on >> PassiveServerMediaSubsession instead? > > You could certainly do that, but then that would be your own code, and therefore probably not something that I could help you with (at least, not for free) on this mailing list. Indeed, I understand that. -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Mon Apr 7 00:27:51 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Apr 2014 00:27:51 -0700 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> <3D54E143-3E3B-42FA-915D-81F4DE25827A@d-pointer.com> <167C76C7-BB39-4C1B-9C4D-63E9B888A47E@live555.com> Message-ID: > Ok, that clears it up a bit. And as PassiveServerMediaSubsession would multicast it would do it all the time, regardless of > if anyone actually receives the stream, right? Not all of the backend cameras are interesting all the time though, so this also > requires some consideration as network bandwidth is finite after all. Yes, although the way that IP multicast routing is implemented, if nobody else is currently subscribed to the multicast address, then the multicast packets will not leave the sender's local area network. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karnaukhov at physics.msu.ru Mon Apr 7 10:45:44 2014 From: karnaukhov at physics.msu.ru (=?KOI8-R?B?4czFy9PFyiDhzMXL08XF18neIOvB0s7B1cjP1w==?=) Date: Mon, 7 Apr 2014 21:45:44 +0400 Subject: [Live-devel] One-frame delay in h264 streaming app Message-ID: Hello. I`m implementing low-latency h264 rtsp source. Based on this code http://stackoverflow.com/questions/13863673/how-to-write-a-live555-framedsource-to-allow-me-to-stream-h-264-live I wrote subclass of FramedSource and able to watch this stream in my client app. But even if I put 1 frame per second setting in x264 codec, I have exactly one frame-delayed data in my client. This delay is 100% on server side, because MultiFramedRTPSink::sendPacketIfNecessary() sending compressed data of previous frame (i just compare first several bytes of data in fOutBuf with those I wrote in fTo). What can I do to send rtp packets immediately after filling fTo buffer? Here is my deliverFrame implementation: void H264FramedSource::deliverFrame() { if(nals_count==0) { clr=clr+1; if(clr==25) clr = 0; envir() << "color=\t"< 0 && fPreferredFrameSize > 0) { if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) { // This is the first frame, so use the current time: gettimeofday(&fPresentationTime, NULL); } else { // Increment by the play time of the previous data: unsigned uSeconds = fPresentationTime.tv_usec + fLastPlayTime; fPresentationTime.tv_sec += uSeconds/1000000; fPresentationTime.tv_usec = uSeconds%1000000; } // Remember the play time of this data: fLastPlayTime = (fPlayTimePerFrame*fFrameSize)/fPreferredFrameSize; fDurationInMicroseconds = fLastPlayTime; } else { // We don't know a specific play time duration for this data, // so just record the current time as being the 'presentation time': gettimeofday(&fPresentationTime, NULL); } fPresentationTime.tv_usec; if(nals_count>0)//!m_queue.empty()) { nalToDeliver = nals_queue[nal_num]; uint8_t* newFrameDataStart = (uint8_t*)0xD15EA5E; newFrameDataStart = (uint8_t*)(nalToDeliver.p_payload); unsigned newFrameSize = nalToDeliver.i_payload; nal_num++; nals_count--; // Deliver the data here: if (newFrameSize > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = newFrameSize - fMaxSize; envir() << "truncate!!!\r\n" << fNumTruncatedBytes << "\r\n"; } else { fFrameSize = newFrameSize; } memcpy(fTo, nalToDeliver.p_payload, fFrameSize); if(nals_count==0) nal_num=0; nextTask() = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this); }else{ nal_num = 0; } } -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Apr 7 11:11:56 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 7 Apr 2014 11:11:56 -0700 Subject: [Live-devel] One-frame delay in h264 streaming app In-Reply-To: References: Message-ID: <53CED704-1A62-4C63-A658-704E3A1C00A7@live555.com> > I`m implementing low-latency h264 rtsp source. Based on this code > http://stackoverflow.com/questions/13863673/how-to-write-a-live555-framedsource-to-allow-me-to-stream-h-264-live Oh please! Only second-rate programmers use 'stackoverflow.com'. The proper web site for documentation on the "LIVE555 Streaming Media" code is . In particular, you should read http://www.live555.com/liveMedia/faq.html#liveInput and http://www.live555.com/liveMedia/faq.html#liveInput-unicast > I wrote subclass of FramedSource and able to watch this stream in my client app. But even if I put 1 frame per second setting in x264 codec, I have exactly one frame-delayed data in my client. This delay is 100% on server side, because MultiFramedRTPSink::sendPacketIfNecessary() sending compressed data of previous frame (i just compare first several bytes of data in fOutBuf with those I wrote in fTo). I haven't looked at your code in detail, but you should note that once a H.264 NAL unit is delivered to the downstream "H264VideoStreamDiscreteFramer" object (using "FramedSource::afterGetting()"), this delivery will take place immediately. Similarly, the delivery from the "H264VideoStreamDiscreteFramer" to *its* downstream object - a "H264VideoRTPSink" - will also take place immediately. There is no '1 frame (NAL unit) delay'. (Note also that for H.264 video streaming, because you are delivering discrete NAL units (i.e., one at a time), your NAL unit source class *must* feed into a "H264VideoStreamDiscreteFramer" object (*not* a "H264VideoStreamFramer"), and each NAL unit that's delivered *must not* begin with a 0x00000001 'start code'.) > nextTask() = envir().taskScheduler().scheduleDelayedTask(0, > (TaskFunc*)FramedSource::afterGetting, this); Because your code will already be returning to the event loop immediately afterwards, note that you can simplify this to FramedSource::afterGetting(this); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From goran.ambardziev at gmail.com Mon Apr 7 14:58:02 2014 From: goran.ambardziev at gmail.com (Goran Ambardziev) Date: Mon, 07 Apr 2014 23:58:02 +0200 Subject: [Live-devel] AAC/H264 RTP Multicast timestamps Message-ID: <53431F6A.8060908@gmail.com> Hello, and thanks for the help. > - We see these types of messages all the time in our platform (which > is implemented using live555 on the server.) A different media player > we've tried seems to play the stream without these "glitches". > - I see the same messages using my own RTSP/RTP stack written from scratch. Hmm, it seems so. I've tried it with MPlayer for windows and it is working fine. The streams are practically unplayable in VLC. I event get errors saying: "more than 5 seconds of late video" mainwarning: playback too late (61251): up-sampling mainwarning: playback too late (60915): up-sampling avcodecerror: more than 5 seconds of late video -> dropping frame (computer too slow ?) mainwarning: picture is too late to be displayed (missing 1418 ms) mainwarning: picture is too late to be displayed (missing 1378 ms) mainwarning: picture is too late to be displayed (missing 1338 ms) mainwarning: picture is too late to be displayed (missing 1172 ms) mainwarning: picture is too late to be displayed (missing 1152 ms) So the stream cannot be played back in VLC. > I also suggest making sure that: > - streaming H.264 video only works OK (I think you've already done this) > - streaming AAC audio only works OK > before trying to stream both audio and video together. yes, I've tried that: both streams work OK when streamed separately. The problem only appears when they are streamed together. Btw, what is the correct sequence of closing the RTSP server and other classes from live555. I do it like this: I signal m_doneFlag. Everything is done on a dedicated thread and my class isn't exiting until it gets the close event (m_hCloseEvt, at the end), but still something is not cleaned up I can see it in the output window. m_pUsageEnv->taskScheduler().doEventLoop( &m_doneFlag ); // does not return // Close everything // if(m_pRtpVideoSink != NULL) { m_pRtpVideoSink->stopPlaying(); } m_pRtpVideoSink = NULL; Medium::close(m_pH264FramedSource); if(m_pRtpAudioSink != NULL) { m_pRtpAudioSink->stopPlaying(); } m_pRtpAudioSink = NULL; Medium::close(m_pAacFrameedSource); Medium::close(rtspServer); Medium::close(rtcpVideo); Medium::close(rtcpAudio); rtpVideoGroupsock.removeAllDestinations(); rtcpVideoGroupsock.removeAllDestinations(); rtpAudioGroupsock.removeAllDestinations(); rtcpAudioGroupsock.removeAllDestinations(); m_pUsageEnv->reclaim(); SetEvent( m_hCloseEvt ); -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 8 01:33:32 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Apr 2014 01:33:32 -0700 Subject: [Live-devel] AAC/H264 RTP Multicast timestamps In-Reply-To: <53431F6A.8060908@gmail.com> References: <53431F6A.8060908@gmail.com> Message-ID: <89794FEE-3219-4501-97BF-E4BA029ADCEE@live555.com> > Hmm, it seems so. I've tried it with MPlayer for windows and it is working fine. The streams are practically unplayable in VLC. Sorry, but we can't help you with problems with VLC; VLC is not our software. > Btw, what is the correct sequence of closing the RTSP server and other classes from live555. The easiest way is just to call: exit(0); However, if (for some reason) you want to keep the process around, even after you've destroyed the server, you can do so by: - Calling "Medium::close()" on all of your 'source', 'sink', and 'RTCPInstance' objects. - Calling "Medium::close()" on your RTSP server - Deleting your 'Groupsock' objects. - Calling env->reclaim(); - Doing delete scheduler; Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From goran.ambardziev at gmail.com Tue Apr 8 14:43:33 2014 From: goran.ambardziev at gmail.com (Goran Ambardziev) Date: Tue, 08 Apr 2014 23:43:33 +0200 Subject: [Live-devel] AAC/H264 Closing Down Message-ID: <53446D85.3040501@gmail.com> > >>/ Btw, what is the correct sequence of closing the RTSP server and other classes from live555. > / > The easiest way is just to call: > exit(0); > > However, if (for some reason) you want to keep the process around, even after you've destroyed the server, you can do so by: > - Calling "Medium::close()" on all of your 'source', 'sink', and 'RTCPInstance' objects. > - Calling "Medium::close()" on your RTSP server > - Deleting your 'Groupsock' objects. > - Calling env->reclaim(); > - Doing delete scheduler; I need the process to stay alive after I'm done with streaming, so I cannot call exit(0) I do closing like this. Signal done flag. Call stop playing on Sinks. Close sources, sinks, RTCPInstance objects etc. I get an access violation error when closing VideoSink (m_pRtpVideoSink). The error is in method "void FramedSource::stopGettingFrames()" when calling "doStopGettingFrames*()*". Why could this be happening? m_pUsageEnv->taskScheduler().doEventLoop( &m_doneFlag ); // does not return // Close everything // if(m_pRtpVideoSink != NULL) { m_pRtpVideoSink->stopPlaying(); } if(m_pRtpAudioSink != NULL) { m_pRtpAudioSink->stopPlaying(); } Medium::close( m_pH264FramedSource ); Medium::close( m_pAacFrameedSource ); Medium::close( m_pRtpVideoSink ); Medium::close( m_pRtpAudioSink ); Medium::close(rtcp); Medium::close(rtcpAudio); Medium::close(rtspServer); rtpVideoGroupsock->removeAllDestinations(); rtcpVideoGroupsock->removeAllDestinations(); delete rtpVideoGroupsock; delete rtcpVideoGroupsock; rtpAudioGroupsock->removeAllDestinations(); rtcpAudioGroupsock->removeAllDestinations(); delete rtpAudioGroupsock; delete rtcpAudioGroupsock; m_pUsageEnv->reclaim(); delete scheduler; -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 8 16:46:57 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 8 Apr 2014 16:46:57 -0700 Subject: [Live-devel] AAC/H264 Closing Down In-Reply-To: <53446D85.3040501@gmail.com> References: <53446D85.3040501@gmail.com> Message-ID: <363A306C-3DCB-48C0-BAF4-67246B4001E8@live555.com> > I get an access violation error when closing VideoSink (m_pRtpVideoSink). Try closing the 'sinks' before the 'sources'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Wed Apr 9 07:37:38 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Wed, 9 Apr 2014 17:37:38 +0300 Subject: [Live-devel] Tracking down latency Message-ID: Hi, I've with some effort managed to get a solution that uses Live555 to handle the streaming of video from a USB webcam for a simple application. It all works quite well but I see a lot of latency and need to start tracking it down. My pipeline currently is: camera -> OpenCV -> x264 -> Live555 -> network (localhost) -> VLC. For this I see a latency of about 1.5s, which is quite a lot. So far I've managed to time the grabbing and encoding part and it seems to be ~50ms per frame. The network part should be really minimal as it's all running on one machine right now. VLC here is an unknown beast, as it's really hard to know what it's doing and how much it actually buffers and adds to the latency that way. What I'm now interested in is if there is some way to debug latency within Live555 to make sure the latency is not introduced there. What I've done so far is trying to compare the presentation times of the encoded frames with what a slightly modified testRTSPClient shows. My grabbing and encoding is done in a FramedSource subclass in doGetNextFrame(). There the NAL:s for the captured frame are all given the same presentation time based on: gettimeofday( &m_currentTime, NULL ); I print this time along with some other data: H264FramedSource::doGetNextFrame: frame done in 30 ms, queue size: 9, time: 1397053387.439 H264FramedSource::doGetNextFrame: frame done in 0 ms, queue size: 8, time: 1397053387.439 H264FramedSource::doGetNextFrame: frame done in 0 ms, queue size: 7, time: 1397053387.439 ... The first one did the grabbing and encoding, creating 9 NAL units into a queue, the following doGetNextFrame() simply feed the already created NAL:s and are thus much faster. The time is the presentation time as retrieved above. Then I use a slightly modified testRTSPClient to receive the stream, and the first frames that arrive looks like: Stream "rtsp://192.168.1.12:8554/camera0/"; video/H264: Received 5 bytes. Presentation time: 1397053387.439622 now: 1397053387.439772 Stream "rtsp://192.168.1.12:8554/camera0/"; video/H264: Received 604 bytes. Presentation time: 1397053387.439622 now: 1397053387.439794 Stream "rtsp://192.168.1.12:8554/camera0/"; video/H264: Received 8224 bytes. Presentation time: 1397053387.439622 now: 1397053387.439924 ... The presentation time is the same as I set, the "now" is simply an added gettimeofday() call for when the DummySink::afterGettingFrame() got called. Seems testRTSPClient gets the frames pretty fast (less than 1ms) after they've been passed to Live555 in the server end. I was not expecting it to be that fast, and that makes me doubt my logic. Is it this simple, do the frame arrive this fast at the client? If so then the latency from the camera to a testRTSPClient is not too big, far from the 1.5s I see with VLC (Mplayer on OSX just breaks apart but it seems that the broken image is just as late there). The testRTSPClient of course does not include any decoding and similar that needs to be done, but that can't be too many milliseconds per frame. Am I thus right in assuming almost all of my latency comes from the VLC side? My next test would be to extend testRTSPClient to provide a real decoding sink in place of the DummySink, but I'd like to avoid doing that if possible. Any ideas? I'm more than happy to be lectured on my bad ways of measuring time and latency. :) Best regards, Jan Ekholm From karnaukhov at physics.msu.ru Wed Apr 9 08:18:36 2014 From: karnaukhov at physics.msu.ru (=?KOI8-R?B?4czFy9PFyiDhzMXL08XF18neIOvB0s7B1cjP1w==?=) Date: Wed, 9 Apr 2014 19:18:36 +0400 Subject: [Live-devel] One-frame delay in h264 streaming app In-Reply-To: <53CED704-1A62-4C63-A658-704E3A1C00A7@live555.com> References: <53CED704-1A62-4C63-A658-704E3A1C00A7@live555.com> Message-ID: Thanks, I follow your recommendation and now I get almost zero delay. Problem was when I tried to use ..DiscreteFramer, I`ve got a lot of those errors about "MPEG start code found...", even if I use annex_b=0 flag in x264 settings. Only when I truncate first 4 bytes of every NAL unit all begins to work. And again, stakoverflow saved my time... I dont pretend on first rate skills) http://stackoverflow.com/questions/19427576/live555-x264-stream-live-source-based-on-testondemandrtspserver 2014-04-07 22:11 GMT+04:00 Ross Finlayson : > I`m implementing low-latency h264 rtsp source. Based on this code > > http://stackoverflow.com/questions/13863673/how-to-write-a-live555-framedsource-to-allow-me-to-stream-h-264-live > > > Oh please! Only second-rate programmers use 'stackoverflow.com'. > > The proper web site for documentation on the "LIVE555 Streaming Media" > code is . In particular, you should read > http://www.live555.com/liveMedia/faq.html#liveInput > and > http://www.live555.com/liveMedia/faq.html#liveInput-unicast > > > I wrote subclass of FramedSource and able to watch this stream in my > client app. But even if I put 1 frame per second setting in x264 codec, I > have exactly one frame-delayed data in my client. This delay is 100% on > server side, because MultiFramedRTPSink::sendPacketIfNecessary() sending > compressed data of previous frame (i just compare first several bytes of > data in fOutBuf with those I wrote in fTo). > > > I haven't looked at your code in detail, but you should note that once a > H.264 NAL unit is delivered to the downstream > "H264VideoStreamDiscreteFramer" object (using > "FramedSource::afterGetting()"), this delivery will take place immediately. > Similarly, the delivery from the "H264VideoStreamDiscreteFramer" to *its* > downstream object - a "H264VideoRTPSink" - will also take place > immediately. There is no '1 frame (NAL unit) delay'. > > (Note also that for H.264 video streaming, because you are delivering > discrete NAL units (i.e., one at a time), your NAL unit source class *must* > feed into a "H264VideoStreamDiscreteFramer" object (*not* a > "H264VideoStreamFramer"), and each NAL unit that's delivered *must not* > begin with a 0x00000001 'start code'.) > > > nextTask() = envir().taskScheduler().scheduleDelayedTask(0, > (TaskFunc*)FramedSource::afterGetting, this); > > > Because your code will already be returning to the event loop immediately > afterwards, note that you can simplify this to > FramedSource::afterGetting(this); > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Wed Apr 9 08:35:43 2014 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Wed, 9 Apr 2014 08:35:43 -0700 Subject: [Live-devel] Tracking down latency In-Reply-To: References: Message-ID: <001201cf5409$60859600$2190c200$@com> Hello, >VLC here is an unknown beast, as it's really hard to know what it's doing >and how much it actually buffers and adds to the latency that way. By default, VLC is using a buffer of 1 second, so it could be that most of your latency comes from there. To change this, open the Preferences dialog, select All under the Show Settings group at the bottom left, then select Input/Codecs from the tree view on the left. Then on the right hand side, scroll way down near the bottom and edit the field "Network Caching (ms)". You can experiment with lower settings; eventually VLC will stop playing any frames altogether because its buffer will be too small. Regards, Chris Richardson WTI From jan.ekholm at d-pointer.com Wed Apr 9 09:00:34 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Wed, 9 Apr 2014 19:00:34 +0300 Subject: [Live-devel] Tracking down latency In-Reply-To: <001201cf5409$60859600$2190c200$@com> References: <001201cf5409$60859600$2190c200$@com> Message-ID: <459BBCFF-D153-486B-A8A6-167488D16EA4@d-pointer.com> On 9 apr 2014, at 18:35, Chris Richardson (WTI) wrote: > Hello, > >> VLC here is an unknown beast, as it's really hard to know what it's doing >> and how much it actually buffers and adds to the latency that way. > > By default, VLC is using a buffer of 1 second, so it could be that most of > your latency comes from there. To change this, open the Preferences dialog, > select All under the Show Settings group at the bottom left, then select > Input/Codecs from the tree view on the left. Then on the right hand side, > scroll way down near the bottom and edit the field "Network Caching (ms)". > You can experiment with lower settings; eventually VLC will stop playing any > frames altogether because its buffer will be too small. Argh, so that's the setting to tweak. I didn't find that one and "tweaked" some other buffer settings that likely then did nothing. Yes, the value is 1000 ms by default and lowering it makes things better. Going down to 200 ms causes the playback to not really work at all and with 400 ms there are a lot of "picture is too late to be displayed" errors. With 500 ms caching it seems to work without any errors. That's really too much delay for my use case though, but good to know where it comes from. Thank you for pointing it out to me! -- Jan Ekholm jan.ekholm at d-pointer.com From rglobisch at csir.co.za Wed Apr 9 09:10:04 2014 From: rglobisch at csir.co.za (Ralf Globisch) Date: Wed, 09 Apr 2014 18:10:04 +0200 Subject: [Live-devel] Tracking down latency In-Reply-To: <001201cf5409$60859600$2190c200$@com> References: <001201cf5409$60859600$2190c200$@com> Message-ID: <53458CFC0200004D000A034E@pta-emo.csir.co.za> In addition to what Chris says, you need to configure x264 to be in low latency (zerolatency) mode. However as this is a live555 mailing list, you should enquire further on the appropriate mailing lists. >>> "Chris Richardson (WTI)" 04/09/14 5:40 PM >>> Hello, >VLC here is an unknown beast, as it's really hard to know what it's doing >and how much it actually buffers and adds to the latency that way. By default, VLC is using a buffer of 1 second, so it could be that most of your latency comes from there. To change this, open the Preferences dialog, select All under the Show Settings group at the bottom left, then select Input/Codecs from the tree view on the left. Then on the right hand side, scroll way down near the bottom and edit the field "Network Caching (ms)". You can experiment with lower settings; eventually VLC will stop playing any frames altogether because its buffer will be too small. Regards, Chris Richardson WTI _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. Please consider the environment before printing this email. -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. Please consider the environment before printing this email. From amir.lasmar.marzouk at gmail.com Thu Apr 10 08:53:15 2014 From: amir.lasmar.marzouk at gmail.com (Amir Marzouk) Date: Thu, 10 Apr 2014 17:53:15 +0200 Subject: [Live-devel] Question Message-ID: hello, I'm looking for the function that makes the call to control play with a definite time eg play from one minute cordially, -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 10 18:25:46 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 10 Apr 2014 18:25:46 -0700 Subject: [Live-devel] Question In-Reply-To: References: Message-ID: > I'm looking for the function that makes the call to control play with a definite time > eg play from one minute Your question wasn't clear, but I presume that you're asking about a *RTSP client* function. (Remember that the code also implements a RTSP server, and a RTSP proxy.) The function is RTSPClient::sendPlayCommand() Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amir.lasmar.marzouk at gmail.com Mon Apr 14 07:24:38 2014 From: amir.lasmar.marzouk at gmail.com (Amir Marzouk) Date: Mon, 14 Apr 2014 16:24:38 +0200 Subject: [Live-devel] How to redefine "FD_SETSIZE" Message-ID: hello, I'm looking for how to increase the number of concurrent clients. I found in this link that i need to redefine "FD_SETSIZE". but I have not found how and where? http://www.live555.com/liveMedia/faq.html#scalability thank you for help cordially, -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Mon Apr 14 13:17:37 2014 From: warren at etr-usa.com (Warren Young) Date: Mon, 14 Apr 2014 14:17:37 -0600 Subject: [Live-devel] How to redefine "FD_SETSIZE" In-Reply-To: References: Message-ID: <534C4261.3020505@etr-usa.com> On 4/14/2014 08:24, Amir Marzouk wrote: > > I'm looking for how to increase the number of concurrent clients. I > found in this link that i need to redefine "FD_SETSIZE". but I have not > found how and where? This is pretty basic stuff, a thing you should know as a programmer regardless of your use of Live555. If you're using Makefiles: CFLAGS=$(CFLAGS) -DFD_SETSIZE=2048 That is, just append it to the value of CFLAGS or CXXFLAGS. If you're using some other build system, you just need to find out where global #defines are normally set. You can do this in a header, too: #define FD_SETSIZE 2048 ...but you have to make certain it's #included before things like netinet/in.h, arpa/inet.h, winsock.h, etc. Doing it at the build system level is more certain to override the platform default. From demthedj at gmail.com Mon Apr 14 13:37:48 2014 From: demthedj at gmail.com (Sergey Kuprienko) Date: Mon, 14 Apr 2014 23:37:48 +0300 Subject: [Live-devel] How to redefine "FD_SETSIZE" In-Reply-To: <534C4261.3020505@etr-usa.com> References: <534C4261.3020505@etr-usa.com> Message-ID: Anyway, it does not affect kernel, if you're using linux. 15.04.2014 4:22 ???????????? "Warren Young" ???????: > On 4/14/2014 08:24, Amir Marzouk wrote: > >> >> I'm looking for how to increase the number of concurrent clients. I >> found in this link that i need to redefine "FD_SETSIZE". but I have not >> found how and where? >> > > This is pretty basic stuff, a thing you should know as a programmer > regardless of your use of Live555. > > If you're using Makefiles: > > CFLAGS=$(CFLAGS) -DFD_SETSIZE=2048 > > That is, just append it to the value of CFLAGS or CXXFLAGS. > > If you're using some other build system, you just need to find out where > global #defines are normally set. > > You can do this in a header, too: > > #define FD_SETSIZE 2048 > > ...but you have to make certain it's #included before things like > netinet/in.h, arpa/inet.h, winsock.h, etc. Doing it at the build system > level is more certain to override the platform default. > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vikram at vizexperts.com Tue Apr 15 04:17:52 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Tue, 15 Apr 2014 16:47:52 +0530 Subject: [Live-devel] Frames are corrupted Message-ID: <000901cf589c$59000630$0b001290$@vizexperts.com> Hi Everyone, I am trying to stream the frames I capture from OpenGL frame buffer. I am using LIVE555 library code for that and I am trying to stream it using RTSP. I encode the frames to H264 format. I have written a custom class "LiveSourceWithx264" which derives from "FramedSource". This class has the function "doGetNextFrame()" and it has the following code. gettimeofday(¤tTime,NULL); if(!isCurrentlyAwaitingData()) return; void* buf = NULL; buf = RTSPstreamQueue->getBuf( fFrameSize, false ); if ( fFrameSize > fMaxSize ) { fNumTruncatedBytes = fFrameSize - fMaxSize; fFrameSize = fMaxSize; } fPresentationTime = currentTime; memmove(fTo, buf, fFrameSize); RTSPstreamQueue->releaseBuf(); FramedSource::afterGetting(this); In the above code RTSPstreamQueue function returns a pointer to the frame and the sets the size of the frame in the variable fFrameSize. But if fFrameSize is greater than fMaxSize then I have to truncate the data. I think this is what that is causing the corrupted frames. Please take a look at the attachments to get an idea how the frames are getting corrupted. Is there any way to get around this problem. Any pointer would be appretiated. Thanks Vikram Singh. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Client.png Type: image/png Size: 324526 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Server.png Type: image/png Size: 119080 bytes Desc: not available URL: From david.bueno at scati.com Tue Apr 15 06:33:48 2014 From: david.bueno at scati.com (David Bueno Monge) Date: Tue, 15 Apr 2014 15:33:48 +0200 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients Message-ID: Hi, I have developed a RTSP server based on the example found in testOnDemandRTSPServer.cpp. The application can stream data from files and also from different live sources (to do that i'm using an RTSP client which i developed some time ago, also based on the Live555 library). I have implemented my own "FramedSource" subclass and differents subclasses of "OnDemandServerMediaSubsession" depending on the video encoding. The application works fine when just one client is connected to the server, even with high resolution videos. However, when more clients connect to the different sessions added to the RTSPServer instance of my server, the performance decreases a lot. I am talking that just with two clients connected, the rate at which the server stream the input of two different sources is really low. There is no network problem, this happens also in a local scenario (server and clients running in the same machine). The %CPU and memory used by the server are very low also, so, the problem does not come from the machine running the server. Is there anything that can be done to increase the performance?? Some kind of configuration i could be missing?? Thanks. -- *David Bueno Monge*Software Engineer Skype dbueno_scati *------------------------------*[image: http://www.scati.com] T +34 902 116 095 F +34 976 466 580 *------------------------------* Bari, 23 Plataforma Log?stica PLAZA 50.197 Zaragoza (Spain) *www.scati.com * *------------------------------* *Disclaimer:* This e-mail (including any attached documents) is proprietary and confidential and may contain legally privileged information. It is intended for the named recipient(s) only. If you are not the intended recipient, you may not review, retain, copy or distribute this message, and we kindly ask you to notify the sender by e-mail immediately and delete this message from your system. *Please consider your environmental responsibility before printing this e-mail.* -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 9971 bytes Desc: not available URL: From finlayson at live555.com Tue Apr 15 07:56:10 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Apr 2014 07:56:10 -0700 Subject: [Live-devel] Frames are corrupted In-Reply-To: <000901cf589c$59000630$0b001290$@vizexperts.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> Message-ID: <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> > But if fFrameSize is greater than fMaxSize then I have to truncate the data. > I think this is what that is causing the corrupted frames. Yes, because if a frame has to be truncated, then the truncated data will be lost (i.e., not sent to the client). > Is there any way to get around this problem. Yes, there are two possible solutions. 1/ The best solution is to not send such large NAL units. Reconfigure your encoder to break up 'key frames' into multiple (therefore much smaller) 'slice' NAL units. 2/ Alternatively (though not as good), you can increase the size of the server's output buffer. Try adding the following line to your application - at the start of "main()": OutPacketBuffer::maxSize = 100000; and recompile. If that doesn't work, try increasing to 150000, 200000, etc., depending on the size of your frames. It's important to understand, though, that this is a bad solution. See: http://lists.live555.com/pipermail/live-devel/2013-April/016805.html 1/ is a *much* better solution - i.e., decrease the size of the NAL units that you're streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 15 08:09:43 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 15 Apr 2014 08:09:43 -0700 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients In-Reply-To: References: Message-ID: > I have developed a RTSP server based on the example found in testOnDemandRTSPServer.cpp. The application can stream data from files and also from different live sources (to do that i'm using an RTSP client which i developed some time ago, also based on the Live555 library). FYI, you could also use our "testRTSPClient" demo application for this. It can stream from multiple "rtsp://" URLs concurrently. > I have implemented my own "FramedSource" subclass and differents subclasses of "OnDemandServerMediaSubsession" depending on the video encoding. > > The application works fine when just one client is connected to the server, even with high resolution videos. However, when more clients connect to the different sessions added to the RTSPServer instance of my server, the performance decreases a lot. > > I am talking that just with two clients connected, the rate at which the server stream the input of two different sources is really low. > > There is no network problem, this happens also in a local scenario (server and clients running in the same machine). > > The %CPU and memory used by the server are very low also, so, the problem does not come from the machine running the server. That's strange. I suspect that the problem is (somehow) related to your 'data source' implementation - i.e., your "FramedSource" subclass. What happens when you don't just 'base' your server on "testOnDemandRTSPServer", but actually use the (original, unmodified) "testOnDemandRTSPServer" code? What happens when two clients request the same stream (from a file)? And do things change at all when you change your server to use a single input source for all concurrent clients - i.e., if you change line 29 of "testOnDemandRTSPServer.cpp" to Boolean reuseFirstSource = True; ? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at jfs-tech.com Tue Apr 15 09:23:13 2014 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Tue, 15 Apr 2014 12:23:13 -0400 Subject: [Live-devel] Frames are corrupted In-Reply-To: <000901cf589c$59000630$0b001290$@vizexperts.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> Message-ID: Check Byte Alignment,Pixel format, and encoder slices. How are you encodeing them? If the frames are large, then the encoder may spit out more than one frame with the same timestamp and differnt sequence number. Nal units may be [7][8][5][5][5][1][1][1]... instead of simply [7][8][5][1][1][1].... There is also the original sampling pixle buffer stuff Is it YUV 4:2:2 or 4:4:1 http://www.fourcc.org/yuv.php Since this determines the layout in memory for each frame it could show up this way. Just looking at the images. it almost looks like the byte alignment is not being honoured and the frame is cut off. I do not see loss of color levels which happens when you get all the Y and miss the end of the frame that is packed [ydata][udata][vdata] It looks instead like first the lines are not an exact power of two in length and the pading is missed causeing the image to shear to the right a bit each line. It can cause you to run out of pixels if they are off in some byte-alignment off-screen area and then the rest of the image just shows tha last known pixels over and over. On Tue, Apr 15, 2014 at 7:17 AM, Vikram Singh wrote: > Hi Everyone, > > > > I am trying to stream the frames I capture from OpenGL frame buffer. > > I am using LIVE555 library code for that and I am trying to stream it > using RTSP. > > I encode the frames to H264 format. > > I have written a custom class "*LiveSourceWithx264*" which derives from " > *FramedSource*". > > This class has the function "*doGetNextFrame()*" and it has the > following code. > > > > *gettimeofday(¤tTime,NULL);* > > * if(!isCurrentlyAwaitingData()) return;* > > > > * void* buf = NULL;* > > * buf = RTSPstreamQueue->getBuf( fFrameSize, false );* > > > > *if ( fFrameSize > fMaxSize )* > > * {* > > * fNumTruncatedBytes = fFrameSize - fMaxSize;* > > * fFrameSize = fMaxSize;* > > * }* > > > > * fPresentationTime = currentTime;* > > * memmove(fTo, buf, fFrameSize);* > > > > * RTSPstreamQueue->releaseBuf();* > > * FramedSource::afterGetting(this);* > > > > In the above code *RTSPstreamQueue *function returns a pointer to the > frame and the sets the size of the frame in the variable fFrameSize. > > > > But if fFrameSize is greater than fMaxSize then I have to truncate the > data. > > I think this is what that is causing the corrupted frames. > > Please take a look at the attachments to get an idea how the frames are > getting corrupted. > > > > Is there any way to get around this problem. > > Any pointer would be appretiated. > > > > Thanks > > Vikram Singh. > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vikram at vizexperts.com Fri Apr 18 08:59:09 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Fri, 18 Apr 2014 21:29:09 +0530 Subject: [Live-devel] Frames are corrupted In-Reply-To: <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> Message-ID: <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> Hi Ross, - I am still not able to resolve the issue. - I would like to elaborate my issue. Basically I want to stream frames I grab from a opengl windows and I get these frames in encoded h264 format - The following is my code for the thread that starts RTSP streaming. int RTSPStreamer::Main() { TaskScheduler* taskSchedular = BasicTaskScheduler::createNew(); BasicUsageEnvironment* usageEnvironment = BasicUsageEnvironment::createNew(*taskSchedular); RTSPServer* rtspServer = RTSPServer::createNew(*usageEnvironment, 8554, NULL); if(rtspServer == NULL) { *usageEnvironment << "Failed to create rtsp server ::" << usageEnvironment->getResultMsg() <<"\n"; exit(1); } std::string streamName = "usb1.mkv"; ServerMediaSession* sms = ServerMediaSession::createNew(*usageEnvironment, streamName.c_str(), streamName.c_str(), "Live H264 Stream"); H264LiveServerMediaSession *liveSubSession = H264LiveServerMediaSession::createNew(*usageEnvironment, true, rtspStreamQueue); sms->addSubsession(liveSubSession); rtspServer->addServerMediaSession(sms); char* url = rtspServer->rtspURL(sms); *usageEnvironment << "Play the stream using url "<doEventLoop(); return 0; } - I have written a custom class H264LiveServerMediaSession which get frames from rtspStreamQueue. - In H264LiveServerMediaSession the function createNewStreamSource() creates an instance of LiveSourceWithx264 which is basically responsible for getting the frame from rtspStreamQueue. - In the function deliverFrame() ?> fMaxSize will start at 15000 - In the next call to deliverFrame() ? fMaxSize will be 15000-?size of frame in previous call?. - I think this should now happen and . - Can you suggest me what I did wrong. I have attached the files H264LiveServerMediaSession and LiveSourceWithx264. - I am new to live555 and please help me out. Thanks and Regards, Vikram Singh. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, April 15, 2014 8:26 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Frames are corrupted But if fFrameSize is greater than fMaxSize then I have to truncate the data. I think this is what that is causing the corrupted frames. Yes, because if a frame has to be truncated, then the truncated data will be lost (i.e., not sent to the client). Is there any way to get around this problem. Yes, there are two possible solutions. 1/ The best solution is to not send such large NAL units. Reconfigure your encoder to break up 'key frames' into multiple (therefore much smaller) 'slice' NAL units. 2/ Alternatively (though not as good), you can increase the size of the server's output buffer. Try adding the following line to your application - at the start of "main()": OutPacketBuffer::maxSize = 100000; and recompile. If that doesn't work, try increasing to 150000, 200000, etc., depending on the size of your frames. It's important to understand, though, that this is a bad solution. See: http://lists.live555.com/pipermail/live-devel/2013-April/016805.html 1/ is a *much* better solution - i.e., decrease the size of the NAL units that you're streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: LiveSourceWithx264.cpp URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: H264LiveServerMediaSession.cpp URL: From finlayson at live555.com Fri Apr 18 12:11:29 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Apr 2014 12:11:29 -0700 Subject: [Live-devel] Frames are corrupted In-Reply-To: <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> Message-ID: <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> > - I am still not able to resolve the issue. That's because you didn't apply *either* of the two possible solutions that I gave you in my earlier response! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vikram at vizexperts.com Fri Apr 18 22:49:50 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Sat, 19 Apr 2014 11:19:50 +0530 Subject: [Live-devel] Frames are corrupted In-Reply-To: <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> Message-ID: <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> Hi Ross, I tried your approach. But I still have the same issue. - I made OutPacketBuffer::maxSize = 900000; - I return an object of H264VideoStreamFramer in the function createNewStreamSource(). The comments in the header says "A filter that breaks up a H.264 Video Elementary Stream into NAL units". - The size of these NAL units range upto 5000 bytes in my case. - I think the issue is with fMaxSize in deliverFrame() which starts from 150000 and goes to zero which should not happen. Please take a look at the function doGetNextFrame() in which I call deliverFrame(). - If this happen then there will be loss of data each time the fMaxSize goes to zero, which does happen consistently and frequently. From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, April 19, 2014 12:41 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Frames are corrupted - I am still not able to resolve the issue. That's because you didn't apply *either* of the two possible solutions that I gave you in my earlier response! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Apr 18 23:07:36 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 18 Apr 2014 23:07:36 -0700 Subject: [Live-devel] Frames are corrupted In-Reply-To: <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> Message-ID: <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> > - I think the issue is with fMaxSize in deliverFrame() which starts from 150000 and goes to zero which should not happen. OK, your problem is that - in your implementation of the "createNewStreamSource()" virtual function - you are feeding a "LiveSourceWithx264" object into a "H264VideoStreamFramer". This is wrong, because "H264VideoStreamFramer" is used only when parsing a H.264 *byte stream*, not a discrete sequence of NAL units. Instead, you should be feeding a "LiveSourceWithx264" into a "H264VideoStreamDiscreteFramer". Also, it's important that the H.264 NAL units that you deliver - from the "LiveSourceWithx264" - into the "H264VideoStreamDiscreteFramer" *not* begin with a 0x00 0x00 0x00 0x01 'start code'. I.e., you need to remove these 4 bytes (and adjust "fFrameSize" accordingly) when you do the delivery. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Mon Apr 21 18:04:25 2014 From: nambirajan.manickam at i-velozity.com (nambirajan.manickam at i-velozity.com) Date: Mon, 21 Apr 2014 18:04:25 -0700 Subject: [Live-devel] Regarding trick mode files Message-ID: <20140421180425.fe3cde39dea1cf6284ae78ac84c5299d.7f023c91c7.wbe@email05.secureserver.net> An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Apr 21 21:09:07 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Apr 2014 21:09:07 -0700 Subject: [Live-devel] Regarding trick mode files In-Reply-To: <20140421180425.fe3cde39dea1cf6284ae78ac84c5299d.7f023c91c7.wbe@email05.secureserver.net> References: <20140421180425.fe3cde39dea1cf6284ae78ac84c5299d.7f023c91c7.wbe@email05.secureserver.net> Message-ID: > The output ( x4, x15, x60 and x300 ) of the MPEG4 HD file plays out ok in VLC. By this I presume you meant that you used the "testMPEG2TransportStreamTrickPlay" utility to generate Transport Stream files (corresponding to various fast-forward speeds of the original), and played these files locally using VLC. The Transport Stream data in these files are exactly the same as the data that the "LIVE555 Media Server" streams when asked to 'fast-forward' at the same speed. Therefore, the problem must be in either your STB clients, or the network. I suspect that at least part of the problem is that the 'faster' streams are being sent at a higher bitrate (because they consist only of large 'key frames'). Network congestion and/or packet loss may be the cause of some of the problems that you're seeing with your network clients. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajo at saranyu.in Mon Apr 21 22:00:33 2014 From: sajo at saranyu.in (Sajo John) Date: Tue, 22 Apr 2014 10:30:33 +0530 Subject: [Live-devel] Loop Streaming File to Live Message-ID: <5355F771.1010600@saranyu.in> Hi, I'm trying to figure out a method to loop stream a file continuously as a Live Stream over RTMP/RTSP. I'm a newbie to Live555 libraries and I'm not able to do figure how to go about this. Please help me with this. -- Thanks and Regards, Sajo John From finlayson at live555.com Mon Apr 21 22:11:03 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 21 Apr 2014 22:11:03 -0700 Subject: [Live-devel] Loop Streaming File to Live In-Reply-To: <5355F771.1010600@saranyu.in> References: <5355F771.1010600@saranyu.in> Message-ID: > I'm trying to figure out a method to loop stream a file continuously as a Live Stream over RTMP/RTSP. We don't have anything to do with RTMP (that's a proprietary protocol). Perhaps you meant RTP But anyway, 'loop streaming' a file is not something that we support directly. You would need to write your own data source class (a subclass of "FramedSource") that did this. Note, however, that you didn't say what kind of file you want to stream. You couldn't do this for a Transport Stream file, for example, because it contains its own timestamp (PCR) information, which would be incorrect when you read the file a second (etc.) time - unless you explicitly changed the PCR timestamps each time. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajo at saranyu.in Mon Apr 21 22:53:14 2014 From: sajo at saranyu.in (Sajo John) Date: Tue, 22 Apr 2014 11:23:14 +0530 Subject: [Live-devel] Loop Streaming File to Live In-Reply-To: References: <5355F771.1010600@saranyu.in> Message-ID: <535603CA.7060703@saranyu.in> Hi, Thanks for your quick reply. I'm basically trying to do a file to live implementation that streams the video file continuously to RTP and once the EOF is reached it should continue streaming from the beginning of the video. It should be a continuous stream without any breaks/gaps while looping. I'm okay with using any suitable Video File Format. Perhaps you could suggest the right format and encoding (H264 etc.) that would be ideal for this case. So once I receive the file I will just convert it first to this ideal format and then use it in the implementation. Also, do you know about any sample code available that implements this? Thanks and Regards, Sajo On Tuesday 22 April 2014 10:41 AM, Ross Finlayson wrote: >> I'm trying to figure out a method to loop stream a file continuously >> as a Live Stream over RTMP/RTSP. > > We don't have anything to do with RTMP (that's a proprietary > protocol). Perhaps you meant RTP > > But anyway, 'loop streaming' a file is not something that we support > directly. You would need to write your own data source class (a > subclass of "FramedSource") that did this. Note, however, that you > didn't say what kind of file you want to stream. You couldn't do this > for a Transport Stream file, for example, because it contains its own > timestamp (PCR) information, which would be incorrect when you read > the file a second (etc.) time - unless you explicitly changed the PCR > timestamps each time. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -- Thanks and Regards, Sajo John Software Developer Saranyu Technologies Pvt. Ltd. http://www.saranyu.in/ Disclaimer: This message contains confidential information and is intended only for the individual named. If you are not the named addressee you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately by e-mail if you have received this e-mail by mistake and delete this e-mail from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue Apr 22 04:11:02 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 22 Apr 2014 14:11:02 +0300 Subject: [Live-devel] Limiting frame rate for a source Message-ID: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> Hi, I have a USB camera that I can stream as MJPEG using a JPEGVideoSource subclass. It all works nicely and the frames are streamed and received ok. I would however like to be able to limit the frame rate of the stream, as it now seems to be 20+ fps. In this case it's way too high as the use case is a surveillance camera that grabs a big overview image, not video conferencing. I tried looking at the FAQ and the examples but didn't see anything. Also, is there some way to know when all clients have disconnected from a RTSP source so that I could stop grabbing and encoding frames? It seems to work automatically for my H264 stream, but the MJPEG stream continues encoding forever. Both are handled using a OnDemandServerMediaSubsession subclass to create the source and the sink. I hope these questions make any sense and I'd be happy for any hints. Best regards, Jan Ekholm -- Jan Ekholm jan.ekholm at d-pointer.com From vikram at vizexperts.com Tue Apr 22 07:12:58 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Tue, 22 Apr 2014 19:42:58 +0530 Subject: [Live-devel] Frames are corrupted In-Reply-To: <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> Message-ID: <003201cf5e34$f77257d0$e6570770$@vizexperts.com> Hi Ross, Thanks for the correction. I fed "LiveSourceWithx264" into "H264VideoStreamDiscreteFramer" and now it works if I comment out the virtual function "getAuxSDPLine" in my implementation. In a previous question posted by user, http://lists.live555.com/pipermail/live-devel/2011-December/014276.html you said this function is needed "only for when you're streaming codecs like H.264 or MPEG-4 video that require special 'configuration' parameters" and I am streaming h264 video. My getAuxSDPLine() is char const* H264LiveServerMediaSession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) { if(fAuxSDPLine != NULL) return fAuxSDPLine; if(fDummySink == NULL) { fDummySink = rtpSink; fDummySink->startPlaying(*inputSource, afterPlayingDummy, this); checkForAuxSDPLine(this); } envir().taskScheduler().doEventLoop(&fDoneFlag); return fAuxSDPLine; } After entering into the doEventLoop() my function is going into infinite loop because it is not getting SPS and PPS nal units. If I comment out the function it is working on brower and vlc player on desktop. But it is not working on iPad brower ie it is rendering one frame and getting struck. Does this happen because I have not received SPS and PPS nal units ? Thanks Vikram Singh From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, April 19, 2014 11:38 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Frames are corrupted - I think the issue is with fMaxSize in deliverFrame() which starts from 150000 and goes to zero which should not happen. OK, your problem is that - in your implementation of the "createNewStreamSource()" virtual function - you are feeding a "LiveSourceWithx264" object into a "H264VideoStreamFramer". This is wrong, because "H264VideoStreamFramer" is used only when parsing a H.264 *byte stream*, not a discrete sequence of NAL units. Instead, you should be feeding a "LiveSourceWithx264" into a "H264VideoStreamDiscreteFramer". Also, it's important that the H.264 NAL units that you deliver - from the "LiveSourceWithx264" - into the "H264VideoStreamDiscreteFramer" *not* begin with a 0x00 0x00 0x00 0x01 'start code'. I.e., you need to remove these 4 bytes (and adjust "fFrameSize" accordingly) when you do the delivery. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 22 08:15:39 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Apr 2014 08:15:39 -0700 Subject: [Live-devel] Frames are corrupted In-Reply-To: <003201cf5e34$f77257d0$e6570770$@vizexperts.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> <003201cf5e34$f77257d0$e6570770$@vizexperts.com> Message-ID: <89371423-1FA7-4963-AF87-1FFD983B8181@live555.com> > Does this happen because I have not received SPS and PPS nal units ? Yes. If you are streaming H.264 video, then you *must* have SPS and PPS NAL units. Either 1/ Your H.264 video source contains SPS and PPS NAL units, occurring frequently. In this case, you *should not* modify "getAuxSDPLine()". Or: 2/ Your H.264 video source does not contain SPS and PPS NAL units, but you know them some other way, in advance. In this case, you should not implement "getAuxSDPLine()", but you *must* then pass these NAL units to "H264VideoRTPSink::createNew()", in your implementation of the "createNewRTPSink()" virtual function. If neither 1/ nor 2/ is true - i.e., if your video source does not contain SPS and PPS NAL units, nor do you know these in advance - then you will not be able to successfully stream H.264 video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 22 08:28:24 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Apr 2014 08:28:24 -0700 Subject: [Live-devel] Limiting frame rate for a source In-Reply-To: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> References: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> Message-ID: > I have a USB camera that I can stream as MJPEG using a JPEGVideoSource subclass. > It all works nicely and the frames are streamed and received ok. I would however like > to be able to limit the frame rate of the stream, as it now seems to be 20+ fps. In this > case it's way too high as the use case is a surveillance camera that grabs a big overview > image, not video conferencing. I tried looking at the FAQ and the examples but didn't > see anything. I'm not sure I understand your question. You have written your own media source class (in this case, a subclass of "JPEGVideoSource") that delivers encoded (JPEG) frames. You want to reduce its frame rate - i.e., how soon it calls "FramedSource::afterGetting()" in response to each call to "doGetNextFrame()". So just do it. This is your code :-) > Also, is there some way to know when all clients have disconnected from a RTSP source > so that I could stop grabbing and encoding frames? This should happen automatically. I.e., when the last concurrent RTSP client has disconnected (or timed out), then your media source class's destructor will get called. Therefore, you should write your media source class's destructor so that it stops grabbing/encoding. Don't forget to have your subclass of "OnDemandServerMediaSubsession" set the "reuseFirstSource" parameter to True when it calls the "OnDemandServerMediaSubsession" constructor. This will ensure that no more than one instance of your media source class will ever be created concurrently, regardless of the number of concurrent RTSP clients. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue Apr 22 09:01:20 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 22 Apr 2014 19:01:20 +0300 Subject: [Live-devel] Limiting frame rate for a source In-Reply-To: References: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> Message-ID: <58058D2C-813E-4784-86CF-82D1AF689772@d-pointer.com> On 22 apr 2014, at 18:28, Ross Finlayson wrote: >> I have a USB camera that I can stream as MJPEG using a JPEGVideoSource subclass. >> It all works nicely and the frames are streamed and received ok. I would however like >> to be able to limit the frame rate of the stream, as it now seems to be 20+ fps. In this >> case it's way too high as the use case is a surveillance camera that grabs a big overview >> image, not video conferencing. I tried looking at the FAQ and the examples but didn't >> see anything. > > I'm not sure I understand your question. You have written your own media source class (in this case, a subclass of "JPEGVideoSource") that delivers encoded (JPEG) frames. You want to reduce its frame rate - i.e., how soon it calls "FramedSource::afterGetting()" in response to each call to "doGetNextFrame()". So just do it. This is your code :-) That I can do, of course. But isn't Live555 single threaded and if I decide to limit the frame rate to, say, 1 fps I will basically block the entire application? I have several sources that the same application will need to handle. Ideally I would like to multithread my application, but I've already once rewritten the core loop to be based on Live555's main loop mechanism. I was hoping that there was some built in rate limitation that only caused doGetNextFrame() to get called when a new frame was needed. Or can I perhaps manipulate the presentation time and Live555 would based on that determine that it has enough frames for now and throttle a bit? >> Also, is there some way to know when all clients have disconnected from a RTSP source >> so that I could stop grabbing and encoding frames? > > This should happen automatically. I.e., when the last concurrent RTSP client has disconnected (or timed out), then your media source class's destructor will get called. Therefore, you should write your media source class's destructor so that it stops grabbing/encoding. > > Don't forget to have your subclass of "OnDemandServerMediaSubsession" set the "reuseFirstSource" parameter to True when it calls the "OnDemandServerMediaSubsession" constructor. This will ensure that no more than one instance of your media source class will ever be created concurrently, regardless of the number of concurrent RTSP clients. Yes, that happens for the H264 source, but not for the MJPEG one. I have reuseFirstSource set to true as the camera can only be opened once. I will have to dig deeper so see why the MJPEG source does not stop. Both OnDemandServerMediaSubsession are very similar, apart from the H264 one having the extra dummy sink to get the aux data. -- Jan Ekholm jan.ekholm at d-pointer.com From jan.ekholm at d-pointer.com Tue Apr 22 11:30:40 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 22 Apr 2014 21:30:40 +0300 Subject: [Live-devel] Limiting frame rate for a source In-Reply-To: <58058D2C-813E-4784-86CF-82D1AF689772@d-pointer.com> References: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> <58058D2C-813E-4784-86CF-82D1AF689772@d-pointer.com> Message-ID: <37C1FC81-61DB-42A3-AD16-9C79BABB8606@d-pointer.com> On 22 apr 2014, at 19:01, Jan Ekholm wrote: > >>> Also, is there some way to know when all clients have disconnected from a RTSP source >>> so that I could stop grabbing and encoding frames? >> >> This should happen automatically. I.e., when the last concurrent RTSP client has disconnected (or timed out), then your media source class's destructor will get called. Therefore, you should write your media source class's destructor so that it stops grabbing/encoding. >> >> Don't forget to have your subclass of "OnDemandServerMediaSubsession" set the "reuseFirstSource" parameter to True when it calls the "OnDemandServerMediaSubsession" constructor. This will ensure that no more than one instance of your media source class will ever be created concurrently, regardless of the number of concurrent RTSP clients. > > Yes, that happens for the H264 source, but not for the MJPEG one. I have reuseFirstSource set to true as > the camera can only be opened once. I will have to dig deeper so see why the MJPEG source does not stop. > Both OnDemandServerMediaSubsession are very similar, apart from the H264 one having the extra dummy > sink to get the aux data. I don't see any of my destructors called for any of my subclasses when I disconnect my client. However, when the client initially connects there is a somewhat strange sequence of calls. First there are the expected calls: createNewStreamSource() createNewRTPSink() to my OnDemandServerMediaSubsession subclass. These set up a JPEGVideoSource subclass and a standard JPEGVideoRTPSink. Immediately after this the destructor for my JPEGVideoSource is called and the source is cleaned up. After this I again get: createNewStreamSource() createNewRTPSink() and now the streaming starts and never stops. Is it possible to enable some kind of debugging output for Live555? I suspect that the somewhat interesting reference counting is somehow to blame, i.e. that I perhaps don't understand it at all and cause my objects to be immortal. -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Tue Apr 22 13:20:11 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Apr 2014 13:20:11 -0700 Subject: [Live-devel] Limiting frame rate for a source In-Reply-To: <58058D2C-813E-4784-86CF-82D1AF689772@d-pointer.com> References: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> <58058D2C-813E-4784-86CF-82D1AF689772@d-pointer.com> Message-ID: <7B408285-B83D-4DF4-8E7B-C9038EE18319@live555.com> > I was hoping that there was some built in rate limitation that only caused doGetNextFrame() > to get called when a new frame was needed. No, it's the data source that decides when it delivers data. There are two possible ways to do this: If your data source class encodes and delivers a frame immediately when "doGetNextFrame()" is called, then presumably you are also setting "fDurationInMicroseconds" before you complete delivery to the downstream object (using "FramedSource::afterGetting()"). (If you are encoding/delivering a frame immediately, but *not* setting "fDurationInMicroseconds", then "fDurationInMicroseconds" is staying at its default value of 0, and you will be delivering frames as fast as you can generate them, which is bad :-) In this case, to change the frame rate, just change the value of "fDurationInMicroseconds". If, however, you want your data source class to not encode/deliver a frame immediately, but instead wait for the encoder to make a new frame available, then - as you noted - the encoder will need to be a separate thread of control (so that you don't delay, waiting for the encoded frame to come available). In this case you would implement your 'data source' class similar to the demo code that's in "DeviceSource.cpp". Note that in this code's "doGetNextFrame()" implementation, if no frame is immediately available to be delivered, we return immediately (i.e., do not block). Instead, we rely upon a separate thread of control (the encoder thread) later 'signaling' the availability of a new frame - e.g., using "TaskScheduler::triggerEvent()". (Note that "triggerEvent()" is the *only* LIVE555 function that can be called from a non-LIVE555 event loop thread.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 22 13:38:30 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Apr 2014 13:38:30 -0700 Subject: [Live-devel] Limiting frame rate for a source In-Reply-To: <37C1FC81-61DB-42A3-AD16-9C79BABB8606@d-pointer.com> References: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> <58058D2C-813E-4784-86CF-82D1AF689772@d-pointer.com> <37C1FC81-61DB-42A3-AD16-9C79BABB8606@d-pointer.com> Message-ID: > However, > when the client initially connects there is a somewhat strange sequence of calls. First there are the > expected calls: > > createNewStreamSource() > createNewRTPSink() > > to my OnDemandServerMediaSubsession subclass. These set up a JPEGVideoSource subclass > and a standard JPEGVideoRTPSink. Immediately after this the destructor for my JPEGVideoSource > is called and the source is cleaned up. After this I again get: > > createNewStreamSource() > createNewRTPSink() > > and now the streaming starts and never stops. Is it possible to enable some kind of debugging > output for Live555? You don't need 'debugging', because in this case I can tell you exactly what is happening. "createNewStreamSource()"+"createNewRTPSink()" is called the first time to create a frame source object and a 'dummy' RTP sink object, so that the server can figure out the stream's SDP description. (This is the code "OnDemandServerMediaSubsession::sdpLines()".) Then, the server code closes these source and sink objects. This is when your source class's destructor gets called the first time. Then, later - when the first RTSP client connects - "createNewStreamSource()"+"createNewRTPSink()" will get called again. If you have set "reuseFirstSource" to True, then these will not get called again, until all clients have disconnected. At this time, your source class's destructor will get called again, and then later - when another RTSP client connects - "createNewStreamSource()"+"createNewRTPSink()" will get called once more. Etc. The bottom line is that your 'frame source' class must allow for the possibility of the object being destroyed, then recreated - perhaps several times. However, if you have set "reuseFirstSource" to True, then there will never be more than one of these objects created concurrently. I.e., the sequence will always be "constructor", "destructor", "constructor", "destructor" ..., never "constructor", "constructor". When the last RTSP client disconnects, then your 'frame source' class destructor *will* get called. (If the RTSP client did a "TEARDOWN", then your destructor will get called then. If the RTSP client disappears without doing a "TEARDOWN", then the client connection will get timed out - and your destructor will get called - 65 seconds later (assuming that you didn't change the server's "reclamationTestSeconds" parameter).) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue Apr 22 14:16:12 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Wed, 23 Apr 2014 00:16:12 +0300 Subject: [Live-devel] Limiting frame rate for a source In-Reply-To: <7B408285-B83D-4DF4-8E7B-C9038EE18319@live555.com> References: <977DC93D-F54D-499F-A02D-0B6854579280@d-pointer.com> <58058D2C-813E-4784-86CF-82D1AF689772@d-pointer.com> <7B408285-B83D-4DF4-8E7B-C9038EE18319@live555.com> Message-ID: <8E3FA700-F6EB-4D8C-A933-FBA5A8FB6AC8@d-pointer.com> On 22 apr 2014, at 23:20, Ross Finlayson wrote: >> I was hoping that there was some built in rate limitation that only caused doGetNextFrame() >> to get called when a new frame was needed. > > No, it's the data source that decides when it delivers data. There are two possible ways to do this: > > If your data source class encodes and delivers a frame immediately when "doGetNextFrame()" is called, then presumably you are also setting "fDurationInMicroseconds" before you complete delivery to the downstream object (using "FramedSource::afterGetting()"). (If you are encoding/delivering a frame immediately, but *not* setting "fDurationInMicroseconds", then "fDurationInMicroseconds" is staying at its default value of 0, and you will be delivering frames as fast as you can generate them, which is bad :-) In this case, to change the frame rate, just change the value of "fDurationInMicroseconds". No, I do not set fDurationInMicroseconds, I did not even know about it. Based on my timing it does look like the frames are sent as fast as they can be grabbed, scaled and encoded to JPEG. I assume that fDurationInMicroseconds means the duration of that single frame and if set to the equivalent of, say, 1 second then doGetNextFrame() would be called once per second or so? > If, however, you want your data source class to not encode/deliver a frame immediately, but instead wait for the encoder to make a new frame available, then - as you noted - the encoder will need to be a separate thread of control (so that you don't delay, waiting for the encoded frame to come available). In this case you would implement your 'data source' class similar to the demo code that's in "DeviceSource.cpp". Note that in this code's "doGetNextFrame()" implementation, if no frame is immediately available to be delivered, we return immediately (i.e., do not block). Instead, we rely upon a separate thread of control (the encoder thread) later 'signaling' the availability of a new frame - e.g., using "TaskScheduler::triggerEvent()". (Note that "triggerEvent()" is the *only* LIVE555 function that can be called from a non-LIVE555 event loop thread.) I have used triggerEvent() successfully when I did try to run my own Boost based event loop. It did get too messy though, but the events worked perfectly. I think I prefer to handle the rate limitation using the first method and only complicate things if necessary. Thank you for your explanation. I enjoy reading your answers on this mailing list, they are teaching me a lot. -- Jan Ekholm jan.ekholm at d-pointer.com From nambirajan.manickam at i-velozity.com Tue Apr 22 17:42:34 2014 From: nambirajan.manickam at i-velozity.com (Nambirajan) Date: Wed, 23 Apr 2014 06:12:34 +0530 Subject: [Live-devel] Regarding trick mode files In-Reply-To: References: <20140421180425.fe3cde39dea1cf6284ae78ac84c5299d.7f023c91c7.wbe@email05.secureserver.net> Message-ID: <00e101cf5e8c$f117b990$d3472cb0$@manickam@i-velozity.com> Hi Ross, Thanks for your input and sorry for the late reply. Infact the problem was related to the client STB's. We have one more clarification. The sequence of testing is as below. We took a clear MPEG2 TS content and encrypted it using Verimatrix Encryption Manager ( DRM ). Copied the encrypted content in the Live555 Media Server. Then we played the encrypted content using STB with a Verimatrix Client software in it. The contents were decrypted and playing fine from the Live555 Media Server. Step 1: Then we created a .tsx file from the same clear content and used the same file to do the trick play. Now FFW and REW was not working. But normal play was working fine. Step2: Created a .tsx file from the encrypted content. Then we used this .tsx file to do the trick play. Now also FFW and REW was not working. But normal play was working fine. We presume the .tsx file contains only I Frames. Can you please suggest us / help us understand why .tsx file was not supporting Trick mode in both scenarios ( Clear as well as encrypted .tsx file ) even though we have the Verimatrix decryption client available in the client STB. Thanks and regards, M. Nambirajan From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, April 22, 2014 9:39 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Regarding trick mode files The output ( x4, x15, x60 and x300 ) of the MPEG4 HD file plays out ok in VLC. By this I presume you meant that you used the "testMPEG2TransportStreamTrickPlay" utility to generate Transport Stream files (corresponding to various fast-forward speeds of the original), and played these files locally using VLC. The Transport Stream data in these files are exactly the same as the data that the "LIVE555 Media Server" streams when asked to 'fast-forward' at the same speed. Therefore, the problem must be in either your STB clients, or the network. I suspect that at least part of the problem is that the 'faster' streams are being sent at a higher bitrate (because they consist only of large 'key frames'). Network congestion and/or packet loss may be the cause of some of the problems that you're seeing with your network clients. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Apr 22 18:00:41 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 22 Apr 2014 18:00:41 -0700 Subject: [Live-devel] Regarding trick mode files In-Reply-To: <00e101cf5e8c$f117b990$d3472cb0$@manickam@i-velozity.com> References: <20140421180425.fe3cde39dea1cf6284ae78ac84c5299d.7f023c91c7.wbe@email05.secureserver.net> <00e101cf5e8c$f117b990$d3472cb0$@manickam@i-velozity.com> Message-ID: <0B0487D6-0B4E-4320-AA03-9BABD978BC09@live555.com> > Thanks for your input and sorry for the late reply. Infact the problem was related to the client STB?s. If that's the case, then we can't help you. Only the manufacturer of the STB can do help you with problems with their product (unless they want to contact me directly about this, in which case I'd be happy to work with them). However, as I suggested in my previous message, I suggest that you first use the "testMPEG2TransportStreamTrickPlay" utility to generate Transport Stream files that represent the effect of the server performing a fast-forward or reverse-play operation. You can then try playing these files using a media player (like VLC), or stream them to the STB (without any 'trick play'), to see if they're OK. Note, however, that the Transport Stream indexing (and thus 'trick play') operation might not work on an encrypted Transport Stream file (depending on how what parts of the MPEG data are encrypted). But again, that's something that you can test using "testMPEG2TransportStreamTrickPlay". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Wed Apr 23 03:54:19 2014 From: nambirajan.manickam at i-velozity.com (Nambirajan) Date: Wed, 23 Apr 2014 16:24:19 +0530 Subject: [Live-devel] Regarding trick mode files In-Reply-To: <0B0487D6-0B4E-4320-AA03-9BABD978BC09@live555.com> References: <20140421180425.fe3cde39dea1cf6284ae78ac84c5299d.7f023c91c7.wbe@email05.secureserver.net> <00e101cf5e8c$f117b990$d3472cb0$@manickam@i-velozity.com> <0B0487D6-0B4E-4320-AA03-9BABD978BC09@live555.com> Message-ID: <003901cf5ee2$6d6b6b90$484242b0$@manickam@i-velozity.com> Hi Ross, Thanks for your immediate response. We are using testMPEG2TransportStreamTrickPlay utility to generate Transport stream files that represent the FFW and REW operation. If we Must encrypt the transport stream then what parts of the stream should we leave it in the clear? Should we specifically leave the I-frames? Let us check with our customer ( Manufacturer of the STB ) for any direct contact and we will suggest your name to our customer. Thanks and regards, M. Nambirajan From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, April 23, 2014 6:31 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Regarding trick mode files Thanks for your input and in fact the problem was related to the client STB's. If that's the case, then we can't help you. Only the manufacturer of the STB can do help you with problems with their product (unless they want to contact me directly about this, in which case I'd be happy to work with them). However, as I suggested in my previous message, I suggest that you first use the "testMPEG2TransportStreamTrickPlay" utility to generate Transport Stream files that represent the effect of the server performing a fast-forward or reverse-play operation. You can then try playing these files using a media player (like VLC), or stream them to the STB (without any 'trick play'), to see if they're OK. Note, however, that the Transport Stream indexing (and thus 'trick play') operation might not work on an encrypted Transport Stream file (depending on how what parts of the MPEG data are encrypted). But again, that's something that you can test using "testMPEG2TransportStreamTrickPlay". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Bo.Yu at drs.com Wed Apr 23 07:07:37 2014 From: Bo.Yu at drs.com (Yu, Bo) Date: Wed, 23 Apr 2014 14:07:37 +0000 Subject: [Live-devel] socket blocking mode Message-ID: We found a problem that when we use two clients (VLC) on two PC to stream the live555 based camera (H264 video). One uses TCP mode and another uses UDP mode. When we disconnect the network cable on TCP mode PC, the UDP stream on another PC stopped at the same time. We found the problem is due to making TCP in blocking mode in sendDataOverTCP(). We reverted to older live555 (2012-05-17), this problem does not exist because there is no calling makeSocketBlocking() in sendRTPOverTCP(). I guess there is no perfect solution as we found making socket blocking mode helps H264 type video in low bandwidth, unreliable and intermittent network environment. Thanks. Bo -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 23 08:32:22 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Apr 2014 08:32:22 -0700 Subject: [Live-devel] Regarding trick mode files In-Reply-To: <003901cf5ee2$6d6b6b90$484242b0$@manickam@i-velozity.com> References: <20140421180425.fe3cde39dea1cf6284ae78ac84c5299d.7f023c91c7.wbe@email05.secureserver.net> <00e101cf5e8c$f117b990$d3472cb0$@manickam@i-velozity.com> <0B0487D6-0B4E-4320-AA03-9BABD978BC09@live555.com> <003901cf5ee2$6d6b6b90$484242b0$@manickam@i-velozity.com> Message-ID: > If we Must encrypt the transport stream then what parts of the stream should we leave it in the clear? Should we specifically leave the I-frames? The Transport Stream fields (including the PCR and PID) at the start of each 188-byte Transport Stream 'packet' must all remain unencrypted. Also, the entire contents of Transport Stream PAT and PMT 'packets' must remain unencrypted. Also, the start of video I-frames - including at least the MPEG 'system code' (or 'nal_unit_type' for H.264/H.265) - must remain unencrypted. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 23 08:50:57 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Apr 2014 08:50:57 -0700 Subject: [Live-devel] socket blocking mode In-Reply-To: References: Message-ID: > We found a problem that when we use two clients (VLC) on two PC to stream the live555 based camera (H264 video). One uses TCP mode and another uses UDP mode. > When we disconnect the network cable on TCP mode PC, the UDP stream on another PC stopped at the same time. > > We found the problem is due to making TCP in blocking mode in sendDataOverTCP(). This is important, because it fixed a serious bug. See http://lists.live555.com/pipermail/live-devel/2013-June/017137.html The blocking write to the TCP socket - which is done only in rare circumstances (when a non-blocking write of packet data succeeded in writing only part of the data) - should be very brief. If it's not, then your network is seriously congested, and your data stream is exceeding the capacity of your network. > We reverted to older live555 (2012-05-17) Old versions of the "LIVE555 Streaming Media" software are never supported. Also, a reminder that under the terms of the LGPL, you must, upon request, upgrade your product to use the latest version of the software (or else provide a way for customers to perform this upgrade themselves). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Bo.Yu at drs.com Wed Apr 23 14:47:46 2014 From: Bo.Yu at drs.com (Yu, Bo) Date: Wed, 23 Apr 2014 21:47:46 +0000 Subject: [Live-devel] socket blocking mode In-Reply-To: References: Message-ID: The problem is user can remove the cable of TCP stream laptop and it stop another client steaming UDP. Bo From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Wednesday, April 23, 2014 10:51 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] socket blocking mode We found a problem that when we use two clients (VLC) on two PC to stream the live555 based camera (H264 video). One uses TCP mode and another uses UDP mode. When we disconnect the network cable on TCP mode PC, the UDP stream on another PC stopped at the same time. We found the problem is due to making TCP in blocking mode in sendDataOverTCP(). This is important, because it fixed a serious bug. See http://lists.live555.com/pipermail/live-devel/2013-June/017137.html The blocking write to the TCP socket - which is done only in rare circumstances (when a non-blocking write of packet data succeeded in writing only part of the data) - should be very brief. If it's not, then your network is seriously congested, and your data stream is exceeding the capacity of your network. We reverted to older live555 (2012-05-17) Old versions of the "LIVE555 Streaming Media" software are never supported. Also, a reminder that under the terms of the LGPL, you must, upon request, upgrade your product to use the latest version of the software (or else provide a way for customers to perform this upgrade themselves). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Apr 23 16:16:59 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 23 Apr 2014 16:16:59 -0700 Subject: [Live-devel] socket blocking mode In-Reply-To: References: Message-ID: <10E70E84-4398-4774-9610-7CBFDEC8013F@live555.com> > The problem is user can remove the cable of TCP stream laptop and it stop another client steaming UDP. Too bad! As I said before, the write to TCP *has to* be atomic. If the write partly succeeds (i.e., if the initial, non-blocking call to "send()" returned a data count less than that provided (but >0)), then the remaining write *has to* succeed - otherwise the receiver will get inconsistent data; something that it doesn't expect to see over a TCP connection. That's why the blocking write is done. (Once again, this blocking write is done *only* in rare circumstances: when the initial, non-blocking write of packet data succeeded in writing only part of the data. ) Unfortunately there's no way for the code (or the OS, probably) to distinguish between the case where the write would block for only a short period of time (due to temporary network congestion), or indefinitely (e.g., due to someone unplugging a cable). The only way to prevent this is to reconfigure your server so that it does not support requests to stream RTP/RTCP-over-TCP. I have just released a new version - 2014.04.23 - that adds a new function "RTSPServer::disableStreamingRTPOverTCP()" that you can call on your newly-created "RTSPServer" object to reject client requests to stream RTP/RTCP-over-TCP. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From goran.ambardziev at gmail.com Tue Apr 22 15:03:08 2014 From: goran.ambardziev at gmail.com (Goran Ambardziev) Date: Wed, 23 Apr 2014 00:03:08 +0200 Subject: [Live-devel] AAC/H264 Closing Down Message-ID: <5356E71C.5030500@gmail.com> I still have problems with the closing sequence. You suggested that I should: > Try closing the 'sinks' before the 'sources'. I did that. I still have problems. I close it like this: Sinks -> StopPlaying Close Sinks Close Sources Close RTCP Inctances... and this is where now I get an error. After I successfully close 'video' rtcp instance, the application breaks when closing 'audio' rtcp instance, and it breaks in RTCPInstance Destructor, RTCPInstance::~RTCPInstance() on the line: delete fKnownMembers; On this line VS says: "HEAP: Free Heap block 3043560 modified at 304361c after it was freed" To eliminate any possible leaks that I might have in my code, I even turned off generating frames and samples and just tried it to start and stop the live555 objects, and I still have the problem. Can you help me on this one? Thanks. From Richard.CK.Chiu at hkcsl.com Thu Apr 24 00:45:40 2014 From: Richard.CK.Chiu at hkcsl.com (Chiu, Richard CK) Date: Thu, 24 Apr 2014 07:45:40 +0000 Subject: [Live-devel] streaming H.265 Message-ID: <03B818FD871B0849985EF787DADF69D7630E1798@wsmbs05> Dear Live555 support We would like to streaming the .265 (surfing.265), using the live555 and VLC. We compile the latest source of live555, but the streaming failed However, it work fine using .264 sample, or VLC can play the file directly, i.e. surfing.265 Any idea? [cid:image002.png at 01CF5FD4.2036EE10] Richard Chiu Senior Engineer | Mobile Internet & Video Services Mobile: +852 9214 1695 | Fax: +852 29625504 20/F, Manhattan Place, 23 Wang Tai Road, Kowloon Bay, Kowloon www.hkcsl.com [cid:clogo.jpg] This communication (and any attachment) may contain information confidential to CSL Limited or information that is legally privileged, proprietary or otherwise protected by law. Privilege is not waived by the transmission to one or more parties in error. If you are not the intended recipient, you must not keep, forward, copy, use, save or rely on this communication and any such action is unauthorised and prohibited. If you have received this communication in error, please reply to this e-mail to notify the sender of its incorrect delivery, and then delete both it and your reply. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.jpg Type: image/jpeg Size: 9201 bytes Desc: image001.jpg URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 4197 bytes Desc: image002.png URL: From finlayson at live555.com Thu Apr 24 01:37:28 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Apr 2014 01:37:28 -0700 Subject: [Live-devel] streaming H.265 In-Reply-To: <03B818FD871B0849985EF787DADF69D7630E1798@wsmbs05> References: <03B818FD871B0849985EF787DADF69D7630E1798@wsmbs05> Message-ID: <47DBCA57-6822-4954-9175-2DE4D3186BF0@live555.com> > We would like to streaming the .265 (surfing.265), using the live555 and VLC. > > We compile the latest source of live555, but the streaming failed > > However, it work fine using .264 sample, or VLC can play the file directly, i.e. surfing.265 > > Any idea? Yes, to play H.265 RTP streams, VLC needs to be upgraded (recompiled) to use a recent version of the "LIVE555 Streaming Media" code. (VLC uses our code when accessing RTSP/RTP streams.) Unfortunately the prebuilt binary versions of VLC - available from videolan.org - still use an old version of our code, so you'll need to build your own version of VLC - with a recent version of the LIVE555 libraries - in order for it to be able to play H.265 video RTSP/RTP streams. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Richard.CK.Chiu at hkcsl.com Thu Apr 24 02:37:18 2014 From: Richard.CK.Chiu at hkcsl.com (Chiu, Richard CK) Date: Thu, 24 Apr 2014 09:37:18 +0000 Subject: [Live-devel] streaming H.265 In-Reply-To: <47DBCA57-6822-4954-9175-2DE4D3186BF0@live555.com> References: <03B818FD871B0849985EF787DADF69D7630E1798@wsmbs05> <47DBCA57-6822-4954-9175-2DE4D3186BF0@live555.com> Message-ID: <03B818FD871B0849985EF787DADF69D7630E1818@wsmbs05> Dear Ross Thank for prompt reply. Would u help advise other pre-built player which we can test the .265 streaming? Richard Chiu Senior Engineer | Mobile Internet & Video Services Mobile: +852 9214 1695 | Fax: +852 29625504 20/F, Manhattan Place, 23 Wang Tai Road, Kowloon Bay, Kowloon www.hkcsl.com [cid:clogo.jpg] From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, April 24, 2014 4:37 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] streaming H.265 We would like to streaming the .265 (surfing.265), using the live555 and VLC. We compile the latest source of live555, but the streaming failed However, it work fine using .264 sample, or VLC can play the file directly, i.e. surfing.265 Any idea? Yes, to play H.265 RTP streams, VLC needs to be upgraded (recompiled) to use a recent version of the "LIVE555 Streaming Media" code. (VLC uses our code when accessing RTSP/RTP streams.) Unfortunately the prebuilt binary versions of VLC - available from videolan.org - still use an old version of our code, so you'll need to build your own version of VLC - with a recent version of the LIVE555 libraries - in order for it to be able to play H.265 video RTSP/RTP streams. Ross Finlayson Live Networks, Inc. http://www.live555.com/ This communication (and any attachment) may contain information confidential to CSL Limited or information that is legally privileged, proprietary or otherwise protected by law. Privilege is not waived by the transmission to one or more parties in error. If you are not the intended recipient, you must not keep, forward, copy, use, save or rely on this communication and any such action is unauthorised and prohibited. If you have received this communication in error, please reply to this e-mail to notify the sender of its incorrect delivery, and then delete both it and your reply. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.jpg Type: image/jpeg Size: 9201 bytes Desc: image001.jpg URL: From warren at etr-usa.com Thu Apr 24 06:02:35 2014 From: warren at etr-usa.com (Warren Young) Date: Thu, 24 Apr 2014 07:02:35 -0600 Subject: [Live-devel] AAC/H264 Closing Down In-Reply-To: <5356E71C.5030500@gmail.com> References: <5356E71C.5030500@gmail.com> Message-ID: <53590B6B.4030800@etr-usa.com> On 4/22/2014 16:03, Goran Ambardziev wrote: > On this line VS says: "HEAP: Free Heap block 3043560 modified at 304361c > after it was freed" What does the stack trace look like when this happens? > To eliminate any possible leaks that I might have in my code, It sounds like the opposite of a leak, where you're either double-free'ing something or freeing something too early. Can you run the app on Linux? If your app has a GUI, you should be able to strip out the core of the app so that it will build and run on Linux without the GUI. (If not, it means you have application logic mixed in with your display code, which is bad design from the start.) I ask because Valgrind will probably tell you what's going wrong. From david.bueno at scati.com Thu Apr 24 07:24:22 2014 From: david.bueno at scati.com (David Bueno Monge) Date: Thu, 24 Apr 2014 16:24:22 +0200 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients In-Reply-To: References: Message-ID: Thanks for your reply, and sorry it took me so long to write back... Probably my data source implementation has some problems, but I have tested the testOnDemandRTSPServer program with different kind of videos and this is what i got: With large resolution videos (1920x1080) the server starts showing critical problems when about 4 or 5 clients request the same video (Boolean reuseFirstSource = True). These critical problems mean the streaming gets freezed. With less resolution (1280x960), the number of clients connected can be higher, but never more than 10 clients asking for the same video. If I set reuseFirstSource = False, the performance rises a little, but meaning only that few more clients can connect to the server before performance troubles begin. I am using VLC as client, and stremaing H264. The %CPU and memory used still are very low. Is there anything that could increase the performance?? I cannot use sources with less resolution... I have worked for some time with the client side of the Live555 library, connecting with large numbers of live sources wihtout these kind of problems... Thanks. 2014-04-15 17:09 GMT+02:00 Ross Finlayson : > I have developed a RTSP server based on the example found > in testOnDemandRTSPServer.cpp. The application can stream data from files > and also from different live sources (to do that i'm using an RTSP client > which i developed some time ago, also based on the Live555 library). > > > FYI, you could also use our "testRTSPClient" demo application for this. > It can stream from multiple "rtsp://" URLs concurrently. > > > I have implemented my own "FramedSource" subclass and differents > subclasses of "OnDemandServerMediaSubsession" depending on the video > encoding. > > The application works fine when just one client is connected to the > server, even with high resolution videos. However, when more clients > connect to the different sessions added to the RTSPServer instance of my > server, the performance decreases a lot. > > I am talking that just with two clients connected, the rate at which the > server stream the input of two different sources is really low. > > There is no network problem, this happens also in a local scenario (server > and clients running in the same machine). > > The %CPU and memory used by the server are very low also, so, the problem > does not come from the machine running the server. > > > That's strange. I suspect that the problem is (somehow) related to your > 'data source' implementation - i.e., your "FramedSource" subclass. > > What happens when you don't just 'base' your server on > "testOnDemandRTSPServer", but actually use the (original, unmodified) > "testOnDemandRTSPServer" code? What happens when two clients request the > same stream (from a file)? And do things change at all when you change > your server to use a single input source for all concurrent clients - i.e., > if you change line 29 of "testOnDemandRTSPServer.cpp" to > Boolean reuseFirstSource = True; > ? > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- *David Bueno Monge*Software Engineer Skype dbueno_scati *------------------------------*[image: http://www.scati.com] T +34 902 116 095 F +34 976 466 580 *------------------------------* Bari, 23 Plataforma Log?stica PLAZA 50.197 Zaragoza (Spain) *www.scati.com * *------------------------------* *Disclaimer:* This e-mail (including any attached documents) is proprietary and confidential and may contain legally privileged information. It is intended for the named recipient(s) only. If you are not the intended recipient, you may not review, retain, copy or distribute this message, and we kindly ask you to notify the sender by e-mail immediately and delete this message from your system. *Please consider your environmental responsibility before printing this e-mail.* -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 9971 bytes Desc: not available URL: From finlayson at live555.com Thu Apr 24 07:34:59 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Apr 2014 07:34:59 -0700 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients In-Reply-To: References: Message-ID: <1F76733C-4F96-447D-93CB-B5EA3A34933A@live555.com> Because your server's CPU utilization remains low, there seems to be only two possibilities: 1/ You are approaching the capacity of your network. But in your earlier message, you said that the problem also happens when you run all clients on the same computer as the server, which suggests instead: 2/ You are running into a limit - imposed by your OS - in the number of sockets that can be open (by the server application) at any one time. See http://www.live555.com/liveMedia/faq.html#scalability Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 24 07:50:35 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Apr 2014 07:50:35 -0700 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients In-Reply-To: References: Message-ID: Also, I hope your computer's OS is Unix-based (e.g., Linux or BSD), and not Windows. Windows is notorious for scalability problems. It is simply not a serious OS for running servers (especially high-performance servers). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.bueno at scati.com Thu Apr 24 08:41:26 2014 From: david.bueno at scati.com (David Bueno Monge) Date: Thu, 24 Apr 2014 17:41:26 +0200 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients In-Reply-To: <1F76733C-4F96-447D-93CB-B5EA3A34933A@live555.com> References: <1F76733C-4F96-447D-93CB-B5EA3A34933A@live555.com> Message-ID: None of these is the problem, as you have said, the CPU utilization remains low, and there is no network problem as i am running server and cliente on the same machine. The number of sockets cannot be a problem, because with high resolution videos, only with 5 or 6 clients connected, the performance decreases a lot. You can continue adding clients, but without a good performance... Nevertheless, I have checked what you said, and the problem continues... 2014-04-24 16:34 GMT+02:00 Ross Finlayson : > Because your server's CPU utilization remains low, there seems to be only > two possibilities: > 1/ You are approaching the capacity of your network. But in your earlier > message, you said that the problem also happens when you run all clients on > the same computer as the server, which suggests instead: > 2/ You are running into a limit - imposed by your OS - in the number of > sockets that can be open (by the server application) at any one time. See > http://www.live555.com/liveMedia/faq.html#scalability > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- *David Bueno Monge*Software Engineer Skype dbueno_scati *------------------------------*[image: http://www.scati.com] T +34 902 116 095 F +34 976 466 580 *------------------------------* Bari, 23 Plataforma Log?stica PLAZA 50.197 Zaragoza (Spain) *www.scati.com * *------------------------------* *Disclaimer:* This e-mail (including any attached documents) is proprietary and confidential and may contain legally privileged information. It is intended for the named recipient(s) only. If you are not the intended recipient, you may not review, retain, copy or distribute this message, and we kindly ask you to notify the sender by e-mail immediately and delete this message from your system. *Please consider your environmental responsibility before printing this e-mail.* -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 9971 bytes Desc: not available URL: From finlayson at live555.com Thu Apr 24 12:02:06 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Apr 2014 12:02:06 -0700 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients In-Reply-To: References: <1F76733C-4F96-447D-93CB-B5EA3A34933A@live555.com> Message-ID: <3CACBFAA-A9A8-4C7D-B10E-6AD98497A211@live555.com> > None of these is the problem, as you have said, the CPU utilization remains low, and there is no network problem as i am running server and cliente on the same machine. > > The number of sockets cannot be a problem, because with high resolution videos, only with 5 or 6 clients connected, the performance decreases a lot. You can continue adding clients, but without a good performance... Are these clients actually decoding and displaying the video (rather than just receiving it)? If so, then perhaps your bottleneck is there - in whatever is doing the decoding? E.g., do you have a separate GPU or something doing the decoding? One way to test this is to use "openRTSP" as your client. "openRTSP" just receives data (and outputs it to a file), but does not decode it. Try running "openRTSP" multiple times (you may wish to use the "-F " option to give each output file a different filename prefix, to distinguish them). Then try playing the received files using a media player. (If your video is H.264, and you're using VLC as your media player, then you'll need to rename each video file to have a ".h264" filename suffix.) This will tell you whether or not data loss is happening. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From elpot_15 at hotmail.com Wed Apr 23 17:16:45 2014 From: elpot_15 at hotmail.com (Luis Lorenzo Bill Clark) Date: Wed, 23 Apr 2014 17:16:45 -0700 Subject: [Live-devel] Can't record video + audio from h.264 IP camera In-Reply-To: <10E70E84-4398-4774-9610-7CBFDEC8013F@live555.com> References: , , , <10E70E84-4398-4774-9610-7CBFDEC8013F@live555.com> Message-ID: Hi, I am trying to record audio/video (mainly audio) from 4 IP cameras that output h264. But I can get the audio part of the recording, and only the video shows up in the files. Actually, the video is not as important as the audio.Here is one of the commands I am using: ./openRTSP -d 5 -4 -f 30 -w 640 -h 480 -4 rtsp://192.168.1.6/MediaInput/h264 > hello.mp4 I've tried this with .mov and .avi and nothing good happens. What am I missing? Am I using the wrong format? Missing some decoding arguments? I need the audio from all 4 cameras at the same time, so I'm trying to put his into a python script to do it.If you have another suggestion/program/strategy for recording more than 1 audio/video file at the same time, then let me know. Thanks in advanced!!!!!! -------------- next part -------------- An HTML attachment was scrubbed... URL: From goran.ambardziev at gmail.com Thu Apr 24 09:14:15 2014 From: goran.ambardziev at gmail.com (Goran Ambardziev) Date: Thu, 24 Apr 2014 18:14:15 +0200 Subject: [Live-devel] AAC/H264 Closing Down Message-ID: <53593857.5090104@gmail.com> Hello Warren and thanks. > >/ On this line VS says: "HEAP: Free Heap block 3043560 modified at 304361c > />/ after it was freed" > / > What does the stack trace look like when this happens? Here's the call stack: MFVideoGenerator Test.exe!_free_base(void * pBlock=0x039505f8) Line 109 + 0x12 bytes C MFVideoGenerator Test.exe!_free_dbg_nolock(void * pUserData=0x03950618, int nBlockUse=0x00000001) Line 1426 + 0x9 bytes C++ MFVideoGenerator Test.exe!_free_dbg(void * pUserData=0x03950618, int nBlockUse=0x00000001) Line 1258 + 0xd bytes C++ MFVideoGenerator Test.exe!operator delete(void * p=0x03950618) Line 373 + 0xb bytes C++ MFVideoGenerator Test.exe!BasicHashTable::`scalar deleting destructor'() + 0x3c bytes C++ MFVideoGenerator Test.exe!RTCPMemberDatabase::~RTCPMemberDatabase() Line 35 + 0x37 bytes C++ MFVideoGenerator Test.exe!RTCPMemberDatabase::`scalar deleting destructor'() + 0x2b bytes C++ MFVideoGenerator Test.exe!RTCPInstance::~RTCPInstance() Line 194 + 0x3a bytes C++ MFVideoGenerator Test.exe!RTCPInstance::`scalar deleting destructor'() + 0x2b bytes C++ MFVideoGenerator Test.exe!MediaLookupTable::remove(const char * name=0x0398bcb8) Line 151 + 0x35 bytes C++ MFVideoGenerator Test.exe!Medium::close(UsageEnvironment & env={...}, const char * name=0x0398bcb8) Line 54 C++ MFVideoGenerator Test.exe!Medium::close(Medium * medium=0x0398bcb0) Line 59 + 0x17 bytes C++ MFVideoGenerator Test.exe!CMulticastH264Streamer::StreamerThread() Line 232 + 0xc bytes C++ > It sounds like the opposite of a leak, where you're either > double-free'ing something or freeing something too early. I posted the code in the first post in this thread. Basically, everything is in one thread (StreamerThread from the stack) and for testing I disabled any methods that are called from other threads from outside (basically, threads that add H264/AAC data to the sources). > Can you run the app on Linux? If your app has a GUI, you should be able > to strip out the core of the app so that it will build and run on Linux > without the GUI. (If not, it means you have application logic mixed in > with your display code, which is bad design from the start.) I have couple of abstraction layers in separate libs until the code is used by GUI (which is there (the GUI) for testing purposes only). live555 is wrapped in separate lib, then one MediaFoundation component that uses it, then MediaFoundation component(lib) that creates and run the stream flow etc. Unfortunately, I cannot run the app on linux, because I do not have any developer experience with it, nor can I run MediaFoundation component for encoding H264/AAC in linux. Anyway, if you could look, here's my code from StreamerThread // Begin by setting up our usage environment: // Create 'groupsocks' for RTP and RTCP: // Create RTSP server // Create server media session // Video stream groupsock's RTCPInstance* rtcp = NULL; // Add H264 stream if( m_settings.VideoStream ) { // Create a 'H264 Video RTP' sink from the RTP 'groupsock': // Create (and start) a 'RTCP instance' for this RTP sink: sms->addSubsession( PassiveServerMediaSubsession::createNew( *m_pRtpVideoSink, rtcp ) ); } // Audio stream groupsock's RTCPInstance* rtcpAudio = NULL; // Add AAC Stream if( m_settings.AudioStream ) { // Create (and start) a 'RTCP instance' for this RTP sink: sms->addSubsession( PassiveServerMediaSubsession::createNew( *m_pRtpAudioSink, rtcpAudio ) ); } rtspServer->addServerMediaSession( sms ); // Get the streaming URL m_streamingURL = rtspServer->rtspURL( sms ); *m_pUsageEnv << "Play this stream using the URL \"" << m_streamingURL << "\"\n"; if( m_settings.VideoStream ) { m_pH264FramedSource = CH264FramedSource::createNew( *m_pUsageEnv, 0, 40000 ); FramedSource* framedSource = m_pH264FramedSource; m_pH264DiscreteFramer = H264VideoStreamDiscreteFramer::createNew( *m_pUsageEnv, framedSource ); m_pRtpVideoSink->startPlaying( *m_pH264DiscreteFramer, afterPlayback, this ); } if( m_settings.AudioStream ) { timeval pTime; if( m_settings.VideoStream ) { m_pH264FramedSource->GetPresentationTime(pTime); } m_pAacFrameedSource = AACFramedSource::createNew( *m_pUsageEnv, 0, 40000, pTime ); m_pRtpAudioSink->startPlaying( *m_pAacFrameedSource, afterPlayback, this ); } m_pUsageEnv->taskScheduler().doEventLoop( &m_doneFlag ); // does not return // Close everything // if(m_pRtpVideoSink != NULL) { m_pRtpVideoSink->stopPlaying(); } if(m_pRtpAudioSink != NULL) { m_pRtpAudioSink->stopPlaying(); } Medium::close( m_pRtpVideoSink ); Medium::close( m_pRtpAudioSink ); Medium::close( m_pH264FramedSource ); Medium::close( m_pAacFrameedSource ); Medium::close(rtcp); Medium::close(rtcpAudio); rtpVideoGroupsock->removeAllDestinations(); rtcpVideoGroupsock->removeAllDestinations(); delete rtpVideoGroupsock; delete rtcpVideoGroupsock; rtpAudioGroupsock->removeAllDestinations(); rtcpAudioGroupsock->removeAllDestinations(); delete rtpAudioGroupsock; delete rtcpAudioGroupsock; Medium::close(rtspServer); m_pUsageEnv->reclaim(); delete scheduler; SetEvent( m_hCloseEvt ); Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Apr 24 17:52:00 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 24 Apr 2014 17:52:00 -0700 Subject: [Live-devel] Can't record video + audio from h.264 IP camera In-Reply-To: References: , , , <10E70E84-4398-4774-9610-7CBFDEC8013F@live555.com> Message-ID: > I am trying to record audio/video (mainly audio) from 4 IP cameras that output h264. But I can get the audio part of the recording, and only the video shows up in the files. Actually, the video is not as important as the audio. > Here is one of the commands I am using: > > ./openRTSP -d 5 -4 -f 30 -w 640 -h 480 -4 rtsp://192.168.1.6/MediaInput/h264 > hello.mp4 > > > I've tried this with .mov and .avi and nothing good happens. You didn't say what kind of audio (i.e., what audio codec) this is. That might have affected my answer. But the only thing I can suggest right now is to try receiving/recording the audio stream only - i.e., without the video stream. I.e., try adding the "-a" option. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Richard.CK.Chiu at hkcsl.com Thu Apr 24 18:23:54 2014 From: Richard.CK.Chiu at hkcsl.com (Chiu, Richard CK) Date: Fri, 25 Apr 2014 01:23:54 +0000 Subject: [Live-devel] streaming H.265 In-Reply-To: <47DBCA57-6822-4954-9175-2DE4D3186BF0@live555.com> References: <03B818FD871B0849985EF787DADF69D7630E1798@wsmbs05> <47DBCA57-6822-4954-9175-2DE4D3186BF0@live555.com> Message-ID: <03B818FD871B0849985EF787DADF69D7630E18C0@wsmbs05> Dear Ross Thank for prompt reply. Would u help advise other pre-built player (either in window or android) which we can test the .265 streaming? Richard Chiu Senior Engineer | Mobile Internet & Video Services Mobile: +852 9214 1695 | Fax: +852 29625504 20/F, Manhattan Place, 23 Wang Tai Road, Kowloon Bay, Kowloon www.hkcsl.com [cid:clogo.jpg] From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, April 24, 2014 4:37 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] streaming H.265 We would like to streaming the .265 (surfing.265), using the live555 and VLC. We compile the latest source of live555, but the streaming failed However, it work fine using .264 sample, or VLC can play the file directly, i.e. surfing.265 Any idea? Yes, to play H.265 RTP streams, VLC needs to be upgraded (recompiled) to use a recent version of the "LIVE555 Streaming Media" code. (VLC uses our code when accessing RTSP/RTP streams.) Unfortunately the prebuilt binary versions of VLC - available from videolan.org - still use an old version of our code, so you'll need to build your own version of VLC - with a recent version of the LIVE555 libraries - in order for it to be able to play H.265 video RTSP/RTP streams. Ross Finlayson Live Networks, Inc. http://www.live555.com/ This communication (and any attachment) may contain information confidential to CSL Limited or information that is legally privileged, proprietary or otherwise protected by law. Privilege is not waived by the transmission to one or more parties in error. If you are not the intended recipient, you must not keep, forward, copy, use, save or rely on this communication and any such action is unauthorised and prohibited. If you have received this communication in error, please reply to this e-mail to notify the sender of its incorrect delivery, and then delete both it and your reply. Thank you. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.jpg Type: image/jpeg Size: 9201 bytes Desc: image001.jpg URL: From david.bueno at scati.com Fri Apr 25 04:28:55 2014 From: david.bueno at scati.com (David Bueno Monge) Date: Fri, 25 Apr 2014 13:28:55 +0200 Subject: [Live-devel] low perfomance in "onDemandRTSPServer" when connecting with multiple clients In-Reply-To: <3CACBFAA-A9A8-4C7D-B10E-6AD98497A211@live555.com> References: <1F76733C-4F96-447D-93CB-B5EA3A34933A@live555.com> <3CACBFAA-A9A8-4C7D-B10E-6AD98497A211@live555.com> Message-ID: I have tested the server using openRTSP as client instead of VLC, and that was the problem. VLC get stuck when visualizing multiple videos at the same time. The perfomance is quite good if no decoding is done. I have also tested the live555 server with our corporative app and it can manage lots of connexions wihtout problem, even with our app decoding and visualizing the sources in the same machine, basically what i could not do whit VLC. I will continue testing the server and also trying to develop a better "FramedSource" implementation in order to stream data from live sources. Thanks you very much for your help 2014-04-24 21:02 GMT+02:00 Ross Finlayson : > None of these is the problem, as you have said, the CPU utilization > remains low, and there is no network problem as i am running server and > cliente on the same machine. > > The number of sockets cannot be a problem, because with high resolution > videos, only with 5 or 6 clients connected, the performance decreases a > lot. You can continue adding clients, but without a good performance... > > > Are these clients actually decoding and displaying the video (rather than > just receiving it)? If so, then perhaps your bottleneck is there - in > whatever is doing the decoding? E.g., do you have a separate GPU or > something doing the decoding? > > One way to test this is to use "openRTSP" as your client. "openRTSP" just > receives data (and outputs it to a file), but does not decode it. Try > running "openRTSP" multiple times (you may wish to use the "-F > " option to give each output file a different filename > prefix, to distinguish them). Then try playing the received files using a > media player. (If your video is H.264, and you're using VLC as your media > player, then you'll need to rename each video file to have a ".h264" > filename suffix.) This will tell you whether or not data loss is happening. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- *David Bueno Monge*Software Engineer Skype dbueno_scati *------------------------------*[image: http://www.scati.com] T +34 902 116 095 F +34 976 466 580 *------------------------------* Bari, 23 Plataforma Log?stica PLAZA 50.197 Zaragoza (Spain) *www.scati.com * *------------------------------* *Disclaimer:* This e-mail (including any attached documents) is proprietary and confidential and may contain legally privileged information. It is intended for the named recipient(s) only. If you are not the intended recipient, you may not review, retain, copy or distribute this message, and we kindly ask you to notify the sender by e-mail immediately and delete this message from your system. *Please consider your environmental responsibility before printing this e-mail.* -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 9971 bytes Desc: not available URL: From Martin.Bene at icomedias.com Sat Apr 26 00:47:15 2014 From: Martin.Bene at icomedias.com (Martin Bene) Date: Sat, 26 Apr 2014 07:47:15 +0000 Subject: [Live-devel] implementation of digest authentication in RTSPClient.cpp / DigestAuthentication.cpp Message-ID: <4b32cb16ee224695bd1c0718e72a8c48@exchange13.ad.icomedias.com> Hi, I?m running into issues trying to receive streams from IP Cameras requiring digest authentication. It looks like the current implementation of digest authentication is based on RFC 2069 which was obsoleted 2617 in June 1999. Here?s an example authentication header returned by one of my cameras: WWW-Authenticate: Digest realm="NC-336PW-HD-1080P", qop="auth", nonce="21a6feb2994257caa17adcf5aa80dd5f", opaque="5ccc069c403ebaf9f0171e9517f40e41", algorithm="MD5", stale="FALSE" This header fails to match the statement used to extract digest authentication parameters from the WWW-Authenticate header: if (sscanf(paramsStr, "Digest realm=\"%[^\"]\", nonce=\"%[^\"]\"", realm, nonce) == 2) changing the parsing so that realm and nonce get extracted leads to successfull auth with my camera. Would a patch to the extraction mechanism be acceptable? As a 2nd step it would probabl be a good idea to add support for opaque, client nonce, nonce-count and qop as well. Thanks for any feedback [http://transfer.icomedias.com/icomedias-email-banner/icomedias-sharepoint2013.png]icomedias ist Microsoft Partner of the Year 2013 Wir wurden aus rund 5.000 Partnern in ?sterreich zum Microsoft Partner of the Year gew?hlt. Der Preis wurde im Rahmen der World Partner Conference in Houston/Texas im Juli im Beisein der Gesch?ftsf?hrung von Microsoft ?bergeben. Uns freut ganz besonders die Begr?ndung: With solutions [?] Enterprise Social Intranet environments based on Microsoft SharePoint 2013, icomedias has proven its technical and UX design skills [?] and its drive to always be the first on the market with its specialized solutions. Mehr dazu finden Sie hier: http://go.icomedias.com/1000.0260 icomedias.com | icomedias? Gruppe | icomedias GmbH | Entenplatz 1 | 8020 Graz. FB: LG.ZRS Graz FN 217 305t, UID ATU 5328 0701, GF/CEO Christian Ekhart. Disclaimer.icomedias Diese Nachricht ist vertraulich und nur f?r den beabsichtigten Empf?nger bestimmt. Sie dr?ckt die pers?nliche Meinung des Schreibers aus und ist keine Stellungnahme des jeweiligen Unternehmens. F?r dieses kann nur ein Gesch?ftsf?hrer in firmenm??iger Zeichnung mit Unterschrift in Schriftform rechtsverbindlich sprechen. Eine m?gliche digitale Signatur der Nachricht dient nur der Sicherstellung der Unversehrtheit, ordnet aber weder Unternehmen noch Personen zu und stellt keine Unterschrift dar. Irrtum und Fehler sind vorbehalten. From finlayson at live555.com Sat Apr 26 01:51:22 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Apr 2014 01:51:22 -0700 Subject: [Live-devel] implementation of digest authentication in RTSPClient.cpp / DigestAuthentication.cpp In-Reply-To: <4b32cb16ee224695bd1c0718e72a8c48@exchange13.ad.icomedias.com> References: <4b32cb16ee224695bd1c0718e72a8c48@exchange13.ad.icomedias.com> Message-ID: <45A6C580-CD0C-44FF-B49E-97538DCE55B7@live555.com> This issue has come up a couple of times in the past - usually with 'Vivotek' IP cameras. It turns out that it's the IP camera - not the "LIVE555 Streaming Media" software - that's non-standard. See http://lists.live555.com/pipermail/live-devel/2012-March/014839.html Please tell your IP camera manufacturer to remove the qop="auth", string from their "WWW-Authenticate:" strings, and thereby properly conform to the RTSP standard. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vanessa.chodaton at etu.univ-nantes.fr Sat Apr 26 04:42:31 2014 From: vanessa.chodaton at etu.univ-nantes.fr (Vanessa CHODATON) Date: Sat, 26 Apr 2014 13:42:31 +0200 (CEST) Subject: [Live-devel] RTP source filter Message-ID: Good Morning, I am student at polytech Nantes (France) and for my university project, I have to design a source filter for RTP protocol only, with LIVE555 library. But I have trouble finding the right functions to achieve it since I have not been able to find documentation on LIVE555's functions Please can you help me finding the useful functions and give me some ideas for the achievement's process. I already have a source filter code for UDP protocol which I understand well and I will edit it to achieve for RTP. Thank you for your attention Regards, Vanessa Chodaton From Martin.Bene at icomedias.com Sat Apr 26 07:01:30 2014 From: Martin.Bene at icomedias.com (Martin Bene) Date: Sat, 26 Apr 2014 14:01:30 +0000 Subject: [Live-devel] implementation of digest authentication in RTSPClient.cpp / DigestAuthentication.cpp In-Reply-To: <45A6C580-CD0C-44FF-B49E-97538DCE55B7@live555.com> References: <4b32cb16ee224695bd1c0718e72a8c48@exchange13.ad.icomedias.com> <45A6C580-CD0C-44FF-B49E-97538DCE55B7@live555.com> Message-ID: <83aa722b808a4848aca3bfe2063ded93@exchange13.ad.icomedias.com> >It turns out that it's the IP camera - not the "LIVE555 Streaming Media" software - that's non-standard. See >http://lists.live555.com/pipermail/live-devel/2012-March/014839.html > >Please tell your IP camera manufacturer to remove the >qop="auth", >string from their "WWW-Authenticate:" strings, and thereby properly conform to the RTSP standard. Thanks for pointing that out, so qop should indeed not be present on the WWW-Authenticate header as per RFC 2069. Given the statement in RFC 2617 that it's intended to serve as a replacement for 2069, and that RFC2069 states " Obsoleted by: 2617" I can certainly see why camera implementations of rtsp digest authentication can end up implementing the newer standard. A way of parsing the WWW-Authenticate header that can handle any order of attributes and can ignore presence of additional attributes would seem to be preferable. Let me know if such a modification would be acceptable. Thanks, Martin [http://transfer.icomedias.com/icomedias-email-banner/icomedias-sharepoint2013.png]icomedias ist Microsoft Partner of the Year 2013 Wir wurden aus rund 5.000 Partnern in ?sterreich zum Microsoft Partner of the Year gew?hlt. Der Preis wurde im Rahmen der World Partner Conference in Houston/Texas im Juli im Beisein der Gesch?ftsf?hrung von Microsoft ?bergeben. Uns freut ganz besonders die Begr?ndung: With solutions [?] Enterprise Social Intranet environments based on Microsoft SharePoint 2013, icomedias has proven its technical and UX design skills [?] and its drive to always be the first on the market with its specialized solutions. Mehr dazu finden Sie hier: http://go.icomedias.com/1000.0260 icomedias.com | icomedias? Gruppe | icomedias GmbH | Entenplatz 1 | 8020 Graz. FB: LG.ZRS Graz FN 217 305t, UID ATU 5328 0701, GF/CEO Christian Ekhart. Disclaimer.icomedias Diese Nachricht ist vertraulich und nur f?r den beabsichtigten Empf?nger bestimmt. Sie dr?ckt die pers?nliche Meinung des Schreibers aus und ist keine Stellungnahme des jeweiligen Unternehmens. F?r dieses kann nur ein Gesch?ftsf?hrer in firmenm??iger Zeichnung mit Unterschrift in Schriftform rechtsverbindlich sprechen. Eine m?gliche digitale Signatur der Nachricht dient nur der Sicherstellung der Unversehrtheit, ordnet aber weder Unternehmen noch Personen zu und stellt keine Unterschrift dar. Irrtum und Fehler sind vorbehalten. From finlayson at live555.com Sat Apr 26 07:09:36 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Apr 2014 07:09:36 -0700 Subject: [Live-devel] RTP source filter In-Reply-To: References: Message-ID: <6C318046-B64B-4C68-9062-9825D02E9489@live555.com> > I am student at polytech Nantes (France) and for my university project, > I have to design a source filter for RTP protocol only, with LIVE555 > library. By "source filter", do you mean "a way to accept or reject incoming RTP packets, based on their IP source (i.e., 'from') address"? If so, then you cannot do this with the "LIVE555 Streaming Media" software, because the IP source address of incoming RTP packets is not made available to receivers. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat Apr 26 07:19:58 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 26 Apr 2014 07:19:58 -0700 Subject: [Live-devel] implementation of digest authentication in RTSPClient.cpp / DigestAuthentication.cpp In-Reply-To: <83aa722b808a4848aca3bfe2063ded93@exchange13.ad.icomedias.com> References: <4b32cb16ee224695bd1c0718e72a8c48@exchange13.ad.icomedias.com> <45A6C580-CD0C-44FF-B49E-97538DCE55B7@live555.com> <83aa722b808a4848aca3bfe2063ded93@exchange13.ad.icomedias.com> Message-ID: > Let me know if such a modification would be acceptable. No - because our code already complies with the standard, but your IP camera does not. You need to contact the manufacturer of your IP camera, telling them to fix their cameras (e.g., via a firmware upgrade) to comply with the standard - and therefore be compatible with the large installed base (a much larger installed base than theirs!) of LIVE555-based receivers. (If the manufacturer of your camera wishes to discuss this with me, then I'd be glad to hear from them - directly.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark.theunissen at gmail.com Sat Apr 26 04:02:25 2014 From: mark.theunissen at gmail.com (Mark Theunissen) Date: Sat, 26 Apr 2014 13:02:25 +0200 Subject: [Live-devel] Stream h264 - bitrate's effect on latency Message-ID: I have three Raspi cameras pushing out h264 elementary streams, which I'm reading from stdin and sending out using live555's RTSP server. All works great. The only problem is that the three camera feeds can sometimes have different bitrates depending on what they're looking at - I want them syncronized. It seems that the cameras with the higher bitrates are closer to real-time, and the lower bitrates lag more. I expect this is because there is some fixed-size buffer somewhere, and the higher the bitrate, the faster the buffer is filled and the more realtime the stream is. If I manually limit the bitrate so that all cameras are the same, they syncronize perfectly. However, I'd like to rather allow them to have different bitrates, but still be syncronized. Can anyone help with this? Is there a setting in live555 somewhere? Perhaps variable buffer size or rather, no buffer so that video is sent ASAP? Gstreamer manages extremely low latency, but uses too much CPU for my application. I'm just using the test app modified to read from stdin. Thanks Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From vanessa.chodaton at etu.univ-nantes.fr Sat Apr 26 23:00:22 2014 From: vanessa.chodaton at etu.univ-nantes.fr (Vanessa CHODATON) Date: Sun, 27 Apr 2014 08:00:22 +0200 (CEST) Subject: [Live-devel] RTP source filter Message-ID: <66b3b8767d31d5fa56ca8c9050b5d9b7.squirrel@webmail-etu.univ-nantes.fr> > >> I am student at polytech Nantes (France) and for my university project, >> I have to design a source filter for RTP protocol only, with LIVE555 >> library. > > By "source filter", do you mean "a way to accept or reject incoming RTP > packets, based on their IP source (i.e., 'from') address"? If so, then > you cannot do this with the "LIVE555 Streaming Media" software, because > the IP source address of incoming RTP packets is not made available to > receivers. Sorry. > Thank you for your response. Actually, I want to do something like the testMPEG2TransportReceiver that I find in the "LIVE555 Streaming Media" testprogs. This code reads a MPEG Transport/RTP stream (from the same multicast group/port), and outputs the reconstituted MPEG Transport Stream to "stdout".I test it and it works very well. My source filter will receive MPEG Transport Stream but instead of putting it to "stdout", I want put it to the output pin of my source filter. I have trouble to find appropriate functions to do this. Please can you give some indications. Thank you for your attention. Regards, Vanessa Chodaton From finlayson at live555.com Sun Apr 27 00:53:03 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 27 Apr 2014 00:53:03 -0700 Subject: [Live-devel] RTP source filter In-Reply-To: <66b3b8767d31d5fa56ca8c9050b5d9b7.squirrel@webmail-etu.univ-nantes.fr> References: <66b3b8767d31d5fa56ca8c9050b5d9b7.squirrel@webmail-etu.univ-nantes.fr> Message-ID: <51CE4844-EAF6-46F6-BF2D-A8FECA497EC8@live555.com> > Actually, I want to do something like the testMPEG2TransportReceiver that > I find > in the "LIVE555 Streaming Media" testprogs. This code reads a MPEG > Transport/RTP stream (from the same multicast group/port), and outputs the > reconstituted MPEG Transport Stream to "stdout".I test it and it works > very well. > My source filter will receive MPEG Transport Stream but instead > of putting it to "stdout", I want put it to the output pin of my source > filter. You can do this without making any changes to the "testMPEG2TransportReceiver". Just write your 'filter' application to read from 'stdin', and run testMPEG2TransportReceiver | your_filter_application Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at jfs-tech.com Sat Apr 26 20:03:10 2014 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Sat, 26 Apr 2014 23:03:10 -0400 Subject: [Live-devel] Stream h264 - bitrate's effect on latency In-Reply-To: References: Message-ID: What is the resolution? At higher resolutions and/or encoding qualities the encoder/decoder may require multiple calls to pump out the frame. I assume that you are sticking with a simple base profile specifically to avoid B frames. (Bidirectional predictive B frames would require latency obviously). To synchronize, you have to be at the mercy of the slowest stream. Of course MJPEG with it's every frame standing on it's own, would avoid the problem but have much higher bandwidth. Other little things. The smaller the packet the lower the latency. Which is why transport streams have such tiny 188(or 204) byte packets. This allows mixing streams at a finer grainularity and assemble them as soon as you get all the info from multiple streams at once. (each network packet of MTU 1400 can contain parts of multiple streams so it appears more parallel to the next level up code.) The transport UDP/TCP choice can effect latency if a data is lost and needs to be re-sent(TCP) UDP could drop and move on. Therefor less latency. How is it encoded? Software or Hardware/firmware? Software can be improved with GPU encodeing. (The Raspi may have this avail) Software may be improved by periodic intra-refresh. Stock H264 can have large Key frames that require a chunk of work to handle. Whoa? did I hear you correctly? stdin? I cannot even speculate how you can quantify stdin timing. (try sockets, shared memory) Make sure you create your timestamps at end of image capture/beginning of encoding time. Since you are the source, the timestamps can be based all off the same clock. Other idea. While normally the first substream of a video stream is the audio, perhaps you could create a stream with n substreams each a video. Then they are synced? On Sat, Apr 26, 2014 at 7:02 AM, Mark Theunissen wrote: > I have three Raspi cameras pushing out h264 elementary streams, which I'm > reading from stdin and sending out using live555's RTSP server. All works > great. > > The only problem is that the three camera feeds can sometimes have > different bitrates depending on what they're looking at - I want them > syncronized. It seems that the cameras with the higher bitrates are closer > to real-time, and the lower bitrates lag more. I expect this is because > there is some fixed-size buffer somewhere, and the higher the bitrate, the > faster the buffer is filled and the more realtime the stream is. > > If I manually limit the bitrate so that all cameras are the same, they > syncronize perfectly. > > However, I'd like to rather allow them to have different bitrates, but > still be syncronized. > > Can anyone help with this? Is there a setting in live555 somewhere? > Perhaps variable buffer size or rather, no buffer so that video is sent > ASAP? Gstreamer manages extremely low latency, but uses too much CPU for my > application. > > I'm just using the test app modified to read from stdin. > > Thanks > Mark > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Apr 27 13:57:23 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 27 Apr 2014 13:57:23 -0700 Subject: [Live-devel] Stream h264 - bitrate's effect on latency In-Reply-To: References: Message-ID: <41F17643-BC44-4C7A-A929-3D56BF9C9407@live555.com> > I have three Raspi cameras pushing out h264 elementary streams, which I'm reading from stdin and sending out using live555's RTSP server. All works great. > > The only problem is that the three camera feeds can sometimes have different bitrates depending on what they're looking at - I want them syncronized. It seems that the cameras with the higher bitrates are closer to real-time, and the lower bitrates lag more. I expect this is because there is some fixed-size buffer somewhere, and the higher the bitrate, the faster the buffer is filled and the more realtime the stream is. This buffer will be the buffer that's in your OS - between your camera (that's writing to 'stdout'), and the LIVE555 server (that's reading from 'stdin'). I.e., it'll be in whatever pipe you have between the camera and the server. It may be possible (using some ioctl() or something) to change this buffer size. Alternatively, you may find it easier to have your camera's encoder use a (virtual) file that's named in the file system (e.g., somewhere in /dev), and then have the LIVE555 server read directly from this 'file', rather than from stdin. That will avoid having an intermediate pipe set up by the OS. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lcheng at grandstream.com Mon Apr 28 01:34:45 2014 From: lcheng at grandstream.com (Cheng Lei) Date: Mon, 28 Apr 2014 16:34:45 +0800 Subject: [Live-devel] lead to a high CPU percentage without setting the gateway ip address Message-ID: Hi guys, I encounter a problem of using live555 as a client on linux platform. If delete default gateway ip addrss, then run the test program testRTSPClient, CPU percentage is 6%, then I add the default gateway ip by using cmd such as "route add default gw 192.168.122.1", CPU decrease to 1%. If you guys know the issue, please help, thanks. best wishes, lei. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Apr 28 02:00:36 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Apr 2014 02:00:36 -0700 Subject: [Live-devel] lead to a high CPU percentage without setting the gateway ip address In-Reply-To: References: Message-ID: <3911F587-B1B4-49B6-9C28-B743362AF17D@live555.com> > I encounter a problem of using live555 as a client on linux platform. If delete default gateway ip addrss, then run the test program testRTSPClient, CPU percentage is 6%, then I add the default gateway ip by using cmd such as "route add default gw 192.168.122.1", CPU decrease to 1%. If you guys know the issue, please help, thanks. I'm not convinced that 6% CPU is a 'problem', nor am I convinced that a slight increase in CPU usage when a host does not have a gateway address (something that all hosts should have anyway) is a 'problem'. In any case, if you still consider this worth looking into, you're probably going to have to do so yourself - by running a profiling tool (e.g., "gprof"). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lcheng at grandstream.com Mon Apr 28 02:33:00 2014 From: lcheng at grandstream.com (Cheng Lei) Date: Mon, 28 Apr 2014 17:33:00 +0800 Subject: [Live-devel] lead to a high CPU percentage without setting the gateway ip address In-Reply-To: <3911F587-B1B4-49B6-9C28-B743362AF17D@live555.com> References: <3911F587-B1B4-49B6-9C28-B743362AF17D@live555.com> Message-ID: Thanks for your replay. But this maybe relay a problem to me. If I run more than ten client like this, the CPU will be 60% more. If some one just not setting the gateway IP, I still wish my app works fine. I tried but failed to find out the reason. best wishes, lei. On Mon, Apr 28, 2014 at 5:00 PM, Ross Finlayson wrote: > I encounter a problem of using live555 as a client on linux platform. If > delete default gateway ip addrss, then run the test program testRTSPClient, > CPU percentage is 6%, then I add the default gateway ip by using cmd such > as "route add default gw 192.168.122.1", CPU decrease to 1%. If you guys > know the issue, please help, thanks. > > > I'm not convinced that 6% CPU is a 'problem', nor am I convinced that a > slight increase in CPU usage when a host does not have a gateway address > (something that all hosts should have anyway) is a 'problem'. > > In any case, if you still consider this worth looking into, you're > probably going to have to do so yourself - by running a profiling tool > (e.g., "gprof"). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren at etr-usa.com Mon Apr 28 13:15:14 2014 From: warren at etr-usa.com (Warren Young) Date: Mon, 28 Apr 2014 14:15:14 -0600 Subject: [Live-devel] lead to a high CPU percentage without setting the gateway ip address In-Reply-To: References: Message-ID: <535EB6D2.7060200@etr-usa.com> On 4/28/2014 02:34, Cheng Lei wrote: > > If delete default gateway ip addrss, then run the test program > testRTSPClient, CPU percentage is 6%, then I add the default gateway ip > by using cmd such as "route add default gw 192.168.122.1", CPU decrease > to 1%. If you guys know the issue, please help, thanks. Are the streams you're setting up via RTSP multicast, by chance? I have noticed that the Linux network stack seems to require a gateway to do multicasting correctly, even when the packets never leave the LAN. If you absolutely cannot have a real gateway IP, try using the machine's own IP as the "gateway" address. That is usually enough to placate the routing layer of the stack. From lcheng at grandstream.com Mon Apr 28 18:17:51 2014 From: lcheng at grandstream.com (Cheng Lei) Date: Tue, 29 Apr 2014 09:17:51 +0800 Subject: [Live-devel] lead to a high CPU percentage without setting the gateway ip address In-Reply-To: <535EB6D2.7060200@etr-usa.com> References: <535EB6D2.7060200@etr-usa.com> Message-ID: I just use command "testRTSPClient rtsp://192.168.122.199" to receive h.264 stream from an ip camera, not using multicast. I've noticed the client can't get local ip address using "ourIPAddress" function when the default gateway ip is missed. "ourIPAddress" will send a multicast packet to get local ip. Using gprof as ROSS told me, I got this: *without gateway ip:* - Each sample counts as 0.01 seconds. - % cumulative self self total - time seconds seconds calls us/call us/call name - 23.53 0.08 0.08 write - 11.76 0.12 0.04 sendto - 5.88 0.14 0.02 close - 5.88 0.16 0.02 recvmsg - 5.88 0.18 0.02 vfprintf - 2.94 0.19 0.01 203 49.26 49.26 DummySink::afterGettingFrame(unsigned int, unsigned int, timeval, unsigned int) - 2.94 0.20 0.01 _IO_default_xsputn - 2.94 0.21 0.01 ourIPAddress(UsageEnvironment&) - 2.94 0.22 0.01 DelayQueue::synchronize() - 2.94 0.23 0.01 FramedSource::afterGetting(FramedSource*) - 2.94 0.24 0.01 ___xstat64 - 2.94 0.25 0.01 __tzstring - 2.94 0.26 0.01 _int_malloc - 2.94 0.27 0.01 _xstat - 2.94 0.28 0.01 make_request - 2.94 0.29 0.01 mempcpy - 2.94 0.30 0.01 our_random - 2.94 0.31 0.01 puts - 2.94 0.32 0.01 select - 2.94 0.33 0.01 setsockopt - 2.94 0.34 0.01 socket - 0.00 0.34 0.00 205 0.00 0.00 DummySink::continuePlaying() - 0.00 0.34 0.00 203 0.00 49.26 DummySink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) - 0.00 0.34 0.00 11 0.00 0.00 operator<<(UsageEnvironment&, RTSPClient const&) - 0.00 0.34 0.00 9 0.00 0.00 operator<<(UsageEnvironment&, MediaSubsession const&) - 0.00 0.34 0.00 4 0.00 0.00 setupNextSubsession(RTSPClient*) - 0.00 0.34 0.00 3 0.00 0.00 continueAfterSETUP(RTSPClient*, int, char*) - 0.00 0.34 0.00 3 0.00 0.00 DummySink::createNew(UsageEnvironment&, MediaSubsession&, char const*) - 0.00 0.34 0.00 3 0.00 0.00 DummySink::DummySink(UsageEnvironment&, MediaSubsession&, char const*) - 0.00 0.34 0.00 1 0.00 0.00 continueAfterPLAY(RTSPClient*, int, char*) - 0.00 0.34 0.00 1 0.00 0.00 continueAfterDESCRIBE(RTSPClient*, int, char*) - 0.00 0.34 0.00 1 0.00 0.00 openURL(UsageEnvironment&, char const*, char const*) - 0.00 0.34 0.00 1 0.00 0.00 ourRTSPClient::createNew(UsageEnvironment&, char const*, int, char const*, unsigned short) - 0.00 0.34 0.00 1 0.00 0.00 ourRTSPClient::ourRTSPClient(UsageEnvironment&, char const*, int, char const*, unsigned short) - 0.00 0.34 0.00 1 0.00 0.00 main *with gateway ip:* - Each sample counts as 0.01 seconds. - % cumulative self self total - time seconds seconds calls Ts/call Ts/call name - 53.33 0.08 0.08 write - 13.33 0.10 0.02 _int_malloc - 6.67 0.11 0.01 BasicUsageEnvironment::operator<<(char const*) - 6.67 0.12 0.01 cfree - 6.67 0.13 0.01 new_do_write - 6.67 0.14 0.01 recvfrom - 6.67 0.15 0.01 vfprintf - 0.00 0.15 0.00 337 0.00 0.00 DummySink::continuePlaying() - 0.00 0.15 0.00 335 0.00 0.00 DummySink::afterGettingFrame(void*, unsigned int, unsigned int, timeval, unsigned int) - 0.00 0.15 0.00 335 0.00 0.00 DummySink::afterGettingFrame(unsigned int, unsigned int, timeval, unsigned int) - 0.00 0.15 0.00 11 0.00 0.00 operator<<(UsageEnvironment&, RTSPClient const&) - 0.00 0.15 0.00 9 0.00 0.00 operator<<(UsageEnvironment&, MediaSubsession const&) - 0.00 0.15 0.00 4 0.00 0.00 setupNextSubsession(RTSPClient*) - 0.00 0.15 0.00 3 0.00 0.00 continueAfterSETUP(RTSPClient*, int, char*) - 0.00 0.15 0.00 3 0.00 0.00 DummySink::createNew(UsageEnvironment&, MediaSubsession&, char const*) - 0.00 0.15 0.00 3 0.00 0.00 DummySink::DummySink(UsageEnvironment&, MediaSubsession&, char const*) - 0.00 0.15 0.00 1 0.00 0.00 continueAfterPLAY(RTSPClient*, int, char*) - 0.00 0.15 0.00 1 0.00 0.00 continueAfterDESCRIBE(RTSPClient*, int, char*) - 0.00 0.15 0.00 1 0.00 0.00 openURL(UsageEnvironment&, char const*, char const*) - 0.00 0.15 0.00 1 0.00 0.00 ourRTSPClient::createNew(UsageEnvironment&, char const*, int, char const*, unsigned short) - 0.00 0.15 0.00 1 0.00 0.00 ourRTSPClient::ourRTSPClient(UsageEnvironment&, char const*, int, char const*, unsigned short) - 0.00 0.15 0.00 1 0.00 0.00 main I still don't know where is the problem. Please help. On Tue, Apr 29, 2014 at 4:15 AM, Warren Young wrote: > On 4/28/2014 02:34, Cheng Lei wrote: > >> >> If delete default gateway ip addrss, then run the test program >> testRTSPClient, CPU percentage is 6%, then I add the default gateway ip >> by using cmd such as "route add default gw 192.168.122.1", CPU decrease >> to 1%. If you guys know the issue, please help, thanks. >> > > Are the streams you're setting up via RTSP multicast, by chance? I have > noticed that the Linux network stack seems to require a gateway to do > multicasting correctly, even when the packets never leave the LAN. > > If you absolutely cannot have a real gateway IP, try using the machine's > own IP as the "gateway" address. That is usually enough to placate the > routing layer of the stack. > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Apr 28 20:25:08 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Apr 2014 20:25:08 -0700 Subject: [Live-devel] lead to a high CPU percentage without setting the gateway ip address In-Reply-To: References: <535EB6D2.7060200@etr-usa.com> Message-ID: <3E46848F-EE87-4058-BA28-B05EFD75FA20@live555.com> > I've noticed the client can't get local ip address using "ourIPAddress" function when the default gateway ip is missed. That's easy to fix. Just enable IP multicast routing on your network interface - i.e., add a route for 224.0.0.0/4 Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From amir.lasmar.marzouk at gmail.com Mon Apr 28 03:00:36 2014 From: amir.lasmar.marzouk at gmail.com (Amir Marzouk) Date: Mon, 28 Apr 2014 12:00:36 +0200 Subject: [Live-devel] list of audio stream Message-ID: Hello, I want to create a list of media that can be played successively and automatically without changing the URL. Thanks for help, cordially, -------------- next part -------------- An HTML attachment was scrubbed... URL: From lcheng at grandstream.com Mon Apr 28 21:51:38 2014 From: lcheng at grandstream.com (Cheng Lei) Date: Tue, 29 Apr 2014 12:51:38 +0800 Subject: [Live-devel] list of audio stream In-Reply-To: References: Message-ID: If your list is of the same media type, maybe you can try to modify the file ByteStreamFileSource.cpp. On Mon, Apr 28, 2014 at 6:00 PM, Amir Marzouk wrote: > Hello, > > I want to create a list of media that can be > played successively and automatically without changing the URL. > > Thanks for help, > cordially, > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Apr 28 22:00:33 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 28 Apr 2014 22:00:33 -0700 Subject: [Live-devel] list of audio stream In-Reply-To: References: Message-ID: <5B99D2E1-46CC-4FD1-978E-DC3AF799B917@live555.com> > I want to create a list of media that can be played successively and automatically without changing the URL. > > If your list is of the same media type, maybe you can try to modify the file ByteStreamFileSource.cpp. If possible, you shouldn't modify the supplied source code, because that makes it harder for you to comply with the LGPL, and makes it much less likely that you'll get support on this mailing list. See http://www.live555.com/liveMedia/faq.html#modifying-and-extending Instead, I suggest using the supplied class "ByteStreamMultiFileSource", which was designed just for this purpose. You would create a new subclass of "OnDemandServerMediaSubsession", and have its "createNewStreamSource()" virtual function call "ByteStreamMultiFileSource::createNew()" (instead of "ByteStreamFileSource::createNew()". Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark.theunissen at gmail.com Mon Apr 28 07:57:34 2014 From: mark.theunissen at gmail.com (Mark Theunissen) Date: Mon, 28 Apr 2014 16:57:34 +0200 Subject: [Live-devel] Stream h264 - bitrate's effect on latency In-Reply-To: References: Message-ID: They're in close proximity - for a live sport event. So all on the same network. Thanks, I'll take a look into some of your suggestions! On Mon, Apr 28, 2014 at 2:00 PM, Jeff Shanab wrote: > That is a reasonable resolution so you should not have multi-slice > keyframes. Stick to main or baseline, High would probably be 2 pass with B > frames. > H264 is of course comprised of Key frames and difference frames. But the > decoder should pump out one frame per iteration at one frame delay. > > Synchronizing multiple sources is an interesting issue. You will have to > buck the tradition of ignoring the absolute time and using the timestamps > only as a relative indicator. Instead sync the clocks on the Pi's and then > rely on this time???. Probably pick one Pi as a time server and the other > two use that to keep the clocks synced. Then control the two encoders > relative to this clock so you even start the Keyframe generation in sync. > > Perhaps the master Pi could run live555 and create a single Transport > stream with multiple video streams from the other Pi's. The transport > stream is defined as a container for streams sharing the same clock base. > > Is this to sync multiple cameras in close proximity? or distant? > > > On Mon, Apr 28, 2014 at 7:23 AM, Mark Theunissen < > mark.theunissen at gmail.com> wrote: > >> > What is the resolution / profile? >> >> I'm at 854x480. I've tried baseline, high and main. Same result. >> >> I'm using UDP. >> >> Encoded in hardware on the Pi's Broadcom chip. >> >> > Whoa? did I hear you correctly? stdin? I cannot even speculate how you >> can >> > quantify stdin timing. (try sockets, shared memory) >> >> So should I try using perhaps v4l2 driver (/dev/video0) instead of piping >> from one program to live555's stdin? >> >> > Make sure you create your timestamps at end of image capture/beginning >> of >> > encoding time. Since you are the source, the timestamps can be based all >> > off the same clock. >> >> I am the source, but it's 3 different computers (raspberry pis) - not >> sure how to sync in that case? >> >> Thanks for your replies and help. >> >> - Mark >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From eadeli at gmail.com Tue Apr 29 21:44:46 2014 From: eadeli at gmail.com (Ehsan Adeli) Date: Wed, 30 Apr 2014 09:14:46 +0430 Subject: [Live-devel] Problems with MPEG2 Indexer / Trickplay Message-ID: Hi, I have used the two "MPEG2TransportStreamIndexer" and "testMPEG2TransportStreamTrickPlay" test programs to index a TS file ("testrec.ts"), and then trickplay it with 4x speed: MPEG2TransportStreamIndexer testrec.ts The index file "testrec.tsx" is created (This file is uploaded herefor more info). Then testMPEG2TransportStreamTrickPlay testrec.ts 0 4 testrec4.ts is run to create the trickplayed file. But the output file could not be played and is a very small file in size. The output file is uploaded here. The original "testrec.ts" file is a 1:18 minute, 25 fps video, with 27.5 MBs in size. Any ideas? Thank you very much, in advance. Regards, Ehsan -------------- next part -------------- An HTML attachment was scrubbed... URL: From amir.lasmar.marzouk at gmail.com Wed Apr 30 05:52:29 2014 From: amir.lasmar.marzouk at gmail.com (Amir Marzouk) Date: Wed, 30 Apr 2014 14:52:29 +0200 Subject: [Live-devel] ByteStreamMultiFileSource Message-ID: hello, I can't find any example to use "ByteStreamMultiFileSource", can you help me please! Thanks, cordially, -------------- next part -------------- An HTML attachment was scrubbed... URL: