From Chen.Li2 at boeing.com Wed Aug 1 08:36:29 2007 From: Chen.Li2 at boeing.com (Li, Chen) Date: Wed, 1 Aug 2007 08:36:29 -0700 Subject: [Live-devel] Two Concerns Message-ID: The first question is using LabVIEW the server is able to do an OPTIONS, DESCRIBE, and SETUP, but I get 405's for PLAY, PAUSE, TEARDOWN and seems to start a new session when every command is entered. I also could not set my session id using the Session: parameter. -I am able to set my session id in Live555 by using the Session: parameter, right? -Why is the server is incrementing my session id even though I keep specifying my session id? -I have tried sending the same requests as VLC does (shown by looking at VLC's console view) over TCP and still get 405's. I am sending the commands in TCP packets through LabVIEW. Even a teardown returns a 405. The second question is I want to be able to play then switch to fast forward or rewind, but I would prefer to not use a teardown, but performance is key here. I see by the standards that I can issue a pause command which would flush the play queue, but must maintain is play position. -Would it be better or worse than teardown->setup->play to have a pause->play->play where the middle play is a command to tell the server to play a short clip. I currently do not have a player capable of sending RTSP commands while already playing so any help would be appreciated. -If there is a player that allows for sending written RTSP commands even during playback, please let me know. -Is there an even better alternative that would allow for switching between various play modes? -Will Live555 support true RTSP and not just trick play? Thank you, --Chen From Chen.Li2 at boeing.com Wed Aug 1 09:35:28 2007 From: Chen.Li2 at boeing.com (Li, Chen) Date: Wed, 1 Aug 2007 09:35:28 -0700 Subject: [Live-devel] (no subject) Message-ID: Hello, Does the server support the optional GET_PARAMETERS and SET_PARAMETERS? --Chen Li From fant0m4s at gmail.com Wed Aug 1 09:41:05 2007 From: fant0m4s at gmail.com (Fantomas) Date: Wed, 1 Aug 2007 13:41:05 -0300 Subject: [Live-devel] Recording stream in multiple files Message-ID: <99fed6ae0708010941p949ac24l65a9ea3f3a5e1bc5@mail.gmail.com> Hi, I'm recording a stream to a .mov file using a QuickTimeFileSink, with code based in the openRTSP example. I need to record the stream in multiple videos with a arbitrary duration, so every 30 seconds I close the sink and open a new with another file name. I modified the QuickTimeFileSink and added the metod stopPlaying, to prevent a segmentation fault when I open the new sink. I can record without problems, but when I try to see the videos with mplayer, in all of them (except the first) I get a few seconds with a black screen at the beginning and the message "[mpeg4 @ 0x87ff79c]warning: first frame is no keyframe." What could be the problem? The code of stopPlaying is: void stopPlaying() { MediaSubsessionIterator iter(fInputSession); MediaSubsession* subsession; while ((subsession = iter.next()) != NULL) { // Ignore subsessions without a data source: FramedSource* subsessionSource = subsession->readSource(); if (subsessionSource == NULL) continue; // First, tell the source that we're no longer interested: subsessionSource->stopGettingFrames(); // Cancel any pending tasks: envir().taskScheduler().unscheduleDelayedTask(nextTask()); nextTask() = NULL; subsessionSource = NULL; // indicates that we can be played again fAfterFunc = NULL; } } Thanks in advance and sorry my english, it's not mi native language. Manuel Carrizo From finlayson at live555.com Wed Aug 1 17:39:31 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 1 Aug 2007 17:39:31 -0700 Subject: [Live-devel] (no subject) In-Reply-To: References: Message-ID: >Does the server support the optional GET_PARAMETERS and SET_PARAMETERS? GET_PARAMETER - yes (as a noop keep-alive). SET_PARAMETER - no. (Remember that you have complete source code. You could have figured this out by looking at the code.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Thu Aug 2 22:53:51 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 2 Aug 2007 22:53:51 -0700 Subject: [Live-devel] issue with a streaming server having multiple network interfaces In-Reply-To: <900003.99079.qm@web53305.mail.re2.yahoo.com> References: <900003.99079.qm@web53305.mail.re2.yahoo.com> Message-ID: Noam, Thanks for the report. It turns out that one of clients also reported the same problem. I have now installed a new version (2007.08.03) of the "LIVE555 Streaming Media" software that should resolve this issue. Could you please try out this new version, and let us know if it solves your problem? (I don't have a multiple-interface computer handy to test this on.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From zmax.linkedin at gmail.com Fri Aug 3 09:18:38 2007 From: zmax.linkedin at gmail.com (Massimo Zito) Date: Fri, 3 Aug 2007 18:18:38 +0200 Subject: [Live-devel] OnDemandServerMediaSubsession, FileServerMediaSubsession and MPEG2TransportFileServerMediaSubsession ... Message-ID: <92a42b330708030918o52c7fc09nfb7e12cb4c5f5613@mail.gmail.com> Hi Ross, I have modified OnDemandServerMediaSubsession, FileServerMediaSubsession and MPEG2TransportFileServerMediaSubsession to handle the session closure on source end. *** include/OnDemandServerMediaSubsession.hh 2007-08-03 06:44: 32.000000000 +0200 --- include/OnDemandServerMediaSubsession.hh.new 2007-08-03 17:46: 37.000000000 +0200 *************** *** 32,38 **** class OnDemandServerMediaSubsession: public ServerMediaSubsession { protected: // we're a virtual base class OnDemandServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, ! portNumBits initialPortNum = 6970); virtual ~OnDemandServerMediaSubsession(); protected: // redefined virtual functions --- 32,38 ---- class OnDemandServerMediaSubsession: public ServerMediaSubsession { protected: // we're a virtual base class OnDemandServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, ! Boolean tearDownOnSourceEnd = False, portNumBits initialPortNum = 6970); virtual ~OnDemandServerMediaSubsession(); protected: // redefined virtual functions *************** *** 75,80 **** --- 75,83 ---- unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) = 0; + protected: + Boolean fTearDownOnSourceEnd; + private: void setSDPLinesFromRTPSink(RTPSink* rtpSink, FramedSource* inputSource); // used to implement "sdpLines()" *** OnDemandServerMediaSubsession.cpp 2007-08-03 06:44:32.000000000 +0200 --- OnDemandServerMediaSubsession.cpp.new 2007-08-03 18:02:49.000000000+0200 *************** *** 27,35 **** OnDemandServerMediaSubsession ::OnDemandServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, portNumBits initialPortNum) : ServerMediaSubsession(env), ! fReuseFirstSource(reuseFirstSource), fInitialPortNum(initialPortNum), fLastStreamToken(NULL), fSDPLines(NULL) { fDestinationsHashTable = HashTable::create(ONE_WORD_HASH_KEYS); gethostname(fCNAME, sizeof fCNAME); --- 27,37 ---- OnDemandServerMediaSubsession ::OnDemandServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, + Boolean tearDownOnSourceEnd, portNumBits initialPortNum) : ServerMediaSubsession(env), ! fTearDownOnSourceEnd(tearDownOnSourceEnd), fReuseFirstSource(reuseFirstSource), ! fInitialPortNum(initialPortNum), fLastStreamToken(NULL), fSDPLines(NULL) { fDestinationsHashTable = HashTable::create(ONE_WORD_HASH_KEYS); gethostname(fCNAME, sizeof fCNAME); *************** *** 123,128 **** --- 125,132 ---- FramedSource* mediaSource() const { return fMediaSource; } + Boolean tearDownOnSourceEnd() const { return fMaster.fTearDownOnSourceEnd; } + private: OnDemandServerMediaSubsession& fMaster; Boolean fAreCurrentlyPlaying; *************** *** 393,399 **** static void afterPlayingStreamState(void* clientData) { StreamState* streamState = (StreamState*)clientData; ! if (streamState->streamDuration() == 0.0) { // When the input stream ends, tear it down. This will cause a RTCP "BYE" // to be sent to each client, teling it that the stream has ended. // (Because the stream didn't have a known duration, there was no other --- 397,403 ---- static void afterPlayingStreamState(void* clientData) { StreamState* streamState = (StreamState*)clientData; ! if ((streamState->tearDownOnSourceEnd() == True) || (streamState->streamDuration() == 0.0)) { // When the input stream ends, tear it down. This will cause a RTCP "BYE" // to be sent to each client, teling it that the stream has ended. // (Because the stream didn't have a known duration, there was no other *** include/FileServerMediaSubsession.hh 2007-08-03 06:44: 32.000000000 +0200 --- include/FileServerMediaSubsession.hh.new 2007-08-03 18:24: 09.000000000 +0200 *************** *** 29,35 **** class FileServerMediaSubsession: public OnDemandServerMediaSubsession { protected: // we're a virtual base class FileServerMediaSubsession(UsageEnvironment& env, char const* fileName, ! Boolean reuseFirstSource); virtual ~FileServerMediaSubsession(); protected: --- 29,35 ---- class FileServerMediaSubsession: public OnDemandServerMediaSubsession { protected: // we're a virtual base class FileServerMediaSubsession(UsageEnvironment& env, char const* fileName, ! Boolean reuseFirstSource, Boolean tearDownOnSourceEnd = False); virtual ~FileServerMediaSubsession(); protected: *** FileServerMediaSubsession.cpp 2007-08-03 06:44:32.000000000 +0200 --- FileServerMediaSubsession.cpp.new 2007-08-03 18:27:21.000000000+0200 *************** *** 23,30 **** FileServerMediaSubsession ::FileServerMediaSubsession(UsageEnvironment& env, char const* fileName, ! Boolean reuseFirstSource) ! : OnDemandServerMediaSubsession(env, reuseFirstSource), fFileSize(0) { fFileName = strDup(fileName); } --- 23,30 ---- FileServerMediaSubsession ::FileServerMediaSubsession(UsageEnvironment& env, char const* fileName, ! Boolean reuseFirstSource, Boolean tearDownOnSourceEnd) ! : OnDemandServerMediaSubsession(env, reuseFirstSource, tearDownOnSourceEnd), fFileSize(0) { fFileName = strDup(fileName); } *** include/MPEG2TransportFileServerMediaSubsession.hh 2007-08-03 06:44: 32.000000000 +0200 --- include/MPEG2TransportFileServerMediaSubsession.hh.new 2007-08-03 18:29:56.000000000 +0200 *************** *** 36,48 **** static MPEG2TransportFileServerMediaSubsession* createNew(UsageEnvironment& env, char const* dataFileName, char const* indexFileName, ! Boolean reuseFirstSource); protected: MPEG2TransportFileServerMediaSubsession(UsageEnvironment& env, char const* fileName, MPEG2TransportStreamIndexFile* indexFile, ! Boolean reuseFirstSource); // called only by createNew(); virtual ~MPEG2TransportFileServerMediaSubsession(); --- 36,50 ---- static MPEG2TransportFileServerMediaSubsession* createNew(UsageEnvironment& env, char const* dataFileName, char const* indexFileName, ! Boolean reuseFirstSource, ! Boolean tearDownOnSourceEnd = False); protected: MPEG2TransportFileServerMediaSubsession(UsageEnvironment& env, char const* fileName, MPEG2TransportStreamIndexFile* indexFile, ! Boolean reuseFirstSource, ! Boolean tearDownOnSourceEnd); // called only by createNew(); virtual ~MPEG2TransportFileServerMediaSubsession(); *** MPEG2TransportFileServerMediaSubsession.cpp 2007-08-03 06:44: 32.000000000 +0200 --- MPEG2TransportFileServerMediaSubsession.cpp.new 2007-08-03 18:32: 31.000000000 +0200 *************** *** 71,77 **** MPEG2TransportFileServerMediaSubsession::createNew(UsageEnvironment& env, char const* fileName, char const* indexFileName, ! Boolean reuseFirstSource) { if (indexFileName != NULL && reuseFirstSource) { // It makes no sense to support trick play if all clients use the same source. Fix this: env << "MPEG2TransportFileServerMediaSubsession::createNew(): ignoring the index file name, because \"reuseFirstSource\" is set\n"; --- 71,78 ---- MPEG2TransportFileServerMediaSubsession::createNew(UsageEnvironment& env, char const* fileName, char const* indexFileName, ! Boolean reuseFirstSource, ! Boolean tearDownOnSourceEnd) { if (indexFileName != NULL && reuseFirstSource) { // It makes no sense to support trick play if all clients use the same source. Fix this: env << "MPEG2TransportFileServerMediaSubsession::createNew(): ignoring the index file name, because \"reuseFirstSource\" is set\n"; *************** *** 79,93 **** } MPEG2TransportStreamIndexFile* indexFile = MPEG2TransportStreamIndexFile::createNew(env, indexFileName); return new MPEG2TransportFileServerMediaSubsession(env, fileName, indexFile, ! reuseFirstSource); } MPEG2TransportFileServerMediaSubsession ::MPEG2TransportFileServerMediaSubsession(UsageEnvironment& env, char const* fileName, MPEG2TransportStreamIndexFile* indexFile, ! Boolean reuseFirstSource) ! : FileServerMediaSubsession(env, fileName, reuseFirstSource), fIndexFile(indexFile), fDuration(0.0), fClientSessionHashTable(NULL) { if (fIndexFile != NULL) { // we support 'trick play' fDuration = fIndexFile->getPlayingDuration(); --- 80,95 ---- } MPEG2TransportStreamIndexFile* indexFile = MPEG2TransportStreamIndexFile::createNew(env, indexFileName); return new MPEG2TransportFileServerMediaSubsession(env, fileName, indexFile, ! reuseFirstSource, tearDownOnSourceEnd); } MPEG2TransportFileServerMediaSubsession ::MPEG2TransportFileServerMediaSubsession(UsageEnvironment& env, char const* fileName, MPEG2TransportStreamIndexFile* indexFile, ! Boolean reuseFirstSource, ! Boolean tearDownOnSourceEnd) ! : FileServerMediaSubsession(env, fileName, reuseFirstSource, tearDownOnSourceEnd), fIndexFile(indexFile), fDuration(0.0), fClientSessionHashTable(NULL) { if (fIndexFile != NULL) { // we support 'trick play' fDuration = fIndexFile->getPlayingDuration(); Is this work correct ? Thank you for your attention Massimo Zito -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070803/64a8124c/attachment-0001.html From zmax.linkedin at gmail.com Fri Aug 3 09:26:23 2007 From: zmax.linkedin at gmail.com (Massimo Zito) Date: Fri, 3 Aug 2007 18:26:23 +0200 Subject: [Live-devel] Debug string in RTSPServer ... Message-ID: <92a42b330708030926t431e0ae2i8af407249c30341a@mail.gmail.com> Hi Ross, I have realized there is a debug string in RTSPServer that is printed all times ... *** RTSPServer.cpp 2007-08-03 06:44:32.000000000 +0200 --- RTSPServer.cpp.new 2007-08-03 18:41:15.000000000 +0200 *************** *** 100,106 **** --- 100,108 ---- } char* RTSPServer::rtspURLPrefix(int clientSocket) const { + #ifdef DEBUG fprintf(stderr, "rtspURLPrefix(%d)\n", clientSocket); + #endif struct sockaddr_in ourAddress; if (clientSocket < 0) { // Use our default IP address in the URL: Is this ok ? Thank you Massimo Zito -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070803/5218c8de/attachment.html From finlayson at live555.com Fri Aug 3 09:33:32 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Aug 2007 09:33:32 -0700 Subject: [Live-devel] OnDemandServerMediaSubsession, FileServerMediaSubsession and MPEG2TransportFileServerMediaSubsession ... In-Reply-To: <92a42b330708030918o52c7fc09nfb7e12cb4c5f5613@mail.gmail.com> References: <92a42b330708030918o52c7fc09nfb7e12cb4c5f5613@mail.gmail.com> Message-ID: >Is this work correct ? Probably. However, I have no plans to add this to the installed library code. (If a stream has a known duration (which is the case when you're streaming from a file), then I always want the client to have the option - when it knows that it has reached the end of the stream - to either seek backwards in the stream and replay (at least part of) it, or tear down the stream itself. I don't want the server to take that choice away from the client.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Aug 3 09:44:17 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Aug 2007 09:44:17 -0700 Subject: [Live-devel] Debug string in RTSPServer ... In-Reply-To: <92a42b330708030926t431e0ae2i8af407249c30341a@mail.gmail.com> References: <92a42b330708030926t431e0ae2i8af407249c30341a@mail.gmail.com> Message-ID: Yes, you're right - I had left this fprintf() call in by mistake. Thanks for letting me know. I have now installed a new version (2007.08.03) of the code that removes this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From zmax.linkedin at gmail.com Fri Aug 3 10:09:15 2007 From: zmax.linkedin at gmail.com (Massimo Zito) Date: Fri, 3 Aug 2007 19:09:15 +0200 Subject: [Live-devel] OnDemandServerMediaSubsession, FileServerMediaSubsession and MPEG2TransportFileServerMediaSubsession ... In-Reply-To: References: <92a42b330708030918o52c7fc09nfb7e12cb4c5f5613@mail.gmail.com> Message-ID: <92a42b330708031009l5be21349vb3ad9ad7b32122f8@mail.gmail.com> Ok Ross, you're right ... I have realized that VLC hangs when a RTSP session ends .... ( tried on MPEG2 TS session ) I have made changes to respect code calls and disabled by default ... I will wait for a VLC fix .... Massimo 2007/8/3, Ross Finlayson : > > >Is this work correct ? > > Probably. However, I have no plans to add this to the installed > library code. (If a stream has a known duration (which is the case > when you're streaming from a file), then I always want the client to > have the option - when it knows that it has reached the end of the > stream - to either seek backwards in the stream and replay (at least > part of) it, or tear down the stream itself. I don't want the server > to take that choice away from the client.) > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070803/e4560d2f/attachment.html From finlayson at live555.com Fri Aug 3 10:08:44 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 3 Aug 2007 10:08:44 -0700 Subject: [Live-devel] Debug string in RTSPServer ... Message-ID: Correction: the new version of the code is called 2007.08.03a -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Sat Aug 4 00:01:41 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 4 Aug 2007 00:01:41 -0700 Subject: [Live-devel] Two Concerns In-Reply-To: References: Message-ID: I couldn't follow all of this (especially the bizarre final question), but one thing you should note - that may be relevant - is that, in our current RTSP server implementation, each "PLAY", "PAUSE" and/or "TEARDOWN" command must use the same TCP connection as the original "SETUP" command. (The earlier "DESCRIBE" command can use a different TCP connection; however, "SETUP", "PLAY", "PAUSE", "TEARDOWN" for each session must use a single TCP connection.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jylipekk at ee.oulu.fi Sat Aug 4 02:41:21 2007 From: jylipekk at ee.oulu.fi (Juha Ylipekkala) Date: Sat, 4 Aug 2007 12:41:21 +0300 Subject: [Live-devel] Implicit RTSP session timeout Message-ID: <20070804094121.GA12407@ee.oulu.fi> Hello everyone, I spotted a small problem with timeouts using live555 and vlc: when an RTSP server does not give an explicit timeout for the session, live555 RTSP client does not use the implicit 60 second timeout specified in the RFC. Would things break seriously if you changed the constructor of RTSPClient to set the session timeout to 60 seconds? VLC counts on a non-zero timeout parameter, and it won't start a keepalive thread for a zero value. It is possible to go around this in vlc with minor code changes, but since it's in the rfc... :) Cheers, -- Juha Ylipekkala From finlayson at live555.com Sat Aug 4 02:58:37 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 4 Aug 2007 02:58:37 -0700 Subject: [Live-devel] Implicit RTSP session timeout In-Reply-To: <20070804094121.GA12407@ee.oulu.fi> References: <20070804094121.GA12407@ee.oulu.fi> Message-ID: The timeout interval (i.e., how long the server waits before timing out a session due to client inactivity) is something that is determined by the *server*, not by the client. The default timeout is 45 seconds; however, you can change this (in the "reclamationTestSeconds" parameter to "RTSPServer::createNew()"; see "liveMedia/include/RTSPServer.hh"). Our server implementation currently does not announce this timeout parameter in its "Session:" responses (as specified in section 12.37 of RFC 2326). However, it probably should; this might get fixed in a future release of the software. However, I don't think a client could really do much that is useful should it see this parameter. Client 'liveness' is indicated by RTCP "RR" packets, which the client sends frequently already. In other words, I don't think there is a problem here that needs fixing. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From jylipekk at ee.oulu.fi Sat Aug 4 03:31:49 2007 From: jylipekk at ee.oulu.fi (Juha Ylipekkala) Date: Sat, 4 Aug 2007 13:31:49 +0300 Subject: [Live-devel] Implicit RTSP session timeout In-Reply-To: References: <20070804094121.GA12407@ee.oulu.fi> Message-ID: <20070804103149.GA14792@ee.oulu.fi> Hi again, On Sat, Aug 04, 2007 at 02:58:37AM -0700, Ross Finlayson wrote: > However, I don't think a client could really do much that is useful > should it see this parameter. Client 'liveness' is indicated by RTCP > "RR" packets, which the client sends frequently already. > > In other words, I don't think there is a problem here that needs fixing. Well, the RR packets are sent through one of the UDP channels, which leaves the TCP connection to timeout on its own after 120 seconds, causing the server in the other end to close the RTP streams. That's what I could deduce from the traces with wireshark. The server is not using live555, you can try it out at rtsp://stream2.asm.fi/tv512.sdp. Am I barking up the wrong tree here? Is the server side acting funny? Because that's where the TCP FIN comes from. Cheers, -- Juha Ylipekkala From finlayson at live555.com Sat Aug 4 06:51:47 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 4 Aug 2007 06:51:47 -0700 Subject: [Live-devel] Implicit RTSP session timeout In-Reply-To: <20070804103149.GA14792@ee.oulu.fi> References: <20070804094121.GA12407@ee.oulu.fi> <20070804103149.GA14792@ee.oulu.fi> Message-ID: >On Sat, Aug 04, 2007 at 02:58:37AM -0700, Ross Finlayson wrote: >> However, I don't think a client could really do much that is useful >> should it see this parameter. Client 'liveness' is indicated by RTCP >> "RR" packets, which the client sends frequently already. >> >> In other words, I don't think there is a problem here that needs fixing. > >Well, the RR packets are sent through one of the UDP channels, which >leaves the >TCP connection to timeout on its own after 120 seconds, causing the server in >the other end to close the RTP streams. That's what I could deduce from the >traces with wireshark. > >The server is not using live555, you can try it out at >rtsp://stream2.asm.fi/tv512.sdp. > >Am I barking up the wrong tree here? Is the server side acting funny? Because >that's where the TCP FIN comes from. I ran the "openRTSP", "VLC" (v 0.8.6c) and QuickTime Player clients against this stream, and - in each case - saw no timeout of the stream, even after several (>2) minutes. However, I see that your server is a Darwin Streaming server, which is known to time out its sessions if it does not receive RTCP "RR" packets from the client after about 2 minutes. However, our RTSP/RTP/RTCP client implementation (used in "openRTSP" and "VLC") does send RTCP "RR" packets frequently (as does QuickTime Player). This suggests, therefore, that your problem is that you have a firewall - somewhere between your client and server - that is blocking RTCP/UDP packets that come from your client. You will need to fix this (or else specify RTP-over-TCP streaming in your clients). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From noamatari at yahoo.com Mon Aug 6 08:36:44 2007 From: noamatari at yahoo.com (Noam Camiel) Date: Mon, 6 Aug 2007 08:36:44 -0700 (PDT) Subject: [Live-devel] issue with a streaming server having multiple network interfaces Message-ID: <570228.47592.qm@web53304.mail.re2.yahoo.com> Ross, I will try your version later on (when I get back from my vacation) but I believe that a simple test can be made to check your correction. If you connect a client from the internet to your LIVE555 server that is behind a NAT/firewall, you would be reproducing the same situation. You will need to "punch" the server port in the firewall to be able to connect from the internet into the LIVE555 server. In this scenario the keepalive packets should be sent from the client to the real (Internet) IP address of the LIVE555 server, as opposed to the internal (behind the firewall) IP address of the LIVE555 server. Please let me know if the test was successful. Noam Camiel ----- Original Message ---- From: Ross Finlayson To: LIVE555 Streaming Media - development & use Sent: Friday, August 3, 2007 8:53:51 AM Subject: Re: [Live-devel] issue with a streaming server having multiple network interfaces Noam, Thanks for the report. It turns out that one of clients also reported the same problem. I have now installed a new version (2007.08.03) of the "LIVE555 Streaming Media" software that should resolve this issue. Could you please try out this new version, and let us know if it solves your problem? (I don't have a multiple-interface computer handy to test this on.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ____________________________________________________________________________________ Building a website is a piece of cake. Yahoo! Small Business gives you all the tools to get online. http://smallbusiness.yahoo.com/webhosting From finlayson at live555.com Mon Aug 6 12:07:54 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Aug 2007 12:07:54 -0700 Subject: [Live-devel] issue with a streaming server having multiple network interfaces In-Reply-To: <570228.47592.qm@web53304.mail.re2.yahoo.com> References: <570228.47592.qm@web53304.mail.re2.yahoo.com> Message-ID: >I will try your version later on (when I get back from my vacation) >but I believe that a simple test can be made to check your >correction. > >If you connect a client from the internet to your LIVE555 server >that is behind a NAT/firewall, you would be reproducing the same >situation. No, that's a completely different situation. A LIVE555 server behind a NAT won't work until we implement ICE. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From cwein at hotmail.com Mon Aug 6 16:24:32 2007 From: cwein at hotmail.com (Chris Wein) Date: Mon, 6 Aug 2007 16:24:32 -0700 Subject: [Live-devel] Time to RTCP sync Message-ID: I have a client and server running on an embedded system, one with an AVC encoder and one with an AVC decoder. All is working well except for the time it takes for the streams to be synchronized using RTCP - once the connection is initiated it takes several seconds for the sync to occur. Our server also interoperates with VLC and I have observed that their RTP client starts presentation much sooner than our embedded client (the time to get first sequence and picture parameter set is equal here). Can anyone give me some insight as to what parameters can be adjusted to make this synchronization occur more quickly and explain the various tradeoffs. Thanks, Chris -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070806/4699e35a/attachment.html From finlayson at live555.com Mon Aug 6 16:34:48 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 6 Aug 2007 16:34:48 -0700 Subject: [Live-devel] Time to RTCP sync In-Reply-To: References: Message-ID: >I have a client and server running on an embedded system, one with >an AVC encoder and one with an AVC decoder. All is working well >except for the time it takes for the streams to be synchronized >using RTCP - once the connection is initiated it takes several >seconds for the sync to occur. Our server also interoperates with >VLC and I have observed that their RTP client starts presentation >much sooner than our embedded client (the time to get first sequence >and picture parameter set is equal here). Can anyone give me some >insight as to what parameters can be adjusted to make this >synchronization occur more quickly and explain the various tradeoffs. Make sure that the "totSessionBW" parameter to "RTCPInstance::createNew()" is reasonably accurate. It should be the bandwidth of the (audio or video) stream, in kilobits-per-second. If you set this parameter too low, then RTCP reports from the server will be sent less frequently than they should be. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070806/4d8f49bc/attachment.html From aviadr1 at gmail.com Tue Aug 7 04:51:21 2007 From: aviadr1 at gmail.com (aviad rozenhek) Date: Tue, 7 Aug 2007 14:51:21 +0300 Subject: [Live-devel] improvements to h264VideoFilesink - making it playable by VLC & quicktime Message-ID: Hi, when I used openrtsp to stream h264 (from wirecast a.k.a. darwin 555), it created a "raw" h264 file. sadly, however, this file would not play in VLC or quicktime. it appears that to fix the issue, It didn't require much more work, basically what was missing was: 1) parse the SDP sprop parameters into NALs (using code already available in H264videoRTPSource.hh) 2) analyze the arriving h264 packets, to understand which packets are IDR frames. 3) before every IDR frame, output the SPS/PPS NALs that were decoded in step #1 now I am able to stream to raw 264 file perfectly! I would be happy to supply the modified source to anyone if there is any interest. I used the open source project h264bitstream https://sourceforge.net/projects/h264bitstream/ to accomplish step #2 h264bitstream is very light weight (only 2 .c files and headers), and it is LGPL just like live555. Aviad Rozenhek From dgf3208 at gmail.com Tue Aug 7 15:50:58 2007 From: dgf3208 at gmail.com (Dan Franke) Date: Tue, 7 Aug 2007 16:50:58 -0600 Subject: [Live-devel] Destroying RTSPClientSession objects Message-ID: <62ca8cdb0708071550t34efd660y53efce694bc7170a@mail.gmail.com> Hello All, Is there a way to destroy the RTSPClientSession objects that the RTSPServer creates? It appears that they only get destroyed when the client terminates the session. I'd like to be able shutdown my server code cleanly (ie: not leaking memory) independent of what the client application is doing. Dan Franke -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070807/ac33f6fe/attachment.html From finlayson at live555.com Tue Aug 7 16:49:30 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 7 Aug 2007 16:49:30 -0700 Subject: [Live-devel] Destroying RTSPClientSession objects In-Reply-To: <62ca8cdb0708071550t34efd660y53efce694bc7170a@mail.gmail.com> References: <62ca8cdb0708071550t34efd660y53efce694bc7170a@mail.gmail.com> Message-ID: >Is there a way to destroy the RTSPClientSession objects that the >RTSPServer creates? It appears that they only get destroyed when >the client terminates the session. A "RTSPClientSession" object will also get terminated if the client times out (due to inactivity - i.e., no RTSP commands or RTCP "RR" packets received within "reclamationTestSeconds" seconds). It will also terminate if its read from the TCP socket fails (for whatever reason). > I'd like to be able shutdown my server code cleanly (ie: not >leaking memory) independent of what the client application is doing. If your server code is running in its own process - i.e., as an application - then you can just kill the process, and the OS will automatically reclaim all memory, sockets etc. that the server was using. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From zcjgeorge at gmail.com Tue Aug 7 18:44:47 2007 From: zcjgeorge at gmail.com (George Zhu) Date: Wed, 8 Aug 2007 09:44:47 +0800 Subject: [Live-devel] streaming download for H263+AMR Message-ID: <58a233710708071844k74ca7309hbaf67ed73fbba950@mail.gmail.com> Dear All, now,i want to download streaming for H263+AMR,and I read the QuickTimeFileSink code and try follow that idea. anyone can give me some tips? -- George Zhu From xqma at marvell.com Tue Aug 7 22:49:43 2007 From: xqma at marvell.com (Xianqing Ma) Date: Tue, 7 Aug 2007 22:49:43 -0700 Subject: [Live-devel] ask for m4e files Message-ID: <7F6B283111E9EB4BA570D4C0322C8EEE0F676B@sc-exch04.marvell.com> Hi: Live media server only supports m4e files. I`ve read the FAQs and tried to find the m4e files on the web. But these files are scarce. If you have these files, no matter how small they are, please send them to me or give me a link to the web page where I can download them. Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070807/7a5dcaa9/attachment.html From finlayson at live555.com Wed Aug 8 00:43:12 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 8 Aug 2007 00:43:12 -0700 Subject: [Live-devel] ask for m4e files In-Reply-To: <7F6B283111E9EB4BA570D4C0322C8EEE0F676B@sc-exch04.marvell.com> References: <7F6B283111E9EB4BA570D4C0322C8EEE0F676B@sc-exch04.marvell.com> Message-ID: > Live media server only supports m4e files. I`ve read the FAQs >and tried to find the m4e files on the web. But these files are >scarce. If you have these files, no matter how small they are, >please send them to me or give me a link to the web page where I can >download them. I don't really understand why someone would want to stream MPEG-4 Elementary Stream files, but not already have any. But anyway, I've left a few online at http://www.live555.com/liveMedia/public/m4e/ -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/fe067cbd/attachment.html From brian.vdsouza at gmail.com Wed Aug 8 14:31:11 2007 From: brian.vdsouza at gmail.com (Brian D'Souza) Date: Wed, 8 Aug 2007 14:31:11 -0700 Subject: [Live-devel] Couple of questions Message-ID: <5770ac1b0708081431k67e6d4f8x5ac56172e97ce1eb@mail.gmail.com> Hi, I have been studying the live media code and its corresponding utilities for use in my module. Let me give an overview of what I am looking to develop. ================================================================================================================ Basically my input is a transport stream from a transport stream multiplexer. Now this input will be fed to one of the modules of live555. >From there I need it to be converted into RTP packets( but not the transport stream packets over RTP), but rather individual components( audio/video ) as separate sessions. This stream then would need to be fed into DSS. and then the client typically connects to a stream. (deployed on DSS). The thing is the client here is a mobile and my client on the mobile has only support for RTP and mp4(codecs) as well as AAC codecs..( i.e decoder) One more thing is that my trasnport stream is mp4 content carried over MPEG2TS. =============================================================================================================== Following are my observations from the modules i studied in terms of what i could reuse. I need your help to understand if there is anything i could use or could get help with. I had an approach to proceed and when I looked at the code this is what i followed. 1. First of all I went through the flow of the following test programs -- testMPEG2TransportStreamer.cpp --testMPEG4VideoToDarwin.cpp -- testMPEG4VideoStreamer.cpp and what methods they call internally. Since the encoded content I have is mp4 (carried in MpEG2TS) Now I can reuse testMPEG2TransportStreamer.cpp to read from encoder and work on transport stream. But while looking at the flow i realized that it basically carried ts over RTP ( since its encoded content profile is 33 : mp2t). This might not work for me since the cellphone does not have the corresponding decoder. *a) So I was looking for something which demultiplexes this TS into audio and video streams.* Currently when I look at the code, we do have modules to demultiplex program streams, multiplex into transport stream and also modules to convert from program stream to transport stream. *But what I would really need is to either have a demultiplexer for the transport stream, or rather converter from transport to program stream. DO we have either support which I might have missed out?* ** *b) Another observation I made is that even if I got the above support to demultiplex into 2 elementary streams( one would be MP4-V-ES ) and other would be AAC supported audio stream(elementary). * *If my understanding is correct we dont have support for AAC ES right? How hard would it be for me to incorporate this support.* ** Alternatively the SimpleRTPSink might be useful for any type of information, but we have no support for a framer class for the AAC audio type right? ** *---> *I know I have asked a lot of questions( but i made sure to read a lot of previous posts) to make sure I wasnt asking anything already posted!! Would really appreciate any response Thanks a ton Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/a121497d/attachment-0001.html From brian.vdsouza at gmail.com Wed Aug 8 14:58:58 2007 From: brian.vdsouza at gmail.com (Brian D'Souza) Date: Wed, 8 Aug 2007 14:58:58 -0700 Subject: [Live-devel] Clarification reg. FAQ Message-ID: <5770ac1b0708081458v993e18en1a97b0905f5387f0@mail.gmail.com> Hi, In one of the FAQ's as well as earlier discussions there was something asked about streaming MPEG4 content and the need for enabling the RTSp server in those applications for unicast streaming. *My scenario:* What would be needed in this scenario, I want unicast streaming but basically I use DSS as the contact point for the client. Wouldn't it be true that DSS handled the RTSP part of the work( DESCRIBE, etc) and then could I still offer unicast streaming without enabling livemedia RTSP server with unicast support. (and just inject the stream onto DSS). Would appreciate any inputs, Thanks Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/b149465e/attachment.html From finlayson at live555.com Wed Aug 8 15:52:05 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 8 Aug 2007 15:52:05 -0700 Subject: [Live-devel] Couple of questions In-Reply-To: <5770ac1b0708081431k67e6d4f8x5ac56172e97ce1eb@mail.gmail.com> References: <5770ac1b0708081431k67e6d4f8x5ac56172e97ce1eb@mail.gmail.com> Message-ID: > a) So I was looking for something which demultiplexes this TS into >audio and video streams. >Currently when I look at the code, we do have modules to demultiplex >program streams, multiplex into transport stream and also modules to >convert from program stream to transport stream. >But what I would really need is to either have a demultiplexer for >the transport stream, or rather converter from transport to program >stream. DO we have either support which I might have missed out? No, our library currently has no support for *demultiplexing* Transport Streams. Sorry. > >b) Another observation I made is that even if I got the above >support to demultiplex into 2 elementary streams( one would be >MP4-V-ES ) and other would be AAC supported audio stream(elementary). >If my understanding is correct we dont have support for AAC ES >right? How hard would it be for me to incorporate this support. > Alternatively the SimpleRTPSink might be useful for any type of >information, but we have no support for a framer class for the AAC >audio type right? AAC audio should be streamed using the "MPEG4GenericRTPSink" class. (See the "ADTSAudioFileServerMediaSubsession" for an example.) Note also that we have a perfectly good RTSP server implementation. You don't need to use a separate 'Darwin Streaming Server'. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/8f5db146/attachment.html From brian.vdsouza at gmail.com Wed Aug 8 16:44:38 2007 From: brian.vdsouza at gmail.com (Brian D'Souza) Date: Wed, 8 Aug 2007 16:44:38 -0700 Subject: [Live-devel] Couple of questions In-Reply-To: References: <5770ac1b0708081431k67e6d4f8x5ac56172e97ce1eb@mail.gmail.com> Message-ID: <5770ac1b0708081644j56e4585ft1ac9f1aac8397933@mail.gmail.com> Hi , Enclosed replies and more questions below. Brian On 8/8/07, Ross Finlayson wrote: > > *a) So I was looking for something which demultiplexes this TS into > audio and video streams.* > > Currently when I look at the code, we do have modules to demultiplex > program streams, multiplex into transport stream and also modules to convert > from program stream to transport stream. > > *But what I would really need is to either have a demultiplexer for the > transport stream, or rather converter from transport to program stream. DO > we have either support which I might have missed out?* > > > > No, our library currently has no support for *demultiplexing* Transport > Streams. Sorry. > > > >>>>Thanks, will have to look at something which does that and hooks it > onto livemedia. > I looked at an earlier post of June where you had suggested the below for piping streams from encoder. or in my case the audio(AAC) and video streams. *"run your MPEG-2 TS grabbing code in a separate process (that **writes to stdout), and just pipe it to "testMPEG2TransportStreamer" **using the command line - i.e.,** yourGrabberApplication | testMPEG2TransportStreamer. **Alternatively, if your Transport Stream source is available as a **med file (e.g., device), then just run** testMPEG2TransportStreamer < yourTransportStreamDeviceFileName". * ** Could I do something similar to pipe 2 outputs? since whatever demux i ultimately decide to use, outputs audio (AMR) and video(MP4-V-ES). I know it is slightly out of scope but would appreciate your feedback. Would it better to read input from a socket in this case and make sure demuz sends data onto a socket.? Or would it be better to intergate some sort of demux with the live555 and if so, currently the ByteStreamFileSource only reads from files(do you think it would be an issue to read stream of bytes, or would be better to make some sort of temp files.) > > > *b) Another observation I made is that even if I got the above support to > demultiplex into 2 elementary streams( one would be MP4-V-ES ) and other > would be AAC supported audio stream(elementary).* > > *If my understanding is correct we dont have support for AAC ES right? How > hard would it be for me to incorporate this support.* > > Alternatively the SimpleRTPSink might be useful for any type of > information, but we have no support for a framer class for the AAC audio > type right? > > > > AAC audio should be streamed using the "MPEG4GenericRTPSink" class. (See > the "ADTSAudioFileServerMediaSubsession" for an example.) > > > Note also that we have a perfectly good RTSP server implementation. You > don't need to use a separate 'Darwin Streaming Server'. > Thanks. I could use something like this right/ I can create the audio sink as suggested by you apart from a few parameters like no. of channels etc which I can look into. I have enclosed a snippet of code : I have a doubt on what type of audio (AAC es ) framer class I would use?? void play() { // Open the input file as a 'byte-stream file source': ByteStreamFileSource* fileSource1 = ByteStreamFileSource::createNew(*env, inputFileName1); //First one for video es // Open the input file as a 'byte-stream file source': ByteStreamFileSource* fileSource2 = ByteStreamFileSource::createNew(*env, inputFileName2); //second fro aac es FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES); audioSource = ?::createNew(*env ,audioES) videoSink->startPlaying(*videoSource, afterPlaying, videoSink); audioSink->startPlaying(*audioSource, afterPlaying, audioSink); } Thanks for your prompt response. Appreciate your advice Brian -- > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/7c24dc08/attachment.html From brian.vdsouza at gmail.com Wed Aug 8 16:47:24 2007 From: brian.vdsouza at gmail.com (Brian D'Souza) Date: Wed, 8 Aug 2007 16:47:24 -0700 Subject: [Live-devel] Couple of questions In-Reply-To: <5770ac1b0708081644j56e4585ft1ac9f1aac8397933@mail.gmail.com> References: <5770ac1b0708081431k67e6d4f8x5ac56172e97ce1eb@mail.gmail.com> <5770ac1b0708081644j56e4585ft1ac9f1aac8397933@mail.gmail.com> Message-ID: <5770ac1b0708081647n2694850am437980b7f192297d@mail.gmail.com> Regarding not using Darwin, the idea is to alow support playlists as well ( in terms of scalability). Else Live555 was enough. One question here. It is fine to do unicast streaming and inject those streams ( without enabling RTSP server right) and let the DSS take care of RTSP right? Thanks Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/c0220654/attachment.html From cwein at hotmail.com Wed Aug 8 19:00:34 2007 From: cwein at hotmail.com (Chris Wein) Date: Wed, 8 Aug 2007 19:00:34 -0700 Subject: [Live-devel] Couple of questions Message-ID: Note that ADTS is not quite an elementary AAC stream so if you demux the AAC you might just get a stream of raw data blocks with no headers at all which cannot be decoded unless the decoder knows the sampling frequency and channel config.Date: Wed, 8 Aug 2007 15:52:05 -0700To: live-devel at ns.live555.comFrom: finlayson at live555.comSubject: Re: [Live-devel] Couple of questions Re: [Live-devel] Couple of questions a) So I was looking for something which demultiplexes this TS into audio and video streams. Currently when I look at the code, we do have modules to demultiplex program streams, multiplex into transport stream and also modules to convert from program stream to transport stream. But what I would really need is to either have a demultiplexer for the transport stream, or rather converter from transport to program stream. DO we have either support which I might have missed out? No, our library currently has no support for *demultiplexing* Transport Streams. Sorry. b) Another observation I made is that even if I got the above support to demultiplex into 2 elementary streams( one would be MP4-V-ES ) and other would be AAC supported audio stream(elementary). If my understanding is correct we dont have support for AAC ES right? How hard would it be for me to incorporate this support. Alternatively the SimpleRTPSink might be useful for any type of information, but we have no support for a framer class for the AAC audio type right? AAC audio should be streamed using the "MPEG4GenericRTPSink" class. (See the "ADTSAudioFileServerMediaSubsession" for an example.) Note also that we have a perfectly good RTSP server implementation. You don't need to use a separate 'Darwin Streaming Server'. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/deb3ff85/attachment-0001.html From lurie.eli at gmail.com Wed Aug 8 20:10:24 2007 From: lurie.eli at gmail.com (Eli Lurie) Date: Thu, 9 Aug 2007 05:10:24 +0200 Subject: [Live-devel] ASF payload using opemRTSP exe Message-ID: <3818ecca0708082010l37de9a89p2d7a8ecfc5768bb@mail.gmail.com> Hi I am using openRTSP executable to extract ASF audio payload from the RTP packet. What is the offset option that I should enter for -s option, so the acccepted file will be readable by Windows media Player 9.0 Eli -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070808/721670b7/attachment.html From finlayson at live555.com Wed Aug 8 21:57:57 2007 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 8 Aug 2007 21:57:57 -0700 Subject: [Live-devel] ASF payload using opemRTSP exe In-Reply-To: <3818ecca0708082010l37de9a89p2d7a8ecfc5768bb@mail.gmail.com> References: <3818ecca0708082010l37de9a89p2d7a8ecfc5768bb@mail.gmail.com> Message-ID: >I am using openRTSP executable to extract ASF audio payload from the >RTP packet. >What is the offset option that I should enter for -s option, so the >acccepted file will be readable by >Windows media Player 9.0 I have no idea. (I don't even know if what you're suggesting is possible.) The "X-ASF" payload format is a closed, Microsoft-proprietary payload format. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kan at pisem.net Thu Aug 9 23:46:41 2007 From: kan at pisem.net (kan at pisem.net) Date: Fri, 10 Aug 2007 10:46:41 +0400 Subject: [Live-devel] how can I stream multiple files like one big file Message-ID: <20070810104641.c8tbwlf5hyckw80c@www.pochta.ru> I want to stream multiple video clips like one continuous file. In addition a want to stream these files in endless loop. It should look like one continuous transport stream for streaming using UDP on Amino STB. For this purpose I use ByteStreamMultiFileSource. Video clips for streaming are all in same MPEG-2-TS format (same bitrate, framerate, etc ...) I try stream with testMPEG2TransportStreamer, Amino receive stream, but playback is very unstable, sometimes picture on TV hang up. I think this is because transport stream is discontinuous. That's why I try stream with testMPEG2ProgramStreamToTransportStream, where transport headers created in run time. But I again have difrent artifacts during playbacking. How can I create continuous transport stream? I'm very thankfull to your help! Sorry for awful English =) Karlov Andrey. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070809/c49e766b/attachment.html From finlayson at live555.com Fri Aug 10 01:58:27 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Aug 2007 01:58:27 -0700 Subject: [Live-devel] how can I stream multiple files like one big file In-Reply-To: <20070810104641.c8tbwlf5hyckw80c@www.pochta.ru> References: <20070810104641.c8tbwlf5hyckw80c@www.pochta.ru> Message-ID: Using "ByteStreamMultiFileSource" is correct. However, because you are joining together multiple, separate Transport Stream files, you should make sure that the "discontinuity_indicator" bit is set in the first header of each Transport Stream file. This tells both our server, and the receiving client, that the PCRs in the Transport Stream have reset. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mailing.list.recipient at gmail.com Fri Aug 10 02:27:20 2007 From: mailing.list.recipient at gmail.com (John Gunnarsson) Date: Fri, 10 Aug 2007 11:27:20 +0200 Subject: [Live-devel] Is this possible with live-devel lib? Message-ID: <6b332d3d0708100227j3b1adf38jfec84990fe42991c@mail.gmail.com> I would like to build a mp3 streaming server which in many ways behaves as vlc etc, (build playlist, pause, seek, etc) but instead of audio out, I would like to stream the output to a mp3 hardware decoder that I have. My question is if it's possible to seek in a song while streaming it (like in vlc, seek to 2:50 into the song playing)? Is it possible to pause, resume aswell? I saw the extensive documentation, but I'm not really sure where to start, could you give me some hints on which classes/datastructures where I could start, that fits my need the best for the scenario above? //John From qinghai.gan2 at mail.dcu.ie Fri Aug 10 03:57:27 2007 From: qinghai.gan2 at mail.dcu.ie (qinghai.gan2 at mail.dcu.ie) Date: Fri, 10 Aug 2007 11:57:27 +0100 Subject: [Live-devel] How to output one file for MPEG1or2 streaming Message-ID: <46B81B04000034ED@hawk.dcu.ie> Hi all, I use playCommon to receive the MPEG1or2 stream, it outputs two files: video-MPV-1 and audio-MPA-2. How can I output just one file, like .mpg file? Another question, does testOnDemandRTSPServer send one stream or two streams for MPEG1or2 source (*.mpg file)? If its two streams, can I send one stream? Thanks. Best Regards, Qing Hai From mailing.list.recipient at gmail.com Fri Aug 10 06:50:13 2007 From: mailing.list.recipient at gmail.com (John Gunnarsson) Date: Fri, 10 Aug 2007 15:50:13 +0200 Subject: [Live-devel] Anyone used Barix Extreamer as a client? Message-ID: <6b332d3d0708100650g197c4145pc1eeb3f20f2ad8e8@mail.gmail.com> Hi, Has anyone used a Barix Extreamer (http://www.barix.com/content/view/57/183/) as a streaming client with liveMedia library as a streaming server? //John From finlayson at live555.com Fri Aug 10 07:56:13 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Aug 2007 07:56:13 -0700 Subject: [Live-devel] How to output one file for MPEG1or2 streaming In-Reply-To: <46B81B04000034ED@hawk.dcu.ie> References: <46B81B04000034ED@hawk.dcu.ie> Message-ID: >I use playCommon to receive the MPEG1or2 stream, it outputs two >files: video-MPV-1 >and audio-MPA-2. How can I output just one file, like .mpg file? There's currently no support in the library for this - i.e, there's no MPEG-1 or 2 *multiplexor* class. You would need to write your own multiplexor class to do this. >Another question, does testOnDemandRTSPServer send one stream or two streams >for MPEG1or2 source (*.mpg file)? Two streams. > If its two streams, can I send one stream? No. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Aug 10 08:01:41 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Aug 2007 08:01:41 -0700 Subject: [Live-devel] Is this possible with live-devel lib? In-Reply-To: <6b332d3d0708100227j3b1adf38jfec84990fe42991c@mail.gmail.com> References: <6b332d3d0708100227j3b1adf38jfec84990fe42991c@mail.gmail.com> Message-ID: It's not clear from your message exactly what you're trying to do. >I would like to build a mp3 streaming server Here you talk about wanting to build a server. > which in many ways >behaves as vlc etc, >(build playlist, pause, seek, etc) >but instead of audio out, I would like to stream the output to a mp3 >hardware decoder that I have. But here you describe functionality that sounds like that of a client. Can you explain exactly what you want to do? >My question is if it's possible to seek in a song while streaming it >(like in vlc, seek to 2:50 into the song playing)? >Is it possible to pause, resume aswell? Yes, our RTSP server implementation supports seeking and pause/play for MP3/RTP streams. (Our RTSP client implementation also supports these operations, of course.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From kan at pisem.net Fri Aug 10 08:06:10 2007 From: kan at pisem.net (kan at pisem.net) Date: Fri, 10 Aug 2007 19:06:10 +0400 Subject: [Live-devel] how can I stream multiple files like one big file Message-ID: <20070810190610.jsjcje92as44ws0w@www.pochta.ru> I have added "discontinuity_indicator" bit in the begin of each Transport Stream file. But I'm still have artifacts during playback multiple files. I read mpeg2 iso13818 and if I understand all correctly, I must set discontinuity_indicator=1 in each TS packet, which continuity_counter has discontinuity. I have 4 diffrent PID in each file: - PAT - PMT - audio - video and I must set discontinuity_indicator='1' in the first packet of each these PID ? If that's right, I don't know why Amino 110 playback this stream very unstable and with artifact. In addition VLC play only first file of sequence and whan hang up and show black screen. And another question. In testMPEG1or2ProgramStreamToTransportStream TS packets created in run-time. What I must do If I want create continuous transport stream? I think I need to correct PCR, PTS, PDS and what else I must do and where? I'm very thankfull to your help! Sorry for awful English =) Karlov Andrey. From mailing.list.recipient at gmail.com Fri Aug 10 08:50:39 2007 From: mailing.list.recipient at gmail.com (John Gunnarsson) Date: Fri, 10 Aug 2007 17:50:39 +0200 Subject: [Live-devel] Is this possible with live-devel lib? In-Reply-To: References: <6b332d3d0708100227j3b1adf38jfec84990fe42991c@mail.gmail.com> Message-ID: <6b332d3d0708100850r10fb6db6t4d730052bb7b47a2@mail.gmail.com> Sorry if I was a bit unclear. What I want to accomplish is to build a streamserver. That streamserver is going to stream music to a hardware device (a extreamer from barix). Instead of having a "dumb" streaming server that for example continously stream songs from a playlist to the client, I want the streamserver to be remotely controlled via a webpage (for example) where I can seek, pause, play etc. Much like having a VLC/Winamp player locally with the difference that I'm controlling the streamserver instead. Does it make any sense to you? //John On 8/10/07, Ross Finlayson wrote: > It's not clear from your message exactly what you're trying to do. > > >I would like to build a mp3 streaming server > > Here you talk about wanting to build a server. > > > which in many ways > >behaves as vlc etc, > >(build playlist, pause, seek, etc) > >but instead of audio out, I would like to stream the output to a mp3 > >hardware decoder that I have. > > But here you describe functionality that sounds like that of a client. > > Can you explain exactly what you want to do? > > >My question is if it's possible to seek in a song while streaming it > >(like in vlc, seek to 2:50 into the song playing)? > >Is it possible to pause, resume aswell? > > Yes, our RTSP server implementation supports seeking and pause/play > for MP3/RTP streams. (Our RTSP client implementation also supports > these operations, of course.) > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From brian.vdsouza at gmail.com Fri Aug 10 09:57:50 2007 From: brian.vdsouza at gmail.com (Brian D'Souza) Date: Fri, 10 Aug 2007 09:57:50 -0700 Subject: [Live-devel] Queries Message-ID: <5770ac1b0708100957s6b6bdecs72dbc69ed0ee4efd@mail.gmail.com> Would appreciate feedback *a) So I was looking for something which demultiplexes this TS into audio and video streams.* Currently when I look at the code, we do have modules to demultiplex program streams, multiplex into transport stream and also modules to convert from program stream to transport stream. *But what I would really need is to either have a demultiplexer for the transport stream, or rather converter from transport to program stream. DO we have either support which I might have missed out? * No, our library currently has no support for *demultiplexing* Transport Streams. Sorry. >>>>Thanks, will have to look at something which does that and hooks it onto livemedia. I looked at an earlier post of June where you had suggested the below for piping streams from encoder. or in my case the audio(AAC) and video streams. *"run your MPEG-2 TS grabbing code in a separate process (that **writes to stdout), and just pipe it to "testMPEG2TransportStreamer" **using the command line - i.e.,** yourGrabberApplication | testMPEG2TransportStreamer. **Alternatively, if your Transport Stream source is available as a **med file (e.g., device), then just run ** testMPEG2TransportStreamer < yourTransportStreamDeviceFileName". * ** Could I do something similar to pipe 2 outputs? since whatever demux i ultimately decide to use, outputs audio (AMR) and video(MP4-V-ES). I know it is slightly out of scope but would appreciate your feedback. Would it better to read input from a socket in this case and make sure demuz sends data onto a socket.? Or would it be better to intergate some sort of demux with the live555 and if so, currently the ByteStreamFileSource only reads from files(do you think it would be an issue to read stream of bytes, or would be better to make some sort of temp files.) > > > *b) Another observation I made is that even if I got the above support to > demultiplex into 2 elementary streams( one would be MP4-V-ES ) and other > would be AAC supported audio stream(elementary). * > > *If my understanding is correct we dont have support for AAC ES right? How > hard would it be for me to incorporate this support.* > > Alternatively the SimpleRTPSink might be useful for any type of > information, but we have no support for a framer class for the AAC audio > type right? > > > > AAC audio should be streamed using the "MPEG4GenericRTPSink" class. (See > the "ADTSAudioFileServerMediaSubsession" for an example.) > > > Note also that we have a perfectly good RTSP server implementation. You > don't need to use a separate 'Darwin Streaming Server'. > Thanks. I could use something like this right/ I can create the audio sink as suggested by you apart from a few parameters like no. of channels etc which I can look into. I have enclosed a snippet of code : I have a doubt on what type of audio (AAC es ) framer class I would use?? void play() { // Open the input file as a 'byte-stream file source': ByteStreamFileSource* fileSource1 = ByteStreamFileSource::createNew(*env, inputFileName1); //First one for video es // Open the input file as a 'byte-stream file source': ByteStreamFileSource* fileSource2 = ByteStreamFileSource::createNew(*env, inputFileName2); //second fro aac es FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES); audioSource = ?::createNew(*env ,audioES) videoSink->startPlaying(*videoSource, afterPlaying, videoSink); audioSink->startPlaying(*audioSource, afterPlaying, audioSink); } Thanks for your prompt response. Appreciate your advice -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070810/bbd95043/attachment-0001.html From finlayson at live555.com Fri Aug 10 11:58:55 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Aug 2007 11:58:55 -0700 Subject: [Live-devel] how can I stream multiple files like one big file In-Reply-To: <20070810190610.jsjcje92as44ws0w@www.pochta.ru> References: <20070810190610.jsjcje92as44ws0w@www.pochta.ru> Message-ID: >If that's right, I don't know why Amino 110 playback this stream very >unstable and with artifact. We can't help you with that - the Amino 110 is not our product. You'll need to ask Amino. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Fri Aug 10 12:06:15 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Aug 2007 12:06:15 -0700 Subject: [Live-devel] Is this possible with live-devel lib? In-Reply-To: <6b332d3d0708100850r10fb6db6t4d730052bb7b47a2@mail.gmail.com> References: <6b332d3d0708100227j3b1adf38jfec84990fe42991c@mail.gmail.com> <6b332d3d0708100850r10fb6db6t4d730052bb7b47a2@mail.gmail.com> Message-ID: >Instead of having a "dumb" streaming server that for example >continously stream songs from a playlist to the client, I want the >streamserver to be remotely controlled via a webpage (for example) >where I can seek, pause, play etc. Because you want to control the stream from a HTTP server, then perhaps you would be better off just streaming the MP3 via HTTP (like most MP3 streaming these days). If, however, you want to stream via RTP, then you should control the stream using RTSP (the standard control protocol for RTP streams). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From xcsmith at rockwellcollins.com Fri Aug 10 15:11:06 2007 From: xcsmith at rockwellcollins.com (xcsmith at rockwellcollins.com) Date: Fri, 10 Aug 2007 17:11:06 -0500 Subject: [Live-devel] LIVE releases / Products using testProg piping Message-ID: Ross, 1. I hope it is not unbecoming for me to inquire, but are you expecting to release any significant trick mode changes in the next 2 weeks? I ask because I must now specify the version of LIVE555 which will be delivered with our baseline product. I should not like to miss out on important, planned improvements which are scheduled to be released imminently. Appologies if my question is inappropriate. 2. I have noticed your solutions which suggest piping between different LIVE555 test applications. Do you believe or know for certain that software products often use the LIVE555 in this way, rather than implementing an application linked against the LIVE? Thank you! Xochitl Smith -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070810/83722c2f/attachment.html From finlayson at live555.com Fri Aug 10 15:58:59 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 10 Aug 2007 15:58:59 -0700 Subject: [Live-devel] LIVE releases / Products using testProg piping In-Reply-To: References: Message-ID: >1. I hope it is not unbecoming for me to inquire, but are you >expecting to release any significant trick mode changes in the next >2 weeks? No, probably not. (At least, not to the server code. I might add new options to "openRTSP" to support client 'trick mode' operations, because several people have been asking for this.) >2. I have noticed your solutions which suggest piping between >different LIVE555 test applications. Do you believe or know for >certain that software products often use the LIVE555 in this way, >rather than implementing an application linked against the LIVE? If a data source is available as a file, or as a pipe, then it's generally better (i.e., much simpler) to use it as such (using "ByteStreamFileSource"), rather than writing your own new "FramedSource" subclass that encapsulates the data. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070810/30b0d473/attachment.html From lurie.eli at gmail.com Sat Aug 11 14:57:09 2007 From: lurie.eli at gmail.com (Eli Lurie) Date: Sun, 12 Aug 2007 00:57:09 +0300 Subject: [Live-devel] Stopping getting frames Message-ID: <3818ecca0708111457s62d80fe6ib864cad588b7a650@mail.gmail.com> Hi 1.I am using openRTSP -s 4 -t -V -c rtsp://stream.msn.co.il/glz-stream 2.After ~500Kb the program stops recording frames 3. I am sniffing with the Wireshark and notice that RTP protocol packets are not arriving anymore Any idea? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070811/d2c73221/attachment.html From finlayson at live555.com Sat Aug 11 23:48:24 2007 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 11 Aug 2007 23:48:24 -0700 Subject: [Live-devel] Stopping getting frames In-Reply-To: <3818ecca0708111457s62d80fe6ib864cad588b7a650@mail.gmail.com> References: <3818ecca0708111457s62d80fe6ib864cad588b7a650@mail.gmail.com> Message-ID: >Hi >1.I am using openRTSP -s 4 -t -V -c rtsp://stream.msn.co.il/glz-stream >2.After ~500Kb the program stops recording frames Yes, I see the same thing. >3. I am sniffing with the Wireshark and notice that RTP protocol >packets are not arriving anymore >Any idea? Sorry, no. For some unknown reason the server stops streaming. Because the server is a closed-source Microsoft product, your own hope may be to ask Microsoft for help (good luck :-) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rajeshkumar.r at imimobile.com Sun Aug 12 23:53:44 2007 From: rajeshkumar.r at imimobile.com (rajesh) Date: Mon, 13 Aug 2007 12:23:44 +0530 Subject: [Live-devel] Consecutive Audio and Video Ports Message-ID: <003501c7dd76$b0ccf510$6902000a@imidomain.com> Hi Ross, when I am setting port number to desiredPortNum ,then Audio and Video RTP Ports are Consecutive. like desiredPortNum = 57346; then Setup "video/H263-2000" subsession (client ports 57346-57347) Setup "audio/AMR" subsession (client ports 57348-57349) is assinged as client Audio and Video Ports. All these Ports are Consecutive. My Requirement is to set audio and video different ports like for Audio---49154-49155 video --- 57348-57349 Audio and Video RTP Ports should not be Consecutive. Is there any Provision in Live that we can set Audio and Video Ports as Per owr Requirement. Thanks and Regards. Rajesh Kumar Software Engineer +91 40 23555945 - 235 +91 99084 00027 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070812/d00f56a6/attachment.html From kan at pisem.net Mon Aug 13 01:56:00 2007 From: kan at pisem.net (Karlov Andrey) Date: Mon, 13 Aug 2007 12:56:00 +0400 Subject: [Live-devel] using testMPEG2ProgramStreamToTransportStream for streaming mpeg2 PS files Message-ID: <46C01CA0.50704@pisem.net> I want to stream mpeg2 Program Stream files by converting in a transport stream in run time I have tried to use "testMPEG1or2ProgramToTransportStream". I have replaced "FileSink" with "MPEG2TransportStreamFramer" filter + "BasicUDPSink". It works, but a sometimes picture began to shake and it's possible to see cubes. In what there can be a problem? From samuelz_83 at hotmail.com Mon Aug 13 02:05:40 2007 From: samuelz_83 at hotmail.com (ZengWenfeng) Date: Mon, 13 Aug 2007 09:05:40 +0000 Subject: [Live-devel] How to use livemedia library in python? Message-ID: hi! I'm new to this discussing list, and I am working with a project, part of it is to implement a media streamer and receiver using RTP payload. I found livemedia is useful, but it's a c++ library, and the other part of the project is written by python. I wonder how can I use the library with python? I guess maybe someone already has it done before, so please help me. thanks very much!! _________________________________________________________________ ?????? MSN Messenger? http://imagine-msn.com/messenger/launch80/default.aspx?locale=zh-cn&source=wlmailtagline -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070813/d5d3e549/attachment.html From zcjgeorge at gmail.com Mon Aug 13 06:17:45 2007 From: zcjgeorge at gmail.com (George Zhu) Date: Mon, 13 Aug 2007 21:17:45 +0800 Subject: [Live-devel] How to use livemedia library in python? In-Reply-To: References: Message-ID: <58a233710708130617x209aab9k414c9362e6c186f3@mail.gmail.com> I think U can write a console application with live555,then control it in your python code. On 8/13/07, ZengWenfeng wrote: > hi! I'm new to this discussing list, and I am working with a project, part > of it is to implement a media streamer and receiver using RTP payload. I > found livemedia is useful, but it's a c++ library, and the other part of the > project is written by python. I wonder how can I use the library with > python? I guess maybe someone already has it done before, so please help me. > thanks very much!! > ________________________________ > ?????? MSN Messenger? ????? > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -- George Zhu at Zhejiang University From finlayson at live555.com Mon Aug 13 07:30:15 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Aug 2007 07:30:15 -0700 Subject: [Live-devel] using testMPEG2ProgramStreamToTransportStream for streaming mpeg2 PS files In-Reply-To: <46C01CA0.50704@pisem.net> References: <46C01CA0.50704@pisem.net> Message-ID: >I want to stream mpeg2 Program Stream files by converting in a >transport stream in run time >I have tried to use "testMPEG1or2ProgramToTransportStream". >I have replaced "FileSink" with "MPEG2TransportStreamFramer" filter + >"BasicUDPSink". >It works, but a sometimes picture began to shake and it's possible to >see cubes. >In what there can be a problem? The problem may be that your outgoing UDP packets each contain only one (188-byte) Transport Stream packet. This is legal, but some clients might not be handle many very small UDP packets, rather than a smaller number of larger UDP packets. You can fix this by accumulating several 188-byte Transport Stream packets into each outgoing UDP packet. People usually pack 7 Transport Stream packets into each UDP packet (because 7*188 < 1500 bytes, which is less than the MTU of most networks). I suggest reusing the code for the "MPEG2TransportStreamAccumulator" class; this code is bundled with the "wis-streamer" source code - see . Insert an object of this class between your "MPEG2TransportStreamFramer" and your "BasicUDPSink", and you will get larger outgoing UDP packets. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yang-m-h at 126.com Mon Aug 13 07:30:22 2007 From: yang-m-h at 126.com (yang-m-h at 126.com) Date: Mon, 13 Aug 2007 22:30:22 +0800 (CST) Subject: [Live-devel] Why the testMPGE4VideoStreamer does not w ork? Message-ID: <46C06AFE.00005C.04201@bj126app17.126.com> Hi, I want the testMPGE4VideoStreamer to transport the real time encoded stream, so I modified the program, just simply replace the appropriate "test.*" file name with the file name of my input device. But when run the program to open the device, the program close the device immediately. So I think the steam devices are different from the file input. Perhaps, for example , the file has the end and can get the size of it . So if I want the program to read from the device continuely and not stop and close the device, what should I do? thank you ! 8?15 ?? ? ? ? 3?? ? ? ??? ? ? ? ? ? ? ? ? ? ? ? ? >> -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070813/1b26d3ba/attachment.html From jeffw at ftl.tvrc.com Mon Aug 13 07:36:47 2007 From: jeffw at ftl.tvrc.com (Jeff Wisgo) Date: Mon, 13 Aug 2007 10:36:47 -0400 Subject: [Live-devel] Cannot play scaled file properly using QAM video card: file does not conform to DVB standards Message-ID: <50B4262F4266D811B94D00508B76B76A5EDF22@Exchange> I have been tasked with writing trick play logic to work with the Video Propulsion Q4LP QAM card, so as a proof-of-concept I am trying to use the "testMPEG2TransportStreamTrickPlay" test program to generate a 8x scaled file to simulate Fast-Forward. The file I generated plays fine on VLC and WMP, however it does not play at all on my QAM card. When I asked the VideoPropulsion technical support, they said the 8x .ts file did not conform to 'DVB standards', and more specifically: "There are repeated PTS & PCR timestamps having the same value. They look to be creating a PES packet for every slice of the video elementary stream, which is also unusual to say the least." I am curious if this is a bug with the live-555 logic, or if this working as designed. Would modifying the live-555 code to generate 'DVB standards'-compliant code be a major effort? > Jeffrey Wisgo > -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/ms-tnef Size: 2638 bytes Desc: not available Url : http://lists.live555.com/pipermail/live-devel/attachments/20070813/8a00d3da/attachment.bin From finlayson at live555.com Mon Aug 13 12:49:05 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Aug 2007 12:49:05 -0700 Subject: [Live-devel] Cannot play scaled file properly using QAM video card: file does not conform to DVB standards In-Reply-To: <50B4262F4266D811B94D00508B76B76A5EDF22@Exchange> References: <50B4262F4266D811B94D00508B76B76A5EDF22@Exchange> Message-ID: Jeff, I don't know what exactly is meant by "conforming to DVB standards" (if there's a specific document describing such standards, I'd be interested to see it), but - to my knowledge - our generated 'trick play' Transport Streams comply with the MPEG Transport Stream specification (ISO/IEC 13818-1). Because the problems with these streams seem to be just with one specific client, your best bet right now would be to try to persuade the manufacturers of this client to make it more tolerant of different types of input stream (perhaps via a firmware upgrade?), or else develop some sort of filter (hardware or software) that transforms our streams into a form that your client can handle. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 13 12:56:21 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 13 Aug 2007 12:56:21 -0700 Subject: [Live-devel] Why the testMPGE4VideoStreamer does not w ork? In-Reply-To: <46C06AFE.00005C.04201@bj126app17.126.com> References: <46C06AFE.00005C.04201@bj126app17.126.com> Message-ID: >I want the testMPGE4VideoStreamer to transport the real time encoded >stream, so I modified the program, just simply replace the >appropriate "test.*" file name with the file name of my input >device. But when run the program to open the device, the program >close the device immediately. When streaming MPEG-4 video, the code has to do an initial open,read,close on the input file - to get stream-specific configuration information - before it does the subsequent open,read* to actually stream the file. Therefore, your input device needs to be able to handle the close operation, followed by an open. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From dmaljur at elma.hr Tue Aug 14 00:04:07 2007 From: dmaljur at elma.hr (Dario maljur) Date: Tue, 14 Aug 2007 09:04:07 +0200 Subject: [Live-devel] TaskScheduler::DoEventLoop() question. Message-ID: <001a01c7de41$52edb8a0$1201000a@DARIO> Hi. I'm using MFC to display streamed media. And I have to implement step by step execution, But DoEventLoop() just takes control. Is there something in live555 I can use instead DoEventLoop Witch executes events step by step, or do I have to modify existing TaskScheduler code to add that myself? ELMA Kurtalj d.o.o. (ELMA Kurtalj ltd.) Vitezi?eva 1a, 10000 Zagreb, Hrvatska (Viteziceva 1a, 10000 Zagreb, Croatia) Tel: 01/3035555, Faks: 01/3035599 (Tel: ++385-1-3035555, Fax: ++385-1-3035599 ) Www: www.elma.hr; shop.elma.hr E-mail: elma at elma.hr (elma at elma.hr) pitanje at elma.hr (questions at elma.hr) primjedbe at elma.hr (complaints at elma.hr) prodaja at elma.hr (sales at elma.hr) servis at elma.hr (servicing at elma.hr) shop at elma.hr (shop at elma.hr) skladiste at elma.hr (warehouse at elma.hr) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070814/e7d15a2f/attachment.html From finlayson at live555.com Tue Aug 14 01:47:47 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Aug 2007 01:47:47 -0700 Subject: [Live-devel] TaskScheduler::DoEventLoop() question. In-Reply-To: <001a01c7de41$52edb8a0$1201000a@DARIO> References: <001a01c7de41$52edb8a0$1201000a@DARIO> Message-ID: >Hi. >I'm using MFC to display streamed media. And I have to implement >step by step execution, >But DoEventLoop() just takes control. Is there something in live555 >I can use instead DoEventLoop >Witch executes events step by step, or do I have to modify existing >TaskScheduler code to add that myself? If you really want to, you can write your own subclass of "TaskScheduler" and reimplement "doEventLoop()", or write your own subclass of "BasicTaskScheduler0" and reimplement "SingleStep". I.e., if you wish, you can implement your own event loop (provided that it implements the interface defined in "UsageEnvironment/include/UsageEnvironment.hh"). The event loop implementation - "BasicTaskScheduler" - that's included with the supplied code is just that: 'basic'. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070814/9ed032de/attachment.html From dmaljur at elma.hr Tue Aug 14 02:07:57 2007 From: dmaljur at elma.hr (Dario maljur) Date: Tue, 14 Aug 2007 11:07:57 +0200 Subject: [Live-devel] TaskScheduler::DoEventLoop() question. In-Reply-To: Message-ID: <003101c7de52$9eb009d0$1201000a@DARIO> >If you really want to, you can write your own subclass of "TaskScheduler" and reimplement "doEventLoop()", or write your own subclass of >"BasicTaskScheduler0" and reimplement "SingleStep". I.e., if you wish, you can implement your own event loop (provided that it implements >the interface defined in "UsageEnvironment/include/UsageEnvironment.hh"). >The event loop implementation - "BasicTaskScheduler" - that's included with the supplied code is just that: 'basic'. -- >Ross Finlayson >Live Networks, Inc. >http://www.live555.com/ And if I mess with TaskScheduler::DoEventLoop() { While(1) { If (watchVariable != NULL && *watchVariable !=) break; SingleStep(); } } Say, to comment out while loop and if keyword? And implement it in my OnTimer function What effect could that have on work of the library, considering I saw many places in lib that Use doEventLoop? ELMA Kurtalj d.o.o. (ELMA Kurtalj ltd.) Vitezi?eva 1a, 10000 Zagreb, Hrvatska (Viteziceva 1a, 10000 Zagreb, Croatia) Tel: 01/3035555, Faks: 01/3035599 (Tel: ++385-1-3035555, Fax: ++385-1-3035599 ) Www: www.elma.hr; shop.elma.hr E-mail: elma at elma.hr (elma at elma.hr) pitanje at elma.hr (questions at elma.hr) primjedbe at elma.hr (complaints at elma.hr) prodaja at elma.hr (sales at elma.hr) servis at elma.hr (servicing at elma.hr) shop at elma.hr (shop at elma.hr) skladiste at elma.hr (warehouse at elma.hr) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070814/d85e8e96/attachment-0001.html From rjbrennn at gmail.com Tue Aug 14 05:30:22 2007 From: rjbrennn at gmail.com (Russell Brennan) Date: Tue, 14 Aug 2007 08:30:22 -0400 Subject: [Live-devel] TaskScheduler::DoEventLoop() question. In-Reply-To: <001a01c7de41$52edb8a0$1201000a@DARIO> References: <001a01c7de41$52edb8a0$1201000a@DARIO> Message-ID: <46C1A05E.3020206@gmail.com> An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070814/9606a4e3/attachment.html From mailing.list.recipient at gmail.com Tue Aug 14 13:58:48 2007 From: mailing.list.recipient at gmail.com (John Gunnarsson) Date: Tue, 14 Aug 2007 22:58:48 +0200 Subject: [Live-devel] Is this possible with live-devel lib? In-Reply-To: References: <6b332d3d0708100227j3b1adf38jfec84990fe42991c@mail.gmail.com> <6b332d3d0708100850r10fb6db6t4d730052bb7b47a2@mail.gmail.com> Message-ID: <6b332d3d0708141358v14e3ff7ch426fc97613dab03b@mail.gmail.com> Okay, My streaming receiver hardware cannot handle the rtsp protocoll (only rtp), but i would like to use the functions like seek,pause,resume etc on the serverside. To utilize functions like seek etc, is that what's kalled "trick play"? Is that possible with http streaming? Sorry to bother you with stupid questions, but I would really appreciate any help I can get. To set things straight, this is my setup: [Hardware streaming client] <---- TCP/IP -----> [Streaming Server] <--- TCP/IP----> [Remote control] So via the remote-controll I would like to seek, pause, play etc, and therefor the RTSP protocol is not an option (since it's not the streaming client that would request operations such as seek) //John On 8/10/07, Ross Finlayson wrote: > >Instead of having a "dumb" streaming server that for example > >continously stream songs from a playlist to the client, I want the > >streamserver to be remotely controlled via a webpage (for example) > >where I can seek, pause, play etc. > > Because you want to control the stream from a HTTP server, then > perhaps you would be better off just streaming the MP3 via HTTP (like > most MP3 streaming these days). > > If, however, you want to stream via RTP, then you should control the > stream using RTSP (the standard control protocol for RTP streams). > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > From finlayson at live555.com Tue Aug 14 14:11:31 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 14 Aug 2007 14:11:31 -0700 Subject: [Live-devel] Is this possible with live-devel lib? In-Reply-To: <6b332d3d0708141358v14e3ff7ch426fc97613dab03b@mail.gmail.com> References: <6b332d3d0708100227j3b1adf38jfec84990fe42991c@mail.gmail.com> <6b332d3d0708100850r10fb6db6t4d730052bb7b47a2@mail.gmail.com> <6b332d3d0708141358v14e3ff7ch426fc97613dab03b@mail.gmail.com> Message-ID: >Okay, >My streaming receiver hardware cannot handle the rtsp protocoll (only rtp), >but i would like to use the functions like seek,pause,resume etc on >the serverside. >To utilize functions like seek etc, is that what's kalled "trick play"? >Is that possible with http streaming? > >Sorry to bother you with stupid questions, but I would really >appreciate any help I can get. > >To set things straight, this is my setup: > >[Hardware streaming client] <---- TCP/IP -----> [Streaming Server] ><--- TCP/IP----> [Remote control] > >So via the remote-controll I would like to seek, pause, play etc, and >therefor the RTSP protocol is not an option (since it's not the >streaming client that would request operations such as seek) That's not correct. You can use the RTSP protocol to control streams that are sent to a third party RTP receiver (i.e., separate from the RTSP client). To support this, our RTSP server implementation must be modified slightly - search for "RTSP_ALLOW_CLIENT_DESTINATION_SETTING" in "RTSPServer.cpp". You would also need to modify our RTSP client implementation to add a "destination=" field in the RTSP "SETUP" command. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Camillo.Ferraris at mediaset.it Thu Aug 16 03:17:33 2007 From: Camillo.Ferraris at mediaset.it (Camillo Francesco Ferraris) Date: Thu, 16 Aug 2007 12:17:33 +0200 Subject: [Live-devel] Select media folder Message-ID: Hi! Is it possible to specify (when application starts) a different folder to search for media? Thanks! Regards, Camillo Ing. Camillo Ferraris Department of Research & Engineering VIDEOTIME spa Gruppo Mediaset Viale Europa 44 20093 Cologno Monzese (MI) Italy camillo.ferraris at mediaset.it tel +39 (0)2 25149121 fax +39 (0)2 25149141 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/16c62b68/attachment.html -------------- next part -------------- Le informazioni trasmesse sono destinate esclusivamente alla persona o alla societ? in indirizzo e sono da intendersi confidenziali e riservate. Ogni trasmissione, inoltro, diffusione o altro uso di queste informazioni a persone o societ? differenti dal destinatario ? proibita. Se ricevete questa comunicazione per errore, contattate il mittente e cancellate le informazioni da ogni computer. The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from any computer. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/16c62b68/attachment-0001.html From jps330 at psu.edu Thu Aug 16 08:55:39 2007 From: jps330 at psu.edu (Jordan Shelley) Date: Thu, 16 Aug 2007 11:55:39 -0400 Subject: [Live-devel] VLC vs. mplayer Message-ID: <01MK816W5YGA91WAPT@ponyexpress.arl.psu.edu> Ross, Quick question. I am using your library to stream an encoder output (set to MPEG1) wirelessly from one computer to another using RTP. Real-time video is very important in this project. When the stream is viewed with VLC the picture quality is excellent, but the video latency is sub par (about a 1 second delay from real time). I checked the runtime messages on VLC while the stream is being played and it is not dropping many frames or missing packets. Mplayer on the other hand has a very low video latency (about a quarter second) but the picture quality is horrible (lots of artifacts and pixilation) and the video latency seems to "age" sometimes. Have you heard of this before? Can you offer me any guidance on how to get VLCs latency down (it seems as though it is buffering the video too much)? Thanks a lot. -Jordan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/f193d507/attachment.html From finlayson at live555.com Thu Aug 16 11:07:34 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Aug 2007 11:07:34 -0700 Subject: [Live-devel] VLC vs. mplayer In-Reply-To: <01MK816W5YGA91WAPT@ponyexpress.arl.psu.edu> References: <01MK816W5YGA91WAPT@ponyexpress.arl.psu.edu> Message-ID: >Can you offer me any guidance on how to get VLCs latency down (it >seems as though it is buffering the video too much)? See http://lists.live555.com/pipermail/live-devel/2007-June/006903.html -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/25b40480/attachment-0001.html From cwein at hotmail.com Thu Aug 16 11:29:22 2007 From: cwein at hotmail.com (Chris Wein) Date: Thu, 16 Aug 2007 11:29:22 -0700 Subject: [Live-devel] VLC vs. mplayer Message-ID: Out of the box VLC has a 1 second delay. I have seen it work at about 200ms end to end over a LAN which is not too bad. Checking settings->preferences->demuxers->rtp and check advanced options. Reduce the caching value as low as you can without observing underflow.Date: Thu, 16 Aug 2007 11:07:34 -0700To: live-devel at ns.live555.comFrom: finlayson at live555.comSubject: Re: [Live-devel] VLC vs. mplayer Re: [Live-devel] VLC vs. mplayer Can you offer me any guidance on how to get VLCs latency down (it seems as though it is buffering the video too much)? See http://lists.live555.com/pipermail/live-devel/2007-June/006903.html -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/5c22f916/attachment.html From brian.vdsouza at gmail.com Thu Aug 16 11:52:26 2007 From: brian.vdsouza at gmail.com (Brian D'Souza) Date: Thu, 16 Aug 2007 11:52:26 -0700 Subject: [Live-devel] Queries: regarding AAC Message-ID: <5770ac1b0708161152u34598f9asab727e5f1b582417@mail.gmail.com> Hi, I have been tracking posts regarding streaming MPEG4 audio( encoded in AAC) and in an earlier post too I was instructed to have a look at the MPEG4GenericRTPSink and ADTSAudioFileServerMediaSubsession. My queries: 1. I am unable to find any framer class ( as was used for MPEG4ESVideoStreamFramer ; as well as a corresponding parser) for the same. The issue is that I am using Live555 as an RTP packetizer for aac content and relaying it to Darwin which does the necessary RTSP part). So I cannot just use live555 as an RTSP server. I want to know that just creating an RTP sink using the MPEG4GenericRTPSink ; then jusing the source ADTSAudioFileSource(which works from file/encoder) and then invoking startPlaying() is enough? Wouldn't I need a framer; which decides what is in an audio frame; has methods like continuePlaying(); getNextFrame() etc and processes it appropriately. 2. In the MPEG4ESVideo Elementary streaming (relay to darwin)test program , there is a hack to read configuration information from the stream ; to generate an SDP description; and an announce RTSP operation is done. Does the hack need to be there for MPEG4 audio as well.. Similarily if I were to stream h264 video content, is this hack needed? Thanks for the help, -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/14cc6c44/attachment.html From finlayson at live555.com Thu Aug 16 13:18:35 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Aug 2007 13:18:35 -0700 Subject: [Live-devel] Queries: regarding AAC In-Reply-To: <5770ac1b0708161152u34598f9asab727e5f1b582417@mail.gmail.com> References: <5770ac1b0708161152u34598f9asab727e5f1b582417@mail.gmail.com> Message-ID: >1. I am unable to find any framer class ( as was used for >MPEG4ESVideoStreamFramer ; as well as a corresponding parser) for >the same. The issue is that I am using Live555 as an RTP packetizer >for aac content and relaying it to Darwin which does the necessary >RTSP part). So I cannot just use live555 as an RTSP server. Yes you can. You've just chosen not to. :-) > I want to know that just creating an RTP sink using the >MPEG4GenericRTPSink ; then jusing the source >ADTSAudioFileSource(which works from file/encoder) and then invoking >startPlaying() is enough? Yes. See the "ADTSAudioFileServerMediaSubsession" for an illustration. >Wouldn't I need a framer; which decides what is in an audio frame; >has methods like continuePlaying(); getNextFrame() etc and processes >it appropriately. No, because "ADTSAudioFileSource" delivers discrete frames (and with correct presentation times), so you don't need a separate 'framer' class. >2. In the MPEG4ESVideo Elementary streaming (relay to darwin)test >program , there is a hack to read configuration information from the >stream ; to generate an SDP description; and an announce RTSP >operation is done. Does the hack need to be there for MPEG4 audio as >well.. Similarily if I were to stream h264 video content, is this >hack needed? Probably, because both of those codecs have stream-specific 'configuration' information that needs to be added to the SDP description, before it gets used. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From aviadr1 at gmail.com Thu Aug 16 13:52:37 2007 From: aviadr1 at gmail.com (aviad rozenhek) Date: Thu, 16 Aug 2007 23:52:37 +0300 Subject: [Live-devel] Live555 : Queries In-Reply-To: <5770ac1b0708161127i46d27b80p57029bb0458afa47@mail.gmail.com> References: <5770ac1b0708160938p6aafff6er104f968acc1cc720@mail.gmail.com> <5770ac1b0708161127i46d27b80p57029bb0458afa47@mail.gmail.com> Message-ID: Hi Brian, welcome to the field, I was a newbie myslef 3 months ago. first thing - the code I wrote is client-side code. so it is less relevant to you. what you need to do is the following - read the file, one NAL at a time. use the aforementioned H264bitstream to do this - skim through RFC 3984 RTP Payload Format for H.264 Video - Aggregation packets (STAP) are packets designed to hold together many small NALs, to increase network efficiency - fragmentation packets (FU) are packets designed to break huge NALs (like IDR/key frames) into network size packets - you should write a framer class, that aggregates small nals into a an STAP packet, and breaks big NALs into several FUs. I don't think this functionality is available in live555. when you implement it, I suggest you submit it to the forum. - an H264 encoder has some preferences or options, which it must convey to the decoder, otherwise the decoder can't work. - this information can be signalled in NALs called SPS (sequence parameter set) and PPS (picture parameter set) - a good way to give the decoder this information is by putting the SPS/PPS NALs before every IDR frame, this is especially true for streaming, when a client can start decoding in the middle of the stream - in RTSP settings, this information is often sent in the SDP from the server side to the client, in the fmtp field (again see rfc 3984) and the NALs are presented as BASE64 text I hope this helps Aviad On 8/16/07, Brian D'Souza wrote: > Hi Aviad, > > Well I havent really executed any of the test programs; but I traced the > flow for testMPEG4ESVideoToDarwin.cpp: This involved understanding the > MultiFramedSink, the Framer, parser classes for MPEG4 video. etc. So I can > say I have managed a reasonable if not extensive understanding of the same. > > Let me outline in more detail what I am looking to do. > > I have Motorola set top box which has an audio and video encoder and > streams live content( mp4 : codec used for video: h264 and for audio: aac). > I have to design a module which streams this content in RTP . > > For aac, i think live media has some code to do the streaming using ADTS. > Not sure if that will work though. > > For h264; i have been scanning through some posts, but stumbled upon your > and thought it was relevant: > > What i need:? > > Ideally take input from encoder/(video(h264) file for now: which contains > output of an encoder(video) ..: containing NAL units). Then I need a framer > class or something which reads frame by frame from the source, generates RTP > packets just like how MultiFrameRTPSInk class( I have come to conclusion; > this is the class used by all media to create RTP packets; etc and send them > on using packet buffers etc.) > > These RTP packets are then fed into Darwin streaming server using > DarwinInjector.cpp(of live media) and then my client a mobile > phone(connects using RTSP) to the DSS. My client has support for decoding > mp4 and support for RTP/RTSP > > Darwin has capabilities on waiting for relay; and Live555 has the > functionality of basically injecting packets from RTPSink onto the server( > it does this by creating an SDP description and using RTSP announce > protocol. > > I havent looked at the H264BitStream project , but on first look at the > code you sent me; it looks like streaming from a file. Does it actually > create RTP Packets and send them. > > > I know I am a bit vague; but the thing is I am new to this domain and a > summer intern. So have just picked up bits and peices and trying to design > module which packetizes elementary streams(h264,aac) and does the relaying. > > Sorry for the inconvenience, but would really appreciate your advice. > > Brian > > p.s: You can post my queries to the group, if you thing they might be > useful to others. > > > On 8/16/07, aviad rozenhek wrote: > > > > Hi Brian, > > > > I have not understood from your mail whether you are familiar with > > live555 or not. > > also I assume the mobile clients want to have audio in addition to > > video, so in order to understand your problem better I would need the > > complete scenario > > (live source VS file audio video etc). > > > > basically, if you are only doing server side then your biggest challenge > > is to understand live555 good enough to integrate it into your server. > > > > Let me rephrase my post on live555: > > > > - the SDP information that is received in the initial RTSP > > handshake has some crucial information without which it is impossible to > > decode h264 > > - this information can be used in two ways: > > - initialize the decoder with the information in some > > decoder-proprietary manner > > - put this information in the stream itself (increases size > > of stream slightly) - this is what I did and what my post describes. > > > > you might need to do this (add the information directly to the stream) > > if the eventual decoder (either darwin if it does transcoding, otherwise the > > mobile client) doesn't know the information beforehand. > > > > I will send you the modified source in a seperate email. > > > > in the interest of clarity, since you are the 2nd person to ask me on > > that post outside of the forum, I would be happy if you would allow me to > > post this conversation back to the mailing list for future reference. > > > > Cheers > > Aviad Rozenhek > > > > > > On 8/16/07, Brian D'Souza wrote: > > > > > > Hi Aviad, > > > > > > I had read one of your posts from the Live555 discussion forums. I am > > > looking to design/re-use a module which takes input from file/encoder, > > > reading h264 content( NAL units i think) and generating RTP packets which is > > > then relayed to the Darwin server and then streamed by the Darwin Server to > > > be viewed by clients on the mobile. > > > > > > As per a post of yours, I understood that you did something on the > > > similar lines. Did you modify the live555 code to establish the same. > > > I would really appreciate your guidance/advice and if you have done > > > something on the similar lines I would be grateful if you send me the > > > modified source > > > > > > > > > Thanks in advance > > > > > > Regards > > > Brian > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/a7f4a5fc/attachment.html From finlayson at live555.com Thu Aug 16 14:42:22 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Aug 2007 14:42:22 -0700 Subject: [Live-devel] Live555 : Queries In-Reply-To: References: <5770ac1b0708160938p6aafff6er104f968acc1cc720@mail.gmail.com> <5770ac1b0708161127i46d27b80p57029bb0458afa47@mail.gmail.com> Message-ID: >skim through RFC >3984 RTP Payload Format for H.264 Video You really shouldn't have to do this, because the library already implements this RTP payload format (using the "H264VideoRTPSink" class). All you need to do is feed NAL units to it. However... >Aggregation packets (STAP) are packets designed to hold together >many small NALs, to increase network efficiency Yes, we currently don't create these 'aggregation packets' (a special kind of NAL unit) ourselves, so if you want to use these, you'll need to create them yourself. However, it's usually uncommon to have sequences of small NAL units >fragmentation packets (FU) are packets designed to break huge NALs >(like IDR/key frames) into network size packets No, you don't have to implement these yourself. Our "H264VideoRTPSink" class automatically does this for you. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/90bbd8d8/attachment-0001.html From pan_xiaolei at sina.com Thu Aug 16 20:02:29 2007 From: pan_xiaolei at sina.com (pan_xiaolei) Date: Fri, 17 Aug 2007 11:02:29 +0800 Subject: [Live-devel] How can I do use-based authentication? Message-ID: <200708171102285260514@sina.com> Hi, I want to construct a video site. when some one needs to watch a video, he(she) must login through a web site, after that, he will get a list that each item links to a rtsp server, eg rtsp://123.45.67.89/test.mp3. My question: how can I deny a user who has not login yet to access this file? How does the rtsp server knows the client has login, maybe it runs on another machine rather on the web server? Thanks!! David.Pan 2007-08-17 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/c9bd000b/attachment.html From finlayson at live555.com Thu Aug 16 22:01:49 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 16 Aug 2007 22:01:49 -0700 Subject: [Live-devel] How can I do use-based authentication? In-Reply-To: <200708171102285260514@sina.com> References: <200708171102285260514@sina.com> Message-ID: >I want to construct a video site. when some one needs to watch a >video, he(she) must login through a web site, after that, he will >get a list that each item links to a rtsp server, eg >rtsp://123.45.67.89/test.mp3. > >My question: >how can I deny a user who has not login yet to access this file? How >does the rtsp server knows the client has login, maybe it runs on >another machine rather on the web server? For an illustration of how to add username,password authentication to a RTSP server, see the code bracketed #ifdef ACCESS_CONTROL #endif in "mediaServer/live555MediaServer.cpp" or "testProgs/testOnDemandRTSPServer.cpp". The RTSP client will then prompt each user for a username and password, before the stream can be played. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/48d91cfb/attachment.html From rajeshkumar.r at imimobile.com Fri Aug 17 03:20:21 2007 From: rajeshkumar.r at imimobile.com (rajesh) Date: Fri, 17 Aug 2007 15:50:21 +0530 Subject: [Live-devel] use of streamId Message-ID: <013e01c7e0b8$3785ac20$6902000a@imidomain.com> HI , what is the use of a=control:streamid=0. I am getting audio StreamId 0 and video streamId 1. What is the use of stream Id. Thanks and Regards Rajesh Kumar Software Engineer +91 40 23555945 - 235 +91 99084 00027 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070817/93663e1b/attachment.html From finlayson at live555.com Fri Aug 17 11:37:28 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 17 Aug 2007 11:37:28 -0700 Subject: [Live-devel] use of streamId In-Reply-To: <013e01c7e0b8$3785ac20$6902000a@imidomain.com> References: <013e01c7e0b8$3785ac20$6902000a@imidomain.com> Message-ID: >HI , >what is the use of >a=control:streamid=0. >I am getting audio StreamId 0 and video streamId 1. > >What is the use of stream Id. The string "streamid=" is something that the server adds to the "rtsp://" URL to identify each individual RTP stream (audio or video) within the session. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070817/2a831f46/attachment.html From brian.vdsouza at gmail.com Sun Aug 19 13:40:16 2007 From: brian.vdsouza at gmail.com (Brian D'Souza) Date: Sun, 19 Aug 2007 13:40:16 -0700 Subject: [Live-devel] Live555 : Queries In-Reply-To: References: <5770ac1b0708160938p6aafff6er104f968acc1cc720@mail.gmail.com> <5770ac1b0708161127i46d27b80p57029bb0458afa47@mail.gmail.com> Message-ID: <5770ac1b0708191340t7969d179y9fbdc045efecf0f9@mail.gmail.com> Thanks for the inputs. I needed your suggestions on steps I would need to follow to feed these NAL units into the H264RTPSink. For starters a) I am only going to be concerned with 1 NAL unit. As u mentioned ; i could discard working on aggregation packets and just feed a single NAL unit into packet. Would this involve just making a H264 framer class and read a NAL unit and package it into a frame which i give as input to the H264RTPSink. Please correct my understanding. b) What would be drawbacks of not using aggregation packets? c) Is there anything else I would need to implement? Thanks, Brian On 8/16/07, Ross Finlayson wrote: > > > - skim through RFC 3984 RTP Payload Format for H.264 Video > > > > You really shouldn't have to do this, because the library already > implements this RTP payload format (using the "H264VideoRTPSink" class). > All you need to do is feed NAL units to it. However... > > > > > - Aggregation packets (STAP) are packets designed to hold together > many small NALs, to increase network efficiency > > > > Yes, we currently don't create these 'aggregation packets' (a special kind > of NAL unit) ourselves, so if you want to use these, you'll need to create > them yourself. However, it's usually uncommon to have sequences of small > NAL units > > > > > - fragmentation packets (FU) are packets designed to break huge NALs > (like IDR/key frames) into network size packets > > > > No, you don't have to implement these yourself. Our "H264VideoRTPSink" > class automatically does this for you. > > -- > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070819/116fe64b/attachment.html From mail at beim-beme.de Mon Aug 20 04:28:17 2007 From: mail at beim-beme.de (Benjamin Meier) Date: Mon, 20 Aug 2007 13:28:17 +0200 Subject: [Live-devel] Difference between H264VideoRTPSink an H264VideoFileSink? Message-ID: <46C97AD1.9000405@beim-beme.de> Hi, we are implementing a Transcoder, who: 1. receives a couple of several demuxed H264/SVC Streams via RTSP from another machine 2. muxes the several streams into one big stream 3. should send this big stream into a RTPSink to stream it to the clients via Multicast But here we are in trouble. When using a H264VideoFileSink, everythings working like a charm, the created file is like the one the whole streaming started with. So I'm supposing everything about our code is ok. Using the intented H264VideoRTPSink to stream the muxed stream to a multicast-address leads us to the error "attempting to read more than once *at the same time".* Can you imagine why the FileSink is working contrary to the RTPSink? What might be the basic difference between RTPSink and FileSink? Thanks! Benjamin From Chen.Li2 at boeing.com Mon Aug 20 09:39:21 2007 From: Chen.Li2 at boeing.com (Li, Chen) Date: Mon, 20 Aug 2007 09:39:21 -0700 Subject: [Live-devel] Open Questions Regarding Directory Setup Message-ID: Hello, This question is a little basic, but I see an ability to setup a session for an entire folder. I want to be able to set up one session for all the files in one folder, but from what I see it does not seem to work as I get a 404 stream not found. Are there limits to what can be recognized as a valid rtsp command that does not specify a file, but instead specifies a directory? Is it a matter of a config or setup file needed in the folder? Or does it have to do with compression methods instead as I currently have a bunch of transport streams in the folder? Thank you! --Chen Li From finlayson at live555.com Mon Aug 20 12:26:12 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Aug 2007 12:26:12 -0700 Subject: [Live-devel] Difference between H264VideoRTPSink an H264VideoFileSink? In-Reply-To: <46C97AD1.9000405@beim-beme.de> References: <46C97AD1.9000405@beim-beme.de> Message-ID: >But here we are in trouble. When using a H264VideoFileSink, everythings >working like a charm, the created file is like the one the whole >streaming started with. So I'm supposing everything about our code is >ok. Using the intented H264VideoRTPSink to stream the muxed stream to a >multicast-address leads us to the error "attempting to read more than >once *at the same time".* > >Can you imagine why the FileSink is working contrary to the RTPSink? I don't know, but I think this is a 'red herring'. Your real problem is the error message about attempting to read the same object more than once at the same time. You need to fix the bug in your code that's causing this to happen. To prevent this error, hou should make sure that: 1/ After you call "getNextFrame()" on the upstream object, you don't call it again, until after you' ve handled the 'after getting' function (that you passed as a parameter to the first call to "getNextFrame()"), and 2/ You call "FramedSource::afterGetting()" exactly once (and no more!) for each call to your "doGetNextFrame()" function (by your downstream object). -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From finlayson at live555.com Mon Aug 20 12:37:21 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Aug 2007 12:37:21 -0700 Subject: [Live-devel] Open Questions Regarding Directory Setup In-Reply-To: References: Message-ID: >This question is a little basic, but I see an ability to setup a session >for an entire folder. I want to be able to set up one session for all >the files in one folder, but from what I see it does not seem to work as >I get a 404 stream not found. Are there limits to what can be >recognized as a valid rtsp command that does not specify a file, but >instead specifies a directory? A RTSP "DESCRIBE" command (and a subsequent "SETUP","PLAY") applies only to a single, specific stream. There is nothing in the RTSP protocol that applies to a set of streams, e.g., multiple files in a directory. (In fact, the RTSP protocol has no concept of a 'directory' at all.) If you want to - for example - list the files in a directory, then you will need to use some other protocol, such as HTTP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Chen.Li2 at boeing.com Mon Aug 20 13:02:24 2007 From: Chen.Li2 at boeing.com (Li, Chen) Date: Mon, 20 Aug 2007 13:02:24 -0700 Subject: [Live-devel] Open Questions Regarding Directory Setup In-Reply-To: References: Message-ID: I only ask because I see this example and it makes me think an entire directory is being setup. C->V: SETUP rtsp://video.example.com/twister/video RTSP/1.0 CSeq: 1 Transport: RTP/AVP/UDP;unicast;client_port=3058-3059 Section 14.1 of the RTSP spec. Any help would be appreciated. Full section: C->W: GET /twister.sdp HTTP/1.1 Host: www.example.com Accept: application/sdp W->C: HTTP/1.0 200 OK Content-Type: application/sdp Schulzrinne, et. al. Standards Track [Page 63] RFC 2326 Real Time Streaming Protocol April 1998 v=0 o=- 2890844526 2890842807 IN IP4 192.16.24.202 s=RTSP Session m=audio 0 RTP/AVP 0 a=control:rtsp://audio.example.com/twister/audio.en m=video 0 RTP/AVP 31 a=control:rtsp://video.example.com/twister/video C->A: SETUP rtsp://audio.example.com/twister/audio.en RTSP/1.0 CSeq: 1 Transport: RTP/AVP/UDP;unicast;client_port=3056-3057 A->C: RTSP/1.0 200 OK CSeq: 1 Session: 12345678 Transport: RTP/AVP/UDP;unicast;client_port=3056-3057; server_port=5000-5001 C->V: SETUP rtsp://video.example.com/twister/video RTSP/1.0 CSeq: 1 Transport: RTP/AVP/UDP;unicast;client_port=3058-3059 V->C: RTSP/1.0 200 OK CSeq: 1 Session: 23456789 Transport: RTP/AVP/UDP;unicast;client_port=3058-3059; server_port=5002-5003 C->V: PLAY rtsp://video.example.com/twister/video RTSP/1.0 CSeq: 2 Session: 23456789 Range: smpte=0:10:00- V->C: RTSP/1.0 200 OK CSeq: 2 Session: 23456789 Range: smpte=0:10:00-0:20:00 RTP-Info: url=rtsp://video.example.com/twister/video; seq=12312232;rtptime=78712811 C->A: PLAY rtsp://audio.example.com/twister/audio.en RTSP/1.0 CSeq: 2 Session: 12345678 Range: smpte=0:10:00- A->C: RTSP/1.0 200 OK CSeq: 2 Session: 12345678 Schulzrinne, et. al. Standards Track [Page 64] RFC 2326 Real Time Streaming Protocol April 1998 Range: smpte=0:10:00-0:20:00 RTP-Info: url=rtsp://audio.example.com/twister/audio.en; seq=876655;rtptime=1032181 C->A: TEARDOWN rtsp://audio.example.com/twister/audio.en RTSP/1.0 CSeq: 3 Session: 12345678 A->C: RTSP/1.0 200 OK CSeq: 3 C->V: TEARDOWN rtsp://video.example.com/twister/video RTSP/1.0 CSeq: 3 Session: 23456789 V->C: RTSP/1.0 200 OK CSeq: 3 Even though the audio and video track are on two different servers, and may start at slightly different times and may drift with respect to each other, the client can synchronize the two using standard RTP methods, in particular the time scale contained in the RTCP sender reports. -----Original Message----- From: Ross Finlayson [mailto:finlayson at live555.com] Sent: Monday, August 20, 2007 12:37 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Open Questions Regarding Directory Setup >This question is a little basic, but I see an ability to setup a >session for an entire folder. I want to be able to set up one session >for all the files in one folder, but from what I see it does not seem >to work as I get a 404 stream not found. Are there limits to what can >be recognized as a valid rtsp command that does not specify a file, but >instead specifies a directory? A RTSP "DESCRIBE" command (and a subsequent "SETUP","PLAY") applies only to a single, specific stream. There is nothing in the RTSP protocol that applies to a set of streams, e.g., multiple files in a directory. (In fact, the RTSP protocol has no concept of a 'directory' at all.) If you want to - for example - list the files in a directory, then you will need to use some other protocol, such as HTTP. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel From l1436636 at yahoo.com Mon Aug 20 13:19:41 2007 From: l1436636 at yahoo.com (hong liu) Date: Mon, 20 Aug 2007 13:19:41 -0700 (PDT) Subject: [Live-devel] about the library of liblivemedia.a Message-ID: <105198.83328.qm@web51408.mail.re2.yahoo.com> Hello Ross, I found the library liblivemedia.a is created by using the command "ld -r -Bstatic ...". What is the difference from using ar . Now my mplayer uses the live library to receive streaming videos. If I want to change some function in the library of liblivemedia.a, for example, a class constructor or member function, and I don't want to recompile the mplayer again, what is a best way for me to achieve it? Thank you so much for your continuous help! ____________________________________________________________________________________ Looking for a deal? Find great prices on flights and hotels with Yahoo! FareChase. http://farechase.yahoo.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070820/38b42ba5/attachment.html From finlayson at live555.com Mon Aug 20 16:03:49 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 20 Aug 2007 16:03:49 -0700 Subject: [Live-devel] Open Questions Regarding Directory Setup In-Reply-To: References: Message-ID: >I only ask because I see this example and it makes me think an entire >directory is being setup. > > C->V: SETUP rtsp://video.example.com/twister/video RTSP/1.0 No, in this example there is just a single audio+video stream, named "twister". However, this stream has separate audio and video substreams (each one being a RTP stream) that are "SETUP" and "PLAY"ed separately. The server - in this case - identifies these substreams using the strings "video" and "audio.en", but these don't correspond to file names within a directory. (In our RTSP server implementation, we use the strings "track1", "track2", etc.) -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From rajeshkumar.r at imimobile.com Mon Aug 20 22:09:14 2007 From: rajeshkumar.r at imimobile.com (rajesh) Date: Tue, 21 Aug 2007 10:39:14 +0530 Subject: [Live-devel] can Live Media Server stream Live Content ? Message-ID: <003901c7e3b1$6b071010$6902000a@imidomain.com> Hi Ross, can we use Live media Library as RTSP streaming Server to stream Live content like any TV channel. Currently we are using Helix Mobile Producer to stream TV Channel. Thanks in advance. Thanks and Regards Rajesh Kumar Software Engineer +91 40 23555945 - 235 +91 99084 00027 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070820/7a976a2e/attachment.html From xqma at marvell.com Tue Aug 21 00:10:21 2007 From: xqma at marvell.com (Xianqing Ma) Date: Tue, 21 Aug 2007 00:10:21 -0700 Subject: [Live-devel] About the details of the live555 Message-ID: <7F6B283111E9EB4BA570D4C0322C8EEE130AEE@sc-exch04.marvell.com> Hi all: I am reading the source code of the live555 now. What puzzles me a lot is the background principle of the live555. Take testMPEG4VideoStreamer.cpp as an example: I traced the program by inserting printf statement in some function and saw the below information play()-> RTPSink::startPlaying-> MultiFramedRTPSink::continuePlaying-> MultiFramedRTPSink::buildAndSendPacket-> MultiFramedRTPSink::packFrame-> FramedSource::getNextFrame-> MPEGVideoStreamFramer::doGetNextFramer-> ByteStreamFileSource::doGetNextFrame That's it. I don't know why ByteStreamFileSource::doGetNextFrame appears here suddenly. Who calls it? I guess there may be some background jobs. As I can not find any code about this calling, I am not sure about that. Any help Thank you Best Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070821/cf9d3e53/attachment.html From samuelz_83 at hotmail.com Tue Aug 21 01:51:44 2007 From: samuelz_83 at hotmail.com (ZengWenfeng) Date: Tue, 21 Aug 2007 08:51:44 +0000 Subject: [Live-devel] Is livemedia a shared library or a static library? Message-ID: As the title. Thanks in advance! _________________________________________________________________ ?? Live.com ??????????????????? http://www.live.com/getstarted.aspx -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070821/164028db/attachment-0001.html From finlayson at live555.com Tue Aug 21 01:53:56 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Aug 2007 01:53:56 -0700 Subject: [Live-devel] About the details of the live555 In-Reply-To: <7F6B283111E9EB4BA570D4C0322C8EEE130AEE@sc-exch04.marvell.com> References: <7F6B283111E9EB4BA570D4C0322C8EEE130AEE@sc-exch04.marvell.com> Message-ID: >That's it. I don't know why ByteStreamFileSource::doGetNextFrame >appears here suddenly. Who calls it? It's called by "FramedSource::getNextFrame()", which in turn is called by the downstream object - in this case a "MPEG4VideoStreamFramer". For the "testMPEG4VideoStreamer" application, the data flows through the following chain of objects (classes): ByteStreamFileSource -> MPEG4VideoStreamFramer -> MPEG4ESVideoRTPSink -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070821/1bd98978/attachment.html From jps330 at psu.edu Tue Aug 21 05:42:44 2007 From: jps330 at psu.edu (Jordan Shelley) Date: Tue, 21 Aug 2007 08:42:44 -0400 Subject: [Live-devel] can Live Media Server stream Live Content ? In-Reply-To: <003901c7e3b1$6b071010$6902000a@imidomain.com> Message-ID: <01MKETWK71AQ91WLCS@ponyexpress.arl.psu.edu> Ross, correct me if I am wrong, As long as you can get the stream encoded into a format that can be used by LiveMedia and your hardware/software encoder supports grabbing single frames, (or can be read as a socket or file) you can modify this library to stream it. I am currently using the library to stream a camera feed from a robotic platform that very much depends on real-time video. I am doing so wirelessly and the delay on the picture is only about 200-250 ms, which is very impressive in my opinion. Hope that helps! -Jordan _____ From: live-devel-bounces at ns.live555.com [mailto:live-devel-bounces at ns.live555.com] On Behalf Of rajesh Sent: Tuesday, August 21, 2007 1:09 AM To: LIVE555 Streaming Media - development & use Subject: [Live-devel] can Live Media Server stream Live Content ? Hi Ross, can we use Live media Library as RTSP streaming Server to stream Live content like any TV channel. Currently we are using Helix Mobile Producer to stream TV Channel. Thanks in advance. Thanks and Regards Rajesh Kumar Software Engineer +91 40 23555945 - 235 +91 99084 00027 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070821/b0370ed7/attachment.html From pan_xiaolei at sina.com Tue Aug 21 07:55:33 2007 From: pan_xiaolei at sina.com (pan_xiaolei) Date: Tue, 21 Aug 2007 22:55:33 +0800 Subject: [Live-devel] How can I do use-based authentication? References: Message-ID: <200708212255324575959@sina.com> Thank you for your reply! I've tried your advice, that defines the macros ACCESS_CONTROL. Wonderful, the windows media player promoted me to enter the username and the password. (BTW, the windows media player didn't give a realm to the live555 server, as a result, the authorization failed). Unfortunately, the realplayer and VLC didn't promte anything to me. In fact, what I need is to build a stream server and the authorization is completed through a web server. I don't know how the rtsp server can get the authorization information that only the web server can get easily. Maybe either of them is running on a seperated machine. I got a idea that: when a user wish to access a video file, he must login on a web server. Then, the web server will generate a page in which there's a rtsp link, eg: rtsp://test.livenetworks.com/test.mpg?card=1234567890. Well, the client player will use this link to access the video file and the rtsp server MUST authorize the card. Though any player can be authorized in this way, I think, it sounds weirdly. Will you please give me a tip? What's the best way to solve this prblem? David.Pan 2007-08-21 ???? live-devel-request at ns.live555.com ????? 2007-08-21 03:52:31 ???? live-devel at ns.live555.com ??? ??? live-devel Digest, Vol 46, Issue 10 Send live-devel mailing list submissions to live-devel at lists.live555.com To subscribe or unsubscribe via the World Wide Web, visit http://lists.live555.com/mailman/listinfo/live-devel or, via email, send a message with subject or body 'help' to live-devel-request at lists.live555.com You can reach the person managing the list at live-devel-owner at lists.live555.com When replying, please edit your Subject line so it is more specific than "Re: Contents of live-devel digest..." Today's Topics: 1. How can I do use-based authentication? (pan_xiaolei) 2. Re: How can I do use-based authentication? (Ross Finlayson) 3. use of streamId (rajesh) 4. Re: use of streamId (Ross Finlayson) 5. Re: Live555 : Queries (Brian D'Souza) 6. Difference between H264VideoRTPSink an H264VideoFileSink? (Benjamin Meier) 7. Open Questions Regarding Directory Setup (Li, Chen) 8. Re: Difference between H264VideoRTPSink an H264VideoFileSink? (Ross Finlayson) ---------------------------------------------------------------------- Message: 1 Date: Fri, 17 Aug 2007 11:02:29 +0800 From: "pan_xiaolei" Subject: [Live-devel] How can I do use-based authentication? To: "live-devel" Message-ID: <200708171102285260514 at sina.com > Content-Type: text/plain; charset="gb2312" Hi, I want to construct a video site. when some one needs to watch a video, he(she) must login through a web site, after that, he will get a list that each item links to a rtsp server, eg rtsp://123.45.67.89/test.mp3. My question: how can I deny a user who has not login yet to access this file? How does the rtsp server knows the client has login, maybe it runs on another machine rather on the web server? Thanks!! David.Pan 2007-08-17 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070816/c9bd000b/attachment-0001.html ------------------------------ Message: 2 Date: Thu, 16 Aug 2007 22:01:49 -0700 From: Ross Finlayson Subject: Re: [Live-devel] How can I do use-based authentication? To: LIVE555 Streaming Media - development & use Message-ID: Content-Type: text/plain; charset="us-ascii" >I want to construct a video site. when some one needs to watch a >video, he(she) must login through a web site, after that, he will >get a list that each item links to a rtsp server, eg >rtsp://123.45.67.89/test.mp3. > >My question: >how can I deny a user who has not login yet to access this file? How >does the rtsp server knows the client has login, maybe it runs on >another machine rather on the web server? For an illustration of how to add username,password authentication to a RTSP server, see the code bracketed #ifdef ACCESS_CONTROL #endif in "mediaServer/live555MediaServer.cpp" or "testProgs/testOnDemandRTSPServer.cpp". The RTSP client will then prompt each user for a username and password, before the stream can be played. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070821/a40b1d4f/attachment-0001.html From tchristensen at nordija.com Tue Aug 21 07:56:45 2007 From: tchristensen at nordija.com (Thomas Christensen) Date: Tue, 21 Aug 2007 16:56:45 +0200 Subject: [Live-devel] afterGetting(this) In-Reply-To: <466958A8.9010601@mytum.de> References: <466958A8.9010601@mytum.de> Message-ID: <85AEB948-365C-427D-9DF6-ECD80A2FB4F8@nordija.com> Hi I'm curious. How did you solve the "request more data" at the bottom? You are calling getNextFrame() to solve this - Correct? Thomas On 8 Jun 2007, at 15:24, Julian Lamberty wrote: > Hi! > > I would like to know how I can > > 1. Request more data from a source (MPEG1or2VideoRTPSource) > 2. Tell a sink (MPEG4VideoStreamDiscreteFramer) that data is > completely delivered to buffer (this should be afterGetting(this), > right?) > > Right now I have a code structure like: > > void Transcoder::doGetNextFrame() > { > fInputSource->getNextFrame(...); > } > > void Transcoder::afterGettingFrame(...) > { > Transcoder* transcoder = (Transcoder*)clientData; > transcoder->afterGettingFrame1(...); > } > > void Transcoder::afterGettingFrame1(...) > { > decode_from_inbuf_to_frame > if(successfully decoded) > { > encode_frame_to_fTo > if(successfully encoded) > { > deliver (<- afterGetting(this)?) > } > } > else > { > request more data (<- I don't know how to do this, if I > simply call doGetNextFrame() once again it complains about being > read more than once) > } > } > > Thanks for your help! > > Julian > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel From finlayson at live555.com Tue Aug 21 09:27:05 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Aug 2007 09:27:05 -0700 Subject: [Live-devel] can Live Media Server stream Live Content ? In-Reply-To: <003901c7e3b1$6b071010$6902000a@imidomain.com> References: <003901c7e3b1$6b071010$6902000a@imidomain.com> Message-ID: >Hi Ross, >can we use Live media Library as RTSP streaming Server to stream >Live content like any TV channel. >Currently we are using Helix Mobile Producer to stream TV Channel. Please read the FAQ! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070821/487d4d74/attachment.html From tchristensen at nordija.com Tue Aug 21 15:50:52 2007 From: tchristensen at nordija.com (Thomas Christensen) Date: Wed, 22 Aug 2007 00:50:52 +0200 Subject: [Live-devel] MPEG-TS demuxer Message-ID: Hi It has been mentioned before on this mailling list that live555 contains a demuxer for program stream and not for transport streams and I was wondering why. Has someone written such a demuxer and whishes to share if? I would like to write a program that receives MPEG2-TS through multicast, decodes the video into a frame and then saves the image. I know that decoding the mpeg2 encoded frame can be done using libavcodec and that libavformat offers to convert the raw yuv frame into an image, however collecting MPEG2-TS packets requires some work. My guess would be to implement the following: MPEG2TransportParser - Should look for MPEG2-TS sync byte in the buffer and decode the header. It should locate and update an memory based version of the PAT and PMT mustly for debuggin purposes since I need to specify a filter_pid up front - since the MPEG2-TS stream can contain more than one Packetized Elementary Stream, this filter_pid should be used to filter out unwanted packets, leaving me with a complete PES with video and sound which I then pass to a new MPEG1and2PESParser. MPEG1and2PESParser, parses the PES header fields and collects the payload into a Program Stream before that is passed to MPEG1and2Demux? Any input on this? Thomas From finlayson at live555.com Tue Aug 21 16:42:19 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 21 Aug 2007 16:42:19 -0700 Subject: [Live-devel] MPEG-TS demuxer In-Reply-To: References: Message-ID: >It has been mentioned before on this mailling list that live555 >contains a demuxer for program stream and not for transport streams >and I was wondering why. Because a Transport Stream demultiplexor was not needed for streaming. MPEG Program Streams are streamed - via RTP - as separate RTP streams (one for audio; one for video), and so need to be demultiplexed into Elementary Streams before streaming. However, MPEG Transport Streams, being more suited for overcoming data loss, are streamed 'as is', without demultiplexing. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From marete at edgenet.co.ke Wed Aug 22 00:18:12 2007 From: marete at edgenet.co.ke (Brian Gitonga Marete) Date: Wed, 22 Aug 2007 10:18:12 +0300 Subject: [Live-devel] Is livemedia a shared library or a static library? In-Reply-To: References: Message-ID: <1187767092.2119.14.camel@delta.local> On Tue, 2007-08-21 at 08:51 +0000, ZengWenfeng wrote: > As the title. Thanks in advance! Static as compiled by the makefiles it comes with. But you can always compile it as a shared library/libraries. The details of how to do this depend on your operating system platform. Regards, Brian G. Marete From tchristensen at nordija.com Wed Aug 22 00:52:05 2007 From: tchristensen at nordija.com (Thomas Christensen) Date: Wed, 22 Aug 2007 09:52:05 +0200 Subject: [Live-devel] MPEG-TS demuxer In-Reply-To: References: Message-ID: <862F6F86-0739-4EB9-906E-A84FA3202116@nordija.com> On 22 Aug 2007, at 01:42, Ross Finlayson wrote: >> It has been mentioned before on this mailling list that live555 >> contains a demuxer for program stream and not for transport streams >> and I was wondering why. > > Because a Transport Stream demultiplexor was not needed for > streaming. MPEG Program Streams are streamed - via RTP - as separate > RTP streams (one for audio; one for video), and so need to be > demultiplexed into Elementary Streams before streaming. However, > MPEG Transport Streams, being more suited for overcoming data loss, > are streamed 'as is', without demultiplexing. I see. In testMPEG2TransportStreamer the MPEG-TS is streamed using RTP. Doesn't RTP cover the same issues as MPEG-TS when it comes to synchonisation and overcoming data loss? I know it still doesn't make a good case for a transport stream demultiplexor in a streaming library. Thanks for your input Thomas From aviadr1 at gmail.com Wed Aug 22 08:19:07 2007 From: aviadr1 at gmail.com (aviad rozenhek) Date: Wed, 22 Aug 2007 18:19:07 +0300 Subject: [Live-devel] directshow mediatype for SDP Message-ID: Hi everyone. I am receing Mpeg video and mpeg audio on two RTP ports, from the live555MediaServer.exe the SDP doesn't specify any parameters other than width, height of the video. what I would really like is to create two media types (one for video, one for audio) that capture this (rather limited) information, and putting in guesses where the information is missing. what should be the subtype? can you post pseudo code that shows which media type fields I should fill up, and what are good guesses for missing information? this is the SDP I have for the media: v=0 o=- 4042797588 1 IN IP4 192.168.64.24 s=MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server i=jackson.mpg t=0 0 a=tool:LIVE555 Streaming Media v2007.07.01 a=type:broadcast a=control:* a=range:npt=0-827.682 a=x-qt-text-nam:MPEG-1 or 2 Program Stream, streamed by the LIVE555 Media Server a=x-qt-text-inf:jackson.mpg m=video 0 RTP/AVP 32 c=IN IP4 0.0.0.0 a=control:track1 m=audio 0 RTP/AVP 14 c=IN IP4 0.0.0.0 a=control:track2 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070822/4a170912/attachment.html From panospcx at gmail.com Thu Aug 23 05:03:01 2007 From: panospcx at gmail.com (Panos Christodoulou) Date: Thu, 23 Aug 2007 15:03:01 +0300 Subject: [Live-devel] build the live555 libraries in VC++ 2005 express edition Message-ID: Hi all, Is there a complete tutorial of how to build the live555 libraries in vc++ 2005 express? I tried to open the makefiles, but this version does not use makefiles. When i try to convert, an error message appears and informs that the makefile cannot be converted. No additional info is given. thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070823/7856d50c/attachment.html From mp_ca2005 at yahoo.ca Thu Aug 23 19:58:12 2007 From: mp_ca2005 at yahoo.ca (mp lt) Date: Thu, 23 Aug 2007 22:58:12 -0400 (EDT) Subject: [Live-devel] rtsp with a/v synchronization Message-ID: <980907.32577.qm@web35911.mail.mud.yahoo.com> Hi Ross, Thank you for offering us this nice library to learn more about streaming. I am just writing a test application for H.264 and AAC multicast streaming with the help of FAQ and previous discussions. After some tests, I found some problem with a/v synchronization that I can not understand. I use the frame rate and sampling rate to put the right timestamps and time duration for every frame video and audio ( I can stream them seperately and receive them normally). However, when I stream them together(which include a rtsp server). I use rtsp protocol to get the a/v stream by VLC, I found the a/v is a bit of out of sync. The audio is always a little bit of behind video. I am not sure whether this is related to I put the different timestamps for the first frame of video and audio since I use the gettimeofday for the first time. Another interesting thing I found is that if the VLC can receive the first frame audio and video, the sync effect become much better(I did this by open the client VLC first, then start stream server), so I am guessing if there is some place in rtsp protocol to help the client synchronize the audio and video. I checked the rtsp protocol, it said In PLAY method, the Range header may help synchronization the media from different source. I can not understand how this works. I would like to hear some idea about this issue. Thanks a lot. --------------------------------- Be smarter than spam. See how smart SpamGuard is at giving junk email the boot with the All-new Yahoo! Mail -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070823/7516b64d/attachment.html From finlayson at live555.com Fri Aug 24 03:52:38 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Aug 2007 03:52:38 -0700 Subject: [Live-devel] rtsp with a/v synchronization In-Reply-To: <980907.32577.qm@web35911.mail.mud.yahoo.com> References: <980907.32577.qm@web35911.mail.mud.yahoo.com> Message-ID: >Hi Ross, > >Thank you for offering us this nice library to learn more about streaming. > >I am just writing a test application for H.264 and AAC multicast streaming >with the help of FAQ and previous discussions. After some tests, I found some >problem with a/v synchronization that I can not understand. >I use the frame rate and sampling rate to put the right timestamps >and time duration for every frame video and audio ( I can stream >them seperately and receive them normally). However, when I stream >them together(which include a rtsp server). I use rtsp protocol to >get the a/v stream by VLC, I found the a/v is a bit of out of sync. >The audio is always a little bit of behind video. >I am not sure whether this is related to I put the different >timestamps for the first frame of video and audio since I use the >gettimeofday for the first time. Yes, if you give accurate - wall-clock synchronized - presentation times to the data that you feed to your "RTPSink" (subclass), *and* if you have a "RTCPInstance" for each "RTPSink" (subclass), then these presentation times will be returned - at the client end - when it reads from the corresponding "RTPSource" (subclass). >Another interesting thing I found is that if the VLC can receive the >first frame audio and video, the sync effect become much better(I >did this by open the client VLC first, then start stream server), so >I am guessing if there is some place in rtsp protocol to help the >client synchronize the audio and video. I checked the rtsp protocol, >it said In PLAY method, the Range header may help synchronization >the media from different source. I can not understand how this works. No, A/V synchronization is ensured by using RTCP. Please read the FAQ. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From mp_ca2005 at yahoo.ca Fri Aug 24 06:21:59 2007 From: mp_ca2005 at yahoo.ca (mp lt) Date: Fri, 24 Aug 2007 09:21:59 -0400 (EDT) Subject: [Live-devel] rtsp with a/v synchronization In-Reply-To: Message-ID: <835803.42969.qm@web35910.mail.mud.yahoo.com> Thanks for your fast reply, Ross. Yes, I created "RTCPInstance" for each of video and audio RTPSink(subclass). However, the first timestamp for video and audio is a little different since each of framesources of video and audio use gettimeofday() to get the time for the first time. In my situation, the videoRTP sink start first, so the timestamp for the first video frame is a little earlier than the timestamp of first audio frame. Will this affect sync? Ross Finlayson wrote: >Hi Ross, > >Thank you for offering us this nice library to learn more about streaming. > >I am just writing a test application for H.264 and AAC multicast streaming >with the help of FAQ and previous discussions. After some tests, I found some >problem with a/v synchronization that I can not understand. >I use the frame rate and sampling rate to put the right timestamps >and time duration for every frame video and audio ( I can stream >them seperately and receive them normally). However, when I stream >them together(which include a rtsp server). I use rtsp protocol to >get the a/v stream by VLC, I found the a/v is a bit of out of sync. >The audio is always a little bit of behind video. >I am not sure whether this is related to I put the different >timestamps for the first frame of video and audio since I use the >gettimeofday for the first time. Yes, if you give accurate - wall-clock synchronized - presentation times to the data that you feed to your "RTPSink" (subclass), *and* if you have a "RTCPInstance" for each "RTPSink" (subclass), then these presentation times will be returned - at the client end - when it reads from the corresponding "RTPSource" (subclass). >Another interesting thing I found is that if the VLC can receive the >first frame audio and video, the sync effect become much better(I >did this by open the client VLC first, then start stream server), so >I am guessing if there is some place in rtsp protocol to help the >client synchronize the audio and video. I checked the rtsp protocol, >it said In PLAY method, the Range header may help synchronization >the media from different source. I can not understand how this works. No, A/V synchronization is ensured by using RTCP. Please read the FAQ. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel --------------------------------- Be smarter than spam. See how smart SpamGuard is at giving junk email the boot with the All-new Yahoo! Mail -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070824/d48bc446/attachment.html From finlayson at live555.com Fri Aug 24 13:16:49 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 24 Aug 2007 13:16:49 -0700 Subject: [Live-devel] rtsp with a/v synchronization In-Reply-To: <835803.42969.qm@web35910.mail.mud.yahoo.com> References: <835803.42969.qm@web35910.mail.mud.yahoo.com> Message-ID: >Yes, I created "RTCPInstance" for each of video and audio RTPSink(subclass). >However, the first timestamp for video and audio is a little >different since each of framesources of video and audio use >gettimeofday() to get the time for the first time. In my situation, >the videoRTP sink start first, so the timestamp for the first video >frame is a little earlier than the timestamp of first audio frame. >Will this affect sync? Because you're using RTCP, the presentation times that come out of the "RTPSource" object will (after a few seconds) be *exactly the same* as those that that went into the "RTPSink" object. Therefore, if A/V sync is not working for you, it must be because the presentation timestamps of the original audio and/or video stream were incorrect. This is a problem that you're going to have to figure out for yourself. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From yinglcs at gmail.com Sat Aug 25 21:33:07 2007 From: yinglcs at gmail.com (ying lcs) Date: Sat, 25 Aug 2007 23:33:07 -0500 Subject: [Live-devel] Need help in understanding openRTSP QOS output Message-ID: <568e62a40708252133y581fa09dsb20613881791ffc7@mail.gmail.com> Hi, I get the following output of openRTSP: I would like to know *how the 'packet_loss_percentage' min, ave, max are being calculated? How can are those related to 'num_packets_received' and num_packets_lost? * What is the meaning of 'inter_packet_gap_ms', min, ave, max? And why ave and max are 0? subsession audio/MPA num_packets_received 1405 num_packets_lost 9 elapsed_measurement_time 82.300077 kBytes_received_total 1180.084000 measurement_sampling_interval_ms 100 kbits_per_second_min 0.000000 kbits_per_second_ave 114.710367 kbits_per_second_max 941.039954 packet_loss_percentage_min 0.000000 packet_loss_percentage_ave 0.636492 packet_loss_percentage_max 75.000000 inter_packet_gap_ms_min 2147483.647000 inter_packet_gap_ms_ave 0.000000 inter_packet_gap_ms_max 0.000000 end_QOS_statistics Thank you. From fant0m4s at gmail.com Mon Aug 27 11:48:57 2007 From: fant0m4s at gmail.com (Fantomas) Date: Mon, 27 Aug 2007 15:48:57 -0300 Subject: [Live-devel] Different client and server fps Message-ID: <99fed6ae0708271148v12df0e05ufeff8b0da8b82a74@mail.gmail.com> Hi, I have a camera that streams video at 30 fps, but I need that my client (based on openRTSP) records video at 10 fps without changing the camera configuration. I've changed fDurationInMicroseconds in FramedSource and in MultiFramedRTPSource to 100000, but I'm still receiving the video in 30 fps. I'm missing something? Manuel. From finlayson at live555.com Mon Aug 27 14:06:22 2007 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 27 Aug 2007 14:06:22 -0700 Subject: [Live-devel] Different client and server fps In-Reply-To: <99fed6ae0708271148v12df0e05ufeff8b0da8b82a74@mail.gmail.com> References: <99fed6ae0708271148v12df0e05ufeff8b0da8b82a74@mail.gmail.com> Message-ID: >Hi, I have a camera that streams video at 30 fps, but I need that my >client (based on openRTSP) records video at 10 fps without changing >the camera configuration. "openRTSP" will record whatever it sees coming in over the net - which in this case will be 30 fps video. If you want to change the frame rate of the video, you will need to reprocess it after it arrives. This will require knowledge of the video format, and is not something that "openRTSP" (or the "LIVE555 Streaming Media" code in general) can do. You will need to use new software to do this. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From JWerwath at Novarra.com Tue Aug 28 08:26:01 2007 From: JWerwath at Novarra.com (Jim Werwath) Date: Tue, 28 Aug 2007 10:26:01 -0500 Subject: [Live-devel] Forgive this newbie... Message-ID: <24889886D84B794A9259323D7354CF330487B068@novarrainet2.internalnt.novarra.com> 1. Is there an easy way for newbies to keyword seach the email list archive ? 2. Has anyone posted any Windows binaries for openRTSP ? Thank you. Sorry if these have been answered before... James Werwath novarra Powering the Mobile Generation(tm) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070828/a11dca07/attachment.html From l1436636 at yahoo.com Tue Aug 28 11:24:33 2007 From: l1436636 at yahoo.com (hong liu) Date: Tue, 28 Aug 2007 11:24:33 -0700 (PDT) Subject: [Live-devel] sdp file format Message-ID: <892376.30382.qm@web51406.mail.re2.yahoo.com> I normally use MP4Box from GPAC to generate a sdp file. Now I have hinted a mp4 file from a h264 video file and created a sdp file from it. Here is the sdp file content: ---------------------- * File SDP content * b=AS:136 a=x-copyright: MP4/3GP File hinted with GPAC 0.4.3-DEV (C)2000-2005 - http://gpac.sourceforge.net c=IN IP4 0.0.0.0 m=video 1234 RTP/AVP 96 b=AS:136 a=rtpmap:96 H264/90000 a=control:trackID=65536 a=fmtp:96 profile-level-id=640028; packetization-mode=1; sprop-parameter-sets=Z2QAKKzOYLE5,aOlKOLA= a=framesize:96 176-144 -------------------------- Now I tried to use mplayer to play it with the command of "mplayer sdp://test.sdp -v". It didn't work. The output message is as follows: ---------------------------- [hong at gx745-2 ~]$ mplayer sdp://foreman.sdp -v MPlayer dev-SVN-r24045-4.1.1 (C) 2000-2007 MPlayer Team CPU: Intel(R) Core(TM)2 CPU 6400 @ 2.13GHz (Family: 6, Model: 15, Stepping: 6) CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 1 Compiled for x86 CPU with extensions: MMX MMX2 SSE SSE2 get_path('codecs.conf') -> '/home/hong/.mplayer/codecs.conf' Reading /home/hong/.mplayer/codecs.conf: Can't open '/home/hong/.mplayer/codecs.conf': No such file or directory Reading /usr/local/etc/mplayer/codecs.conf: Can't open '/usr/local/etc/mplayer/codecs.conf': No such file or directory Using built-in default codecs.conf. Configuration: --enable-gui --enable-largefiles --enable-menu --enable-debug --realcodecsdir=/usr/local/RealPlayer CommandLine: 'sdp://foreman.sdp' '-v' init_freetype get_path('font/font.desc') -> '/home/hong/.mplayer/font/font.desc' font: can't open file: /home/hong/.mplayer/font/font.desc font: can't open file: /usr/local/share/mplayer/font/font.desc Using MMX (with tiny bit MMX2) Optimized OnScreenDisplay get_path('fonts') -> '/home/hong/.mplayer/fonts' Using nanosleep() timing get_path('input.conf') -> '/home/hong/.mplayer/input.conf' Can't open input config file /home/hong/.mplayer/input.conf: No such file or directory Can't open input config file /usr/local/etc/mplayer/input.conf: No such file or directory Falling back on default (hardcoded) input config get_path('foreman.sdp.conf') -> '/home/hong/.mplayer/foreman.sdp.conf' Playing sdp://foreman.sdp. get_path('sub/') -> '/home/hong/.mplayer/sub/' STREAM: [SDP] sdp://foreman.sdp STREAM: Description: SDP stream descriptor STREAM: Author: Ross Finlayson STREAM: Comment: Uses LIVE555 Streaming Media library. file format detected. vo: x11 uninit called but X11 not inited.. Exiting... (End of file) -------------------------------------------------------------------- Then I tried to use mplayer to player your testMPEG1or2AudioVideo.sdp, and it worked well. So I think it might be caused by wrong formats between your sdp and the MP4Box created one. I am wondering if it is true or not. If yes, could you tell me how to generate your sdp file with what kind of software? If no, could you give me a clue to fix it up? Thank you for your help! ____________________________________________________________________________________ Luggage? GPS? Comic books? Check out fitting gifts for grads at Yahoo! Search http://search.yahoo.com/search?fr=oni_on_mail&p=graduation+gifts&cs=bz -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070828/828fd9b1/attachment-0001.html From mail at beim-beme.de Tue Aug 28 11:29:19 2007 From: mail at beim-beme.de (Benjamin Meier) Date: Tue, 28 Aug 2007 20:29:19 +0200 Subject: [Live-devel] Forgive this newbie... In-Reply-To: <24889886D84B794A9259323D7354CF330487B068@novarrainet2.internalnt.novarra.com> References: <24889886D84B794A9259323D7354CF330487B068@novarrainet2.internalnt.novarra.com> Message-ID: <46D4697F.7030402@beim-beme.de> > 1. Is there an easy way for newbies to keyword seach the email list > archive ? > Simply type in google search field: "thisisakeyword" site:http://lists.live555.com/pipermail/live-devel/ > > 2. Has anyone posted any Windows binaries for openRTSP ? > > Thank you. Sorry if these have been answered before... > > > > James Werwath > > > Benjamin Meier From finlayson at live555.com Tue Aug 28 11:39:55 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Aug 2007 11:39:55 -0700 Subject: [Live-devel] sdp file format In-Reply-To: <892376.30382.qm@web51406.mail.re2.yahoo.com> References: <892376.30382.qm@web51406.mail.re2.yahoo.com> Message-ID: It wasn't clear to me from your description exactly how you are transmitting the stream that you are trying to receive/play, but if you want to receive/play a stream using only a SDP description, then the stream must be multicast, not unicast. I.e., the following line in the SDP description is the problem: >c=IN IP4 0.0.0.0 Does your server support RTSP? If so, then you should play the stream using a "rtsp://" URL, rather than by reading a SDP description directly. If your server doesn't support RTSP, then you will need to change it to stream via multicast, and generate a new SDP description that contains the IP multicast address in the "c=" line. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From l1436636 at yahoo.com Tue Aug 28 15:09:51 2007 From: l1436636 at yahoo.com (hong liu) Date: Tue, 28 Aug 2007 15:09:51 -0700 (PDT) Subject: [Live-devel] sdp file format Message-ID: <939155.35198.qm@web51406.mail.re2.yahoo.com> Thanks Ross. The line of "c=IN IP4 0.0.0.0" ____________________________________________________________________________________ Got a little couch potato? Check out fun summer activities for kids. http://search.yahoo.com/search?fr=oni_on_mail&p=summer+activities+for+kids&cs=bz -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070828/c0ce011d/attachment.html From l1436636 at yahoo.com Tue Aug 28 15:26:12 2007 From: l1436636 at yahoo.com (hong liu) Date: Tue, 28 Aug 2007 15:26:12 -0700 (PDT) Subject: [Live-devel] SDP file format Message-ID: <71520.42642.qm@web51406.mail.re2.yahoo.com> Thanks Ross for your prompt reply! > It wasn't clear to me from your description exactly how you are transmitting the stream that you are trying > to receive/play, but if you want to receive/play a stream using only a SDP description, then the stream must > be multicast, not unicast. I.e., the following line in the SDP description is the problem: > >c=IN IP4 0.0.0.0 The line of "c=IN IP4 0.0.0.0" in the sdp file was manually added by myself to the original sdp file created by MP4Box since the original one didn't make mplayer work. Sorry about that causing you confused. From your above-mentioned description, If I understand correctly, the stream must be multicast, not unicast. Otherwise your program would stop proceeding. But in my sdp file, it doesn't specificy it is via multicast or unicast, just UDP and its port number. Do I have to specificy it in my sdp file? > Does your server support RTSP? If so, then you should play the stream using a "rtsp://" URL, rather than > by reading a SDP description directly. My application scenario is that one server send stream via multicast to the clients using mplayer to receive streaming videos. I have a simple video server to send mp4 video file out on a multicast address. I just have MP4Box to be used for generating sdp files. Do you think its created one is compatible with your program? > If your server doesn't support RTSP, then you will need to change it to stream via multicast, and generate > a new SDP description that contains the IP multicast address in the "c=" line. Why it is required of the "c=" line again? Thank you again! ____________________________________________________________________________________ Yahoo! oneSearch: Finally, mobile search that gives answers, not web links. http://mobile.yahoo.com/mobileweb/onesearch?refer=1ONXIC -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070828/6f186a78/attachment.html From finlayson at live555.com Tue Aug 28 15:41:26 2007 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 28 Aug 2007 15:41:26 -0700 Subject: [Live-devel] SDP file format In-Reply-To: <71520.42642.qm@web51406.mail.re2.yahoo.com> References: <71520.42642.qm@web51406.mail.re2.yahoo.com> Message-ID: > > If your server doesn't support RTSP, then you will need to change >it to stream via multicast, and generate >> a new SDP description that contains the IP multicast address in >>the "c=" line. > >Why it is required of the "c=" line again? Because otherwise the receiver(s) have no way of knowing which IP multicast address to subscribe to! -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From harmeeksingh at gmail.com Wed Aug 29 03:14:23 2007 From: harmeeksingh at gmail.com (HarmeekSingh Bedi) Date: Wed, 29 Aug 2007 03:14:23 -0700 Subject: [Live-devel] Question Message-ID: <98de83330708290314m5f2f03bcr882707b3adb95939@mail.gmail.com> Hi , I had a question on how async RTCP reciever reports are send in openRTSP client. a) Is openRTSP threaded in order to check for RTSP commands and also take care of RTCP report async sends once transmission interval expires. b) How is the RTCP transmission interval implemented ? Thread wakes us every 5 secs and sends reports ??? Info appreciated . Thanks in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070829/0c43fba9/attachment.html From k.sivareddy at gmail.com Thu Aug 30 05:33:59 2007 From: k.sivareddy at gmail.com (sivareddy kallam) Date: Thu, 30 Aug 2007 18:03:59 +0530 Subject: [Live-devel] Reuse of RTSP Session Message-ID: Hi All, I want to reuse the same session for different URLs. Can I use same session for new URL even though existing session is continuing streaming? If Yes, Can you please let me know the steps needs to be done. Thanks in Advance. --Siva -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070830/9e9cd674/attachment.html From gb.bg.gb at gmail.com Thu Aug 30 05:55:17 2007 From: gb.bg.gb at gmail.com (gian luca) Date: Thu, 30 Aug 2007 14:55:17 +0200 Subject: [Live-devel] Undefined reference to `DELAY_SECOND' Message-ID: <72f171020708300555m7180f7dax3a7c00ceb54373ae@mail.gmail.com> Hi I'm new to Linux programming and I'm trying to compile a very simple demo program using RTSP capabilities provided by liveMedia; first step I've made is to set up a project in my IDE, Kdevelop, running on a Debian Linux; I've created my project using the template provided by Kdevelop ("Simple Hello World program"), added reference to liveMedialid and necessary include, executed "Automake & friends" and builded successfully; then I've tried to use a basic class from the library (copy&paste from "playCommon.cpp").... my simple source file looks like the following: #ifdef HAVE_CONFIG_H #include #endif #include #include #include "BasicUsageEnvironment.hh" using namespace std; UsageEnvironment* env; int main(int argc, char *argv[]) { cout << "Begin program ...." << endl; // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); return EXIT_SUCCESS; } the problem is that when I try to build the project I get the following error: linking provalive2 (libtool) linking provalive2 (g++) provalive2.o: In function `__static_initialization_and_destruction_0': /usr/include/BasicUsageEnvironment/DelayQueue.hh:114: undefined reference to `DELAY_SECOND' /usr/include/BasicUsageEnvironment/DelayQueue.hh:114: undefined reference to `operator*(short, DelayInterval const&)' /usr/include/BasicUsageEnvironment/DelayQueue.hh:115: undefined reference to `operator*(short, DelayInterval const&)' /usr/include/BasicUsageEnvironment/DelayQueue.hh:116: undefined reference to `operator*(short, DelayInterval const&)' provalive2.o: In function `main': /home/veris/Projects/provalive2/src/provalive2.cpp:40: undefined reference to `BasicTaskScheduler::createNew()' /home/veris/Projects/provalive2/src/provalive2.cpp:41: undefined reference to `BasicUsageEnvironment::createNew(TaskScheduler&)' collect2: ld returned 1 exit status make[2]: *** [provalive2] Error 1 make[1]: *** [all-recursive] Error 1 make: *** [all] Error 2 *** Exited with status: 2 *** where I'm wrong? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070830/e3b4f7ec/attachment.html From samuelz_83 at hotmail.com Thu Aug 30 05:59:36 2007 From: samuelz_83 at hotmail.com (ZengWenfeng) Date: Thu, 30 Aug 2007 12:59:36 +0000 Subject: [Live-devel] How to play RTP stream with MPlayer? Message-ID: hi all! I am currently working with a RTP-based streaming application, and intend to use the mplayer as embedded player for user. I tried to call mplayer from command line like this: mplayer rtp://localhost:8888 but it doesn't work(no movie window), the output is like this: MPlayer 1.0rc1-3.4.2 (C) 2000-2006 MPlayer TeamCPU: Intel(R) Pentium(R) D CPU 3.20GHz (Family: 15, Model: 6, Stepping: 5)CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 0 SSE2: 0Compiled with runtime CPU detection.Playing rtp://localhost:8888.STREAM_RTP, URL: rtp://localhost:8888Stream not seekable!Stream not seekable!Stream not seekable!Stream not seekable!Stream not seekable! the stream is not seekable because control-level like RTSP hasn't been implemented yet, and the media stream is video-only mpeg stream, using a RTP port. the quicktime player can correctly receive the video stream. the manual and livemedia homepage says RTP stream can be played by passing a SDP file, but it also failed: mplayer sdp://1.sdp MPlayer 1.0rc1-3.4.2 (C) 2000-2006 MPlayer TeamCPU: Intel(R) Pentium(R) D CPU 3.20GHz (Family: 15, Model: 6, Stepping: 5)CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 0 SSE2: 0Compiled with runtime CPU detection.Playing sdp://1.sdp.Exiting... (End of file) the streamer is running, but mplayer just seems like did nothing with the sdp file and quit immediately right after displaying "playing sdp:...", the sdp file works fine with quicktime player. someone please tell me how to play raw RTP stream with the mplayer? Thanks very much! Best Regards,Zeng ????? Hotmail???????????????? ????? _________________________________________________________________ ? Live Search ??????? http://www.live.com/?searchOnly=true -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070830/5d42e2db/attachment-0001.html From finlayson at live555.com Thu Aug 30 09:22:59 2007 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 30 Aug 2007 09:22:59 -0700 Subject: [Live-devel] Reuse of RTSP Session In-Reply-To: References: Message-ID: >I want to reuse the same session for different URLs. >Can I use same session for new URL even though existing session is >continuing streaming? No, not with our current implementation. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ From Ambra.Cristaldi at elsagdatamat.com Fri Aug 31 03:27:53 2007 From: Ambra.Cristaldi at elsagdatamat.com (Cristaldi Ambra) Date: Fri, 31 Aug 2007 12:27:53 +0200 Subject: [Live-devel] stream of audio/video separated files Message-ID: <268315E5844A704AB431589863FE8EEFBEDE23@els00wmx04.elsag.it> Hi Ross, I've got a couple of questions about stream of .m4e files: a) I've read all the FAQ and, as described there, I've tried the OpenRTSP to receive an mpeg4 file via RTSP, from a Darwin Streaming Server and record it. As described in the FAQ (http://www.live555.com/liveMedia/faq.html#separate-rtp-streams, http://www.live555.com/liveMedia/faq.html#m4e-file) I obtained two files, an audio one and a video one, but I can't open them. I've tried both with VLC and QuickTime. Can I play them locally? How? Which is the correct extension for the audio files? b) I try with the "xxxTEST" to stream the video file (after renamed it with .m4e extension), I try to see it with VLC, and it works fine. How can I stream the audio too? I try with the other tests but they don't seem to work. c) Is it possible to merge the audio and video files to a unique mpeg4 file and play it locally (not stream)? d) I've read in this 3d your opinion about streaming m4e files (http://lists.live555.com/pipermail/live-devel/2007-August/007304.html), but I think this is the only way to stream mpeg4 files, isn't it? I mean: I stream from a device with OpenRTSP, I obtain the two separate files (audio and video) and, if I wish to stream them again, I can do it only using ".m4e" extension. Is it right? Thanks in advance, Ambra -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070831/bf8b2466/attachment.html From Ambra.Cristaldi at elsagdatamat.com Fri Aug 31 04:19:24 2007 From: Ambra.Cristaldi at elsagdatamat.com (Cristaldi Ambra) Date: Fri, 31 Aug 2007 13:19:24 +0200 Subject: [Live-devel] stream of audio/video separated files (corrected version) Message-ID: <268315E5844A704AB431589863FE8EEFBEDE43@els00wmx04.elsag.it> Hi Ross, I've got a couple of questions about stream of .m4e files: a) I've read all the FAQ and, as described there, I've tried the OpenRTSP to receive an mpeg4 file via RTSP, from a Darwin Streaming Server and record it. As described in the FAQ (http://www.live555.com/liveMedia/faq.html#separate-rtp-streams, http://www.live555.com/liveMedia/faq.html#m4e-file) I obtained two files, an audio one and a video one, but I can't open them. I've tried both with VLC and QuickTime. Can I play them locally? How? Which is the correct extension for the audio files? b) I try with the "testOnDemandRTSPServer.exe" and "testVideoStreamMPEG4.exe" applications to stream the video file (after renamed it with .m4e extension), I try to see it with VLC, and it works fine. How can I stream the audio too? I try with the other tests but they don't seem to work. c) Is it possible to merge the audio and video files to a unique mpeg4 file and play it locally (not stream)? d) I've read in this 3d your opinion about streaming m4e files (http://lists.live555.com/pipermail/live-devel/2007-August/007304.html), but I think this is the only way to stream mpeg4 files, isn't it? I mean: I stream from a device with OpenRTSP, I obtain the two separate files (audio and video) and, if I wish to stream them again, I can do it only using ".m4e" extension. Is it right? Thanks in advance, Ambra -------------------- Ambra Cristaldi -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070831/be6635d1/attachment-0001.html From finlayson at live555.com Fri Aug 31 05:40:42 2007 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 31 Aug 2007 05:40:42 -0700 Subject: [Live-devel] stream of audio/video separated files (corrected version) In-Reply-To: <268315E5844A704AB431589863FE8EEFBEDE43@els00wmx04.elsag.it> References: <268315E5844A704AB431589863FE8EEFBEDE43@els00wmx04.elsag.it> Message-ID: >c) Is it possible to merge the audio and video files to a unique >mpeg4 file and play it locally (not stream)? Yes, using the "-4" option to "openRTSP". Read the "openRTSP" online documentation. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.live555.com/pipermail/live-devel/attachments/20070831/3d769576/attachment.html