From vanessa.chodaton at etu.univ-nantes.fr Thu May 1 09:10:56 2014 From: vanessa.chodaton at etu.univ-nantes.fr (Vanessa CHODATON) Date: Thu, 1 May 2014 18:10:56 +0200 (CEST) Subject: [Live-devel] RTP source filter In-Reply-To: References: Message-ID: <6a7472e0a8f45ce5873dcf6fdef9071b.squirrel@webmail-etu.univ-nantes.fr> > >> Actually, I want to do something like the testMPEG2TransportReceiver >> that >> I find >> in the "LIVE555 Streaming Media" testprogs. This code reads a MPEG >> Transport/RTP stream (from the same multicast group/port), and outputs >> the >> reconstituted MPEG Transport Stream to "stdout".I test it and it works >> very well. >> My source filter will receive MPEG Transport Stream but instead >> of putting it to "stdout", I want put it to the output pin of my >> source >> filter. > > You can do this without making any changes to the > "testMPEG2TransportReceiver". Just write your 'filter' application to > read from 'stdin', and run > testMPEG2TransportReceiver | your_filter_application Thank you. In the testMPEG2TransportReceiver when we do : sessionState.sink = FileSink::createNew(*env, "stdout"); We get the stream in stdout or a real file name could have been used instead. //According to what I understand this function allows to receive all the stream until the end: sessionState.sink->startPlaying(*sessionState.source, afterPlaying, NULL); I don't really unsterstand the role of this function :env->taskScheduler().doEventLoop(); ? So, to put the stream in my filter at the same time it is received, I make a pointer to stdout like this : byte *pDataBuff=(byte*)"stdout";//I also tried stdin pms->GetPointer(&pDataBuff); //After this operation, the output buffer pms point at the same location as the buffer pDataBuff. pms is used to put the stream in the filter But It don't work. I thought that if I used the function read() I will have to wait until all the stream is received before read it because according to what i understand startplaying() allows to receive the stream until the last packet. Am I wrong? Please can you give me some ideas to put the stream in my filter at the same time it is received. Thank you for your attention Regards, Vanessa Chodaton From finlayson at live555.com Thu May 1 09:50:37 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 May 2014 09:50:37 -0700 Subject: [Live-devel] RTP source filter In-Reply-To: <6a7472e0a8f45ce5873dcf6fdef9071b.squirrel@webmail-etu.univ-nantes.fr> References: <6a7472e0a8f45ce5873dcf6fdef9071b.squirrel@webmail-etu.univ-nantes.fr> Message-ID: > I don't really unsterstand the role of this function > :env->taskScheduler().doEventLoop(); ? This is explained in the FAQ http://www.live555.com/liveMedia/faq.html#control-flow LIVE555-based applications are event-driven, using an event loop. > So, to put the stream in my filter at the same time it is received, I make > a pointer to stdout like this : > byte *pDataBuff=(byte*)"stdout";//I also tried stdin No! Just do what I said! Leave the "testMPEG2TransportStreamer" code as it is. DO NOT change it. Write a separate 'filter' application (using your own code) that reads from 'stdin'. Then run, from the command line: testMPEG2TransportReceiver | your_filter_application (To use the LIVE555 software, you need to understand what "stdout" and "stdin" mean, and understand what "|" (i.e., piping) means.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vikram at vizexperts.com Thu May 1 09:58:18 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Thu, 1 May 2014 22:28:18 +0530 Subject: [Live-devel] Frames are corrupted In-Reply-To: <89371423-1FA7-4963-AF87-1FFD983B8181@live555.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> <003201cf5e34$f77257d0$e6570770$@vizexperts.com> <89371423-1FA7-4963-AF87-1FFD983B8181@live555.com> Message-ID: <001d01cf655e$8e153740$aa3fa5c0$@vizexperts.com> Hi ross, I am not able to get SPS and PPS units from the encoder. I am using CUDA Video Encode library which has a function NVGetSPSPPS(). NVGetSPSPPS() returns a buffer to SPS and PPS. The problem is that I don't have the formatting for this buffer so that I could separate SPS and PPS units from each other. In my case this buffer is 00 24 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 68 ee 3c 80 Total of 44 bytes. In the link http://www.cardinalpeak.com/blog/the-h-264-sequence-parameter-set/ 0x00, 0x00, 0x00, 0x01, 0x67, 0x42, 0x00, 0x0a, 0xf8, 0x41, 0xa2 ==> sps 0x00, 0x00, 0x00, 0x01, 0x68, 0xce, 0x38, 0x80 ==> pps According the webpage, 7 in 0x67 for nal_unit_type sps And 8 in 0x68 is for pps nal_unit_type. I have the same in my buffer. Does this mean 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 is sps and ee 3c 80 is pps leaving out 00 24 at the starting. According to my assumption 00 00 00 01 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 ==> sps And 00 00 00 01 68 ee 3c 80 ==> pps Sorry I am not getting the detailed documentation for the function NVGetSPSPPS(). Please help me. From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, April 22, 2014 8:46 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Frames are corrupted Does this happen because I have not received SPS and PPS nal units ? Yes. If you are streaming H.264 video, then you *must* have SPS and PPS NAL units. Either 1/ Your H.264 video source contains SPS and PPS NAL units, occurring frequently. In this case, you *should not* modify "getAuxSDPLine()". Or: 2/ Your H.264 video source does not contain SPS and PPS NAL units, but you know them some other way, in advance. In this case, you should not implement "getAuxSDPLine()", but you *must* then pass these NAL units to "H264VideoRTPSink::createNew()", in your implementation of the "createNewRTPSink()" virtual function. If neither 1/ nor 2/ is true - i.e., if your video source does not contain SPS and PPS NAL units, nor do you know these in advance - then you will not be able to successfully stream H.264 video. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Thu May 1 10:19:59 2014 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Thu, 1 May 2014 10:19:59 -0700 Subject: [Live-devel] Frames are corrupted In-Reply-To: <001d01cf655e$8e153740$aa3fa5c0$@vizexperts.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> <003201cf5e34$f77257d0$e6570770$@vizexperts.com> <89371423-1FA7-4963-AF87-1FFD983B8181@live555.com> <001d01cf655e$8e153740$aa3fa5c0$@vizexperts.com> Message-ID: <022b01cf6561$954083a0$bfc18ae0$@com> Hello, > 00 24 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 68 ee 3c 80 >Total of 44 bytes. It looks like the buffer is formatted with the 16-bit length of the NAL unit, then the NAL unit data. So you can split that buffer by reading the first two bytes to determine the NAL unit length, then read the NAL unit data, then read two more bytes to determine the length of the next NAL unit and so on, until you reach the end of the buffer. In your example, you would read 0x0024 (36), then read 36 bytes for the SPS NAL. This would place you immediately before the 00 04, and you would then read 0x0004 (4), then read 4 bytes for the PPS NAL. At this point you would be at the end of the buffer and could stop reading. Chris Richardson WTI -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 1 10:27:07 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 May 2014 10:27:07 -0700 Subject: [Live-devel] Frames are corrupted In-Reply-To: <001d01cf655e$8e153740$aa3fa5c0$@vizexperts.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> <003201cf5e34$f77257d0$e6570770$@vizexperts.com> <89371423-1FA7-4963-AF87-1FFD983B8181@live555.com> <001d01cf655e$8e153740$aa3fa5c0$@vizexperts.com> Message-ID: <1DB89542-B6E9-41FA-BC3D-F89CB84358CD@live555.com> On May 1, 2014, at 9:58 AM, Vikram Singh wrote: > Hi ross, > I am not able to get SPS and PPS units from the encoder. > I am using CUDA Video Encode library which has a function NVGetSPSPPS(). [...] >> >> Sorry I am not getting the detailed documentation for the function NVGetSPSPPS(). Please help me. "NVGetSPSPPS()" is not a function in our code, so I don't know why you're asking here. Why don't you ask whoever provided you the "CUDA Video Encode library"? However... > NVGetSPSPPS() returns a buffer to SPS and PPS. > The problem is that I don?t have the formatting for this buffer so that I could separate SPS and PPS units from each other. > > In my case this buffer is > > 00 24 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 68 ee 3c 80 It seems quite clear that the encoding is: Sequence of: 2-byte 'length' (in big-endian order) 'length' bytes, containing the NAL unit (without a preceding 'start code') > According to my assumption > > 00 00 00 01 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 ==> sps > And 00 00 00 01 68 ee 3c 80 ==> pps Not quite, because it seems clear that the "00 24" is a length field (i.e., 0x0024 == 36 decimal). Therefore SPS is 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 then the next two bytes - 00 04 - are also a length field (therefore, *not* part of the NAL unti), so PPS is 68 ee 3c 80 You should therefore pass these two NAL units - without start codes - as parameters to H264VideoRTPSink::createNew() This will be the second form of "H264VideoRTPSink::createNew()" - i.e., the form that has the signature static H264VideoRTPSink* createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat, u_int8_t const* sps, unsigned spsSize, u_int8_t const* pps, unsigned ppsSize, unsigned profile_level_id); Note that you also need to pass a "profile_level_id" parameter. This is actually a flaw in the code, because you can generate this parameter from the 2nd through 4th bytes of the SPS NAL unit. (I.e., this parameter shouldn't be needed.) So, in your case, the "profile_level_id" parameter would be 0x4d401e Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rglobisch at csir.co.za Thu May 1 11:21:48 2014 From: rglobisch at csir.co.za (Ralf Globisch) Date: Thu, 01 May 2014 20:21:48 +0200 Subject: [Live-devel] RTP source filter In-Reply-To: References: <6a7472e0a8f45ce5873dcf6fdef9071b.squirrel@webmail-etu.univ-nantes.fr> Message-ID: <5362ACDC0200004D000A1CE7@pta-emo.csir.co.za> It sounds to me like the OP is trying to write a DirectShow source filter, which she failed to mention. If that is the case I'm not sure if reading from stdin would work. Rather she should run the live555 code in its own thread and then pass on incoming media samples to the DirectShow source filter media processing thread. An example that shows how this can be done with live555 is available at http://sourceforge.net/projects/videoprocessing/. >>> Ross Finlayson 05/01/14 6:55 PM >>> > I don't really unsterstand the role of this function > :env->taskScheduler().doEventLoop(); ? This is explained in the FAQ http://www.live555.com/liveMedia/faq.html#control-flow LIVE555-based applications are event-driven, using an event loop. > So, to put the stream in my filter at the same time it is received, I make > a pointer to stdout like this : > byte *pDataBuff=(byte*)"stdout";//I also tried stdin No! Just do what I said! Leave the "testMPEG2TransportStreamer" code as it is. DO NOT change it. Write a separate 'filter' application (using your own code) that reads from 'stdin'. Then run, from the command line: testMPEG2TransportReceiver | your_filter_application (To use the LIVE555 software, you need to understand what "stdout" and "stdin" mean, and understand what "|" (i.e., piping) means.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. Please consider the environment before printing this email. -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. Please consider the environment before printing this email. From vikram at vizexperts.com Thu May 1 13:21:05 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Fri, 2 May 2014 01:51:05 +0530 Subject: [Live-devel] Frames are corrupted In-Reply-To: <1DB89542-B6E9-41FA-BC3D-F89CB84358CD@live555.com> References: <000901cf589c$59000630$0b001290$@vizexperts.com> <84AAAB6A-2131-4E61-A6B2-872D9B52E2F2@live555.com> <000301cf5b1f$23855c50$6a9014f0$@vizexperts.com> <14A19130-FF1B-4D97-871D-915B6A88AA52@live555.com> <000c01cf5b93$2eb333d0$8c199b70$@vizexperts.com> <6BA71742-B5FE-4086-8240-B781BAECDFD3@live555.com> <003201cf5e34$f77257d0$e6570770$@vizexperts.com> <89371423-1FA7-4963-AF87-1FFD983B8181@live555.com> <001d01cf655e$8e153740$aa3fa5c0$@vizexperts.com> <1DB89542-B6E9-41FA-BC3D-F89CB84358CD@live555.com> Message-ID: <003a01cf657a$e242ae30$a6c80a90$@vizexperts.com> Thank you ross and chris. It works !! J From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, May 01, 2014 10:57 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Frames are corrupted On May 1, 2014, at 9:58 AM, Vikram Singh wrote: Hi ross, I am not able to get SPS and PPS units from the encoder. I am using CUDA Video Encode library which has a function NVGetSPSPPS(). [...] Sorry I am not getting the detailed documentation for the function NVGetSPSPPS(). Please help me. "NVGetSPSPPS()" is not a function in our code, so I don't know why you're asking here. Why don't you ask whoever provided you the "CUDA Video Encode library"? However... NVGetSPSPPS() returns a buffer to SPS and PPS. The problem is that I don't have the formatting for this buffer so that I could separate SPS and PPS units from each other. In my case this buffer is 00 24 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 68 ee 3c 80 It seems quite clear that the encoding is: Sequence of: 2-byte 'length' (in big-endian order) 'length' bytes, containing the NAL unit (without a preceding 'start code') According to my assumption 00 00 00 01 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 ==> sps And 00 00 00 01 68 ee 3c 80 ==> pps Not quite, because it seems clear that the "00 24" is a length field (i.e., 0x0024 == 36 decimal). Therefore SPS is 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2 00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 then the next two bytes - 00 04 - are also a length field (therefore, *not* part of the NAL unti), so PPS is 68 ee 3c 80 You should therefore pass these two NAL units - without start codes - as parameters to H264VideoRTPSink::createNew() This will be the second form of "H264VideoRTPSink::createNew()" - i.e., the form that has the signature static H264VideoRTPSink* createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat, u_int8_t const* sps, unsigned spsSize, u_int8_t const* pps, unsigned ppsSize, unsigned profile_level_id); Note that you also need to pass a "profile_level_id" parameter. This is actually a flaw in the code, because you can generate this parameter from the 2nd through 4th bytes of the SPS NAL unit. (I.e., this parameter shouldn't be needed.) So, in your case, the "profile_level_id" parameter would be 0x4d401e Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 1 14:07:59 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 May 2014 14:07:59 -0700 Subject: [Live-devel] Problems with MPEG2 Indexer / Trickplay In-Reply-To: References: Message-ID: <66027E32-98C0-4F53-9372-05C036F3469E@live555.com> > I have used the two "MPEG2TransportStreamIndexer" and "testMPEG2TransportStreamTrickPlay" test programs to index a TS file ("testrec.ts"), and then trickplay it with 4x speed: > MPEG2TransportStreamIndexer testrec.ts > The index file "testrec.tsx" is created (This file is uploaded here for more info). Then > testMPEG2TransportStreamTrickPlay testrec.ts 0 4 testrec4.ts > is run to create the trickplayed file. But the output file could not be played and is a very small file in size. The output file is uploaded here. The original "testrec.ts" file is a 1:18 minute, 25 fps video, with 27.5 MBs in size. Any ideas? To figure out what's happening here, I'll need to see the original Transport Stream file ("testrec.ts"). Please upload this, so we can see it. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 1 14:14:15 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 1 May 2014 14:14:15 -0700 Subject: [Live-devel] ByteStreamMultiFileSource In-Reply-To: References: Message-ID: > I can't find any example to use "ByteStreamMultiFileSource", can you help me please! "ByteStreamMultiFileSource::createNew()" takes a NULL-terminated array of file names. E.g., char* ourFileNames = new char*[3+1]; ourFileNames[0] = "ourFile0.ts"; ourFileNames[1] = "ourFile1.ts"; ourFileNames[2] = "ourFile2.ts"; ourFileNames[3] = NULL; ByteStreamMultiFileSource* source = ByteStreamMultiFileSource::createNew(envir(), ourFileNames); Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Fri May 2 13:31:01 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Fri, 2 May 2014 23:31:01 +0300 Subject: [Live-devel] Logging or debug info Message-ID: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> Hi, >From what I can see there is no way to get some form of logging or debug info out from Live555? I have a case where I develop both the RTSP server (based on Live555) and the client (based on libav). I'm trying to track down a problem where my client application randomly after 5-120 seconds of valid H264 or MJPEG video suddenly doesn't seem to receive any frames anymore. It does look like my server based code is still grabbing images from its source (local USB camera or remote stream) and sending the frames to the core Live555 components, but I'm not sure they actually are sent out to the network. I'd like to rule out the server side in this equation by getting a bit more information about what it's doing and statistics of sent data. I don't really suspect the server code here, but I have to rule out things one by one, and I do not look forward to diving deep into the libav code. So, are there any callbacks, defines or similar to get a bit more info as to what is going on? Kind regards, Jan Ekholm From finlayson at live555.com Fri May 2 14:35:57 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 May 2014 14:35:57 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> Message-ID: > I have a case where I develop both the RTSP server (based on Live555) > and the client (based on libav). Why not use our library for your client as well (and use "libav" just for the decoding)? I wouldn't be surprised if "libav's" implementation of RTSP/RTP/RTCP were imperfect. (E.g., if it were to (incorrectly) fail to send RTCP "RR" reports, then that could cause your server to time out the connection.) > I'd like to rule out the server side in this equation by > getting a bit more information about what it's doing You could add #define DEBUG 1 to the start of "liveMedia/RTSPServer.cpp", and recompile. Alternatively, I suggest using "testRTSPClient" (and/or "openRTSP") as your client. That should tell you if the problem is with your server, or with your client. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vikram at vizexperts.com Fri May 2 23:42:12 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Sat, 3 May 2014 12:12:12 +0530 Subject: [Live-devel] HTTP Live streaming Message-ID: <000001cf669a$d1caf690$7560e3b0$@vizexperts.com> Hi I have three question. 1) Can we stream h.264 encoded data using HLS in live555. If so, can you please point me out the example which does this in live555 testProgs folder. 2) I was going through a example testMPEG2TransportStreamer.cpp which stream a test.ts. I have placed a valid test.ts file in the path and executed the testMPEG2TransportStreamer.exe. But what is the url so that it will play in browser. I tested with : http://192.168.0.90:80/test.ts http://192.168.0.90:8080/test.ts http://192.168.0.90:554/test.ts all of them are not playing in browser and vlc. 3) Can we stream a h264 data simultaneously to both RTSP and HLS clients. Thanks, Vikram singh -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 2 23:55:24 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 2 May 2014 23:55:24 -0700 Subject: [Live-devel] HTTP Live streaming In-Reply-To: <000001cf669a$d1caf690$7560e3b0$@vizexperts.com> References: <000001cf669a$d1caf690$7560e3b0$@vizexperts.com> Message-ID: <7B5BAF89-C191-414D-A4CD-B7B5D5CD2FD5@live555.com> > 1) Can we stream h.264 encoded data using HLS in live555. If so, can you please point me out the example which does this in live555 testProgs folder. It's not an application in the "testProgs" directory. Instead, it's the "LIVE555 Media Server" application (in the "mediaServer" directory). See http://www.live555.com/mediaServer/#http-live-streaming > > 2) I was going through a example testMPEG2TransportStreamer.cpp which stream a test.ts. > I have placed a valid test.ts file in the path and executed the testMPEG2TransportStreamer.exe. > But what is the url so that it will play in browser. By default (unless you make a small modification to the code), the "testMPEG2TransportStreamer" application does not have a built-in server, and therefore is not accessed via a URL. Instead, run the "LIVE555 Media Server": http://www.live555.com/mediaServer/ > 3) Can we stream a h264 data simultaneously to both RTSP and HLS clients. As noted in the first link above, the "LIVE555 Media Server" can stream a MPEG Transport Stream file - containing H.264 video - to clients using either RTSP or HLS. However, the data source *must* be a file (and must have a corresponding index file), and is not streamed 'simultaneously' to multiple clients. (Instead, multiple clients can each choose to stream the file at whatever time they choose.) Note that we do not support HLS streaming from a live source - only from a pre-recorded (and pre-indexed) file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vikram at vizexperts.com Sat May 3 01:44:34 2014 From: vikram at vizexperts.com (Vikram Singh) Date: Sat, 3 May 2014 14:14:34 +0530 Subject: [Live-devel] HTTP Live streaming In-Reply-To: <7B5BAF89-C191-414D-A4CD-B7B5D5CD2FD5@live555.com> References: <000001cf669a$d1caf690$7560e3b0$@vizexperts.com> <7B5BAF89-C191-414D-A4CD-B7B5D5CD2FD5@live555.com> Message-ID: <001401cf66ab$e9ae0390$bd0a0ab0$@vizexperts.com> I get the H.264 encoded frames in a queue. I implemented RTSP live streaming by extending the class OnDemandServerMediaSubsession and reading from the queue. Can the same be done on HLS. My requirement is to stream OpenGL frames and visualize on a iPad. I can do the same using RTSP but iOS doen't support native decoding for RTSP. It supports only for HLS. From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, May 03, 2014 12:25 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] HTTP Live streaming 1) Can we stream h.264 encoded data using HLS in live555. If so, can you please point me out the example which does this in live555 testProgs folder. It's not an application in the "testProgs" directory. Instead, it's the "LIVE555 Media Server" application (in the "mediaServer" directory). See http://www.live555.com/mediaServer/#http-live-streaming 2) I was going through a example testMPEG2TransportStreamer.cpp which stream a test.ts. I have placed a valid test.ts file in the path and executed the testMPEG2TransportStreamer.exe. But what is the url so that it will play in browser. By default (unless you make a small modification to the code), the "testMPEG2TransportStreamer" application does not have a built-in server, and therefore is not accessed via a URL. Instead, run the "LIVE555 Media Server": http://www.live555.com/mediaServer/ 3) Can we stream a h264 data simultaneously to both RTSP and HLS clients. As noted in the first link above, the "LIVE555 Media Server" can stream a MPEG Transport Stream file - containing H.264 video - to clients using either RTSP or HLS. However, the data source *must* be a file (and must have a corresponding index file), and is not streamed 'simultaneously' to multiple clients. (Instead, multiple clients can each choose to stream the file at whatever time they choose.) Note that we do not support HLS streaming from a live source - only from a pre-recorded (and pre-indexed) file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 3 01:52:20 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 3 May 2014 01:52:20 -0700 Subject: [Live-devel] HTTP Live streaming In-Reply-To: <001401cf66ab$e9ae0390$bd0a0ab0$@vizexperts.com> References: <000001cf669a$d1caf690$7560e3b0$@vizexperts.com> <7B5BAF89-C191-414D-A4CD-B7B5D5CD2FD5@live555.com> <001401cf66ab$e9ae0390$bd0a0ab0$@vizexperts.com> Message-ID: <2BEB76E9-9A8B-4BCF-B24D-5808E5D76054@live555.com> > I get the H.264 encoded frames in a queue. > I implemented RTSP live streaming by extending the class OnDemandServerMediaSubsession and reading from the queue. > Can the same be done on HLS. No. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Sat May 3 03:20:46 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Sat, 3 May 2014 13:20:46 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> Message-ID: On 3 maj 2014, at 00:35, Ross Finlayson wrote: >> I have a case where I develop both the RTSP server (based on Live555) >> and the client (based on libav). > > Why not use our library for your client as well (and use "libav" just for the decoding)? I wouldn't be surprised if "libav's" implementation of RTSP/RTP/RTCP were imperfect. (E.g., if it were to (incorrectly) fail to send RTCP "RR" reports, then that could cause your server to time out the connection.) The client was started and more or less completed way before I had ever looked in any detail at Live555. I would more or less have to recreate it from scratch if I were to base it on Live555. Not to say that isn't an option though if I can't solve the issues. The server does not time out the stream, as then I'd see in my server code that the stream and all my components are destroyed. The server side component destruction happens when a client closes a stream normally or after the standard timeout, but that does not happen now. > >> I'd like to rule out the server side in this equation by >> getting a bit more information about what it's doing > > You could add > > #define DEBUG 1 > > to the start of "liveMedia/RTSPServer.cpp", and recompile. > > Alternatively, I suggest using "testRTSPClient" (and/or "openRTSP") as your client. That should tell you if the problem is with your server, or with your client. Great, thank you for those debugging tips! Especially the test applications will quickly show me if there is any data delivered at all. -- Jan Ekholm jan.ekholm at d-pointer.com From jshanab at jfs-tech.com Sat May 3 05:17:12 2014 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Sat, 3 May 2014 08:17:12 -0400 Subject: [Live-devel] HTTP Live streaming In-Reply-To: <7B5BAF89-C191-414D-A4CD-B7B5D5CD2FD5@live555.com> References: <000001cf669a$d1caf690$7560e3b0$@vizexperts.com> <7B5BAF89-C191-414D-A4CD-B7B5D5CD2FD5@live555.com> Message-ID: I wrote a HLS streamer for live stream in my last job that used live555 to pull from security cameras. I only had to make a small modification to the MPEG2TransportStream class to make deterministic PAT packet (or was that PES, sorry this is from memory). These packets are currently inserted on a timer basis and not related to the keyframe. In order for different people people to connect at different times This was changed to every so many Keyframes. Keyframes are already regularly spaced so I then could use the PAT/PES packets to know where to split the file. I used the Mongoose embedded web server and made a complete in memory system. But... It was not widely used becasue this was security video and HLS has a latency of At Least 1.5 segments and the segment is supposedly minimum 5 seconds. I was able to go to 2 seconds. This was unacceptable. It was easier to port the browser plugin to an android and iphone app and just stream rtsp to the devices. I had another propritary protocol serving frames over HTTP and one of the newer standards is very similar. WEB-M + VP8 On Sat, May 3, 2014 at 2:55 AM, Ross Finlayson wrote: > 1) Can we stream h.264 encoded data using HLS in live555. If so, can > you please point me out the example which does this in live555 testProgs > folder. > > > It's not an application in the "testProgs" directory. Instead, it's the > "LIVE555 Media Server" application (in the "mediaServer" directory). See > http://www.live555.com/mediaServer/#http-live-streaming > > > > 2) I was going through a example testMPEG2TransportStreamer.cpp > which stream a test.ts. > I have placed a valid test.ts file in the path and executed the > testMPEG2TransportStreamer.exe. > But what is the url so that it will play in browser. > > > By default (unless you make a small modification to the code), the > "testMPEG2TransportStreamer" application does not have a built-in server, > and therefore is not accessed via a URL. Instead, run the "LIVE555 Media > Server": > http://www.live555.com/mediaServer/ > > > > 3) Can we stream a h264 data simultaneously to both RTSP and HLS > clients. > > > As noted in the first link above, the "LIVE555 Media Server" can stream a > MPEG Transport Stream file - containing H.264 video - to clients using > either RTSP or HLS. However, the data source *must* be a file (and must > have a corresponding index file), and is not streamed 'simultaneously' to > multiple clients. (Instead, multiple clients can each choose to stream the > file at whatever time they choose.) Note that we do not support HLS > streaming from a live source - only from a pre-recorded (and pre-indexed) > file. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue May 6 05:33:02 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 6 May 2014 15:33:02 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> Message-ID: <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> On 3 maj 2014, at 00:35, Ross Finlayson wrote: >> I have a case where I develop both the RTSP server (based on Live555) >> and the client (based on libav). > > Why not use our library for your client as well (and use "libav" just for the decoding)? I wouldn't be surprised if "libav's" implementation of RTSP/RTP/RTCP were imperfect. (E.g., if it were to (incorrectly) fail to send RTCP "RR" reports, then that could cause your server to time out the connection.) > > >> I'd like to rule out the server side in this equation by >> getting a bit more information about what it's doing > > You could add > > #define DEBUG 1 > > to the start of "liveMedia/RTSPServer.cpp", and recompile. > > Alternatively, I suggest using "testRTSPClient" (and/or "openRTSP") as your client. That should tell you if the problem is with your server, or with your client. Testing with: ./testRTSPClient rtsp://192.168.1.12:8554/camera0 it seems to work ok until the 65 second timeout occurs on the server side. Perhaps it does not handle the needed RTSP conversation? Same happens with ./openRTSP. I see from my server logs that the camera I stream gets deallocated. I have done nothing custom on the Live555 side for the low level RTSP handling, so I assume the clients don't do the needed talking. Live555 says as much too after enabling the DEBUG variable: RTSP client session (id "2FBE732F", stream name "camera0") has timed out (due to inactivity) Other applications like VLC or avconc work fine for longer periods of time (hours), even my own libav based client often works fine for long periods of time. It seems to be something related to when several clients connect to the server and some clients time out. My own client is controlled in a somewhat silly manner, so it gets killed and restarted when there are changes in the video window size. This seems to leave the old sockets open and these time out after a while. When these time outs occur there seems to be a bigger risk for my client to freeze. I haven't found out why, but it only happens when the server is Live555-based. I also now see that the server always streams over TCP, not UDP as I would have expected. Is there any way to control what transport is actually used for the video streams? The control stream is TCP of course. -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Tue May 6 10:40:35 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 May 2014 10:40:35 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> Message-ID: > Testing with: > > ./testRTSPClient rtsp://192.168.1.12:8554/camera0 > > it seems to work ok until the 65 second timeout occurs on the server side. Perhaps it does not handle the > needed RTSP conversation? Same happens with ./openRTSP. I see from my server logs that the camera > I stream gets deallocated. I have done nothing custom on the Live555 side for the low level RTSP handling, > so I assume the clients don't do the needed talking. Live555 says as much too after enabling the DEBUG > variable: > > RTSP client session (id "2FBE732F", stream name "camera0") has timed out (due to inactivity) The problem here is that your server is not receiving the frequent RTCP "RR" reports that the client (both "testRTSPClient" and "openRTSP") is sending. Because you say that your server is built using the LIVE555 library, this surprises me. Does your server stream via multicast - i.e., using a "PassiveServerMediaSubsession", like the "test*Streamer" demo applications in the "testProgs" directory? If so, then your server *must* create a "RTCPInstance" object along with each "RTPSink" object (as you can see in the "test*Streamer" example code). Or does your server stream via unicast - i.e., using a (subclass of) "OnDemandServerMediaSubsession" - as demonstrated by the "testOnDemandRTSPServer" demo application (or by the "LIVE555 Media Server")? If so, then I can't see how you could be having this problem, because "OnDemandServerMediaSubsession" automatically creates "RTCPInstance" objects. > I also now see that the server always streams over TCP No, you're mistaken about this (unless you've modified the "RTSPServer.cpp" code, in which case you can't expect any support on this mailing list). "testRTSPClient" (unless modified) always requests RTP/RTCP-over-UDP streaming, as does "openRTSP" (unless you explicitly give it the "-t" option). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 6 10:42:48 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 May 2014 10:42:48 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> Message-ID: <7CF11758-9F50-4C92-A014-153B97CA4E64@live555.com> >> I also now see that the server always streams over TCP > > No, you're mistaken about this (unless you've modified the "RTSPServer.cpp" code, in which case you can't expect any support on this mailing list). "testRTSPClient" (unless modified) always requests RTP/RTCP-over-UDP streaming, as does "openRTSP" (unless you explicitly give it the "-t" option). And the server (unless improperly modified) always delivers whatever type of streaming (RTP/RTCP-over-UDP, or RTP/RTCP-over-TCP) is requested by the client. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue May 6 11:47:01 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 6 May 2014 21:47:01 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> Message-ID: On 6 maj 2014, at 20:40, Ross Finlayson wrote: >> Testing with: >> >> ./testRTSPClient rtsp://192.168.1.12:8554/camera0 >> >> it seems to work ok until the 65 second timeout occurs on the server side. Perhaps it does not handle the >> needed RTSP conversation? Same happens with ./openRTSP. I see from my server logs that the camera >> I stream gets deallocated. I have done nothing custom on the Live555 side for the low level RTSP handling, >> so I assume the clients don't do the needed talking. Live555 says as much too after enabling the DEBUG >> variable: >> >> RTSP client session (id "2FBE732F", stream name "camera0") has timed out (due to inactivity) > > The problem here is that your server is not receiving the frequent RTCP "RR" reports that the client (both "testRTSPClient" and "openRTSP") is sending. Because you say that your server is built using the LIVE555 library, this surprises me. It's all a bit random, as some clients can run hours, like VLC. It all seems to depend on how many clients are connected. A single client and there are no problems ever. As soon as there are more clients or the same client crashes/killed/exits and reconnects the problems start. At that point own libav based client will freeze sooner or later. > Does your server stream via multicast - i.e., using a "PassiveServerMediaSubsession", like the "test*Streamer" demo applications in the "testProgs" directory? If so, then your server *must* create a "RTCPInstance" object along with each "RTPSink" object (as you can see in the "test*Streamer" example code). No multicast yet, but that is planned. I have seen and initially used code based on those examples, but then I at least temporarily migrated to use OnDemandServerMediaSubsession. > Or does your server stream via unicast - i.e., using a (subclass of) "OnDemandServerMediaSubsession" - as demonstrated by the "testOnDemandRTSPServer" demo application (or by the "LIVE555 Media Server")? If so, then I can't see how you could be having this problem, because "OnDemandServerMediaSubsession" automatically creates "RTCPInstance" objects. Yes, I use that. My subclass mostly does the getAuxSDPLine() handling for H264 as well as trivial versions of createNewStreamSource() and createNewRTPSink(). > >> I also now see that the server always streams over TCP > > No, you're mistaken about this (unless you've modified the "RTSPServer.cpp" code, in which case you can't expect any support on this mailing list). "testRTSPClient" (unless modified) always requests RTP/RTCP-over-UDP streaming, as does "openRTSP" (unless you explicitly give it the "-t" option). In my code I use an unmodified instance of RTSPServer (apart from the now added DEBUG flag). It gets a few different sessions based on subclassed OnDemandServerMediaSubsession and ProxyServerMediaSession. They are not modified either. The current stream I test with is streaming H264, but also another using MJPEG has the same issues This is what my server says (or the Live555 code through DEBUG=1) when I run testRTSPClient when there are no other clients and no old sessions timing out: accept()ed connection from 192.168.1.12 RTSPClientConnection[0x10300e000]::handleRequestBytes() read 156 new bytes:DESCRIBE rtsp://192.168.1.12:8554/camera0 RTSP/1.0 CSeq: 2 User-Agent: ./testRTSPClient (LIVE555 Streaming Media v2014.03.25) Accept: application/sdp parseRTSPRequestString() succeeded, returning cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "camera0", CSeq "2", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 2 Date: Tue, May 06 2014 18:13:57 GMT Content-Base: rtsp://192.168.1.12:8554/camera0/ Content-Type: application/sdp Content-Length: 521 v=0 o=- 1399371554452606 1 IN IP4 192.168.1.12 s=Local H264 camera i=Camera t=0 0 a=tool:LIVE555 Streaming Media v2014.03.25 a=type:broadcast a=control:* a=source-filter: incl IN IP4 * 192.168.1.12 a=rtcp-unicast: reflection a=range:npt=0- a=x-qt-text-nam:Local H264 camera a=x-qt-text-inf:Camera m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:200 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=42C01F;sprop-parameter-sets=Z0LAH9kAUAW6EAAAAwAQAAADAyjxgySA,aMuDyyA= a=control:track1 RTSPClientConnection[0x10300e000]::handleRequestBytes() read 187 new bytes:SETUP rtsp://192.168.1.12:8554/camera0/track1 RTSP/1.0 CSeq: 3 User-Agent: ./testRTSPClient (LIVE555 Streaming Media v2014.03.25) Transport: RTP/AVP;unicast;client_port=65142-65143 parseRTSPRequestString() succeeded, returning cmdName "SETUP", urlPreSuffix "camera0", urlSuffix "track1", CSeq "3", Content-Length 0, with 0 bytes following the message. x264 [info]: using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX x264 [info]: profile Constrained Baseline, level 3.1 sending response: RTSP/1.0 200 OK CSeq: 3 Date: Tue, May 06 2014 18:14:00 GMT Transport: RTP/AVP;unicast;destination=192.168.1.12;source=192.168.1.12;client_port=65142-65143;server_port=6970-6971 Session: 90DD8B79;timeout=65 RTSPClientConnection[0x10300e000]::handleRequestBytes() read 166 new bytes:PLAY rtsp://192.168.1.12:8554/camera0/ RTSP/1.0 CSeq: 4 User-Agent: ./testRTSPClient (LIVE555 Streaming Media v2014.03.25) Session: 90DD8B79 Range: npt=0.000- parseRTSPRequestString() succeeded, returning cmdName "PLAY", urlPreSuffix "camera0", urlSuffix "", CSeq "4", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 4 Date: Tue, May 06 2014 18:14:01 GMT Range: npt=0.000- Session: 90DD8B79 RTP-Info: url=rtsp://192.168.1.12:8554/camera0/track1;seq=5372;rtptime=2132557593 RTSP client session (id "90DD8B79", stream name "camera0") has timed out (due to inactivity) When that happens, testRTSP has shown a long line of: Stream "rtsp://192.168.1.12:8554/camera0/"; video/H264: Received 2385 bytes. Presentation time: 1399400105.931199 and then just freezes. UPDATE: I looked through my own code again after writing this looking for clues. What does the "isSSM" flag really do here: class ServerMediaSession: public Medium { public: static ServerMediaSession* createNew(UsageEnvironment& env, char const* streamName = NULL, char const* info = NULL, char const* description = NULL, Boolean isSSM = False, char const* miscSDPLines = NULL); ... The code I based my code on was for PassiveServerMediaSubsession which uses SSM. I set that flag to False now and that makes testRTSPClient work longer than 65 s. Still my own client gets issues as soon as there are more clients connected. Now I also seem to get from the testRTSPClient: RTSP client session (id "133E19DE", stream name "camera0"): Liveness indication RTSP client session (id "133E19DE", stream name "camera0"): Liveness indication after few seconds, which I did not get when isSSM=True. My own client does not seem to send those, but instead I get these every 10 seconds or so: parseRTSPRequestString() succeeded, returning cmdName "GET_PARAMETER", urlPreSuffix "camera0", urlSuffix "", CSeq "10", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 10 Date: Tue, May 06 2014 18:44:53 GMT Session: 6A8077BD Content-Length: 10 This is quite complicated... I'm sorry for troubling you with this, but I'm grasping at straws trying to get this to work. Best regards, Jan Ekholm From finlayson at live555.com Tue May 6 12:28:27 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 May 2014 12:28:27 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> Message-ID: <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> > The code I based my code on was for PassiveServerMediaSubsession which uses SSM. That's your problem. If you are streaming via unicast, then you shouldn't be using any of the "test*Streamer.cpp" applications as a model. You need to start again from the beginning, using "testOnDemandRTSPServer" as a model. In fact, I suggest that you first run the (unmodified!) "testOnDemandRTSPServer" application, using "testRTSPClient" or "openRTSP" as a client. Verify that this combination works OK. Then, rewrite your server, which (because you are streaming via unicast) will involve little more than implementing your own subclass of "OnDemandServerMediaSubsession". See: http://www.live555.com/liveMedia/faq.html#liveInput-unicast Make sure that your new server works OK with "testRTSPClient" or "openRTSP". Then - and only then - try running your own (non-LIVE555-based) RTSP client with your new server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue May 6 13:00:38 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 6 May 2014 23:00:38 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> Message-ID: On 6 maj 2014, at 22:28, Ross Finlayson wrote: >> The code I based my code on was for PassiveServerMediaSubsession which uses SSM. > > That's your problem. If you are streaming via unicast, then you shouldn't be using any of the "test*Streamer.cpp" applications as a model. You need to start again from the beginning, using "testOnDemandRTSPServer" as a model. > > In fact, I suggest that you first run the (unmodified!) "testOnDemandRTSPServer" application, using "testRTSPClient" or "openRTSP" as a client. Verify that this combination works OK. > > Then, rewrite your server, which (because you are streaming via unicast) will involve little more than implementing your own subclass of "OnDemandServerMediaSubsession". See: > http://www.live555.com/liveMedia/faq.html#liveInput-unicast > Make sure that your new server works OK with "testRTSPClient" or "openRTSP". > > Then - and only then - try running your own (non-LIVE555-based) RTSP client with your new server. It was based on "test*Streamer.cpp", but it's all been rewritten when moving to OnDemandServerMediaSubsession and as I learned how to do H264. The only line of code left from that was the one that did the ServerMediaSession::createNew() call. Everything else is similar to testOnDemandRTSPServer. -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Tue May 6 13:07:59 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 6 May 2014 13:07:59 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> Message-ID: <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> Once again: >> In fact, I suggest that you first run the (unmodified!) "testOnDemandRTSPServer" application, using "testRTSPClient" or "openRTSP" as a client. Verify that this combination works OK. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue May 6 13:35:26 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 6 May 2014 23:35:26 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> Message-ID: <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> On 6 maj 2014, at 23:07, Ross Finlayson wrote: > Once again: > >>> In fact, I suggest that you first run the (unmodified!) "testOnDemandRTSPServer" application, using "testRTSPClient" or "openRTSP" as a client. Verify that this combination works OK. It does, once I got rid of isSSM. Now testRTSPClient runs forever, at least for a definition of forever that goes up to 30 minutes. So that at least is not an issue anymore and was my mistake. -- Jan Ekholm jan.ekholm at d-pointer.com From goran.ambardziev at gmail.com Tue May 6 16:30:25 2014 From: goran.ambardziev at gmail.com (Goran Ambardziev) Date: Wed, 07 May 2014 01:30:25 +0200 Subject: [Live-devel] AAC/H264 Closing Down Message-ID: <53697091.2090809@gmail.com> To whom it may concern, this is what worked for me: signal done flag Close RTCP Instances Close Sinks Close Sources Close RTSP Server Delete video and audio rtp and rtcp group socks Reclaim UsageEnvironment Delete Scheduler From jan.ekholm at d-pointer.com Wed May 7 07:23:14 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Wed, 7 May 2014 17:23:14 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> Message-ID: <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> On 6 maj 2014, at 23:35, Jan Ekholm wrote: > > On 6 maj 2014, at 23:07, Ross Finlayson wrote: > >> Once again: >> >>>> In fact, I suggest that you first run the (unmodified!) "testOnDemandRTSPServer" application, using "testRTSPClient" or "openRTSP" as a client. Verify that this combination works OK. > > It does, once I got rid of isSSM. Now testRTSPClient runs forever, at least for a definition of forever that > goes up to 30 minutes. So that at least is not an issue anymore and was my mistake. Ok, after some Wiresharking and lots of testing I have a case that seems to explain why my client freezes. I may have misunderstood something, but as far as I can see this behavior is not correct. I have: * a server based on the Live555 libraries that serve local camera data to clients. * a libav based client that can connect to one stream and decode it. To reproduce the problem I do: 1. start server, make sure there are no clients connected to it. 2. start one client that uses TCP as the transport. The client successfully connects and negotiates the RTSP setup for the stream. The client starts getting RTP data over TCP and decodes the video fine. The server outputs when the stream starts: parseRTSPRequestString() succeeded, returning cmdName "PLAY", urlPreSuffix "camera0", urlSuffix "", CSeq "4", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 4 Date: Wed, May 07 2014 14:02:46 GMT Range: npt=0.000- Session: 61AA5DE4 RTP-Info: url=rtsp://192.168.1.12:8554/camera0/track1;seq=1220;rtptime=2615924584 I see from my own logs that the camera has been opened and Live555 requests frames which are delivered. 3. kill the client. This gives the output: RTSPClientConnection[0x10c008000]::handleRequestBytes() read -1 new bytes (of 10000); terminating connection! I assume this means that the server noticed that the client was killed and closes the connection. So far so good. 4. start the same client again, this gives: accept()ed connection from 192.168.1.124 RTSPClientConnection[0x10c008000]::handleRequestBytes() read 87 new bytes:OPTIONS rtsp://192.168.1.12:8554/camera0 RTSP/1.0 CSeq: 1 User-Agent: Lavf55.16.0 ... There is more related to DESCRIBE etc and then comes: parseRTSPRequestString() succeeded, returning cmdName "PLAY", urlPreSuffix "camera0", urlSuffix "", CSeq "4", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 4 Date: Wed, May 07 2014 14:03:02 GMT Range: npt=0.000- Session: 5CEA1D88 RTP-Info: url=rtsp://192.168.1.12:8554/camera0/track1;seq=4242;rtptime=2617382418 This client now also receives the stream and decodes it fine. 5. the client sends a GET_PARAMETER as a keep alive. parseRTSPRequestString() succeeded, returning cmdName "GET_PARAMETER", urlPreSuffix "camera0", urlSuffix "", CSeq "5", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 5 Date: Wed, May 07 2014 14:03:34 GMT Session: 5CEA1D88 Content-Length: 10 6. some time passes, 65 seconds from that the first client was killed, then this is shown by the server: 2014.03.25RTSP client session (id "61AA5DE4", stream name "camera0") has timed out (due to inactivity) Note that the session id is the same as the one for the first client. So it times out 65 seconds after Live555 noticed that the client disconnected. At this point there is no more data delivered to the second client. This is verified using Wireshark. As soon as the time out is printed, all network data to the second client stops and the client blocks as it expects more data. 7. check the camera logging. I see that Live555 still requests frames from it and they get delivered normally. 8. some time passes, enough for the second client to time out. Now I see: RTSP client session (id "5CEA1D88", stream name "camera0") has timed out (due to inactivity) The second client never sends and more keepalives as it's expecting data from Live555. It should perhaps not block for so long and should perhaps abort the reading and send the keep alive, but that is another matter. When this second session times out then the camera source is released and no more frames are requested. Based on what I see it seems as if Live555 does not properly clean up the first session when the connection breaks. Instead it seems to close the wrong connection when the time out occurs. Perhaps there is a 65 s timer that does not get cleaned up or there is some array index that refers to the wrong session? Whatever the cause, the stream to a still valid client gets killed. As the camera source is not closed it's also an indication that Live555 thinks that there is at least one valid stream left. Note that if UDP is used there doesn't seem to be any problems like these. I can start and kill clients at will and all streams are delivered as they should. If more details are needed please let me know. Best regards, Jan Ekholm -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Wed May 7 11:28:01 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 May 2014 11:28:01 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> Message-ID: > Based on what I see it seems as if Live555 does not properly clean up the first session when the > connection breaks. Instead it seems to close the wrong connection when the time out occurs. No, based on your description I'm not seeing any incorrect server behavior. The server is timing out the first connection (and deleting the "RTSPClientConnection" object) 65 seconds after you kill the first client (without having it send a "TEARDOWN"). The second client then causes a new "RTSPClientConnection" object to get created. (It happens to have the same address - 0x10c008000 - as the old "RTSPClientConnection" object, but it's a new object.) The reason why the server stops streaming to the second client after the first client has timed out is most likely due to a problem with your own additions to the server - i.e., either your own "OnDemandServerMediaMediaSubsession" subclass, and/or your own "FramedSource" subclass. First, when your "OnDemandServerMediaMediaSubsession" subclass's constructor calls the "OnDemandServerMediaMediaSubsession" constructor, then make sure that the "reuseFirstSource" parameter is "True". This ensures that - if two or more clients request the stream concurrently - then your "FramedSource" subclass (i.e., your data source class) will be created only once. Second, you must allow for the possibility of your "FramedSource" subclass object being closed, then recreated - perhaps multiple times. If you set "reuseFirstSource" to "True" (as noted above), then you will never see your "FramedSource" subclass constructor being called more than once before the destructor is called - but you may see "constructor", "destructor", "constructor", "destructor", etc. Your "FramedSource" subclass's constructors and destructors must allow for this to happen. Finally, if you want to test the behavior of your server, then you should first do so using our "openRTSP" client (note the "-t" option to request RTP/RTCP-over-TCP streaming), not with some random other client (that might use a different - perhaps broken - RTSP client implementation). Similarly, if you want to test the behavior of your own client, then you should first do so using "testOnDemandRTSPServer" or the "LIVE555 Media Server" as the server. This is common sense. If you want to test a client-server combination, then start with one combination that you know works OK ("testOnDemandRTSPServer" with "openRTSP"), then change only one end at a time as you continue your testing. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Wed May 7 12:07:29 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Wed, 7 May 2014 22:07:29 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> Message-ID: On 7 maj 2014, at 21:28, Ross Finlayson wrote: >> Based on what I see it seems as if Live555 does not properly clean up the first session when the >> connection breaks. Instead it seems to close the wrong connection when the time out occurs. > > No, based on your description I'm not seeing any incorrect server behavior. The server is timing out the first connection (and deleting the "RTSPClientConnection" object) 65 seconds after you kill the first client (without having it send a "TEARDOWN"). The second client then causes a new "RTSPClientConnection" object to get created. (It happens to have the same address - 0x10c008000 - as the old "RTSPClientConnection" object, but it's a new object.) > > The reason why the server stops streaming to the second client after the first client has timed out is most likely due to a problem with your own additions to the server - i.e., either your own "OnDemandServerMediaMediaSubsession" subclass, and/or your own "FramedSource" subclass. My own subclasses are very shallow and I can not see what else (apart from isSSM) that could have any effect on this. Nothing deadlocks in my code. What happens is that the wrong stream gets cut off. The connection to the second client is not closed cleanly, as otherwise it would get EOF at some point. > > First, when your "OnDemandServerMediaMediaSubsession" subclass's constructor calls the "OnDemandServerMediaMediaSubsession" constructor, then make sure that the "reuseFirstSource" parameter is "True". This ensures that - if two or more clients request the stream concurrently - then your "FramedSource" subclass (i.e., your data source class) will be created only once. reuseFirstSource is set to True, and it works ok with several clients streaming the same source: only one instance is ever created. > Second, you must allow for the possibility of your "FramedSource" subclass object being closed, then recreated - perhaps multiple times. If you set "reuseFirstSource" to "True" (as noted above), then you will never see your "FramedSource" subclass constructor being called more than once before the destructor is called - but you may see "constructor", "destructor", "constructor", "destructor", etc. Your "FramedSource" subclass's constructors and destructors must allow for this to happen. Yes, this also works. During the stream startup one instance is created so that the H264 parameters can be probed. That instance is then destroyed and a new one created for the real streaming task. That part works ok. I even see the USB camera light light up for the probe, go dark and then light up again when the real stream starts. > Finally, if you want to test the behavior of your server, then you should first do so using our "openRTSP" client (note the "-t" option to request RTP/RTCP-over-TCP streaming), not with some random other client (that might use a different - perhaps broken - RTSP client implementation). Similarly, if you want to test the behavior of your own client, then you should first do so using "testOnDemandRTSPServer" or the "LIVE555 Media Server" as the server. This is common sense. If you want to test a client-server combination, then start with one combination that you know works OK ("testOnDemandRTSPServer" with "openRTSP"), then change only one end at a time as you continue your testing. I can test tomorrow with one of the test servers against my client. But even a badly working client should not be able to cause problems (cut stream) for another unrelated client. -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Wed May 7 13:57:35 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 May 2014 13:57:35 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> Message-ID: > My own subclasses are very shallow and I can not see what else (apart from isSSM) that could have any effect > on this. Nothing deadlocks in my code. What happens is that the wrong stream gets cut off. If that's the case, then you should be able to verify this using openRTSP -t and testOnDemandRTSPServer with just one minor change. In "testOnDemandRTSPServer.cpp", change: Boolean reuseFirstSource = False; to Boolean reuseFirstSource = True; Please don't post to the mailing list again until you've done this. (I assume that you're using the latest version of the LIVE555 code (the only version that we support). Several RTP/RTCP-over-TCP-related bugs were fixed over the past year or so.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From gordu at dvr2010.com Wed May 7 17:58:30 2014 From: gordu at dvr2010.com (Gord Umphrey) Date: Wed, 7 May 2014 20:58:30 -0400 Subject: [Live-devel] Event Loop Not Terminating Message-ID: Hi Ross; We use the event loop ?watch variable? mechanism to stop the event loop and terminate our application (which is based on the testRTSPClient sample). This works about 99.9% of the time. However our systems run 24/7 and occasionally we are finding that the event loop does not terminate. After some painful investigation we figured out why. The problem is in RTPInterface.cpp, member function sendDataOverTCP(). The event loop is hung in the socket send function ? because the socket is in blocking mode. We have modified the live555 source as follows. Instead of making the socket blocking we have set a send timeout with: int milliseconds = 1000; setsockopt(socket, SOL_SOCKET, SO_SNDTIMEO, (char *)&milliseconds, sizeof(int)); This prevents the hanging and allows the event loop to exit. Another option would be to close the socket handle from a different thread which would cause the socket send function to return. (This is how we break out of a blocking socket call in other environments). However there does not seem to be a way to get the socket handle since these variables are private, as is RTSPClient::resetTCPSockets. We would prefer not to modify live555 source, so if you have any suggestions we would greatly appreciate it. Thank you, Gord. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 7 21:29:13 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 7 May 2014 21:29:13 -0700 Subject: [Live-devel] Event Loop Not Terminating In-Reply-To: References: Message-ID: <1F4327AD-7642-4E0E-A50F-9691476F12E7@live555.com> > We use the event loop ?watch variable? mechanism to stop the event loop and terminate our application (which is based on the testRTSPClient sample). This works about 99.9% of the time. However our systems run 24/7 and occasionally we are finding that the event loop does not terminate. After some painful investigation we figured out why. The problem is in RTPInterface.cpp, member function sendDataOverTCP(). The event loop is hung in the socket send function ? because the socket is in blocking mode. Note that the blocking "send()" is done *only if* the immediately-previous, non-blocking "send()" succeeded in writing only part of the data. That's why it's done. And it has to be done (and therefore, your timeout hack couldn't be applied, unless - at the end of the timeout - the socket were always closed). See http://lists.live555.com/pipermail/live-devel/2014-April/018270.html I don't know why this "send()" is blocking for a long period of time for you, but it suggests that you may have severe network congestion (and insufficient network bandwidth), for which RTP/RTCP-over-TCP delivery is not appropriate. As noted in the link above, you can now disable RTP/RTCP-over-TCP delivery in your server. > Another option would be to close the socket handle from a different thread which would cause the socket send function to return. (This is how we break out of a blocking socket call in other environments). However there does not seem to be a way to get the socket handle since these variables are private No, they're not. They're "protected". Note the fields "fClientInputSocket" and "fClientOutputSocket" in "RTSPServer::RTSPClientConnection". You could subclass "RTSPServer::RTSPClientConnection" (and therefore also subclass "RTSPServer" and reimplement "createNewClientConnection()"). Then your subclass would have access to these two sockets (which are usually the same, unless you're doing RTSP/RTP/RTCP-over-HTTP). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 8 13:53:18 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 8 May 2014 13:53:18 -0700 Subject: [Live-devel] Event Loop Not Terminating In-Reply-To: References: Message-ID: <4C854790-35E3-465D-BBD6-CA9EFE714659@live555.com> I've decided to address this situation by implementing Gord Umphrey's 'send timeout' hack. A difference, however, is that if the blocking "send()" does time out, we close the socket. Because the socket is used for both RTSP and RTP/RTP-over-TCP, this means that all remaining network activity on this connection will cease. (If this is a server, then the "RTSPClientConnection" state will eventually get reclaimed normally.) This change is implemented in the latest release: 2014.05.08 By default, the timeout interval is 500 ms. If you wish, you can change this by compiling "RTPInterface.cpp" with RTPINTERFACE_BLOCKING_WRITE_TIMEOUT_MS defined to be some other value. (A value of 0 means: Don't timeout - i.e., the previous behavior.) (A reminder, once again, that this applies *only* to RTP/RTCP-over-TCP connections where the initial, non-blocking write succeeded in writing only part of the data.) Alternatively, if you're running a RTSP server, you can call "RTSPServer::disableStreamingRTPOverTCP()" to tell the server to reject clients that request RTP/RTCP-over-TCP streaming. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From perera.amila at gmail.com Thu May 8 04:11:45 2014 From: perera.amila at gmail.com (Amila Perera) Date: Thu, 8 May 2014 16:41:45 +0530 Subject: [Live-devel] sreaming while progressively trascode .MOV to H.264 Message-ID: Hi all, This is my first mail on this thread. I have a requirement to implement a streaming server that streams a .MOV which contains H.264 encoded video data. When the client sends a request to play the media, the server should 'progressively transcode' the .MOV to H.264 and stream it. (example: transcode 500 frames from the .MOV file and stream that portion, transcode the next 500 frames and stream that portion and so on) I am quite new to this domain (encoding, rtsp/rtp server implementation etc.). I am just thinking of extracting few frames from .MOV at a time, convert to H.264 by using ffmpeg and stream that portion using live555 library and to continue the same procedure until I stream the entire media or the client presses the stop button. Following are the things that bother me. 1. Can my requirement be accomplished by using live555 library alone (without using ffmpeg) ? 2. Or am I completely going towards a different direction from my original requirement. Thank you in advance. -- *Amila Perera.* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun May 11 13:15:52 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 11 May 2014 13:15:52 -0700 Subject: [Live-devel] sreaming while progressively trascode .MOV to H.264 In-Reply-To: References: Message-ID: > Can my requirement be accomplished by using live555 library alone (without using ffmpeg) ? No, because the "LIVE555 Streaming Media" libraries don't include any mechanism for demultiplexing from a ".mov" (or ".mp4") file. Instead, I suggest that you use a Matroska (i.e., ".mkv") file as your input source. You can already stream from a ".mkv" file that contains H.264 video, using the existing "LIVE555 Media Server" (or "testMKVStreamer", to stream via multicast) application, without modification. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.decarpentieri at abcimpianti.com Mon May 12 05:19:27 2014 From: s.decarpentieri at abcimpianti.com (Saverio ABC) Date: Mon, 12 May 2014 14:19:27 +0200 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: <5370AE21.2010705@abcimpianti.com> References: <5370AE21.2010705@abcimpianti.com> Message-ID: <5370BC4F.30706@abcimpianti.com> Dear Live555, We are Italian company that use live555ProxyServer to dupplicate RTSP streaming from cameras to our server based on linux Ubuntu 12.04. We have same problem with streaming of HD resolution cam (es. 1,3MPixel 1280*980): When we proxy rtsp steaming of video cam in D1 resolution ALL it's Fine. When we proxy rtsp steaming of the same video cam in D1 resolution the video result corrupt, in particullay in the end of the immage, but if instead i try to get rtsp stream direct from HD cam without using Live555proxyserver the video result ok I attached screenshot of the two case. I try to increase "OutPacketBuffer::maxSize" to 100000 or 200000 but the problem it's the same. How can I resolve this problem? Thank for your support. Best Regards , -- Ing. Saverio De Carpentieri ABC Impianti S.r.l. -------------- next part -------------- A non-text attachment was scrubbed... Name: direct from cam HD.png Type: image/png Size: 707618 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: with LIVE555 Proxy Server.png Type: image/png Size: 791587 bytes Desc: not available URL: From finlayson at live555.com Mon May 12 07:42:31 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 12 May 2014 07:42:31 -0700 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: <5370BC4F.30706@abcimpianti.com> References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> Message-ID: > How can I resolve this problem? The best solution is to not send such large NAL units. Reconfigure your encoder to break up 'key frames' into multiple (therefore much smaller) 'slice' NAL units. It's important to understand that each outgoing NAL unit - if it is larger than the RTP/UDP packet size (about 1500 bytes on most networks) - will be broken up into multiple outgoing RTP packets, and the receiver - both the proxy server and the final receiving client(s) - must receive *all* of these packets in order to be able to reconstruct the frame. In other words, if even one of these packets is lost, then the receiver will lose the *entire* NAL unit. That's why the NAL units - generated by your encoder - should be as small as is reasonable. Alternatively (though not as good, for the reason noted above), you can increase the buffer size ("OutPacketBuffer::maxSize") that the "LIVE555 Proxy Server" uses. I.e., change line 59 of "proxyServer/live555ProxyServer.cpp". > I try to increase "OutPacketBuffer::maxSize" to 100000 or 200000 but the problem it's the same. > I'm not sure why; if you make this buffer size large enough, it should work (provided, of course, you don't lose any network packets, as noted above). You should also make sure that you're using an up-to-date version of the "LIVE555 Streaming Media" code; see http://www.live555.com/liveMedia/faq.html#latest-version Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From CERLANDS at arinc.com Mon May 12 09:08:55 2014 From: CERLANDS at arinc.com (Erlandsson, Claes P (CERLANDS)) Date: Mon, 12 May 2014 16:08:55 +0000 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: <5370BC4F.30706@abcimpianti.com> References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> Message-ID: >I try to increase "OutPacketBuffer::maxSize" to 100000 or 200000 but the >problem it's the same. How big are your frames? I typically see larger frames than that from high resolution cameras and have the buffer set to 500000 or 1000000. /Claes -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 5796 bytes Desc: not available URL: From s.decarpentieri at abcimpianti.com Mon May 12 08:37:45 2014 From: s.decarpentieri at abcimpianti.com (Saverio ABC) Date: Mon, 12 May 2014 17:37:45 +0200 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> Message-ID: <5370EAC9.8060803@abcimpianti.com> Dear Ross, Thanks for your response, i analyzed the problem better. First of all I update today live555 and now we use ProxyRTSPClient (LIVE555 Streaming Media v2014.05.08) that it's the last version. In my Encoder we only can set Encode, Resolution, Frame-rate(FPS), I-Frame interval and bit-rate, and now it's set to: Encode: h264 Resolution: 1,3M (1280*960) Frame-rate(FPS): 15 (max value) I-Frame interval: 15 (min value) bit-rate: 2048 Kb/s or 1024 Kb/s To send smaller NAL units i reduce I-Frame interval to 15 but i don't want reduce Resolution! How can reduce NAL unit without change resolution and FPS I also try to increase "OutPacketBuffer::maxSize" in "proxyServer/live555ProxyServer.cpp" to 2000000 and more but the problem it's the same. Finally I find this behaviour: I start my software that use live555ProxyServer and run code "live555ProxyServer -V -p 9002 -R rtsp://admin:369000 at 192.168.100.20:554/cam/realmonitor?channel=1&subtype=0" now we have the same situation: --> [RTSP client1] [back-end RTSP/RTP stream] --> [LIVE555 Proxy Server] --> [RTSP client2] ... --> [RTSP clientN] If I use only one RTSP client all it's fine: the video HD streaming result fine and not corrupt but, as soon as, i start another RTSP client (es. vlc) the video HD streaming result corrupt as in attached picture, WHY? I'am in local network (LAN) so we can't have network problem! Thanks so much for your support. Best Regards, Ing. Saverio De Carpentieri ABC Impianti S.r.l. Il 12/05/2014 16:42, Ross Finlayson ha scritto: >> How can I resolve this problem? > > The best solution is to not send such large NAL units. Reconfigure > your encoder to break up 'key frames' into multiple (therefore much > smaller) 'slice' NAL units. > > It's important to understand that each outgoing NAL unit - if it is > larger than the RTP/UDP packet size (about 1500 bytes on most > networks) - will be broken up into multiple outgoing RTP packets, and > the receiver - both the proxy server and the final receiving client(s) > - must receive *all* of these packets in order to be able to > reconstruct the frame. In other words, if even one of these packets > is lost, then the receiver will lose the *entire* NAL unit. That's > why the NAL units - generated by your encoder - should be as small as > is reasonable. > > > Alternatively (though not as good, for the reason noted above), you > can increase the buffer size ("OutPacketBuffer::maxSize") that the > "LIVE555 Proxy Server" uses. I.e., change line 59 of > "proxyServer/live555ProxyServer.cpp". > >> I try to increase "OutPacketBuffer::maxSize" to 100000 or 200000 but >> the problem it's the same. >> > > I'm not sure why; if you make this buffer size large enough, it should > work (provided, of course, you don't lose any network packets, as > noted above). > > You should also make sure that you're using an up-to-date version of > the "LIVE555 Streaming Media" code; see > http://www.live555.com/liveMedia/faq.html#latest-version > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Bad streaming.png Type: image/png Size: 353418 bytes Desc: not available URL: From finlayson at live555.com Mon May 12 11:46:47 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 12 May 2014 11:46:47 -0700 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: <5370EAC9.8060803@abcimpianti.com> References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> <5370EAC9.8060803@abcimpianti.com> Message-ID: > How can reduce NAL unit without change resolution and FPS Ask the manufacturer of your encoder to upgrade it so that it can break up I-frames into multiple 'slice' NAL units. > I start my software that use live555ProxyServer and run code "live555ProxyServer -V -p 9002 -R rtsp://admin:369000 at 192.168.100.20:554/cam/realmonitor?channel=1&subtype=0" The "LIVE555 Proxy Server" has no "-p" option. > > now we have the same situation: > --> [RTSP client1] > [back-end RTSP/RTP stream] --> [LIVE555 Proxy Server] --> [RTSP client2] > ... > --> [RTSP clientN] > If I use only one RTSP client all it's fine: the video HD streaming result fine and not corrupt but, as soon as, i start another RTSP client (es. vlc) the video HD streaming result corrupt as in attached picture, WHY? Is your first client VLC also, or only your second client? If you're using VLC as a client, then be aware that - if your NAL units are excessively large - the first few received frames *will* be truncated. VLC notices this and increases its buffer size, so after a few seconds should have increased its buffer size large enough. (However, VLC is not our software, so we can't help you with VLC-related problems.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.decarpentieri at abcimpianti.com Mon May 12 23:42:58 2014 From: s.decarpentieri at abcimpianti.com (Saverio ABC) Date: Tue, 13 May 2014 08:42:58 +0200 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> Message-ID: <5371BEF2.5060302@abcimpianti.com> Dear Claes, We use this encoder details: Encode: h264 Resolution: 1,3M (1280*960) Frame-rate(FPS): 15 (max value) I-Frame interval: 15 (min value) bit-rate: 2048 Kb/s or 1024 Kb/s I also try to increase "OutPacketBuffer::maxSize" in "proxyServer/live555ProxyServer.cpp" to 1000000 and more but the problem it's the same. Saverio Il 12/05/2014 18:08, Erlandsson, Claes P (CERLANDS) ha scritto: >> I try to increase "OutPacketBuffer::maxSize" to 100000 or 200000 but the >> problem it's the same. > > How big are your frames? I typically see larger frames than that from high > resolution cameras and have the buffer set to 500000 or 1000000. > > /Claes > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.decarpentieri at abcimpianti.com Mon May 12 23:50:54 2014 From: s.decarpentieri at abcimpianti.com (Saverio ABC) Date: Tue, 13 May 2014 08:50:54 +0200 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> <5370EAC9.8060803@abcimpianti.com> Message-ID: <5371C0CE.1010509@abcimpianti.com> Dear Ross, I Try to ask Dahua for this problem! "-p" option is our implementation in "proxyServer/live555ProxyServer.cpp" tu use a specific port UDP, it's work great ;)! VLC is not our first client, we use DirectShow as first Client. we use VLC for rapid test of RTSP Client as secondary RTSP Client Saverio Il 12/05/2014 20:46, Ross Finlayson ha scritto: >> How can reduce NAL unit without change resolution and FPS > > Ask the manufacturer of your encoder to upgrade it so that it can > break up I-frames into multiple 'slice' NAL units. > > >> I start my software that use live555ProxyServer and run code >> "live555ProxyServer -V -p 9002 -R >> rtsp://admin:369000 at 192.168.100.20:554/cam/realmonitor?channel=1&subtype=0" > > The "LIVE555 Proxy Server" has no "-p" option. > > >> >> now we have the same situation: >> --> [RTSP client1] >> [back-end RTSP/RTP stream] --> [LIVE555 Proxy Server] --> [RTSP client2] >> ... >> --> [RTSP clientN] >> If I use only one RTSP client all it's fine: the video HD streaming >> result fine and not corrupt but, as soon as, i start another RTSP >> client (es. vlc) the video HD streaming result corrupt as in attached >> picture, WHY? > > Is your first client VLC also, or only your second client? > > If you're using VLC as a client, then be aware that - if your NAL > units are excessively large - the first few received frames *will* be > truncated. VLC notices this and increases its buffer size, so after a > few seconds should have increased its buffer size large enough. > (However, VLC is not our software, so we can't help you with > VLC-related problems.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.decarpentieri at abcimpianti.com Tue May 13 07:41:03 2014 From: s.decarpentieri at abcimpianti.com (Saverio ABC) Date: Tue, 13 May 2014 16:41:03 +0200 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> <5370EAC9.8060803@abcimpianti.com> Message-ID: <53722EFF.8070808@abcimpianti.com> Dear Ross, Why the first RTSP client works well and playback HD cam streaming very well while the second RTSP client can't playback video, the video result corrupt. it's the same NAL units for both RTSP client. Proxy server reads HD Megapixel Camera stream only once. In LAN i can view more than one connection at the same time...I i should have limit only on 100Mbps network.... S. > > now we have the same situation: > --> [RTSP client1] > [back-end RTSP/RTP stream] --> [LIVE555 Proxy Server] --> [RTSP client2] > ... > --> [RTSP clientN] > If I use only one RTSP client all it's fine: the video HD streaming > result fine and not corrupt but, as soon as, i start another RTSP > client (es. vlc) the video HD streaming result corrupt as in attached > picture, WHY? Is your first client VLC also, or only your second client? > > If you're using VLC as a client, then be aware that - if your NAL > units are excessively large - the first few received frames *will* be > truncated. VLC notices this and increases its buffer size, so after a > few seconds should have increased its buffer size large enough. > (However, VLC is not our software, so we can't help you with > VLC-related problems.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue May 13 10:24:13 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 13 May 2014 10:24:13 -0700 Subject: [Live-devel] Problem with RTSP streaming of HD Camera with Live555Proxyserver In-Reply-To: <53722EFF.8070808@abcimpianti.com> References: <5370AE21.2010705@abcimpianti.com> <5370BC4F.30706@abcimpianti.com> <5370EAC9.8060803@abcimpianti.com> <53722EFF.8070808@abcimpianti.com> Message-ID: <9A6482FB-9E6D-4FFF-90F7-888164ACF7A3@live555.com> > Why the first RTSP client works well and playback HD cam streaming very well while the second RTSP client can't playback video, the video result corrupt. it's the same NAL units for both RTSP client. I've already told you why - it's because your two RTSP clients are different. Your non-VLC RTSP client is apparently able to handle excessively large NAL units, but VLC initially does not. It will initially truncate the (excessively large) I-frame NAL units, but should eventually resize its buffer to be able to handle these. This will be my last posting on this topic! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Wed May 14 06:14:02 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Wed, 14 May 2014 16:14:02 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> Message-ID: <1A198611-C49D-4A64-AAD8-96E7D77CAB9D@d-pointer.com> On 7 maj 2014, at 21:28, Ross Finlayson wrote: >> Based on what I see it seems as if Live555 does not properly clean up the first session when the >> connection breaks. Instead it seems to close the wrong connection when the time out occurs. > > No, based on your description I'm not seeing any incorrect server behavior. The server is timing out the first connection (and deleting the "RTSPClientConnection" object) 65 seconds after you kill the first client (without having it send a "TEARDOWN"). The second client then causes a new "RTSPClientConnection" object to get created. (It happens to have the same address - 0x10c008000 - as the old "RTSPClientConnection" object, but it's a new object.) > > The reason why the server stops streaming to the second client after the first client has timed out is most likely due to a problem with your own additions to the server - i.e., either your own "OnDemandServerMediaMediaSubsession" subclass, and/or your own "FramedSource" subclass. > > First, when your "OnDemandServerMediaMediaSubsession" subclass's constructor calls the "OnDemandServerMediaMediaSubsession" constructor, then make sure that the "reuseFirstSource" parameter is "True". This ensures that - if two or more clients request the stream concurrently - then your "FramedSource" subclass (i.e., your data source class) will be created only once. > > Second, you must allow for the possibility of your "FramedSource" subclass object being closed, then recreated - perhaps multiple times. If you set "reuseFirstSource" to "True" (as noted above), then you will never see your "FramedSource" subclass constructor being called more than once before the destructor is called - but you may see "constructor", "destructor", "constructor", "destructor", etc. Your "FramedSource" subclass's constructors and destructors must allow for this to happen. > > Finally, if you want to test the behavior of your server, then you should first do so using our "openRTSP" client (note the "-t" option to request RTP/RTCP-over-TCP streaming), not with some random other client (that might use a different - perhaps broken - RTSP client implementation). Similarly, if you want to test the behavior of your own client, then you should first do so using "testOnDemandRTSPServer" or the "LIVE555 Media Server" as the server. This is common sense. If you want to test a client-server combination, then start with one combination that you know works OK ("testOnDemandRTSPServer" with "openRTSP"), then change only one end at a time as you continue your testing. I can not reproduce the same problem when using openRTSP or testOnDemandRTSPServer. My own client also behaves ok. I'm at a total loss as to what in my server code could cause only TCP to misbehave. I've spent enough time trying to debug that and must move on for now. I hope to find the problem later when I get more familiar with Live555, it's not an easy library to use. I'm sorry for the time I've wasted for all readers. One thing I did notice however that causes testOnDemandRTSPServer to freeze is to do like this: 1. start testOnDemandRTSPServer and stream something (in my case a H264 file) 2. start openRTSP #1 against the server 3. start openRTSP #2 against the server 4. suspend #1 through C-z to simulate a bad network connection 5. #2 will now not get any more data and the server is totally frozen until #1 is resumed -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Wed May 14 09:05:39 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 14 May 2014 09:05:39 -0700 Subject: [Live-devel] Logging or debug info In-Reply-To: <1A198611-C49D-4A64-AAD8-96E7D77CAB9D@d-pointer.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> <1A198611-C49D-4A64-AAD8-96E7D77CAB9D@d-pointer.com> Message-ID: <14943636-2A7A-419C-BEA8-91A2DEB50406@live555.com> > One thing I did notice however that causes testOnDemandRTSPServer to freeze is to do like this: > > 1. start testOnDemandRTSPServer and stream something (in my case a H264 file) > 2. start openRTSP #1 against the server > 3. start openRTSP #2 against the server > 4. suspend #1 through C-z to simulate a bad network connection > 5. #2 will now not get any more data and the server is totally frozen until #1 is resumed It turned out there was a bug in the way that the 'blocking TCP send() with timeout' hack was implemented in the previous revision. Please upgrade to the latest version: 2014.05.14 Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Wed May 14 13:44:34 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Wed, 14 May 2014 23:44:34 +0300 Subject: [Live-devel] Logging or debug info In-Reply-To: <14943636-2A7A-419C-BEA8-91A2DEB50406@live555.com> References: <55C525BC-D86D-45F8-A2B9-09F77F270BBD@d-pointer.com> <6BE6CB07-D7D3-4363-9050-0A0FD5F29ABD@d-pointer.com> <8AFC4EEC-0370-40C1-B7E2-A190D8F0271E@live555.com> <990ED3C1-7707-429D-89EB-BB8BDB1064BC@live555.com> <59CFB309-BA62-42A2-8444-065A7A5619CC@d-pointer.com> <594DC213-C182-44DB-90C0-C8CE4CDEAAE7@d-pointer.com> <1A198611-C49D-4A64-AAD8-96E7D77CAB9D@d-pointer.com> <14943636-2A7A-419C-BEA8-91A2DEB50406@live555.com> Message-ID: <131D83A2-74E2-4E3D-8A80-240F1C5373A4@d-pointer.com> On 14 maj 2014, at 19:05, Ross Finlayson wrote: >> One thing I did notice however that causes testOnDemandRTSPServer to freeze is to do like this: >> >> 1. start testOnDemandRTSPServer and stream something (in my case a H264 file) >> 2. start openRTSP #1 against the server >> 3. start openRTSP #2 against the server >> 4. suspend #1 through C-z to simulate a bad network connection >> 5. #2 will now not get any more data and the server is totally frozen until #1 is resumed > > It turned out there was a bug in the way that the 'blocking TCP send() with timeout' hack was implemented in the previous revision. > > Please upgrade to the latest version: 2014.05.14 I think the same bug was discussed here in another thread last week, so I actually guessed it was already fixed. But you never know if it's something different. I will update, thanks! -- Jan Ekholm jan.ekholm at d-pointer.com From nambirajan.manickam at i-velozity.com Fri May 16 05:21:01 2014 From: nambirajan.manickam at i-velozity.com (Nambirajan) Date: Fri, 16 May 2014 17:51:01 +0530 Subject: [Live-devel] Trick play Message-ID: <004401cf7101$50e059a0$f2a10ce0$@manickam@i-velozity.com> Hi Ross, We have been checking the trick play with the Live555 Server. We found that when doing the FFW or REW, The PID values of the stream id, PMT and Video Elementary Stream are changed to different values. Do we need to change the same. Can we revert it to the original PID values. Can you please suggest as. Thanks and regards, M. Nambirajan -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Fri May 16 13:25:17 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Fri, 16 May 2014 23:25:17 +0300 Subject: [Live-devel] A lesson learned: Groupsock Message-ID: <6939495B-C1A8-45DD-AE5E-09CEE284F7EF@d-pointer.com> Hi, Just wanted to share a small caveat that's been biting me for several days. It's not a Live555 bug, but something that's easy to do wrong in any kind of application that is a bit more complex than the sample apps. Most of the examples that do multicast streaming have this code: const unsigned short rtpPortNum = 18888; const Port rtpPort(rtpPortNum); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); rtpGroupsock.multicastSendOnly(); // we're a SSM source This sets up the Groupsock, there's usually some more, but that's irrelevant. Then the rtpGroupsock instance is used like this: videoSink = H264VideoRTPSink::createNew(*env, &rtpGroupsock, 96); This is all fine as long as control stays in the same method/function until the streaming is done! In my case I was setting up the sinks in a method and then returning to my main loop which handled the Live555 loop. What happens here is that rtpGroupsock is a local variable passed as a pointer to createNew(). We all know what happens when it goes out of scope. In my case I didn't see any crashes, only "bad file descriptor" errors for days that had me debugging all kinds of code. It's not particularly clear when Live555 takes ownership of something or gives it to your code. Smart pointers would be the way to go here, of course. Just in case someone googles for "bad file descriptor", this could help. -- Jan Ekholm jan.ekholm at d-pointer.com From jan.ekholm at d-pointer.com Fri May 16 13:36:26 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Fri, 16 May 2014 23:36:26 +0300 Subject: [Live-devel] Fixing "The total received frame size exceeds the client's buffer size" Message-ID: Hi, So I see the error below in my code that works like a proxy. It receives a stream from a remote camera, optionally does some magic with it and then multicasts it onwards. The multicasting is done using a PassiveServerMediaSubSession, H264VideoRTPSink and H264VideoStreamFramer in addition to what MediaSubsession->initiate() sets up. I see a lot of: MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (693). 1023 bytes of trailing data will be dropped! MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (794). 959 bytes of trailing data will be dropped! MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (490). 1329 bytes of trailing data will be dropped! MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (1269). 685 bytes of trailing data will be dropped! The final recipient does not get any data (tested with testRTSPClient). If I change the H264VideoStreamFramer to a H264VideoStreamDiscreteFramer then I see none of the above errors and then testRTSPClient gets data, but it is not playable by, say, VLC. If I have openRTSP save the stream and then stream the saved file with testH264VideoStreamer, then I get nothing streamed at all. So the data seems corrupt somehow. Anyway, how are the buffers *really* set? I've set OutPacketBuffer::maxSize to all kinds of huge values, even in the RTPSink code, but that has no effect. The buffer handling is a little bit clumsy, if I may be honest. Lots of email threads say to set the buffer size using some bufferSize parameter to the sink (the error is from a source) or use OutPacketBuffer::maxSize, but none works. -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Fri May 16 14:10:37 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 16 May 2014 14:10:37 -0700 Subject: [Live-devel] Trick play In-Reply-To: <004401cf7101$50e059a0$f2a10ce0$@manickam@i-velozity.com> References: <004401cf7101$50e059a0$f2a10ce0$@manickam@i-velozity.com> Message-ID: <30E8A0DB-0BD0-443F-87F1-BDDACE9C8C2C@live555.com> > We have been checking the trick play with the Live555 Server. We found that when doing the FFW or REW, The PID values of the stream id, PMT and Video Elementary Stream are changed to different values. Strictly speaking, the trick play mechanism isn't 'changing' the PID, because it's not reusing the original Transport Stream. Instead, it is creating a whole new Transport Stream, using only the video content that appeared in the original Transport Stream. Therefore - as a whole new Transport Stream - the 'trick play' stream uses its own PIDs, which have no relationship to whatever PIDs happened to have been in the original. This should not be a problem, because the new 'trick play' transport stream has proper PAT and PMT tables, and therefore compliant receivers should have no problem playing it. > Can we revert it to the original PID values. No. This would be non-trivial to do, and there are no plans to do so. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 16 14:23:20 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 16 May 2014 14:23:20 -0700 Subject: [Live-devel] Fixing "The total received frame size exceeds the client's buffer size" In-Reply-To: References: Message-ID: > If I change the H264VideoStreamFramer to a H264VideoStreamDiscreteFramer Yes, you MUST do this, because your input source (from the RTSP client's "H264VideoRTPSource") consists of a sequence of discrete NAL units, rather than an unstructured byte stream. I.e., you MUST feed your RTSP client "MediaSubsession"s "readSource()" into a "H264VideoStreamDiscreteFramer", NOT a "H264VideoStreamFramer". > then I see none > of the above errors and then testRTSPClient gets data, but it is not playable by, say, VLC. That's because your "H264VideoRTPSink" does not know the H.264 SPS and PPS NAL units for the stream. To fix this, create your "H264VideoRTPSink" using one of the "createNew()" versions that take the SPS/PPS NAL units as parameter. E.g. H264VideoRTPSink::createNew(envir(), rtpGroupsock, 96, clientMediaSubsession.fmtp_spropparametersets()); I.e., in this case you're taking the encoded SPS/PPS NAL unit information that your RTSP client already got from the downstream server (camera). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nambirajan.manickam at i-velozity.com Fri May 16 18:13:40 2014 From: nambirajan.manickam at i-velozity.com (Nambirajan) Date: Sat, 17 May 2014 06:43:40 +0530 Subject: [Live-devel] Trick play In-Reply-To: <30E8A0DB-0BD0-443F-87F1-BDDACE9C8C2C@live555.com> References: <004401cf7101$50e059a0$f2a10ce0$@manickam@i-velozity.com> <30E8A0DB-0BD0-443F-87F1-BDDACE9C8C2C@live555.com> Message-ID: <003601cf716d$3f0db2b0$bd291810$@manickam@i-velozity.com> Hi Ross, Thanks for your reply. If I need to change back to original PID's in the FFW /REW streamed TS, in which file I should make changes. Can you please help me out in this. Thanks and regards, M. Nambirajan From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Saturday, May 17, 2014 2:41 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Trick play We have been checking the trick play with the Live555 Server. We found that when doing the FFW or REW, The PID values of the stream id, PMT and Video Elementary Stream are changed to different values. Strictly speaking, the trick play mechanism isn't 'changing' the PID, because it's not reusing the original Transport Stream. Instead, it is creating a whole new Transport Stream, using only the video content that appeared in the original Transport Stream. Therefore - as a whole new Transport Stream - the 'trick play' stream uses its own PIDs, which have no relationship to whatever PIDs happened to have been in the original. This should not be a problem, because the new 'trick play' transport stream has proper PAT and PMT tables, and therefore compliant receivers should have no problem playing it. Can we revert it to the original PID values. No. This would be non-trivial to do, and there are no plans to do so. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Sat May 17 00:35:46 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Sat, 17 May 2014 10:35:46 +0300 Subject: [Live-devel] Fixing "The total received frame size exceeds the client's buffer size" In-Reply-To: References: Message-ID: <94A27AC7-529A-4F4D-84BC-BFCF02A9D97F@d-pointer.com> On 17 maj 2014, at 00:23, Ross Finlayson wrote: >> If I change the H264VideoStreamFramer to a H264VideoStreamDiscreteFramer > > Yes, you MUST do this, because your input source (from the RTSP client's "H264VideoRTPSource") consists of a sequence of discrete NAL units, rather than an unstructured byte stream. I.e., you MUST feed your RTSP client "MediaSubsession"s "readSource()" into a "H264VideoStreamDiscreteFramer", NOT a "H264VideoStreamFramer". > > >> then I see none >> of the above errors and then testRTSPClient gets data, but it is not playable by, say, VLC. > > That's because your "H264VideoRTPSink" does not know the H.264 SPS and PPS NAL units for the stream. > > To fix this, create your "H264VideoRTPSink" using one of the "createNew()" versions that take the SPS/PPS NAL units as parameter. E.g. > H264VideoRTPSink::createNew(envir(), rtpGroupsock, 96, clientMediaSubsession.fmtp_spropparametersets()); > I.e., in this case you're taking the encoded SPS/PPS NAL unit information that your RTSP client already got from the downstream server (camera). With those two changes it works. Thank you for the help. There is btw no H264VideoRTPSink::createNew() version that takes the arguments you mention, you need to add the profile-level-id too, but it seems to work ok by using the value from: unsigned int profileLevelId = m_subsession->attrVal_int( "profile-level-id" ); -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Sat May 17 03:09:38 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 17 May 2014 03:09:38 -0700 Subject: [Live-devel] Fixing "The total received frame size exceeds the client's buffer size" In-Reply-To: <94A27AC7-529A-4F4D-84BC-BFCF02A9D97F@d-pointer.com> References: <94A27AC7-529A-4F4D-84BC-BFCF02A9D97F@d-pointer.com> Message-ID: <099A94E8-733E-4A96-B482-A1392D14805F@live555.com> > With those two changes it works. Thank you for the help. There is btw no H264VideoRTPSink::createNew() version that takes > the arguments you mention, you need to add the profile-level-id too No, you're using an old version of the code. You should upgrade! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From newsletters at thetoolwiz.com Mon May 19 12:11:47 2014 From: newsletters at thetoolwiz.com (David Schwartz) Date: Mon, 19 May 2014 12:11:47 -0700 Subject: [Live-devel] channel mapping question Message-ID: I?m trying to figure out track channel mappings in SDP payloads. The DESCRIBE command to an RTSP server is giving me the following SDP data (shortened a bit for brevity): . . . m=video 0 RTP/AVP 96 a=3GPP-Adaptation-Support:1 a=rtpmap:96 H264/90000 a=fmtp:96 profile-level-id=42E00C; sprop-parameter-sets=Z0LgDJZSAoP2AiAAAXcAACvyEIA=,aM44gA==; packetization-mode=1 a=control:trackID=1 m=audio 0 RTP/AVP 97 a=3GPP-Adaptation-Support:1 a=rtpmap:97 MP4A-LATM/16000/1 a=fmtp:97 profile-level-id=15; object=2; cpresent=0; config=400028103FC0; SBR-enabled=1 a=control:trackID=2 So I send two SETUP commands to the server: SETUP rtsp://some.url/trackID=1 RTSP/1.0 CSeq: 4 User-Agent: LibVLC/2.1.3 (LIVE555 Streaming Media v2014.01.21) Transport: RTP/AVP/TCP;unicast;interleaved=0-1 SETUP rtsp://some.url/trackID=2 RTSP/1.0 CSeq: 5 User-Agent: LibVLC/2.1.3 (LIVE555 Streaming Media v2014.01.21) Transport: RTP/AVP/TCP;unicast;interleaved=2-3 Session: 8571189255046566924 So, let?s say I have an array (or indexed list) of items that contain the ?m=xxx? sections of the SDP data. The first entry is item[0] and the second is item[1]. We can see that item[0]?s value for trackID = 1, while item[1]?s value for trackID = 2. When I start receiving the data, and the code examines the following the ?$? in the payload, I?m seeing two values: for trackID=1 ?> channel# = 0 for trackID=2 ?> channel# = 2 When I was playing around with just the FIRST (video) SETUP command (without the audio SETUP), I observed messages in RTP payload for BOTH channel# = 0 AND channel# = 1, although I don?t know what the latter data was for. I?m trying to figure out if I?m always going to see channel# = 0 / 2 for both video + audio, or if I?ll ever have to deal with the odd channel numbers. Also, how do I map the trackID# to the channel# seen in the SDP data blocks? For now, I?m doing it manually, just mapping trackID=1 ?> channel# 0 and trackID=2 ?> channel# 2. Is this safe to do in the long-run? And ? what about the odd channel#?s? -David From finlayson at live555.com Mon May 19 12:32:19 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 May 2014 12:32:19 -0700 Subject: [Live-devel] channel mapping question In-Reply-To: References: Message-ID: <635C7D7E-D46E-4118-8FF9-2124AABA996A@live555.com> Note that this mailing list is specifically for questions about/discussion of the "LIVE555 Streaming Media" software. It is not intended for general questions about RTSP, RTP or RTCP. If your organization is interested in hiring me as a consultant/technical advisor for general architectural issues regarding streaming media, RTSP/RTP/RTCP etc., then please contact me via separate email. Nonetheless... > So I send two SETUP commands to the server: > > SETUP rtsp://some.url/trackID=1 RTSP/1.0 > CSeq: 4 > User-Agent: LibVLC/2.1.3 (LIVE555 Streaming Media v2014.01.21) > Transport: RTP/AVP/TCP;unicast;interleaved=0-1 If you are using VLC (or 'LibVLC') as your RTSP client, then the RTSP protocol exchange, and RTP/RTCP packet reception and media rendering is taken care of automatically. You do not need to know or care about the specific details of how this all works. But if instead you are developing your own RTSP client application - that doesn't use 'LibVLC' - then you should not include a false "User-Agent:" header that claims otherwise. (If you're not using the "LIVE555 Streaming Media" software - or VLC/LibVLC, which uses the "LIVE555 Streaming Media" software, then you definitely should not be claiming to be using it, because "LIVE555" is a trademark of "Live Networks, Inc.".) > When I was playing around with just the FIRST (video) SETUP command (without the audio SETUP), I observed messages in RTP payload for BOTH channel# = 0 AND channel# = 1, although I don?t know what the latter data was for. The 'odd-numbered' channels are for RTCP packets. Again, if you're using the "LIVE555 Streaming Media" software (or VLC/'LibVLC'), then RTCP is implemented automatically, and you don't need to care about it. But again, if you're *not* using the "LIVE555 Streaming Media" software, then this mailing list is not appropriate. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Mon May 19 01:46:33 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Mon, 19 May 2014 11:46:33 +0300 Subject: [Live-devel] Fixing "The total received frame size exceeds the client's buffer size" In-Reply-To: <099A94E8-733E-4A96-B482-A1392D14805F@live555.com> References: <94A27AC7-529A-4F4D-84BC-BFCF02A9D97F@d-pointer.com> <099A94E8-733E-4A96-B482-A1392D14805F@live555.com> Message-ID: <8584B7E3-15EC-4FDA-8078-8B0FA7FBF3F7@d-pointer.com> On 17 maj 2014, at 13:09, Ross Finlayson wrote: >> With those two changes it works. Thank you for the help. There is btw no H264VideoRTPSink::createNew() version that takes >> the arguments you mention, you need to add the profile-level-id too > > No, you're using an old version of the code. You should upgrade! Aha, I see that the new version has removed the parameter. It was a bit redundant. Are there any changelogs for changes like these? -- Jan Ekholm jan.ekholm at d-pointer.com From jan.ekholm at d-pointer.com Mon May 19 06:11:01 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Mon, 19 May 2014 16:11:01 +0300 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> Message-ID: <4E0677A5-3CB6-4952-ADF6-EFB121BA9B81@d-pointer.com> On 6 apr 2014, at 22:54, Ross Finlayson wrote: > >> Also I probably want to also be able to decode the incoming stream in order to grab >> still images from it and do some manipulation on it before re-encoding it again. Is there >> some good way to "duplicate" the incoming stream into my custom code for doing the >> image manipulation while at the same time keeping the normal proxied stream working? > > Yes, you can use the "StreamReplicator" class. (Note the "testReplicator" demo application - in "testProgs" - that illustrates how to use this.) > > If you use the mechanism that I suggested above (piping "openRTSP" into a modified "testH264VideoStreamer"), then you could update the "testH264VideoStreamer" some more, by feeding the "StreamReplicator" from the input "ByteStreamFileSource" (from "stdin"), create two replicas, then feed one replica into the "H264VideoStreamFramer" (and thus the "H264VideoRTPSink", for streaming), and feed another replica into your decoder. Finally got to the point where I'm trying to save a backup of the received stream using StreamReplicator. It's not going too well, as expected. :) I receive a stream from a remote camera with code based on testRTSPClient. Internally I use a RTSPClient instance and have a very similar layout to the original code. void RemoteRTSPClient::setupCallback () { ... // set up the sink used to multicast the stream further H264VideoRTPSink * sink = H264VideoRTPSink::createNew( envir(), m_rtpGroupsock, 96, m_subsession->fmtp_spropparametersets() ); m_subsession->sink = sink; // create the server media session with suitable meta data ServerMediaSession* sms = ServerMediaSession::createNew( ... ); sms->addSubsession( PassiveServerMediaSubsession::createNew( *sink, 0 ) ); rtpsServer->addSession( sms ); // set up a framer as the input consists of a sequence of discrete NAL units and not a byte stream H264VideoStreamDiscreteFramer * framer = H264VideoStreamDiscreteFramer::createNew( envir(), m_subsession->readSource() ); // create a replicator to allow us to copy the stream m_replicator = StreamReplicator::createNew( envir(), framer, False ); if ( ! m_subsession->sink->startPlaying( *m_replicator->createStreamReplica(), ... ) ) { // error .... } // start the stream sendPlayCommand( *m_session, RemoteRTSPClient::playCallback ); } Here I only try to get the original stream to work through the replicator. Based on the testReplicator example all streams must be through a replicator, it's not possible to have one stream use the replicator and one stream use the normal way. That results in errors here: void FramedSource::getNextFrame(... if (fIsCurrentlyAwaitingData) { envir() << "FramedSource[" << this << "]::getNextFrame(): attempting to read more than once at the same time!\n"; envir().internalError(); } If I change it to be: if ( ! m_subsession->sink->startPlaying( *framer, ... ) ) { then it works ok, but the replicator can then not be used at all due to the above check in FramedSource. Probably there's some reorganization that needs to be done, but I've tried quite a few permutations of the code and nothing works. The one that almost works is: m_replicator = StreamReplicator::createNew( envir(), m_subsession->readSource(), False ); H264VideoStreamDiscreteFramer * discreteFramer = H264VideoStreamDiscreteFramer::createNew( envir(), m_replicator->createStreamReplica() ); if ( ! m_subsession->sink->startPlaying( * discreteFramer, ... ) ) { .... } This does deliver a video stream, but it's unwatchable. The individual images are ok, but the video is stuttering back and forth, as if it was zig-zagging over the time line. There is not too much documentation for StreamReplicator and the provided example is quite limited. -- Jan Ekholm jan.ekholm at d-pointer.com From finlayson at live555.com Mon May 19 13:29:44 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 May 2014 13:29:44 -0700 Subject: [Live-devel] Fixing "The total received frame size exceeds the client's buffer size" In-Reply-To: <8584B7E3-15EC-4FDA-8078-8B0FA7FBF3F7@d-pointer.com> References: <94A27AC7-529A-4F4D-84BC-BFCF02A9D97F@d-pointer.com> <099A94E8-733E-4A96-B482-A1392D14805F@live555.com> <8584B7E3-15EC-4FDA-8078-8B0FA7FBF3F7@d-pointer.com> Message-ID: > Are there any changelogs for changes like these? http://www.live555.com/liveMedia/public/changelog.txt Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 19 13:50:09 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 May 2014 13:50:09 -0700 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: <4E0677A5-3CB6-4952-ADF6-EFB121BA9B81@d-pointer.com> References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> <4E0677A5-3CB6-4952-ADF6-EFB121BA9B81@d-pointer.com> Message-ID: <0A723E50-A748-4C57-B396-1CBD3700467C@live555.com> > Here I only try to get the original stream to work through the replicator. Based on the testReplicator example > all streams must be through a replicator, it's not possible to have one stream use the replicator and one > stream use the normal way. That's correct. Only one object can be reading from a "FramedSource". (Otherwise, you'll see the "attempting to read more than once at the same time!" error message.) That's the whole reason why I developed the "StreamReplicator" class: To make it possible for data from a "FramedSource" to be delivered to more than one object. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From newsletters at thetoolwiz.com Mon May 19 14:49:13 2014 From: newsletters at thetoolwiz.com (David Schwartz) Date: Mon, 19 May 2014 14:49:13 -0700 Subject: [Live-devel] channel mapping question In-Reply-To: <635C7D7E-D46E-4118-8FF9-2124AABA996A@live555.com> References: <635C7D7E-D46E-4118-8FF9-2124AABA996A@live555.com> Message-ID: My software isn?t working properly yet. These traces were intended to illustrate the problem, not the solution, and did not come from my software, which is an incorrect assumption on your part. Regardless, my question was not about headers, but SDP data that IS being returned from a LIVE555 server. Being that this is a discussion group for LIVE555 server users, I figured it?s an appropriate question. I?m sorry if I was mistaken. Any recommendations where I can post such questions where I?ll get more helpful replies? -David On May 19, 2014, at 12:32 PM, Ross Finlayson wrote: > Note that this mailing list is specifically for questions about/discussion of the "LIVE555 Streaming Media" software. It is not intended for general questions about RTSP, RTP or RTCP. If your organization is interested in hiring me as a consultant/technical advisor for general architectural issues regarding streaming media, RTSP/RTP/RTCP etc., then please contact me via separate email. > > > Nonetheless... >> So I send two SETUP commands to the server: >> >> SETUP rtsp://some.url/trackID=1 RTSP/1.0 >> CSeq: 4 >> User-Agent: LibVLC/2.1.3 (LIVE555 Streaming Media v2014.01.21) >> Transport: RTP/AVP/TCP;unicast;interleaved=0-1 > > If you are using VLC (or 'LibVLC') as your RTSP client, then the RTSP protocol exchange, and RTP/RTCP packet reception and media rendering is taken care of automatically. You do not need to know or care about the specific details of how this all works. > > But if instead you are developing your own RTSP client application - that doesn't use 'LibVLC' - then you should not include a false "User-Agent:" header that claims otherwise. (If you're not using the "LIVE555 Streaming Media" software - or VLC/LibVLC, which uses the "LIVE555 Streaming Media" software, then you definitely should not be claiming to be using it, because "LIVE555" is a trademark of "Live Networks, Inc.".) > > >> When I was playing around with just the FIRST (video) SETUP command (without the audio SETUP), I observed messages in RTP payload for BOTH channel# = 0 AND channel# = 1, although I don?t know what the latter data was for. > > The 'odd-numbered' channels are for RTCP packets. Again, if you're using the "LIVE555 Streaming Media" software (or VLC/'LibVLC'), then RTCP is implemented automatically, and you don't need to care about it. But again, if you're *not* using the "LIVE555 Streaming Media" software, then this mailing list is not appropriate. > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 19 20:03:53 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 19 May 2014 20:03:53 -0700 Subject: [Live-devel] channel mapping question In-Reply-To: References: <635C7D7E-D46E-4118-8FF9-2124AABA996A@live555.com> Message-ID: > Regardless, my question was not about headers, but SDP data that IS being returned from a LIVE555 server. > > Being that this is a discussion group for LIVE555 server users, I figured it?s an appropriate question. No, your question was a general question about RTSP server behavior; not a question specifically about our implementation. General questions about RTSP should be posted to the mailing list for the IETF's "MMUSIC" working group (the group that standardized RTSP). But first, you should read RFC 2326 (which describes the RTSP standard). That will probably answer your question(s). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From newsletters at thetoolwiz.com Mon May 19 21:49:33 2014 From: newsletters at thetoolwiz.com (David Schwartz) Date: Mon, 19 May 2014 21:49:33 -0700 Subject: [Live-devel] channel mapping question In-Reply-To: References: <635C7D7E-D46E-4118-8FF9-2124AABA996A@live555.com> Message-ID: <345BDFB1-A069-4D0A-ADC4-5738FB131E7C@thetoolwiz.com> I?ve been through every RFC I can find that seems relevant, some several times. In particular, what I?m NOT finding is any discussion about all of the codec-specific and so-called ?out-of-band? details that these specs DELIBERATELY avoid ? which happens to be exactly what I?m having trouble with. Since you generally need to test client software against SOME server software, I?d imagine that server venders like you guys would have some passing desire to support such development. It seems I?m mistaken. I?ll remember that when customers ask for recommendations for a server to work with my software. Sorry to be such a bother. I?ll unsubscribe from this list as it seems my kind aren?t welcome. -David On May 19, 2014, at 8:03 PM, Ross Finlayson wrote: >> Regardless, my question was not about headers, but SDP data that IS being returned from a LIVE555 server. >> >> Being that this is a discussion group for LIVE555 server users, I figured it?s an appropriate question. > > No, your question was a general question about RTSP server behavior; not a question specifically about our implementation. > > General questions about RTSP should be posted to the mailing list for the IETF's "MMUSIC" working group (the group that standardized RTSP). > > But first, you should read RFC 2326 (which describes the RTSP standard). That will probably answer your question(s). > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From jan.ekholm at d-pointer.com Tue May 20 00:01:09 2014 From: jan.ekholm at d-pointer.com (Jan Ekholm) Date: Tue, 20 May 2014 10:01:09 +0300 Subject: [Live-devel] Making ProxyServerMediaSubsession stream to multicast addresses In-Reply-To: <0A723E50-A748-4C57-B396-1CBD3700467C@live555.com> References: <9D564B6A-DE0F-4418-8A6A-CE8203BAD2EE@d-pointer.com> <4E0677A5-3CB6-4952-ADF6-EFB121BA9B81@d-pointer.com> <0A723E50-A748-4C57-B396-1CBD3700467C@live555.com> Message-ID: <04441E25-DFEF-4284-A6B5-AFAB982DD388@d-pointer.com> On 19 maj 2014, at 23:50, Ross Finlayson wrote: >> Here I only try to get the original stream to work through the replicator. Based on the testReplicator example >> all streams must be through a replicator, it's not possible to have one stream use the replicator and one >> stream use the normal way. > > That's correct. Only one object can be reading from a "FramedSource". (Otherwise, you'll see the "attempting to read more than once at the same time!" error message.) That's the whole reason why I developed the "StreamReplicator" class: To make it possible for data from a "FramedSource" to be delivered to more than one object. It was a bit unclear that the StreamReplicator must be a central part of the source-filter-sink chain, instead of just "hooking in" to some part. However, as soon as I add the StreamReplicator the video starts to stutter. I leave it out, the stream is fine. I don't actually do anything secondary yet, just trying to get the main stream working through it. To me it looks like frames get rendered like: 1 2 3 4 5 3 4 5 6 7 5 6 7 8 ... The stutter is not consistent though and does not happen every time I test, but it's there. The stutter happens with: m_replicator = StreamReplicator::createNew( envir(), m_subsession->readSource(), False ); H264VideoStreamDiscreteFramer * framer = H264VideoStreamDiscreteFramer::createNew( envir(), m_replicator->createStreamReplica() ); if ( ! m_subsession->sink->startPlaying( *framer, ... ) { Simply changing that to: H264VideoStreamDiscreteFramer * framer = H264VideoStreamDiscreteFramer::createNew( envir(), m_subsession->readSource() ); if ( ! m_subsession->sink->startPlaying( *framer, ... ) { to remove StreamReplicator from the equation and there is no stutter anymore. Switching where the replicator is like this: H264VideoStreamDiscreteFramer * framer = H264VideoStreamDiscreteFramer::createNew( envir(), m_subsession->readSource() ); m_replicator = StreamReplicator::createNew( envir(), framer, False ); if ( ! m_subsession->sink->startPlaying( *m_replicator->createStreamReplica(), ... ) { means no video is delivered at all anymore. So the replicator does something fishy, it's not a 1:1 pass through and duplicator. -- Jan Ekholm jan.ekholm at d-pointer.com From john95018 at gmail.com Wed May 21 09:59:29 2014 From: john95018 at gmail.com (john dicostanzo) Date: Wed, 21 May 2014 22:29:29 +0530 Subject: [Live-devel] valgrind showing memory leaks in Indexer Message-ID: Hi, I am using Live555 library for my vod server, vod server create index file and announce transport stream. For Indexing, I am using code as reference from MPEG2TransportStreamIndexer.cpp but when i close and delete all the dynamic objects,valgrind shows memory leaks on it. Leaks: ==29177== 80(16 direct,64 indirect) bytes in 1 block are definitely lost in loss record 3 of 3 ==29177== at 0x402BA13: operator new(unsigned int)(vg_replace_malloc.c:313) ==29177== by 0x8055B67: _Tables::getOurTables(UsageEnvironment &,unsigned char) ==29177== by 0x6524B67: MediaLookupTables::ourMedia(UsageEnvironment) ==29177== by 0x5874C6A: Medium::close(UsageEnvironment &,char const *) ==29177== by 0x424CCB7: Medium::close(Medium*) ==29177== by 0x5345C67: FramedFilter::~FramedFilter() ==29177== by 0x5345C67: MPEG2IFrameIndexFromTransportStream::~MPEG2IFrameIndexFromTransportStream() ==29177== by 0x5345C67: Medium::close(Medium*) I don't know, why its creating new dynamic object in Medium::close(operator new(unsigned int)(vg_replace_malloc.c:313)) Thanks, John -------------- next part -------------- An HTML attachment was scrubbed... URL: From frombach002 at gmail.com Thu May 22 09:11:12 2014 From: frombach002 at gmail.com (Alix Frombach) Date: Thu, 22 May 2014 12:11:12 -0400 Subject: [Live-devel] Issues streaming H.264 video from camera to a file Message-ID: I am experiencing issues when saving H.264 video directly from a camera to a file sink (H264VideoFileSink) involving some corrupt frames where the forbidden bit is being set to 1. I have tested with other formats (MP4, AVI) and there is no issue, I have also used wireshark to confirm the data coming from the camera does not have any errors. Has anyone else experienced this? Is this a known issue? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu May 22 21:25:19 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 22 May 2014 21:25:19 -0700 Subject: [Live-devel] Issues streaming H.264 video from camera to a file In-Reply-To: References: Message-ID: <3C16E73B-6584-4904-B72E-81E421108D46@live555.com> > I am experiencing issues when saving H.264 video directly from a camera to a file sink (H264VideoFileSink) What do you mean by saving "directly from a camera"? Are you recording from a RTSP/RTP stream - e.g., using our "openRTSP" demo application? In this case, "H264VideoFileSink" does not inspect the incoming H.2164 NAL units; it just writes them to a file (with the "0x00 0x00 0x00 0x01" 'start code' in front). Or are you (somehow) reading H.264 NAL units from the camera without using RTSP/RTP? If you're doing this, note that the data that you feed to "H264VideoFileSink" *must* consist of discrete NAL units (i.e., one at a time), *without* any preceding 'start code'. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From john95018 at gmail.com Fri May 23 10:51:36 2014 From: john95018 at gmail.com (john dicostanzo) Date: Fri, 23 May 2014 23:21:36 +0530 Subject: [Live-devel] valgrind showing memory leaks in Indexer In-Reply-To: References: Message-ID: HI Ross, I have modified Indexer utility ( MPEG2TransportStreamIndexer.cpp ) to delete all the dynamic objects. Code: #include #include void afterPlaying(void* clientData); // forward UsageEnvironment* env; char const* programName; void usage() { *env << "usage: " << programName << " \n"; *env << "\twhere ends with \".ts\"\n"; exit(1); } TaskScheduler* scheduler; FramedSource* input; FramedSource* indexer; MediaSink* output; int main(int argc, char const** argv) { // Begin by setting up our usage environment: scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Parse the command line: programName = argv[0]; if (argc != 2) usage(); char const* inputFileName = argv[1]; // Check whether the input file name ends with ".ts": int len = strlen(inputFileName); if (len < 4 || strcmp(&inputFileName[len-3], ".ts") != 0) { *env << "ERROR: input file name \"" << inputFileName << "\" does not end with \".ts\"\n"; usage(); } // Open the input file (as a 'byte stream file source'): input = ByteStreamFileSource::createNew(*env, inputFileName, TRANSPORT_PACKET_SIZE); if (input == NULL) { *env << "Failed to open input file \"" << inputFileName << "\" (does it exist?)\n"; exit(1); } // Create a filter that indexes the input Transport Stream data: indexer = MPEG2IFrameIndexFromTransportStream::createNew(*env, input); // The output file name is the same as the input file name, except with suffix ".tsx": char* outputFileName = new char[len+2]; // allow for trailing x\0 sprintf(outputFileName, "%sx", inputFileName); // Open the output file (for writing), as a 'file sink': output = FileSink::createNew(*env, outputFileName); if (output == NULL) { *env << "Failed to open output file \"" << outputFileName << "\"\n"; exit(1); } // Start playing, to generate the output index file: *env << "Writing index file \"" << outputFileName << "\"..."; output->startPlaying(*indexer, afterPlaying, NULL); env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } void afterPlaying(void* /*clientData*/) { *env << "...done\n"; Medium::close(output); Medium::close(input); Medium::close(indexer); delete scheduler; env->reclaim(); exit(0); } As you can see agterPlaying function, is it right way to delete all objects because valgrind tool showing memory leaks at closing of "indexer" object. Leaks: ==29177== 80(16 direct,64 indirect) bytes in 1 block are definitely lost in loss record 3 of 3 ==29177== at 0x402BA13: operator new(unsigned int)(vg_replace_malloc.c:313) ==29177== by 0x8055B67: _Tables::getOurTables( UsageEnvironment &,unsigned char) ==29177== by 0x6524B67: MediaLookupTables::ourMedia(UsageEnvironment) ==29177== by 0x5874C6A: Medium::close(UsageEnvironment &,char const *) ==29177== by 0x424CCB7: Medium::close(Medium*) ==29177== by 0x5345C67: FramedFilter::~FramedFilter() ==29177== by 0x5345C67: MPEG2IFrameIndexFromTransportStream::~MPEG2IFrameIndexFromTransportStream() ==29177== by 0x5345C67: Medium::close(Medium*) Thanks & Regards, John On Wed, May 21, 2014 at 10:29 PM, john dicostanzo wrote: > Hi, > I am using Live555 library for my vod server, vod server create index file > and announce transport stream. > For Indexing, I am using code as reference from > MPEG2TransportStreamIndexer.cpp > but when i close and delete all the dynamic objects,valgrind shows memory > leaks on it. > > Leaks: > ==29177== 80(16 direct,64 indirect) bytes in 1 block are definitely lost > in loss record 3 of 3 > ==29177== at 0x402BA13: operator new(unsigned int)(vg_replace_malloc.c:313) > ==29177== by 0x8055B67: _Tables::getOurTables(UsageEnvironment &,unsigned > char) > ==29177== by 0x6524B67: MediaLookupTables::ourMedia(UsageEnvironment) > ==29177== by 0x5874C6A: Medium::close(UsageEnvironment &,char const *) > ==29177== by 0x424CCB7: Medium::close(Medium*) > ==29177== by 0x5345C67: FramedFilter::~FramedFilter() > ==29177== by 0x5345C67: > MPEG2IFrameIndexFromTransportStream::~MPEG2IFrameIndexFromTransportStream() > ==29177== by 0x5345C67: Medium::close(Medium*) > > I don't know, why its creating new dynamic object in > Medium::close(operator new(unsigned int)(vg_replace_malloc.c:313)) > > Thanks, > John > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pr.jwei at gmail.com Sun May 25 04:07:43 2014 From: pr.jwei at gmail.com (JW Liao) Date: Sun, 25 May 2014 19:07:43 +0800 Subject: [Live-devel] [ Propose ] Fix multi-thread issue when RTCPInstance new OutPacketBuffer Message-ID: *Environment* 1. Create multiple RTSP server in one process and every server is different thread ( multi-thread ). 2. Create 100 session to play video from the multiple RTSP server at the same time, so every server will be handle some sessions. *Warning Message :* - MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (2884). 1731 bytes of trailing data was dropped! Correct this by increasing "OutPacketBuffer::maxSize" to at least 3181, *before* creating this 'RTPSink'. (Current value is 1450.) - MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (2884). 9449 bytes of trailing data was dropped! Correct this by increasing "OutPacketBuffer::maxSize" to at least 10899, *before* creating this 'RTPSink'. (Current value is 1450.) - MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (2884). 350 bytes of trailing data was dropped! Correct this by increasing "OutPacketBuffer::maxSize" to at least 1800, *before* creating this 'RTPSink'. (Current value is 1450.) - MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size (2884). 926 bytes of trailing data was dropped! Correct this by increasing "OutPacketBuffer::maxSize" to at least 2376, *before* creating this 'RTPSink'. (Current value is 1450.) *Root Cause :* - *OutPacketBuffer::maxSize* type is *static*, when multi-thread running, the value will be modified at class *RTCPInstance* constructor, show source code below: // A hack to save buffer space, because RTCP packets are always small: unsigned savedMaxSize = OutPacketBuffer::maxSize; *OutPacketBuffer::maxSize* = maxRTCPPacketSize; fOutBuf = new OutPacketBuffer(preferredPacketSize, maxRTCPPacketSize); OutPacketBuffer::maxSize = savedMaxSize; if (fOutBuf == NULL) return; *Analysis :* - *RTCPInstance* constructor modify *OutPacketBuffer::maxSize* just want to help class *OutPacketBuffer* constructor compute *maxNumPackets*, show source below : *MediaSink.cpp* OutPacketBuffer::OutPacketBuffer(unsigned preferredPacketSize,unsigned maxPacketSize) : fPreferred(preferredPacketSize), fMax(maxPacketSize), fOverflowDataSize(0) { unsigned *maxNumPackets* = (*maxSize* + (maxPacketSize-1))/maxPacketSize; fLimit = maxNumPackets*maxPacketSize; fBuf = new unsigned char[fLimit]; resetPacketStart(); resetOffset(); resetOverflowData(); } *Resolve :* - Add new constructor for OutPacketBuffer and pre-compute the maxNumPackets value then pass it. *RTCP.cpp* #if 0 // A hack to save buffer space, because RTCP packets are always small: unsigned savedMaxSize = OutPacketBuffer::maxSize; OutPacketBuffer::maxSize = maxRTCPPacketSize; fOutBuf = new OutPacketBuffer(preferredPacketSize, maxRTCPPacketSize); OutPacketBuffer::maxSize = savedMaxSize; if (fOutBuf == NULL) return; #else *unsigned maxNumPackets = (maxRTCPPacketSize + (maxRTCPPacketSize-1))/maxRTCPPacketSize;* fOutBuf = new OutPacketBuffer(preferredPacketSize, maxRTCPPacketSize, *maxNumPackets*); if (fOutBuf == NULL) return; #endif *MediaSink.cpp* OutPacketBuffer::OutPacketBuffer(unsigned preferredPacketSize,unsigned maxPacketSize, *unsigned maxNumPackets*) : fPreferred(preferredPacketSize), fMax(maxPacketSize), fOverflowDataSize(0) { fLimit = maxNumPackets*maxPacketSize; fBuf = new unsigned char[fLimit]; resetPacketStart(); resetOffset(); resetOverflowData(); } BR ChunWei -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun May 25 05:07:50 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 25 May 2014 05:07:50 -0700 Subject: [Live-devel] [ Propose ] Fix multi-thread issue when RTCPInstance new OutPacketBuffer In-Reply-To: References: Message-ID: <2458DA45-54A3-4930-85BB-450D689BAF7C@live555.com> > Create multiple RTSP server in one process and every server is different thread ( multi-thread ). If you do this, be sure that each thread uses its own "UsageEnvironment" and "TaskScheduler" objects; see http://www.live555.com/liveMedia/faq.html#threads However, there's generally little or no benefit in having multiple RTSP server threads, because RTSP servers are not 'CPU bound', and do non-blocking I/O. (But if you insist on having multiple, concurrent RTSP servers, it's better to run them in separate processes (not separate threads within a single process).) Nonetheless, the problem you noted (with the code in "RTCP.cpp") is real. I have just installed a new version - 2014.05.25 - of the "LIVE555 Streaming Media" code that fixes this (using a slightly different solution to the one that you proposed). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon May 26 05:34:55 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 26 May 2014 14:34:55 +0200 Subject: [Live-devel] filedescriptor leak on windows plateform Message-ID: <12708_1401107695_538334EF_12708_1994_1_1BE8971B6CFF3A4F97AF4011882AA255015651627200@THSONEA01CMS01P.one.grp> Hi Ross, With a colleage we saw a filedescriptor leak in somethink that looks like a workaround for windows. In BasicTaskScheduler.cpp there is : #if defined(__WIN32__) || defined(_WIN32) int err = WSAGetLastError(); // For some unknown reason, select() in Windoze sometimes fails with WSAEINVAL if // it was called with no entries set in "readSet". If this happens, ignore it: if (err == WSAEINVAL && readSet.fd_count == 0) { err = EINTR; // To stop this from happening again, create a dummy socket: int dummySocketNum = socket(AF_INET, SOCK_DGRAM, 0); FD_SET((unsigned)dummySocketNum, &fReadSet); } if (err != EINTR) { #else But the dummySocket is never closed. As during the lifetime of the process we can create/destroy many TaskScheduler, this small leak becomes important. Do you think it could be safe to add a close(dummySocketNum) ? Best Regards, Michel. From finlayson at live555.com Mon May 26 11:13:56 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 May 2014 11:13:56 -0700 Subject: [Live-devel] filedescriptor leak on windows plateform In-Reply-To: <12708_1401107695_538334EF_12708_1994_1_1BE8971B6CFF3A4F97AF4011882AA255015651627200@THSONEA01CMS01P.one.grp> References: <12708_1401107695_538334EF_12708_1994_1_1BE8971B6CFF3A4F97AF4011882AA255015651627200@THSONEA01CMS01P.one.grp> Message-ID: > // For some unknown reason, select() in Windoze sometimes fails with WSAEINVAL if > // it was called with no entries set in "readSet". If this happens, ignore it: > if (err == WSAEINVAL && readSet.fd_count == 0) { > err = EINTR; > // To stop this from happening again, create a dummy socket: > int dummySocketNum = socket(AF_INET, SOCK_DGRAM, 0); > FD_SET((unsigned)dummySocketNum, &fReadSet); > } > if (err != EINTR) { > #else > > But the dummySocket is never closed. Yeah, OK, in some future release I'll close "dummySocket" in the "BasicTaskScheduler" destructor. Alternatively, because this code is Windows-specific (to work around a bug in Windows), you could use another OS. > As during the lifetime of the process we can create/destroy many TaskScheduler, this small leak becomes important. This (using a single, permanent process in which you repeatedly create objects, then destroy them all) is poor system design. (Maybe there's something in your OS that requires you to design your system this way? Even if so, that doesn't escape the fact that this is poor system design. Helping people who design their systems this way is not a high priority for me.) > Do you think it could be safe to add a close(dummySocketNum) ? Definitely not (unless this is done only in the "BasicTaskScheduler"s destructor). Having a closed socket in the select() 'read set' will lead to an 'invalid file descriptor' error. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jichang at coretrust.com Mon May 26 21:40:09 2014 From: jichang at coretrust.com (=?ks_c_5601-1987?B?wOXB1sDP?=) Date: Tue, 27 May 2014 13:40:09 +0900 Subject: [Live-devel] to stream MP4 container Message-ID: <002101cf7965$c0ac5150$4204f3f0$@com> Hi, I?m doing to stream MP4 with live555. To do this I googled and found mp4-ehnahced live555 project with live555 ver4.x It uses MPEG4IP, and extracts ES from mp4 source file and stream each ES file. like this: ...(parsing from mp4 file information using MPEG4IP)... sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(env, command, reuseSource)); sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(env, command, reuseSource)); It works well. But I don?t want use this way for some reasons (storage and initial extracting time, etc). I?d like to demux mp4 file as live555 demux MPEG1or2. Can I do this easily? Should I use mp4 demux code from another project like ffmpeg? Any advice will be appreciated. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon May 26 23:03:56 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 26 May 2014 23:03:56 -0700 Subject: [Live-devel] to stream MP4 container In-Reply-To: <002101cf7965$c0ac5150$4204f3f0$@com> References: <002101cf7965$c0ac5150$4204f3f0$@com> Message-ID: <256318DB-5D3E-4E12-AEBE-634A378602E7@live555.com> > I?d like to demux mp4 file as live555 demux MPEG1or2. > Can I do this easily? No. Instead, you should use the Matroska (i.e., ".mkv") file format for your input file. See http://lists.live555.com/pipermail/live-devel/2014-May/018343.html Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Tue May 27 07:34:52 2014 From: fantasyvideo at 126.com (Tony) Date: Tue, 27 May 2014 22:34:52 +0800 (CST) Subject: [Live-devel] Regarding the start code handling in h264 Message-ID: <1cd9bedd.c3df.1463e1af3ea.Coremail.fantasyvideo@126.com> Hi Ross, My system has two type rtsp stream. One is living video, other device transfer their h264 data to my system, I copied them to one queue, and the framed source would get one sample every time, these sample is h264 nalu, and isn't add start code (0x00,0x00,0x00,0x01). Other is VOID? every time video framed source would read one sample in mp4 file. This sample contains more than one nalu, and each nalu would be started with (0x00,0x00,0x00,0x01), but in such case, the h264 data client receives doesn't have start code, so I guess live555 has removed them. But what's the difference with these two type streams? If I add the start code in the first stream, live555 would'nt remove them. And if I don't add the start code in the second type stream, the client can't play the received data. So could you explain it? Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed May 28 18:42:15 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 28 May 2014 18:42:15 -0700 Subject: [Live-devel] Regarding the start code handling in h264 In-Reply-To: <1cd9bedd.c3df.1463e1af3ea.Coremail.fantasyvideo@126.com> References: <1cd9bedd.c3df.1463e1af3ea.Coremail.fantasyvideo@126.com> Message-ID: <2469175B-E76A-4312-B76C-7E86D49F054C@live555.com> In each case - because your input source is H.264 video - your input source object (i.e., your subclass of "FramedSource") must deliver NAL units - *without* any 'start code' - one at a time to a "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer"). > One is living video, other device transfer their h264 data to my system, I copied them to one queue, and the framed source would get one sample every time, these sample is h264 nalu, and isn't add start code (0x00,0x00,0x00,0x01). As noted above, you must deliver one NAL unit at a time - without a start code - to a "H264VideoStreamDiscreteFramer". > Other is VOID? every time video framed source would read one sample in mp4 file. This sample contains more than one nalu, and each nalu would be started with (0x00,0x00,0x00,0x01) In this case, you need to parse the input data, so that (as above) you are delivering one NAL unit at a time - without a start code - to a "H264VideoStreamDiscreteFramer". The data that gets delivered to a "H264VideoRTPSink" - and thus across the network - must *never* contain a start code! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From frombach002 at gmail.com Wed May 28 05:14:03 2014 From: frombach002 at gmail.com (Alix Frombach) Date: Wed, 28 May 2014 08:14:03 -0400 Subject: [Live-devel] Issues streaming H.264 video from camera to a file In-Reply-To: <3C16E73B-6584-4904-B72E-81E421108D46@live555.com> References: <3C16E73B-6584-4904-B72E-81E421108D46@live555.com> Message-ID: Yes I am recording from RTSP/RTP stream through my own application using the library. The camera itself is not providing any preceding "start code" and is only being added by the sink. The majority of the frames are fine, but a couple of times throughout the file there is an issue with an SPS or PPS unit (forbidden bit set to 1) and will cause the video to display a "greyed out" screen for a partial second until the next good frame comes through. Again just to clarify these frames are not being corrupted until they reach the application, I am able to take the wireshark data and manually add the start code to achieve playback. On Fri, May 23, 2014 at 12:25 AM, Ross Finlayson wrote: > I am experiencing issues when saving H.264 video directly from a camera to > a file sink (H264VideoFileSink) > > > What do you mean by saving "directly from a camera"? > > Are you recording from a RTSP/RTP stream - e.g., using our "openRTSP" demo > application? In this case, "H264VideoFileSink" does not inspect the > incoming H.2164 NAL units; it just writes them to a file (with the "0x00 > 0x00 0x00 0x01" 'start code' in front). > > Or are you (somehow) reading H.264 NAL units from the camera without using > RTSP/RTP? If you're doing this, note that the data that you feed to > "H264VideoFileSink" *must* consist of discrete NAL units (i.e., one at a > time), *without* any preceding 'start code'. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fantasyvideo at 126.com Wed May 28 23:49:54 2014 From: fantasyvideo at 126.com (Tony) Date: Thu, 29 May 2014 14:49:54 +0800 (CST) Subject: [Live-devel] Regarding the start code handling in h264 In-Reply-To: <2469175B-E76A-4312-B76C-7E86D49F054C@live555.com> References: <1cd9bedd.c3df.1463e1af3ea.Coremail.fantasyvideo@126.com> <2469175B-E76A-4312-B76C-7E86D49F054C@live555.com> Message-ID: <33511cc0.d3ac.14646bdfcd7.Coremail.fantasyvideo@126.com> But in the second case, even if I din't parse the data, and every nalu is started with start code, but live555 worked correctly. The h264 data client recieved doesn't have start code. So live555 has removed them? At 2014-05-29 09:42:15, "Ross Finlayson" wrote: In each case - because your input source is H.264 video - your input source object (i.e., your subclass of "FramedSource") must deliver NAL units - *without* any 'start code' - one at a time to a "H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer"). One is living video, other device transfer their h264 data to my system, I copied them to one queue, and the framed source would get one sample every time, these sample is h264 nalu, and isn't add start code (0x00,0x00,0x00,0x01). As noted above, you must deliver one NAL unit at a time - without a start code - to a "H264VideoStreamDiscreteFramer". Other is VOID? every time video framed source would read one sample in mp4 file. This sample contains more than one nalu, and each nalu would be started with (0x00,0x00,0x00,0x01) In this case, you need to parse the input data, so that (as above) you are delivering one NAL unit at a time - without a start code - to a "H264VideoStreamDiscreteFramer". The data that gets delivered to a "H264VideoRTPSink" - and thus across the network - must *never* contain a start code! Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From laimaoli at 126.com Thu May 29 01:38:03 2014 From: laimaoli at 126.com (laimaoli at 126.com) Date: Thu, 29 May 2014 16:38:03 +0800 Subject: [Live-devel] Problem in live555 DelayQueue::synchronize() Message-ID: <2014052916380206803427@126.com> Hi,I have a problem, such as the following, I had change the code about "read the file" ,and replace this place as "get a frame data from the camera"?Could you help me ,or give me some advices? I had trouble this question about 2weeks . code as void H264FramedLiveSource::getNextFrame1() { int pos = 0; fFrameSize = 0; bool bflag =false; while(1) { while(1) { if(pEncodeOperater != NULL && pEncodeOperater->nVedioID == 1) { pEncodeOperater->encode_getframe(); vbuf = pEncodeOperater->enc->virt_bsbuf_addr + pEncodeOperater->outinfo->bitstreamBuffer - pEncodeOperater->enc->phy_bsbuf_addr; fFrameSize = pEncodeOperater->outinfo->bitstreamSize; if(nCount == 0) { // sps first frame memcpy(fTo,(char *)pEncHead[0]+sizeof(int),*(int*)pEncHead[0]); memcpy(fTo+(*(int*)pEncHead[0]),(char *)pEncHead[1]+sizeof(int),*(int*)pEncHead[1]); fFrameSize += (*(int*)pEncHead[0]) + (*(int*)pEncHead[1]); bflag = true; } if( fFrameSize > fMaxSize) { fNumTruncatedBytes = fFrameSize - fMaxSize; fFrameSize = fMaxSize; } else { fNumTruncatedBytes = 0; } if(bflag == true) { memmove(fTo +(*(int*)pEncHead[0]) + (*(int*)pEncHead[1]),(void *)vbuf,fFrameSize - (*(int*)pEncHead[0]) + (*(int*)pEncHead[1])); bflag = false; } else { memmove(fTo ,(void *)vbuf,fFrameSize); } gettimeofday(&fPresentationTime, NULL); afterGetting(this); } } } _Debug as following: 70 fFrameSize 2083-- -fMaxSize 116501 fNumTruncatedBytes:0 71 fFrameSize 2102-- -fMaxSize 116501 fNumTruncatedBytes:0 72 fFrameSize 2334-- -fMaxSize 116501 fNumTruncatedBytes:0 73 fFrameSize 1944-- -fMaxSize 109982 fNumTruncatedBytes:0 74 fFrameSize 2115-- -fMaxSize 109982 fNumTruncatedBytes:0 75 fFrameSize 2220-- -fMaxSize 109982 fNumTruncatedBytes:0 76 fFrameSize 2668-- -fMaxSize 103703 fNumTruncatedBytes:0 77 fFrameSize 2334-- -fMaxSize 103703 fNumTruncatedBytes:0 78 fFrameSize 1690-- -fMaxSize 149997 fNumTruncatedBytes:0 79 fFrameSize 1835-- -fMaxSize 149997 fNumTruncatedBytes:0 80 fFrameSize 2810-- -fMaxSize 149997 fNumTruncatedBytes:0 81 fFrameSize 2254-- -fMaxSize 143662 fNumTruncatedBytes:0 Program received signal SIGSEGV, Segmentation fault. 0x0005a81c in DelayQueue::synchronize() () (gdb) p 0x0005a81c $1 = 370716 Email:laimaoli at fosiao.com Iphone:18823320170 Name:Maoli -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 30 05:43:37 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 May 2014 05:43:37 -0700 Subject: [Live-devel] Issues streaming H.264 video from camera to a file In-Reply-To: References: <3C16E73B-6584-4904-B72E-81E421108D46@live555.com> Message-ID: <6463C130-1B54-4AE3-8ECE-FD6B47FF185C@live555.com> > Yes I am recording from RTSP/RTP stream through my own application using the library. Does the problem occur if you use the provided "openRTSP" demo application (see ), instead of your own application? (If the problem occurs only with your application, then we likely won't be able help you on this mailing list.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From xxiao8 at fosiao.com Fri May 30 05:23:56 2014 From: xxiao8 at fosiao.com (Xh Xiao) Date: Fri, 30 May 2014 07:23:56 -0500 Subject: [Live-devel] H264 stream from camera shows black screen Message-ID: I am capturing from a camera(interlaced NTSC analog camera with BT656), I can save all the captured video to a file test.h264 and can play it via RTSP as expected, however if I play it directly(i.e. live) without saving to a file first, the play window is black with progressing time displayed, but all we got is a black screen. what could be the problem? Thanks, xxiao -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri May 30 06:39:15 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 30 May 2014 06:39:15 -0700 Subject: [Live-devel] H264 stream from camera shows black screen In-Reply-To: References: Message-ID: <0AF421FC-0AD9-423E-933C-6B14E7CA9E70@live555.com> > I am capturing from a camera(interlaced NTSC analog camera with BT656), I can save all the captured video to a file test.h264 and can play it via RTSP as expected, however if I play it directly(i.e. live) without saving to a file first, the play window is black with progressing time displayed, but all we got is a black screen. what could be the problem? I don't know. The problem may be with your media player application - which would have nothing to do with us. Try running "openRTSP" as your RTSP client. Rename the output file to have a ".h264" filename, and try playing this file using VLC (as your media player). If this works, then you should also be able to use VLC as a RTSP client, to play the stream directly. If this doesn't work, then send us the diagnostic output from "openRTSP"; that might tell us what's going wrong. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 31 19:46:40 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 31 May 2014 19:46:40 -0700 Subject: [Live-devel] Regarding the start code handling in h264 In-Reply-To: <33511cc0.d3ac.14646bdfcd7.Coremail.fantasyvideo@126.com> References: <1cd9bedd.c3df.1463e1af3ea.Coremail.fantasyvideo@126.com> <2469175B-E76A-4312-B76C-7E86D49F054C@live555.com> <33511cc0.d3ac.14646bdfcd7.Coremail.fantasyvideo@126.com> Message-ID: <2CC744BF-75E2-4497-AEE6-255EE3CF9149@live555.com> > But in the second case, even if I din't parse the data, and every nalu is started with start code, but live555 worked correctly. You *might* be able to stream the second type of data by passing it to a "H264VideoStreamFramer" (rather than a "H264VideoStreamDiscreteFramer"). I don't recommend this, however, because a "H264VideoStreamFramer" is intended for H.264 video data that's in a file (or pipe). (And yes, it removes 'start codes' from the data.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sat May 31 19:48:55 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 31 May 2014 19:48:55 -0700 Subject: [Live-devel] Problem in live555 DelayQueue::synchronize() In-Reply-To: <2014052916380206803427@126.com> References: <2014052916380206803427@126.com> Message-ID: You should be checking the frame size against "fMaxSize" *before* you copy any bytes to (the address pointed to by) "fTo". "fMaxSize" tells you the maximum amount of data that you're allowed to be copying. If you copy more than this, you'll overflow the receiver's buffer. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From frombach002 at gmail.com Fri May 30 08:56:49 2014 From: frombach002 at gmail.com (Alix Frombach) Date: Fri, 30 May 2014 11:56:49 -0400 Subject: [Live-devel] Issues streaming H.264 video from camera to a file In-Reply-To: <6463C130-1B54-4AE3-8ECE-FD6B47FF185C@live555.com> References: <3C16E73B-6584-4904-B72E-81E421108D46@live555.com> <6463C130-1B54-4AE3-8ECE-FD6B47FF185C@live555.com> Message-ID: Did not realize I could stream H264 using openRTSP (did not see any documentation, but looking through the code I have found it) After further testing the issue does not exist in openRTSP and only my application. I apologize for the "non-issue" and appreciate all the help you provided. Thanks again. On Fri, May 30, 2014 at 8:43 AM, Ross Finlayson wrote: > Yes I am recording from RTSP/RTP stream through my own application using > the library. > > > Does the problem occur if you use the provided "openRTSP" demo application > (see ), instead of your own application? > > (If the problem occurs only with your application, then we likely won't be > able help you on this mailing list.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: