From dnyanesh.gate at intelli-vision.com Fri Aug 1 04:41:34 2014 From: dnyanesh.gate at intelli-vision.com (Dnyanesh Gate) Date: Fri, 1 Aug 2014 17:11:34 +0530 Subject: [Live-devel] Live555 proxy server set/control streaming fps Message-ID: Hello, I am new in live555 development. I am using live555 proxy server for streaming live and recorded video. I am able to stream recorded video, but facing an issue with that. I have a 15fps recoreded video file, but live555 is streaming it with 30fps (VLC stats). VLS show fast forward video. I am refering testOnDemandServer.cpp file for streaming video and audio files. How can I set/control streaming fps. Please help me with controlling streaming fps. -- Thanks & Regards, DnyaneshG. From Sudhir_Potnuru at mindtree.com Fri Aug 1 06:18:04 2014 From: Sudhir_Potnuru at mindtree.com (Sudhir Potnuru) Date: Fri, 1 Aug 2014 13:18:04 +0000 Subject: [Live-devel] Live555 libraries for Macosx - 32 bit Issue Message-ID: <7453073f8dd3473fb2dbd4a97832b38e@SIXPR01MB256.apcprd01.prod.exchangelabs.com> Hi All, We are working on developing a RTSP client application in QT platform which can be ported on both Windows and Mac OS 32bit. We have successfully build the live555 libraries for Windows platform and integrated the same in our application. Now we are working on building the live555 libraries for Mac OS 32 bit (10.9 OSX) to integrate the libraries to our application on Mac OS. We followed the steps specified in the link http://www.live555.com/liveMedia/#testProgs under "How to configure and build the code on Unix (including Linux, Mac OS X, QNX, and other Posix-compliant systems)" We have generated the libraries for MacOS by using config.macosx-32bit configuration file. We are facing an issue after including the 4 libraries (libliveMedia.a, libgroupsock.a, libBasicUsageEnvironment.a, libUsageEnvironment.a) in our application (.pro file). We are getting following error on building our project. Undefined symbols for architecture i386: "RTSPClient::RTSPClient(UsageEnvironment&, char const*, int, char const*, unsigned short)", referenced from: OURRTSPClient::OURRTSPClient(UsageEnvironment&, char const*, int, char const*, unsigned short) in OURRTSPClient.o (maybe you meant: __ZN10RTSPClientC2ER16UsageEnvironmentPKciS3_ti) "BasicTaskScheduler::createNew()", referenced from: CRTPDataReceiver::CreateTaskScheduler() in RTPDataReceiver.o "MediaSubsessionIterator::MediaSubsessionIterator(MediaSession&)", referenced from: CRTPDataReceiver::ContinueAfterDescribe(RTSPClient*, int, char*) in RTPDataReceiver.o CRTPDataReceiver::ShutdownStream(RTSPClient*, int) in RTPDataReceiver.o CRTPDataReceiver::SubsessionAfterPlaying(void*) in RTPDataReceiver.o ld: symbol(s) not found for architecture i386 clang: error: linker command failed with exit code 1 (use -v to see invocation) make: *** We have cross checked that libraries built are for architecture i386 but these files are non-fat libraries. We would like to know will there be any difference between non-fat files and fat files. We would like to know the actual reason for this issue and it will be very helpful if we get any solution for this. Thanks in Advance, Sudhir ________________________________ http://www.mindtree.com/email/disclaimer.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Aug 1 06:37:44 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 1 Aug 2014 09:37:44 -0400 Subject: [Live-devel] Live555 proxy server set/control streaming fps In-Reply-To: References: Message-ID: <9214532C-4B3E-4EAF-812A-4CD685824D88@live555.com> > I have a 15fps recoreded video file, > but live555 is streaming it with 30fps (VLC stats). VLS show fast > forward video. http://www.live555.com/liveMedia/faq.html#my-file-doesnt-work Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Aug 1 22:21:10 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sat, 2 Aug 2014 00:21:10 -0500 Subject: [Live-devel] What to do if data not coming while streaming from memory In-Reply-To: References: Message-ID: <46CBEE35-5962-4F9A-8590-762A973215D8@live555.com> > Maybe the speed of streaming frames is much faster than the speed of generating frames. So when function ?doGetNextFrame()? is called, no frame is put into memory. If I use sleep() to wait for frames, thread blocks and streaming AAC frames slowns down. No, you shouldn't call "sleep()" at all, because that will stop events from being handled during that period of time. Remember that LIVE555-based applications are event driven, using a single thread of control - with asynchronous I/O - to handle events. Therefore, the LIVE555 event loop should not block. > So What should I do to ensure both AAC and H.264 to streamed well when no frames come? If your "doGetNextFrame()" is called when no frames are immediately available to be delivered, then it should return immediately (i.e., without calling "sleep()"), and should be called again when a new frame becomes available. One way to do this is to use a separate thread that monitors your frame source, and 'triggers' the LIVE555 event loop thread - by calling "triggertEvent()". I suggest that you use the "DeviceSource" code (see "liveMedia/DeviceSource.cpp") as a model for your 'frame source' class. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From grom86 at mail.ru Sat Aug 2 11:35:12 2014 From: grom86 at mail.ru (=?UTF-8?B?bWludXM=?=) Date: Sat, 02 Aug 2014 22:35:12 +0400 Subject: [Live-devel] =?utf-8?q?HOW_REALIZED_NAT_TRAVERSAL_IN_LIVE555?= Message-ID: <1407004512.353067323@f118.i.mail.ru> How Live555 client side library makes UDP ports reachable from the global network for incoming UDP traffic when both client and server are under NAT and all UDP ports are not opened for the network? Does it use STUN or something else? -- Vadim -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 3 21:50:39 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 3 Aug 2014 21:50:39 -0700 Subject: [Live-devel] HOW REALIZED NAT TRAVERSAL IN LIVE555 In-Reply-To: <1407004512.353067323@f118.i.mail.ru> References: <1407004512.353067323@f118.i.mail.ru> Message-ID: <46A2901E-AB2D-4629-81B7-3233CF49C144@live555.com> > How Live555 client side library makes UDP ports reachable from the global network for incoming UDP traffic when both client and server are under NAT In short, it can't. RTSP works by having a RTSP client contact a RTSP server, to request a specific stream. If the RTSP client can't contact the RTSP server, either by name or IP address (because the server is behind a NAT and has only a private IP address), then RTSP won't work. (On the other hand, if just the RTSP client - but not the RTSP server - is hidden behind a NAT, then RTSP will often work OK.) What people usually do to overcome this is put a "LIVE555 Proxy Server" on a public-facing computer (i.e., one that has a public IP address), and have that proxy server stream from a 'back-end' RTSP server that can be on a private network. (If the proxy server computer can't contact the 'back-end' server, then instead, the 'back-end' server can contact the proxy server, using our custom RTSP "REGISTER" command, as described in our proxy server documentation: ) Then, clients (including those that are themselves behind a NAT) can access the stream via the (public) proxy server. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Sudhir_Potnuru at mindtree.com Sun Aug 3 22:00:17 2014 From: Sudhir_Potnuru at mindtree.com (Sudhir Potnuru) Date: Mon, 4 Aug 2014 05:00:17 +0000 Subject: [Live-devel] Live555 libraries for Macosx - 32 bit Issue Message-ID: Hi All, We are working on developing a RTSP client application in QT platform which can be ported on both Windows and Mac OS 32bit. We have successfully build the live555 libraries for Windows platform and integrated the same in our application. Now we are working on building the live555 libraries for Mac OS 32 bit (10.9 OSX) to integrate the libraries to our application on Mac OS. We followed the steps specified in the link http://www.live555.com/liveMedia/#testProgs under "How to configure and build the code on Unix (including Linux, Mac OS X, QNX, and other Posix-compliant systems)" We have generated the libraries for MacOS by using config.macosx-32bit configuration file. We are facing an issue after including the 4 libraries (libliveMedia.a, libgroupsock.a, libBasicUsageEnvironment.a, libUsageEnvironment.a) in our application (.pro file). We are getting following error on building our project. Undefined symbols for architecture i386: "RTSPClient::RTSPClient(UsageEnvironment&, char const*, int, char const*, unsigned short)", referenced from: OURRTSPClient::OURRTSPClient(UsageEnvironment&, char const*, int, char const*, unsigned short) in OURRTSPClient.o (maybe you meant: __ZN10RTSPClientC2ER16UsageEnvironmentPKciS3_ti) "BasicTaskScheduler::createNew()", referenced from: CRTPDataReceiver::CreateTaskScheduler() in RTPDataReceiver.o "MediaSubsessionIterator::MediaSubsessionIterator(MediaSession&)", referenced from: CRTPDataReceiver::ContinueAfterDescribe(RTSPClient*, int, char*) in RTPDataReceiver.o CRTPDataReceiver::ShutdownStream(RTSPClient*, int) in RTPDataReceiver.o CRTPDataReceiver::SubsessionAfterPlaying(void*) in RTPDataReceiver.o ld: symbol(s) not found for architecture i386 clang: error: linker command failed with exit code 1 (use -v to see invocation) make: *** We have cross checked that libraries built are for architecture i386 but these files are non-fat libraries. We would like to know will there be any difference between non-fat files and fat files. We would like to know the actual reason for this issue and it will be very helpful if we get any solution for this. Thanks in Advance, Sudhir ________________________________ http://www.mindtree.com/email/disclaimer.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 3 22:11:51 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 3 Aug 2014 22:11:51 -0700 Subject: [Live-devel] Live555 libraries for Macosx - 32 bit Issue In-Reply-To: References: Message-ID: Please DO NOT send the same question to the mailing list more than once. This is basic email 'netiquette', and is also covered in the FAQ that everyone was asked to read before posting to the mailing list. Because of this, all future postings from anyone "@mindtree.com" will be moderated. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From michel.promonet at thalesgroup.com Mon Aug 4 01:08:28 2014 From: michel.promonet at thalesgroup.com (PROMONET Michel) Date: Mon, 4 Aug 2014 10:08:28 +0200 Subject: [Live-devel] TS muxer with H264VideoStreamDiscreteFramer ? Message-ID: <30123_1407139710_53DF3F7D_30123_12250_1_1BE8971B6CFF3A4F97AF4011882AA25501565611A443@THSONEA01CMS01P.one.grp> Hi Ross, I am trying to see what could be done with the Transport Stream Muxer provided by live555. Starting from the sample testH264VideoToTransportStream.cpp, it seems simple, adding a H264VideoStreamFramer in a MPEG2TransportStreamFromESSource and connect the MPEG2TransportStreamFromESSource to a Sink. Well. But our custom FramedSource provide data NAL unit per NAL unit, then : - the implementation of H264VideoStreamFramer using its internal buffer of BANK_SIZE overflow. - the implementation of H264VideoStreamDiscreteFramer works but H264 start code will not added to the muxed transport stream. So I don't see many solution either I override the H264VideoStreamDiscreteFramer in order to add H264 start code or I need to make our custom FramedSource to split frame. Do you think what I trying to do has a meaning ? Thank for your help. Michel. [@@ THALES GROUP INTERNAL @@] -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Aug 4 01:17:34 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 4 Aug 2014 01:17:34 -0700 Subject: [Live-devel] TS muxer with H264VideoStreamDiscreteFramer ? In-Reply-To: <30123_1407139710_53DF3F7D_30123_12250_1_1BE8971B6CFF3A4F97AF4011882AA25501565611A443@THSONEA01CMS01P.one.grp> References: <30123_1407139710_53DF3F7D_30123_12250_1_1BE8971B6CFF3A4F97AF4011882AA25501565611A443@THSONEA01CMS01P.one.grp> Message-ID: <9485F633-2A9C-49AD-B3E5-BBBF4C63696A@live555.com> > I am trying to see what could be done with the Transport Stream Muxer provided by live555. > Starting from the sample testH264VideoToTransportStream.cpp, it seems simple, adding a H264VideoStreamFramer in a MPEG2TransportStreamFromESSource and connect the MPEG2TransportStreamFromESSource to a Sink. Well. > > But our custom FramedSource provide data NAL unit per NAL unit, then : > - the implementation of H264VideoStreamFramer using its internal buffer of BANK_SIZE overflow. If your "FramedSource" subclass delivers discrete NAL units (i.e., one NAL unit at a time), then you *must* use a "H264VideoStreamDiscreteFramer", not a "H264VideoStreamFramer"! > - the implementation of H264VideoStreamDiscreteFramer works but H264 start code will not added to the muxed transport stream. "H264VideoStreamFramer::createNew()" includes an optional parameter "includeStartCodeInOutput" (default value: False). This parameter is currently not in "H264VideoStreamDiscreteFramer::createNew()". That was an oversight. In the next release of the software, I'll add that (optional) parameter to "H264VideoStreamDiscreteFramer::createNew()". Then you'll be able to feed your "FramedSource" subclass into a "H264VideoStreamDiscreteFramer" (with "includeStartCodeInOutput": True), and then into a "MPEG2TransportStreamFromESSource", etc. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jshanab at jfs-tech.com Mon Aug 4 03:42:38 2014 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Mon, 4 Aug 2014 06:42:38 -0400 Subject: [Live-devel] HOW REALIZED NAT TRAVERSAL IN LIVE555 In-Reply-To: <46A2901E-AB2D-4629-81B7-3233CF49C144@live555.com> References: <1407004512.353067323@f118.i.mail.ru> <46A2901E-AB2D-4629-81B7-3233CF49C144@live555.com> Message-ID: If I am understanding the question correctly I did something similar using live555 in a previous project. Originally this was only because the device (Sercomm security camera) had some special code on it. I later wrote a small app that allowed any RTSP camera inside the network to do reach out from behind the firewall. ** Reach out via HTTP to a server on the net. This works thru nat firewall. obscound the socket on the server end and pass it to live555 instead of letting 555 create the socket. Start with the Options command back down the socket and proceed normally. A socket is a socket at the lowest level the rest is sort of a gentleman's agreement between the two endpoints. RTSP is similar to http. As long as you start inside the nat the return path is authorized thru the firewall. There are high end firewalls that can block even this but they are usually found in medium to large businesses. ** Pusher App connects to http server once per camera and maintains a bidirectional persistent connection with 15 second keep alive. This allows the server to send commands to app behind the firewall at any time. Kinda "long polling". A client connects to server via http and says I want to watch camera #1, http server sends down the persistant http connection the request for camera #1. Pusher app connects to camera and gets and opens a new port to server. Pusher app is a live555 restreamer. Client abscounds the port and passes it to live555 on the client that now connects to the restreamer that is behind the nat firewall. I ran this on a rasberryPi with 4 cameras 30FPS @ D1. It used 9% CPU. On Mon, Aug 4, 2014 at 12:50 AM, Ross Finlayson wrote: > How Live555 client side library makes UDP ports reachable from the global > network for incoming UDP traffic when both client and server are under NAT > > > In short, it can't. RTSP works by having a RTSP client contact a RTSP > server, to request a specific stream. If the RTSP client can't contact the > RTSP server, either by name or IP address (because the server is behind a > NAT and has only a private IP address), then RTSP won't work. (On the > other hand, if just the RTSP client - but not the RTSP server - is hidden > behind a NAT, then RTSP will often work OK.) > > What people usually do to overcome this is put a "LIVE555 Proxy Server" on > a public-facing computer (i.e., one that has a public IP address), and have > that proxy server stream from a 'back-end' RTSP server that can be on a > private network. (If the proxy server computer can't contact the > 'back-end' server, then instead, the 'back-end' server can contact the > proxy server, using our custom RTSP "REGISTER" command, as described in our > proxy server documentation: ) Then, > clients (including those that are themselves behind a NAT) can access the > stream via the (public) proxy server. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel at lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lkxkfl at hotmail.com Tue Aug 5 01:46:08 2014 From: lkxkfl at hotmail.com (Xuan) Date: Tue, 5 Aug 2014 16:46:08 +0800 Subject: [Live-devel] Only VLC Player can open my url and play Message-ID: Hi, Now I can stream live h.264 and aac source with your suggestion. And VLC player can open the url and play the received video and audio, however other player like realplayer, quicktime not. Why realplayer and quicktime can't? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 5 08:34:10 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 5 Aug 2014 08:34:10 -0700 Subject: [Live-devel] Only VLC Player can open my url and play In-Reply-To: References: Message-ID: <6E2C09E6-710E-41AD-BF34-E5D3446FBB07@live555.com> > Now I can stream live h.264 and aac source with your suggestion. And VLC player can open the url and play the received video and audio, however other player like realplayer, quicktime not. Why realplayer and quicktime can?t? Perhaps those media players don't properly handle the RTP payload format for H.264 video and/or AAC audio? I suggest asking on on one of their mailing lists. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From costantino.cerbo at gebit.de Wed Aug 6 07:10:01 2014 From: costantino.cerbo at gebit.de (costantino.cerbo at gebit.de) Date: Wed, 6 Aug 2014 16:10:01 +0200 Subject: [Live-devel] Empty data file with the "-S " option Message-ID: Hello, I'd like to use openRTPS to receive the RTSP stream of my IP cam. The first try failed becouse of the following error: "Unable to create receiver for "video/MJPEG" subsession: RTP payload format unknown or not supported" Then I put the option '-S 5' and the connection was succesfully created: Setup "video/MJPEG" subsession (client ports 55546-55547) Created output file: "video-MJPEG-1" Sending request: PLAY rtsp://192.168.100.1:1554/profile/1/local.sdp/ RTSP/1.0 but the output file: "video-MJPEG-1" is empty, while on the standard out is continuosly printed the message: "Received xyz new bytes of response data." What am I doing wrong? Thanks in advance for your help! Costantino from Germany -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 6 07:27:38 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Aug 2014 07:27:38 -0700 Subject: [Live-devel] Empty data file with the "-S " option In-Reply-To: References: Message-ID: <11B35F47-58DB-4192-828A-B5D605ACA59A@live555.com> > I'd like to use openRTPS to receive the RTSP stream of my IP cam. > > The first try failed becouse of the following error: > "Unable to create receiver for "video/MJPEG" subsession: RTP payload > format unknown or not supported" This appears to be a bug in your IP camera. The IETF has not defined a RTP payload format named "MPEG". Perhaps they meant "JPEG"? > > Then I put the option '-S 5' and the connection was succesfully created: > Setup "video/MJPEG" subsession (client ports 55546-55547) > Created output file: "video-MJPEG-1" > Sending request: PLAY rtsp://192.168.100.1:1554/profile/1/local.sdp/ RTSP/1.0 > > but the output file: "video-MJPEG-1" is empty See http://www.live555.com/liveMedia/faq.html#openRTSP-empty-files Try adding the "-t" option to RTSP. If this still doesn't work, then please send us the complete RTSP protocol exchange (i.e., printed by "openRTSP" to 'stderr'), and we'll take a look at it, to see if it tells us anything. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 6 07:43:49 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Aug 2014 07:43:49 -0700 Subject: [Live-devel] Empty data file with the "-S " option In-Reply-To: <11B35F47-58DB-4192-828A-B5D605ACA59A@live555.com> References: <11B35F47-58DB-4192-828A-B5D605ACA59A@live555.com> Message-ID: On Aug 6, 2014, at 7:27 AM, Ross Finlayson wrote: >> I'd like to use openRTPS to receive the RTSP stream of my IP cam. >> >> The first try failed becouse of the following error: >> "Unable to create receiver for "video/MJPEG" subsession: RTP payload >> format unknown or not supported" > > This appears to be a bug in your IP camera. The IETF has not defined a RTP payload format named "MPEG" Oops, I meant "The IETF has not defined a RTP payload format named "MJPEG"". Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From costantino.cerbo at gebit.de Wed Aug 6 08:11:48 2014 From: costantino.cerbo at gebit.de (costantino.cerbo at gebit.de) Date: Wed, 6 Aug 2014 17:11:48 +0200 Subject: [Live-devel] Antwort: Re: Empty data file with the "-S " option In-Reply-To: <11B35F47-58DB-4192-828A-B5D605ACA59A@live555.com> References: <11B35F47-58DB-4192-828A-B5D605ACA59A@live555.com> Message-ID: Thanks for the prompt answer! Unfortunately even with the "-t" option the problem persists. Here the RTSP protocol exchange (without the "-t" option): $ openRTSP -S 10 rtsp://192.168.100.1:1554/profile/1/local.sdp Opening connection to 192.168.100.1, port 1554... ...remote connection opened Sending request: OPTIONS rtsp://192.168.100.1:1554/profile/1/local.sdp RTSP/1.0 CSeq: 2 User-Agent: /home/c.cerbo/dev/live/testProgs/openRTSP (LIVE555 Streaming Media v2014.07.25) EasyPassword: 123456 Received 100 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK CSeq: 2 Public: OPTIONS,DESCRIBE,SETUP,PLAY,PAUSE,TEARDOWN ForwardRequired: 1 Sending request: DESCRIBE rtsp://192.168.100.1:1554/profile/1/local.sdp RTSP/1.0 CSeq: 3 User-Agent: /home/c.cerbo/dev/live/testProgs/openRTSP (LIVE555 Streaming Media v2014.07.25) EasyPassword: 123456 Accept: application/sdp Received 435 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 3 Server: FaramCam/1.0.0 Date: Tue, 18 Jan 2005 22:56:43 GMT Expires: Tue, 18 Jan 2005 22:56:43 GMT Content-Base: rtsp://192.168.100.1:1554/profile/1/local.sdp/ Content-Type: application/sdp Content-Length: 192 v=0 o=- 1 1 IN IP4 192.168.100.1 s=RTSP Session c=IN IP4 192.168.100.1 t=0 0 a=range:npt=now- a=control:* m=video 0 RTP/AVP 125 b=AS:80 a=rtpmap:125 MJPEG/90000 a=control:trackID=0 Opened URL "rtsp://192.168.100.1:1554/profile/1/local.sdp", returning a SDP description: v=0 o=- 1 1 IN IP4 192.168.100.1 s=RTSP Session c=IN IP4 192.168.100.1 t=0 0 a=range:npt=now- a=control:* m=video 0 RTP/AVP 125 b=AS:80 a=rtpmap:125 MJPEG/90000 a=control:trackID=0 Created receiver for "video/MJPEG" subsession (client ports 36242-36243) Sending request: SETUP rtsp://192.168.100.1:1554/profile/1/local.sdp/trackID=0 RTSP/1.0 CSeq: 4 User-Agent: /home/c.cerbo/dev/live/testProgs/openRTSP (LIVE555 Streaming Media v2014.07.25) EasyPassword: 123456 Transport: RTP/AVP;unicast;client_port=36242-36243 Received 193 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Server: FaramCam/1.0.0 Date: Tue, 18 Jan 2005 22:56:53 GMT Expires: Tue, 18 Jan 2005 22:56:53 GMT Session: 70706 Transport: RTP/AVP/TCP;unicast;interleaved=0-1 Setup "video/MJPEG" subsession (client ports 36242-36243) Created output file: "video-MJPEG-1" Sending request: PLAY rtsp://192.168.100.1:1554/profile/1/local.sdp/ RTSP/1.0 CSeq: 5 User-Agent: /home/c.cerbo/dev/live/testProgs/openRTSP (LIVE555 Streaming Media v2014.07.25) EasyPassword: 123456 Session: 70706 Range: npt=0.000- Received 232 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Server: FaramCam/1.0.0 Date: Tue, 18 Jan 2005 22:56:53 GMT Expires: Tue, 18 Jan 2005 22:56:53 GMT Session: 70706 RTP-info: url=rtsp://192.168.100.1:1554/profile/1/local.sdp/trackID=0;seq=0;rtptime=0 Started playing session Receiving streamed data (signal with "kill -HUP 29043" or "kill -USR1 29043" to terminate)... Received 1448 new bytes of response data. Received 915 new bytes of response data. Received 1448 new bytes of response data. Received 907 new bytes of response data. Received 40 new bytes of response data. Received 2371 new bytes of response data. Received 111 new bytes of response data. [...] Von: Ross Finlayson An: LIVE555 Streaming Media - development & use Datum: 06.08.2014 16:35 Betreff: Re: [Live-devel] Empty data file with the "-S " option I'd like to use openRTPS to receive the RTSP stream of my IP cam. The first try failed becouse of the following error: "Unable to create receiver for "video/MJPEG" subsession: RTP payload format unknown or not supported" This appears to be a bug in your IP camera. The IETF has not defined a RTP payload format named "MPEG". Perhaps they meant "JPEG"? Then I put the option '-S 5' and the connection was succesfully created: Setup "video/MJPEG" subsession (client ports 55546-55547) Created output file: "video-MJPEG-1" Sending request: PLAY rtsp://192.168.100.1:1554/profile/1/local.sdp/ RTSP/1.0 but the output file: "video-MJPEG-1" is empty See http://www.live555.com/liveMedia/faq.html#openRTSP-empty-files Try adding the "-t" option to RTSP. If this still doesn't work, then please send us the complete RTSP protocol exchange (i.e., printed by "openRTSP" to 'stderr'), and we'll take a look at it, to see if it tells us anything. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 6 10:33:14 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Aug 2014 10:33:14 -0700 Subject: [Live-devel] Antwort: Re: Empty data file with the "-S " option In-Reply-To: References: <11B35F47-58DB-4192-828A-B5D605ACA59A@live555.com> Message-ID: <0C351D6C-A7F7-4F2B-9E54-DF401FC20BFB@live555.com> I suspect that the problem here is that your IP camera is delivering RTP/RTCP packets over the RTSP TCP connection in a non-standard way. This (along with the fact that they announce a non-standard RTP payload format name) suggests that you probably won't be able to receive this data using our software. I suggest first checking whether the camera has a firmware upgrade available that fixes this. If not, then I recommend contacting the manufacturer of your IP camera, telling them about this problem, and suggesting that they contact me to help work around this problem. (If they're interested, I'd be happy to work with them to help make their products standards-compliant.) Otherwise, I don't have anything to suggest, other than use a different IP camera. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From james.heliker at gmail.com Tue Aug 5 21:34:45 2014 From: james.heliker at gmail.com (James Heliker) Date: Tue, 5 Aug 2014 21:34:45 -0700 Subject: [Live-devel] receiving live LPCM 16 stream / decode Message-ID: <515F0DEE-2977-4DE0-8CF5-5EFD7349DC2F@gmail.com> Hello - I am new to the Live555 code, sorry for newbie question! I need to decode an LPCM 16bit / 44.1 stream on my local area network (from a hardware encoder). Is there an example of a sink receiving / decoding PCM that anyone can share with me? Thanks! - James Heliker From filu005 at gmail.com Wed Aug 6 03:32:20 2014 From: filu005 at gmail.com (Filip Koperski) Date: Wed, 6 Aug 2014 12:32:20 +0200 Subject: [Live-devel] Creating receiver-streamer combined H264 videoconference server Message-ID: Hi, I'm trying to create a server application for video conference, working in a following way: get incoming H264 data on RTP protocol from multiple clients, decode data to get seperate frames, manipulate frames, encode frames to H264 and finally stream data to multiple clients. For now I have two separate parts working together: H264 live streamer - gets frames from camera (using opencv), encodes them to H264 (with x264lib) and streams as RTP multicast. H264 receiver - collects incoming H264 data from multicast RTP, decodes them (with ffmpeg) and displays in window (using opencv). My current idea for stitching theese two parts and creating my server is to create a sequence of sources (filters) with a single sink at the end: inputRTPSource = H264VideoRTPSource(rtpGroupsockIn) <- collects incoming data from clients computingSource = H264ComputingSource(inputRTPSource); <- this is where decoding, manipulating frames and encoding takes place; it takes inputRTPSource as input source videoSource = H264VideoStreamDiscreteFramer(computingSource); sink = H264VideoRTPSink(rtpGroupsockOut); <- streams data to clients start two (for incoming and outcoming stream) RTCP sessions: rtcpInstanceIn = RTCPInstance(rtcpGroupsockIn, inputRTPSource); rtcpInstanceOut = RTCPInstance(rtcpGroupsockOut, sink); and play whole thing like this: sink->startPlaying(*videoSource, afterPlaying, NULL); Is this the way to go? Am I on the right track? I'd be grateful for any insight! -------------- next part -------------- An HTML attachment was scrubbed... URL: From yueyanbin at vimicro.com Wed Aug 6 19:02:56 2014 From: yueyanbin at vimicro.com (yueyanbin) Date: Thu, 7 Aug 2014 10:02:56 +0800 Subject: [Live-devel] Why vlc could not get frame data? Message-ID: <2014080710025592258213@vimicro.com> Hello everyone, When will the vlc could process the RTP payload, but could't get the streamer frame? I do give the frame data to fTo. Could anyone give me any suggestion? accept()ed connection from 10.140.70.20 RTSPClientConnection[0x16cb90]::handleRequestBytes() read 119 new bytes:OPTIONS rtsp://10.140.70.251/h264 RTSP/1.0 CSeq: 2 User-Agent: LibVLC/2.1.5 (LIVE555 Streaming Media v2014.05.27) parseRTSPRequestString() succeeded, returning cmdName "OPTIONS", urlPreSuffix "", urlSuffix "h264", CSeq "2", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 2 Date: Tue, Jun 22 2010 10:29:05 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER RTSPClientConnection[0x16cb90]::handleRequestBytes() read 145 new bytes:DESCRIBE rtsp://10.140.70.251/h264 RTSP/1.0 CSeq: 3 User-Agent: LibVLC/2.1.5 (LIVE555 Streaming Media v2014.05.27) Accept: application/sdp parseRTSPRequestString() succeeded, returning cmdName "DESCRIBE", urlPreSuffix "", urlSuffix "h264", CSeq "3", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 3 Date: Tue, Jun 22 2010 10:29:05 GMT Content-Base: rtsp://10.140.70.251/h264/ Content-Type: application/sdp Content-Length: 343 v=0 o=- 1277200652988984 1 IN IP4 10.140.70.251 s=RTSP/RTP stream from IPNC i=h264 t=0 0 a=tool:LIVE555 Streaming Media v2014.07.18 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:RTSP/RTP stream from IPNC a=x-qt-text-inf:h264 m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:1500 a=rtpmap:96 H264/90000 a=control:track1 RTSPClientConnection[0x16cb90]::handleRequestBytes() read 176 new bytes:SETUP rtsp://10.140.70.251/h264/track1 RTSP/1.0 CSeq: 4 User-Agent: LibVLC/2.1.5 (LIVE555 Streaming Media v2014.05.27) Transport: RTP/AVP;unicast;client_port=62298-62299 parseRTSPRequestString() succeeded, returning cmdName "SETUP", urlPreSuffix "h264", urlSuffix "track1", CSeq "4", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 4 Date: Tue, Jun 22 2010 10:29:05 GMT Transport: RTP/AVP;unicast;destination=10.140.70.20;source=10.140.70.251;client_port=62298-62299;server_port=6970-6971 Session: FB75C5A3;timeout=65 RTSPClientConnection[0x16cb90]::handleRequestBytes() read 155 new bytes:PLAY rtsp://10.140.70.251/h264/ RTSP/1.0 CSeq: 5 User-Agent: LibVLC/2.1.5 (LIVE555 Streaming Media v2014.05.27) Session: FB75C5A3 Range: npt=0.000- parseRTSPRequestString() succeeded, returning cmdName "PLAY", urlPreSuffix "h264", urlSuffix "", CSeq "5", Content-Length 0, with 0 bytes following the message. offset 106 sending response: RTSP/1.0 200 OK CSeq: 5 Date: Tue, Jun 22 2010 10:29:05 GMT Range: npt=0.000- Session: FB75C5A3 RTP-Info: url=rtsp://10.140.70.251/h264/track1;seq=49303;rtptime=4099823860 offset 114 framecount:1725, framesize:8, fTo:1720CA offset 338 framecount:1726, framesize:224, fTo:1720D2 offset 580 framecount:1727, framesize:242, fTo:1721B2 RTSPClientConnection[0x16cb90]::handleRequestBytes() read 145 new bytes:GET_PARAMETER rtsp://10.140.70.251/h264/ RTSP/1.0 CSeq: 6 User-Agent: LibVLC/2.1.5 (LIVE555 Streaming Media v2014.05.27) Session: FB75C5A3 parseRTSPRequestString() succeeded, returning cmdName "GET_PARAMETER", urlPreSuffix "h264", urlSuffix "", CSeq "6", Content-Length 0, with 0 bytes following the message. sending response: RTSP/1.0 200 OK CSeq: 6 Date: Tue, Jun 22 2010 10:29:05 GMT Session: FB75C5A3 Content-Length: 10 2014.07.18 offset 800 framecount:1728, framesize:220, fTo:1722A4 offset 1026 ....[my debug log] framecount:1781, framesize:20706, fTo:1A50A5 RTSP client session (id "FB75C5A3", stream name "h264"): Liveness indication Thanks Yanbin Yue -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Wed Aug 6 20:00:47 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 6 Aug 2014 20:00:47 -0700 Subject: [Live-devel] Why vlc could not get frame data? In-Reply-To: <2014080710025592258213@vimicro.com> References: <2014080710025592258213@vimicro.com> Message-ID: <088EDCC3-D063-4D0E-B7F1-9677DC087BD1@live555.com> > When will the vlc could process the RTP payload, but could't get the streamer frame? I do give the frame data to fTo. Could anyone give me any suggestion? Yes, here are some suggestions: 1/ Try to make your question clearer. You didn't explain specifically how you're using the "LIVE555 Streaming Media" software, and what specific problem you're having. 2/ Use "openRTSP" as a client to test your server, before using VLC. VLC is not our software. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.porsch at intenta.de Thu Aug 7 06:04:08 2014 From: m.porsch at intenta.de (Marco Porsch) Date: Thu, 7 Aug 2014 13:04:08 +0000 Subject: [Live-devel] RTSP client 'openURL()' while event loop is already running Message-ID: <159a8b815ba0445da3a1af76aa769bbc@DB4PR04MB442.eurprd04.prod.outlook.com> Hello list, hello Ross, based on testRTSPClient.cpp I wrote a wrapper class that has an interface with the public functions void connect(const char* URL); void startAll(); connect() basically performs the same tasks as testRTSPClient's openURL(), while startAll() starts the event loop. The class' user has to issue the connect() calls before calling the blocking startAll() function. - This interface works just fine. Now I would like to add the ability to connect() and disconnect() streams while the event loop is already running, i.e. startAll() has been called. For this case I have already set up an EventTriggerId to call into the running event loop and a handler which triggers a deferred handling of openURL(). This works fine initially, but the newly connected streams always stall completely after a short time (few seconds or just few frames). The streams connected before startAll() continue to play just fine. What could be going wrong here? Or is there any general issue that permits starting new sessions from a running event loop? Thanks and best regards, Marco Porsch From finlayson at live555.com Thu Aug 7 08:23:48 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 7 Aug 2014 08:23:48 -0700 Subject: [Live-devel] RTSP client 'openURL()' while event loop is already running In-Reply-To: <159a8b815ba0445da3a1af76aa769bbc@DB4PR04MB442.eurprd04.prod.outlook.com> References: <159a8b815ba0445da3a1af76aa769bbc@DB4PR04MB442.eurprd04.prod.outlook.com> Message-ID: <676BD8E2-3340-415F-BED8-87731876AAB0@live555.com> > Now I would like to add the ability to connect() and disconnect() streams while the event loop is already running, i.e. startAll() has been called. For this case I have already set up an EventTriggerId to call into the running event loop and a handler which triggers a deferred handling of openURL(). This works fine initially, but the newly connected streams always stall completely after a short time (few seconds or just few frames). The streams connected before startAll() continue to play just fine. > > What could be going wrong here? Or is there any general issue that permits starting new sessions from a running event loop? No. It shouldn't matter whether you call "openURL()" from outside or inside the event loop (provided that - if you called it from outside the event loop - you enter the event loop next, as we do in the "testRTSPClient" code). I suspect that there's some bug in your code - having nothing to do with the event loop. I'd test this by (in order): 1/ Starting just a single stream, starting outside the event loop (as in the current "testRTSPClient" code) 2/ Starting just a single stream, starting inside the event loop 3/ Starting two streams, both starting outside the event loop (as in the current "testRTSPClient" code) 4/ Starting two streams, both starting inside the event loop This will likely help you track down your bug. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From costantino.cerbo at gebit.de Fri Aug 8 00:59:49 2014 From: costantino.cerbo at gebit.de (costantino.cerbo at gebit.de) Date: Fri, 8 Aug 2014 09:59:49 +0200 Subject: [Live-devel] Antwort: Re: Antwort: Re: Empty data file with the "-S " option In-Reply-To: <0C351D6C-A7F7-4F2B-9E54-DF401FC20BFB@live555.com> References: <11B35F47-58DB-4192-828A-B5D605ACA59A@live555.com> <0C351D6C-A7F7-4F2B-9E54-DF401FC20BFB@live555.com> Message-ID: Okay, thanks! I wrote to the camera manufacturer and I'm now waiting for his answer. Von: Ross Finlayson An: LIVE555 Streaming Media - development & use Datum: 06.08.2014 19:40 Betreff: Re: [Live-devel] Antwort: Re: Empty data file with the "-S " option I suspect that the problem here is that your IP camera is delivering RTP/RTCP packets over the RTSP TCP connection in a non-standard way. This (along with the fact that they announce a non-standard RTP payload format name) suggests that you probably won't be able to receive this data using our software. I suggest first checking whether the camera has a firmware upgrade available that fixes this. If not, then I recommend contacting the manufacturer of your IP camera, telling them about this problem, and suggesting that they contact me to help work around this problem. (If they're interested, I'd be happy to work with them to help make their products standards-compliant.) Otherwise, I don't have anything to suggest, other than use a different IP camera. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel at lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From frombach002 at gmail.com Fri Aug 8 10:36:41 2014 From: frombach002 at gmail.com (Alix Frombach) Date: Fri, 8 Aug 2014 13:36:41 -0400 Subject: [Live-devel] OpenRTSP Frame Loss When Streaming Multiple Cameras Message-ID: Hello, I am seeing an issue when running multiple instances of openRTSP to capture h.264 data from multiple IP cameras. I am using command line options to allow openRTSP to save the data into 12 second periodic files. I have openRTSP compiled and running on QNX 6.5.0 and can stream data from one and two cameras with no issues. When I stream with 3 or more cameras however, I notice the library is not receiving all the frames. This grows worse the more cameras (and subsequently instances of openRTSP) I am streaming from. Here is a rundown of the testing I have done so far which lead me to believe a library issue. The issue is not network related as running simultaneous instances of wireshark both on and offboard show identical data. The issue is not CPU related as CPU monitoring shows less than 25% usage for all applications running on the system. The issue is not a file write speed limitation as the writes are no where near the max of the system. I have placed some debugging within openRTSP to allow me to keep a counter of how many frames are received. It is observable to notice a loss of frames received as we start streaming from more cameras as well as a delay in receiving frames (rather than constantly receiving them when only streaming from 1-2 cameras). Is this something that has been observed before? Any reason separate instances of openRTSP would cause an issue when running simultaneously? Thanks in advance for the help -------------- next part -------------- An HTML attachment was scrubbed... URL: From neeravpatel at hotmail.com Fri Aug 8 18:40:57 2014 From: neeravpatel at hotmail.com (Neerav Patel) Date: Sat, 9 Aug 2014 01:40:57 +0000 Subject: [Live-devel] live555 onDemand RTSP server Message-ID: Hi There I have managed to write code to stream MJPG RTSP using onDemandServer and everything is working, but I want to be able to allow multiple viewers, when I try to set up another connection everything begins to run super slow. Does OnDemandServer allow this or do I have do some multicast stuff? Is there an example of how to do multicast using the onDemandServer? >From unicast mode is there alot of changes required to make it multicast mode? Thanks in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 10 02:53:47 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 10 Aug 2014 02:53:47 -0700 Subject: [Live-devel] OpenRTSP Frame Loss When Streaming Multiple Cameras In-Reply-To: References: Message-ID: <67724110-BFC2-4AE4-9311-B666DC7DE19C@live555.com> > I am seeing an issue when running multiple instances of openRTSP to capture h.264 data from multiple IP cameras. I am using command line options to allow openRTSP to save the data into 12 second periodic files. I have openRTSP compiled and running on QNX 6.5.0 and can stream data from one and two cameras with no issues. When I stream with 3 or more cameras however, I notice the library is not receiving all the frames. This grows worse the more cameras (and subsequently instances of openRTSP) I am streaming from. > > Here is a rundown of the testing I have done so far which lead me to believe a library issue. No. The problem is insufficiently large buffering in the receiving computer's OS. See: http://www.live555.com/liveMedia/faq.html#packet-loss and note, especially, the last paragraph. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 10 16:09:56 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 10 Aug 2014 16:09:56 -0700 Subject: [Live-devel] live555 onDemand RTSP server In-Reply-To: References: Message-ID: <22D236A5-DE87-4459-96FC-A06FC4EAAFE3@live555.com> > I have managed to write code to stream MJPG RTSP using onDemandServer and everything is working, but I want to be able to allow multiple viewers, when I try to set up another connection everything begins to run super slow. Does OnDemandServer allow this Yes; however, be sure that your "OnDemandServerMediaSubsession" subclass sets the "reuseFirstSource" parameter (in the "OnDemandServerMediaSubsession" constructor) to True, so that the input source will get read only once, even if there's more than one concurrent client. You should note, though, that JPEG is a very bandwidth-inefficient codec for streaming, and therefore you risk running into bandwidth limits, especially if you transmit more than one stream, as you are doing. > or do I have do some multicast stuff? Yes, multicast would be better (provided, of course, that your receivers can all be reached - from the source - by IP multicast routing). To stream via multicast, you *don't* use a "OnDemandServerMediaSubsession"; instead, you use a "PassiveServerMediaSubsession". To see examples of multicast servers that use "PassiveServerMediaSubsession"s, note the various "test*Streamer" demo applications. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dnyanesh.gate at intelli-vision.com Mon Aug 11 06:09:46 2014 From: dnyanesh.gate at intelli-vision.com (Dnyanesh Gate) Date: Mon, 11 Aug 2014 18:39:46 +0530 Subject: [Live-devel] File streaming session getting closed before eof received on client side Message-ID: Hello All, I am refering testVideoOnDemand.cpp source for streaming h264 file. I am able to open stream from vlc player, but rtsp server close session before EOF receives on client side. I have a raw h264 file of length 30 seconds, but client plays video up to 25 seconds. When I check the source for ByteStreamFileSource, I found that server close session as soon as it found feof(), but client is stll receiving frames and doesn't reached to EOF. For 30 seconds video it skip last 3-4 seconds of video, but for larger video files, more number of frames will be skipped, this is my concern. This happens because (read from file FPS) is greater than (streaming FPS). How can I get to know that all frames has been sent to client successfully and then I can call handleClosure()? What you guys will suggest me to overcome this problem? Here is link for file I am using for testing: https://www.dropbox.com/s/qwip5p9ij68fpm3/stream1.h264 -- Thanks & Regards, DnyaneshG. From finlayson at live555.com Mon Aug 11 07:07:03 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 11 Aug 2014 07:07:03 -0700 Subject: [Live-devel] File streaming session getting closed before eof received on client side In-Reply-To: References: Message-ID: <67313DCA-2DE2-4861-B4A8-AD93E4815154@live555.com> > I am refering testVideoOnDemand.cpp source We do not have a file by that name. You should begin your testing with our (unmodified) "testOnDemandRTSPServer" code (as your server), and "openRTSP" (as your client). > Here is link for file I am using for testing: > https://www.dropbox.com/s/qwip5p9ij68fpm3/stream1.h264 The problem with this file is that it has been encoded incorrectly. It contains SPS ("Sequence Parameter Set") NAL units that say that the stream's frame rate is 29.97 frames-per-second (specifically: num_units_in_tick=1001; time_scale=60000). Therefore, our server streams the file at that frame rate. However, the video shows the clock on the wall running twice as fast as it should (i.e., 30 seconds elapse in about 15 seconds), which suggests that the frame rate should probably really be 15 frames-per-second. Therefore, you need to fix whatever encoder you used to generate this file. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lkxkfl at hotmail.com Mon Aug 11 07:07:21 2014 From: lkxkfl at hotmail.com (Xuan) Date: Mon, 11 Aug 2014 22:07:21 +0800 Subject: [Live-devel] Lots of Packets Lost When Receiving With VLC Message-ID: Hi, I have implemented class H264LiveSource so that I can stream real time h.264 video frames. My program captures images with the webcam of my notebook and encodes them. If size is 640x480, vlc player can displayer well. However, if the size is larger,eg.1280x720, vlc player can't display well, and it seems lots of packets are lost. Why are so many packets lost? IS it because the speed of sending packets is to fast? How to deal with it? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 12 17:10:58 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 12 Aug 2014 17:10:58 -0700 Subject: [Live-devel] Lots of Packets Lost When Receiving With VLC In-Reply-To: References: Message-ID: <252A7E85-8DE4-4BCB-801F-E2E3056C0015@live555.com> > I have implemented class H264LiveSource so that I can stream real time h.264 video frames. My program captures images with the webcam of my notebook and encodes them. If size is 640x480, vlc player can displayer well. However, if the size is larger,eg.1280x720, vlc player can?t display well, and it seems lots of packets are lost. > Why are so many packets lost? It's difficult to say, because we know so little about your application. However, I suggest making sure that your "Fr amedSource" subclass (for delivering H.264 NAL units) is not truncating some frames. I.e., check that "fMaxSize" is never < your NAL unit size. (If you are, you can increase your RTP output buffer size by calling "OutPacketBuffer::increaseMaxSizeTo()". Also, you should use "openRTSP" as a client to test your server, before using VLC. (VLC is not our software.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From john_q61 at yahoo.com Wed Aug 13 05:45:50 2014 From: john_q61 at yahoo.com (John Q) Date: Wed, 13 Aug 2014 05:45:50 -0700 Subject: testOnDemandRTSPServer timing issues Message-ID: <1407933950.51953.YahooMailNeo@web122103.mail.ne1.yahoo.com> I am sending a RTP + MPEG-TS live stream from a windows machine over the WAN to a server machine which is running "testMPEG2TransportReceiver" and piping its output to "testOnDemandRTSPServer". I then connect a client to receive the RTSP stream. I noticed that the "testOnDeamandRTSPServer" buffers data if no client is connected to it. So if I connect a client after a minute of starting the stream I see the stream from a minute back. I want to turn off this buffering in the "testOnDemandRTSPServer" when no client is connected to it. I want to see live stream in the client from the instant it connects to the server. How can I do this? Regards John -------------- next part -------------- An HTML attachment was scrubbed... URL: From office at acident.ro Thu Aug 14 01:26:23 2014 From: office at acident.ro (iulian baciu) Date: Thu, 14 Aug 2014 11:26:23 +0300 Subject: [Live-devel] ubuntu 14.04 not working Message-ID: <042a6ae1038429ce0bb3f6e21254a87b@acident.ro> Hello! Just came acrosst your software. We need to restream rtsp with authentication from security dvr`s. vLC impementation on vps is not working stable. So: the problem? - install live555 with this method http://ubuntuforums.org/showthread.php?t=1324290 but not eaven compiled because ./genM.... error. - so we sudo apt-get install livemedia-utils and was ok, live555MediaServer starts and live555Proxysever too. But, where is location to insert files for be streamed to live555Mediaserver - for live555Proxyserver not restream rtsp practicaly but on cli it shoes that it does! Thank you -- WWW.ACIDENT.RO TEL: 0723/300.210 VIDEO-CONFERINTE DIRECT DIN BROWSERUL DE INTERNET TRANSFER DE FISIERE PC TO PC DIRECT FARA SERVERE INTERMEDIARE INTERNETUL PARALEL - ESTE DEJA AICI - From finlayson at live555.com Thu Aug 14 01:44:43 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 14 Aug 2014 01:44:43 -0700 Subject: [Live-devel] ubuntu 14.04 not working In-Reply-To: <042a6ae1038429ce0bb3f6e21254a87b@acident.ro> References: <042a6ae1038429ce0bb3f6e21254a87b@acident.ro> Message-ID: <14B79290-23E6-432B-9E45-D5C53942D53B@live555.com> First, make sure that you're using the latest version of the "LIVE555 Streaming Media" code; see http://www.live555.com/liveMedia/faq.html#latest-version Instructions for building and installing the software on Linux (and other Unix-like systems) can be found at http://www.live555.com/liveMedia/#config-unix If you're still having problems, you're going to have to explain them *much* more clearly than you did in your previous message. Sorry. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Aug 15 22:59:33 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 15 Aug 2014 22:59:33 -0700 Subject: [Live-devel] testOnDemandRTSPServer timing issues In-Reply-To: References: Message-ID: <96BBB048-8D3C-4FEB-8B06-4DB1CC45E55A@live555.com> > I am sending a RTP + MPEG-TS live stream from a windows machine over the WAN to a server machine which is running "testMPEG2TransportReceiver" and piping its output to "testOnDemandRTSPServer". I then connect a client to receive the RTSP stream. I noticed that the "testOnDeamandRTSPServer" buffers data if no client is connected to it. Sort of. Data is being buffered, but it's not happening in the "LIVE555 Streaming Media" library or application code. The buffering is happening in the receiver's OS (in the OS kernel, and the pipe). A better solution would be have your server not start receiving data until a client asks for it. A way to do this is to run a "LIVE555 Proxy server" (see ), rather than "testOnDemandRTSPServer". For this to work, your Windows machine will also need to run a RTSPServer (i.e., a 'back end' server) delivering your MPEG TS/RTP stream. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liveking130 at gmail.com Sat Aug 16 04:50:18 2014 From: liveking130 at gmail.com (live king) Date: Sat, 16 Aug 2014 17:20:18 +0530 Subject: [Live-devel] Streaming JPEG over RTP Message-ID: Hi all, I want to use the test program testOnDemandRTSPServer to stream JPEG over RTP and use the testRTSPClient to save that frame received in fReceiveBuffer into a jpg file. Can anyone please suggest how I can go about doing it Thanks, King -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 17 01:37:26 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 17 Aug 2014 01:37:26 -0700 Subject: [Live-devel] Streaming JPEG over RTP In-Reply-To: References: Message-ID: <3ACD2FFA-BE3A-48A7-85F1-3952571EE74A@live555.com> > I want to use the test program testOnDemandRTSPServer to stream JPEG over RTP See http://www.live555.com/liveMedia/faq.html#jpeg-streaming > and use the testRTSPClient to save that frame received in fReceiveBuffer into a jpg file. You can use the "-m" option to "openRTSP" for this; see http://www.live555.com/openRTSP/ Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liveking130 at gmail.com Sun Aug 17 06:15:49 2014 From: liveking130 at gmail.com (live king) Date: Sun, 17 Aug 2014 18:45:49 +0530 Subject: [Live-devel] Streaming JPEG over RTP In-Reply-To: References: Message-ID: Hi Ross, Thanks for the timely reply.I have taken a look at the Elphel code you directed me to. Currently I have used the Elphel model to create my own class MyJPEGVideoSource that inherits JPEGVideoSource. I have copied the exact same code of ElphelJPEGDeviceSource classes and just changed the file read (initially it was /dev/ccam_dma.raw ) from to a jpg file on my hard disk. ElphelJPEGDeviceSource* ElphelJPEGDeviceSource::createNew(UsageEnvironment& env, unsigned timePerFrame) { FILE* fid = fopen( " */dev/ccam_dma.raw* ", "rb"); if (fid == NULL) { env.setResultErrMsg("Failed to open input device file"); return NULL; } return new ElphelJPEGDeviceSource(env, fid, timePerFrame); } to this, MyJPEGVideoSource* MyJPEGVideoSource::createNew(UsageEnvironment& env, unsigned timePerFrame) { FILE* fid = fopen(" * /path/to/my/jpgFile *", "rb"); if (fid == NULL) { env.setResultErrMsg("Failed to open input device file"); return NULL; } return new MyJPEGVideoSource(env, fid, timePerFrame); } Now the the RTSP server to stream JPEG/RTP is ready and I am able to stream this over a player like VLC. The problem I am facing now is that if I use a client such as testRTSPClient and try to store the fReceiveBuffer to a .jpg file, the jpg file is unreadable using a image viewer. I need the testRTSPClient to store the jpg (as its of high priority to me). Now is the problem with the RTSP server or is the testRTSPClient is incapable of acquiring jpeg images is my question. Or is there something kind of code that I should add on to the MyJPEGVideoSource , please do help. Any kind of help is appreciated. Thanks in Advance, King On Sat, Aug 16, 2014 at 5:20 PM, live king wrote: > Hi all, > > I want to use the test program testOnDemandRTSPServer to stream JPEG over > RTP and use the testRTSPClient to save that frame received in > fReceiveBuffer into a jpg file. > Can anyone please suggest how I can go about doing it > > > Thanks, > King > -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 17 13:43:11 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 17 Aug 2014 13:43:11 -0700 Subject: [Live-devel] Streaming JPEG over RTP In-Reply-To: References: Message-ID: > Thanks for the timely reply. No, it wasn't a "timely reply". My reply was deliberately delayed by about a day because you use an unprofessional email address. (As explained in the FAQ, if you want to be taken seriously on this mailing list, you should use a professional email address. This software is not intended for casual hobbyists.) > Now the the RTSP server to stream JPEG/RTP is ready and I am able to stream this over a player like VLC. > The problem I am facing now is that if I use a client such as testRTSPClient and try to store the fReceiveBuffer to a .jpg file, the jpg file is unreadable using a image viewer. If you're receiving a JPEG/RTP stream using "testRTSPClient", then when you enter the "DummySink::afterGettingFrame()" function ("testRTSPClient.cpp", line 502), then "fReceiveBuffer" should point to a complete JPEG image, of size "frameSize". (You should make sure, though, that "numTruncatedBytes" is zero; if it's not, then you'll need to increase "DUMMY_SINK_RECEIVE_BUFFER_SIZE" accordingly.) Alternatively, as I explained before, you can just run "openRTSP" with the "-m" option to record each incoming frame as a separate file (in this case, a JPEG image). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pragneshkumar.patel at einfochips.com Sun Aug 17 22:48:34 2014 From: pragneshkumar.patel at einfochips.com (Pragneshkumar Patel) Date: Mon, 18 Aug 2014 05:48:34 +0000 Subject: [Live-devel] Supporting Axis camera like RTSP URLs Message-ID: <1408340908439.27526@einfochips.com> Hello, This is Pragnesh and I recently joined the live-devel mailing list. I am having two questions regarding the RTSPServer class of liveMedia. 1. Is it possible to have the live555 RTSP server supporting typical Axis camera like URLs, "rtsp://:/media.sdp?id=&vcodec=&audio=<1/0>&...". Here query can be infinity long name value pair. If possible what kind of changes I need to do keeping LGPL licensing terms and conditions in mind? 2. Is it possible to have a callback like functionality? My application wants a callback from Live555 as and when new RTSP request comes. Thank you in advance. Regards, Pragnesh ************************************************************************************************************************************************************* eInfochips Business Disclaimer: This e-mail message and all attachments transmitted with it are intended solely for the use of the addressee and may contain legally privileged and confidential information. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any dissemination, distribution, copying, or other use of this message or its attachments is strictly prohibited. If you have received this message in error, please notify the sender immediately by replying to this message and please delete it from your computer. Any views expressed in this message are those of the individual sender unless otherwise stated. Company has taken enough precautions to prevent the spread of viruses. However the company accepts no liability for any damage caused by any virus transmitted by this email. ************************************************************************************************************************************************************* -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Mon Aug 18 02:23:33 2014 From: finlayson at live555.com (Ross Finlayson) Date: Mon, 18 Aug 2014 02:23:33 -0700 Subject: [Live-devel] Supporting Axis camera like RTSP URLs In-Reply-To: <1408340908439.27526@einfochips.com> References: <1408340908439.27526@einfochips.com> Message-ID: <89DA1E9E-F3F0-41D2-806F-DD3CC08EAE91@live555.com> > 1. Is it possible to have the live555 RTSP server supporting typical Axis camera like URLs, > "rtsp://:/media.sdp?id=&vcodec=&audio=<1/0>&...". > Here query can be infinity long name value pair. If possible what kind of changes I need to do We don't support this kind of URL by default, because such URLs are not defined in the RTSP standard. However, you could support this by subclassing "RTSPServer" and reimplementing the virtual member function "lookupServerMediaSession()". > keeping LGPL licensing terms and conditions in mind? If you subclass the existing classes, but make no change to the existing code (.hh or .cpp files), then - according to the LGPL - you can distribute your product without having to distribute your subclass code. Note, however, other implications of the LGPL, including the fact that you must, upon request, upgrade your product(s) to use the latest version of the "LIVE555 Streaming Media" code. See http://www.live555.com/liveMedia/faq.html#copyright-and-license > > 2. Is it possible to have a callback like functionality? My application wants a callback from Live555 as and when new RTSP request comes. You can do this by subclassing "RTSPServer::RTSPClientConnection", and then reimplementing the virtual member function "handleCmd_DESCRIBE". (Ditto for any other RTSP request that you want to have a 'callback' on.) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From neeravpatel at hotmail.com Mon Aug 18 19:48:57 2014 From: neeravpatel at hotmail.com (Neerav Patel) Date: Tue, 19 Aug 2014 02:48:57 +0000 Subject: [Live-devel] Live Audio RTSP Streaming Message-ID: Hi I am trying to send a MP2 encoded frame via RTSP (live555) from a live source, (microphone). I am using ffmpeg to encode audio stream, and I am sending it overriding FramedSource and for the OnDemandServer I am using MPEG1or2AudioRTPSink.hh and MPEG1or2AudioStreamFramer.hh. Now in VLC I get to hear maybe half a second worth of sound and then it just stops... I dont know what I am doing wrong. Has anyone experienced this problem before? Am I doing this right, should I be override a different class? Thanks in advance. Neerav -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 19 13:29:05 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 19 Aug 2014 13:29:05 -0700 Subject: [Live-devel] Live Audio RTSP Streaming In-Reply-To: References: Message-ID: > I am trying to send a MP2 encoded frame via RTSP (live555) from a live source, (microphone). I am using ffmpeg to encode audio stream, and I am sending it overriding FramedSource and for the OnDemandServer I am using MPEG1or2AudioRTPSink.hh and MPEG1or2AudioStreamFramer.hh. "MPEG1or2AudioStreamFramer" is used only when the input source is a *byte stream* (e.g., a MP3 file). If, instead, your "FramedSource" subclass is delivering discrete MPEG audio frames (i.e., delivering one frame at a time), then you should *not* use a "MPEG1or2AudioStreamFramer". Instead, your input source should be fed directly into a "MPEG1or2AudioRTPSink", with no intermediate 'framer' object. Note, however, that if your "FramedSource" subclass delivers discrete frames, then you must set "fPresentationTime" correctly for each frame, before completing delivery. > Now in VLC I get to hear maybe half a second worth of sound and then it just stops... VLC is not our software. You should first use "testRTSPClient" and "openRTSP" as RTSP clients, before using VLC. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dnyanesh.gate at intelli-vision.com Wed Aug 20 04:46:00 2014 From: dnyanesh.gate at intelli-vision.com (Dnyanesh Gate) Date: Wed, 20 Aug 2014 17:16:00 +0530 Subject: [Live-devel] RTCPInstance error: Hit limit when reading incoming packet over TCP. Increase "maxRTCPPacketSize" Message-ID: Hello, I am working on Live555 proxy server. I have one camera connected to proxy server and able to open its live stream from proxy server in first attempt. But as soon as I disconnect client (openRTSP/VLC), proxy server started giving this error RTCPInstance error: Hit limit when reading incoming packet over TCP. Increase "maxRTCPPacketSize". When I checked the last command sent by proxy server was PAUSE command, but PAUSE command is not supported by my camera, so I change it to TEARDOWN just for cross check. But it still give above error on client disconnection. I did also tried to open live stream ("rtsp://streaming1.osu.edu/media2/ufsap/ufsap.mov") which support PAUSE command, but still no success. after 2-3 attempts it again started giving above error. Can you please suggest me anything on this. -- Thanks & Regards, DnyaneshG. From finlayson at live555.com Wed Aug 20 08:15:43 2014 From: finlayson at live555.com (Ross Finlayson) Date: Wed, 20 Aug 2014 08:15:43 -0700 Subject: [Live-devel] RTCPInstance error: Hit limit when reading incoming packet over TCP. Increase "maxRTCPPacketSize" In-Reply-To: References: Message-ID: <2666A686-4158-4928-9E57-C9EFFC1941C5@live555.com> On Aug 20, 2014, at 4:46 AM, Dnyanesh Gate wrote: > Hello, > > I am working on Live555 proxy server. I have one camera connected to > proxy server and able to open its live stream from proxy server in > first attempt. > But as soon as I disconnect client (openRTSP/VLC), proxy server > started giving this error RTCPInstance error: Hit limit when reading > incoming packet over TCP. Increase "maxRTCPPacketSize". Were you using the latest version of the "LIVE555 Streaming Media" code when you built the "LIVE555 Proxy Server"? Also, what is the make and model of your 'back-end' server (i.e., network camera)? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Kenneth.Forsythe at activu.com Wed Aug 20 11:58:06 2014 From: Kenneth.Forsythe at activu.com (Kenneth Forsythe) Date: Wed, 20 Aug 2014 14:58:06 -0400 Subject: [Live-devel] Live555 liveMedia wrapped as into a DirectShow filter Message-ID: Hello, We are currently developing a streaming player application that utilizes DirectShow filters. We were interested in wrapping liveMedia into a filter to be able to plug in as a possible RTSP client source filter. Have you done this or could you help us with that? We would also be very appreciative of a quick response. Thank You -------------- next part -------------- An HTML attachment was scrubbed... URL: From neeravpatel at hotmail.com Wed Aug 20 06:30:40 2014 From: neeravpatel at hotmail.com (Neerav Patel) Date: Wed, 20 Aug 2014 13:30:40 +0000 Subject: [Live-devel] live audio source with onDemandServer Message-ID: Hi I am trying to setup live555 to stream rtsp with an ondemandserver from a microphone, but I am not sure how to do so. I have attempted to do this by overriding OnDemandServerMediaSubsession and FramedSource, but I am running into issues where I hear a bit of sound for half a second and then quiet, in VLC the Messages say "buffer arrived way too early"... I am encoding using ffmpeg to encode the audio as mp2. I have attached what I am doing here: #ifndef _FRAMED_SOURCE_HH #include "FramedSource.hh" #include "ImageTransfer.h" #endif class MP2DeviceSource : public FramedSource { public: static MP2DeviceSource* createNew(UsageEnvironment& env, unsigned int stream_id, AudioTransfer * audioTransfer); public: EventTriggerId eventTriggerId; protected: MP2DeviceSource(UsageEnvironment& env, ImageTransfer * imageTransfer ); virtual ~MP2DeviceSource(); private: virtual void doGetNextFrame(); private: static void deliverFrame0(void* clientData); void deliverFrame(); private: AudioTransfer * audioTx; }; Collapse | Copy Code#include "MP2DeviceSource.h" MP2DeviceSource* MP2DeviceSource::createNew(UsageEnvironment& env, unsigned int stream_id, AudioTransfer * audioTransfer) { return new MaxMP2DeviceSource(env, audioTransfer); } MP2DeviceSource::MP2DeviceSource(UsageEnvironment& env, AudioTransfer * audioTransfer) : FramedSource(env), audioTx(audioTransfer) { if (eventTriggerId == 0) eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0); } MP2DeviceSource::~MP2DeviceSource() { envir().taskScheduler().deleteEventTrigger(eventTriggerId); eventTriggerId = 0; } void MP2DeviceSource::doGetNextFrame() { deliverFrame(); } void MP2DeviceSource::deliverFrame0(void *clientData) { ((MP2DeviceSource*)clientData)->deliverFrame(); } static const unsigned __int64 epoch = 116444736000000000; int gettimeofday(struct timeval * tp, struct timezone * tzp) { FILETIME file_time; SYSTEMTIME system_time; ULARGE_INTEGER ularge; GetSystemTime(&system_time); SystemTimeToFileTime(&system_time, &file_time); ularge.LowPart = file_time.dwLowDateTime; ularge.HighPart = file_time.dwHighDateTime; tp->tv_sec = (long) ((ularge.QuadPart - epoch) / 10000000L); tp->tv_usec = (long) (system_time.wMilliseconds * 1000); return 0; } void MP2DeviceSource::deliverFrame() { gettimeofday(&fPresentationTime, NULL); audioTx->GetMP2Image( &fTo, &fFrameSize ); fDurationInMicroseconds = 26000; FramedSource::afterGetting(this); } Collapse | Copy Code#ifndef _ON_DEMAND_SERVER_MEDIA_SUBSESSION_HH #include "OnDemandServerMediaSubsession.hh" #endif class MP2AudioMediaSubsession: public OnDemandServerMediaSubsession { public: static MP2AudioMediaSubsession* createNew(UsageEnvironment& evn, unsigned int sid, Boolean requestFirstSource, AudioTransfer * audioTransfer); protected: MP2AudioMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, AudioTransfer * audioTransfer); virtual ~MP2AudioMediaSubsession(); protected: virtual FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate); virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupSock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource); protected: unsigned int id; AudioTransfer * audioTx; }; Collapse | Copy Code#include "MP2MediaSubsession.h" #include "MP2DeviceSource.h" #include "MPEG1or2AudioRTPSink.hh" #include "MPEG1or2AudioStreamFramer.hh" MP2AudioMediaSubsession* MP2AudioMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource, AudioTransfer * audioTransfer) { return new MP2AudioMediaSubsession(env, reuseFirstSource, imageTransfer ); } MP2AudioMediaSubsession::MP2AudioMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource, AudioTransfer * audioTransfer) : OnDemandServerMediaSubsession(env, reuseFirstSource), audioTx(audioTransfer) { } FramedSource* MP2AudioMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned &estBitrate) { estBitrate = 44100; MP2DeviceSource *source = MP2DeviceSource::createNew(envir(), id, audioTx); return MPEG1or2AudioStreamFramer::createNew(envir(), source ); } RTPSink* MP2AudioMediaSubsession::createNewRTPSink(Groupsock* rtpGroupSock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) { return MPEG1or2AudioRTPSink::createNew( envir(), rtpGroupSock ); } MP2AudioMediaSubsession::~MP2AudioMediaSubsession() { } -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Aug 21 08:19:01 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 21 Aug 2014 08:19:01 -0700 Subject: [Live-devel] Live555 liveMedia wrapped as into a DirectShow filter In-Reply-To: References: Message-ID: <4A5ED6E2-DF62-4B49-8ECB-7FE95BA9253A@live555.com> > We are currently developing a streaming player application that utilizes DirectShow filters. We were interested in wrapping liveMedia into a filter to be able to plug in as a possible RTSP client source filter. Have you done this or could you help us with that? We don't provide any explicit support for DirectShow filters, because they are specific to just one OS platform (and a waning one at that). However, several people on this mailing list have developed their own DirectShow filter that use our RTSP client code. Perhaps some of them would like to contribute tips on how to do this? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jonathan.Anderson at wallawalla.edu Thu Aug 21 22:07:59 2014 From: Jonathan.Anderson at wallawalla.edu (Jonathan Anderson) Date: Fri, 22 Aug 2014 05:07:59 +0000 Subject: [Live-devel] Live555 Buffer Latency Message-ID: Hello, My name is Jon Anderson. I am working on an application using Live555 to stream L16 data for real-time audio playback at the lowest latency possible. I have stripped down many buffers on the OS side and have gotten my stream to playback with a latency of about 60 - 70 ms. My attention is now focused on Live555. I have notice several buffers built into Live555, but the only one that I am expecting to cause latency is the ReorderingPacketBuffer. For now, my plan is to get the latency down to as low as possible and then work it up for the sake of audio quality. With that in mind, I set the ReoderingPacketBuffer threshold time to 0ms. What other buffers might be causing audio playback delay? -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Aug 21 22:35:24 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 21 Aug 2014 22:35:24 -0700 Subject: [Live-devel] Live555 Buffer Latency In-Reply-To: References: Message-ID: > My name is Jon Anderson. I am working on an application using Live555 to stream L16 data for real-time audio playback at the lowest latency possible. I have stripped down many buffers on the OS side and have gotten my stream to playback with a latency of about 60 - 70 ms. My attention is now focused on Live555. I have notice several buffers built into Live555, but the only one that I am expecting to cause latency is the ReorderingPacketBuffer. For now, my plan is to get the latency down to as low as possible and then work it up for the sake of audio quality. With that in mind, I set the ReoderingPacketBuffer threshold time to 0ms. Note that the 'packet reordering threshold time' (default value: 100ms) will occur as latency *only* when a packet gets lost, or if packets arrive out of order. (In the latter case, the actual latency will be <= the threshold value.) If there's no packet loss or reordering on your network (which should be the common case if you're streaming over a LAN), then the 'packet reordering threshold time' will have NO effect on latency. But if packets ever arrive out of order, then the 'packet reordering threshold time' is important, otherwise the out-of-order packet(s) will get discarded. Therefore, I don't recommend that you change this value, unless you're *certain* that packets will never get reordered on your network, but you expect packet loss to be quite common. > What other buffers might be causing audio playback delay? Nothing in the LIVE555 code. Note, however, that if you're using a media player (on the client side) to render and play the audio, then the media player will typically have a 'jitter buffer' that smooths out the incoming audio samples, before playing them. There's going to be some latency there - and it's unavoidable, because network jitter is unavoidable. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Fri Aug 22 01:15:16 2014 From: finlayson at live555.com (Ross Finlayson) Date: Fri, 22 Aug 2014 01:15:16 -0700 Subject: [Live-devel] live audio source with onDemandServer In-Reply-To: References: Message-ID: <35262480-DC9C-43C3-8B0D-A69BC2780532@live555.com> Please read my response to your previous question. It tells you what you're doing wrong. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry.folse at gmail.com Fri Aug 22 06:34:44 2014 From: barry.folse at gmail.com (Barry Folse) Date: Fri, 22 Aug 2014 08:34:44 -0500 Subject: [Live-devel] Trouble with multiple streams Message-ID: The purpose of my app is to generate 2 video streams from video generated by the same app. The video is from 2 different eyepoints using OpenGL to create the views. The views are generated and updated concurrently. The problem is that only one stream is valid when I use VLC to view them. In the app, I have set up a Stream class which encapsulates an input view, a thread, an RTSP server, an environment and task scheduler / event loop. When an Stream object is created/configured, it starts the thread, which creates the environment / task scheduler / event loop. When the Stream's input view is finished rendering, it sends a triggerEvent() to the Stream's task scheduler. Each Stream has a different destination address, RTP/RTCP Port, and RTSP port/URL. The second Stream's event handler never gets called by its event loop. Is it possible that one event loop is handling the triggerEvents from both Streams? If anyone has successfully created an app like this, I would really appreciate any assistance you can give. thanks, Barry -------------- next part -------------- An HTML attachment was scrubbed... URL: From seamxr at gmail.com Fri Aug 22 18:10:55 2014 From: seamxr at gmail.com (James Huang) Date: Fri, 22 Aug 2014 18:10:55 -0700 Subject: [Live-devel] How to wrap a AAC-HBR RTP source into Transport stream correctly Message-ID: Hi, I'm working on wrapping two RTP stream source (H264 and AAC-HBR ) into a MPEG Transport stream and then forward it to another 3rd party server. The source and the target are both 3rd party so I don't have control of their expectation of input and output format. The video is working after study the mail list. But I'm not able to find a correct way to convert and feed a correct source into the method MPEG2TransportStreamFromESSource::addNewAudioSource(). My code pieces looks like: MPEG2TransportStreamFromESSource *pTS = > MPEG2TransportStreamFromESSource::createNew(env); > InsertH264StartCodeFilter *pVdoFilter = > InsertH264StartCodeFilter::createNew(env, H264VideoRTPSource); > pTS->addNewVideoSource(pVdoFilter, 5); > pTS->addNewAudioSource(MPEG4GenericRTPSource, 4); > MPEG4GenericRTPSource seems not proper source for addNewAudioSource here. But I couldn't figure out how to convert it correctly. The SDP of audio part looks like: > m=audio 0 RTP/AVP 97 > b=AS:1000 > a=rtpmap:97 MPEG4-GENERIC/48000/2 > a=fmtp:97 > streamtype=5;profile-level-id=1;mode=AAC-hbr;sizelength=13;indexlength=3;indexdeltalength=3;config=1190 > > a=control:track2 > Would you provide some hint how can I convert the MPEG4GenericRTPSource into correct data which is good to feed into MPEG2TransportStreamFromESSource::addNewAudioSource()? Thank you, seamxr. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sungjoo.byun at gmail.com Sat Aug 23 03:54:57 2014 From: sungjoo.byun at gmail.com (SungJoo Byun) Date: Sat, 23 Aug 2014 19:54:57 +0900 Subject: [Live-devel] patch for qmake build Message-ID: Hi, I add some files for qmake build of live555. It is not perfect for all platform. but it is helpful for win32, linux qmake user. ( especially win32 msvc user ) Regards, -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: live.add-qmake.patch Type: application/octet-stream Size: 15751 bytes Desc: not available URL: From finlayson at live555.com Sun Aug 24 13:47:56 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 24 Aug 2014 13:47:56 -0700 Subject: [Live-devel] Trouble with multiple streams In-Reply-To: References: Message-ID: On Aug 22, 2014, at 6:34 AM, Barry Folse wrote: > The purpose of my app is to generate 2 video streams from video generated by the same app. The video is from 2 different eyepoints using OpenGL to create the views. The views are generated and updated concurrently. > > The problem is that only one stream is valid when I use VLC to view them. Before using VLC to test your server(s), you should use "testRTSPClient" or "openRTSP". (VLC is not our software.) > In the app, I have set up a Stream class which encapsulates an input view, a thread, an RTSP server, an environment and task scheduler / event loop. > > When an Stream object is created/configured, it starts the thread, which creates the environment / task scheduler / event loop. When the Stream's input view is finished rendering, it sends a triggerEvent() to the Stream's task scheduler. > > Each Stream has a different destination address, RTP/RTCP Port, and RTSP port/URL. > > The second Stream's event handler never gets called by its event loop. > > Is it possible that one event loop is handling the triggerEvents from both Streams? No, not if each thread has its own "UsageEnvironment" and "TaskScheduler" objects (and thus also its own event loop). I assume you've read http://www.live555.com/liveMedia/faq.html#threads In any case, there's probably no reason for you to use separate threads for each of your servers. You can run multiple servers (with different ports, of course) within the same (single-threaded) event loop. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Sun Aug 24 13:59:15 2014 From: finlayson at live555.com (Ross Finlayson) Date: Sun, 24 Aug 2014 13:59:15 -0700 Subject: [Live-devel] How to wrap a AAC-HBR RTP source into Transport stream correctly In-Reply-To: References: Message-ID: > pTS->addNewAudioSource(MPEG4GenericRTPSource, 4); Yes, that's what you should be doing. Unfortunately I don't know why it's not working for you... Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From heuze.seb at gmail.com Mon Aug 25 02:15:36 2014 From: heuze.seb at gmail.com (=?UTF-8?Q?S=C3=A9bastien_HEUZE?=) Date: Mon, 25 Aug 2014 11:15:36 +0200 Subject: [Live-devel] AAC RTP problem (crackling sound) Message-ID: Hello, I'm using live555 in a swift (iOS) application, and I would like to stream AAC through this phone using RTP. I have done a lot of test using MP3 and it works fine. (I'm using VLC and/or WireShark to capture the stream) But I can't use AAC. I tried to use ADTSAudioFileSource and MPEG4GenericRTPSink but I get a crackling sound instead (Noise) when I try to capture it. sessionStateAAC.source = ADTSAudioFileSource::createNew(*env, inputFileNameAAC); sessionStateAAC.sink = MPEG4GenericRTPSink::createNew(*env, sessionStateAAC.rtpGroupsock, 96, 44100, "audio", "AAC-hbr",sessionStateAAC.source->configStr(),2); How can I get rid of this problem ? Am I doing it wrong ? Thanks, we are trying to get it work since 3 (working) days now -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 26 01:28:30 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 26 Aug 2014 01:28:30 -0700 Subject: [Live-devel] AAC RTP problem (crackling sound) In-Reply-To: References: Message-ID: <89AB7D4C-AAE3-4238-983E-3B86A90FBBFE@live555.com> > But I can't use AAC. > > I tried to use ADTSAudioFileSource and MPEG4GenericRTPSink but I get a crackling sound instead (Noise) when I try to capture it. > > sessionStateAAC.source = ADTSAudioFileSource::createNew(*env, inputFileNameAAC); > Is "inputFileNameAAC" really an ADTS-format AAC file? To test this, copy it to a computer that's running the "LIVE555 Media Server" , and rename it to have a ".aac" filename extension (e.g., "test.aac"). Then, try streaming it from the "LIVE555 Media Server". If this doesn't work, then your file is not an ADTS-format AAC file (and that's your problem). Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Tue Aug 26 09:55:20 2014 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Tue, 26 Aug 2014 09:55:20 -0700 Subject: [Live-devel] High CPU usage streaming TCP and UDP with OnDemandServerMediaSubsession Message-ID: <009301cfc14e$85ade650$9109b2f0$@com> Hi Ross, I have recently updated the LIVE555 libraries I am using, ultimately to version 2014.08.23, though I originally updated to 2014.07.25 and the same problem happens. The problem is that the library is using a large amount of CPU when streaming both UDP and TCP from the same OnDemandServerMediaSubsession. I can reproduce this problem both on my embedded system (with my own setup code) and on an Ubuntu Linux machine running an almost un-modified version of testOnDemandRTSPServer (the only change is setting reuseFirstSource to 'True'). To test this yourself: 1. Modify testOnDemandRTSPServer by setting reuseFirstSource to True. 2. Start an instance of 'top' to watch the CPU usage. 3. Start one copy of openRTSP, streaming using TCP via -t. CPU usage should be very low. For me it is < 1%. 4. Start another copy of openRTSP, streaming using UDP. CPU usage should now be very high. For me it is > 90%. For my testing I was reading pre-recorded H.264 data via 'h264ESVideoTest'. I think the core problem is that RTPInterface (via StreamState->fRTCPInstance-> fRTCPInterface) is not properly handling receiving RTCP over both the TCP socket and UDP socket at the same time: 1. Some RTCP data comes in over the UDP socket. 2. The code eventually gets into RTPInterface::handleRead, which tries to read the data from the TCP socket, since fNextTCPReadStreamSocketNum is valid. This read fails with EAGAIN almost all the time, and no data is processed. 3. On the next iteration through the event loop, the UDP socket is still readable, since no data was read from it, and once again the code gets into RTPInterface::handleRead, which tries to read the data from the TCP socket. This results in the UDP RTCP socket being kept in a continuously readable state, which then causes the event loop to spin. I dug through the changes for the previous months and discovered that a change to RTPInterface::handleRead in the 2014.03.25 version causes this problem to occur. The attached patch undoes the change and fixes this problem. Applying this patch might cause other problems though, since I think you removed it for the following reason given in the change log for 2014.03.25: "- Fixed an issue in the "RTPInterface" code that could cause "SetSpecificRRHandler()" to not work properly when RTP/RTCP is being carried over TCP." Thanks for your time, and please let me know if you would like me to test anything else or send additional data. Thanks, Chris Richardson WTI -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: RTPInterface.patch Type: application/octet-stream Size: 361 bytes Desc: not available URL: From Kenneth.Forsythe at activu.com Tue Aug 26 13:33:35 2014 From: Kenneth.Forsythe at activu.com (Kenneth Forsythe) Date: Tue, 26 Aug 2014 16:33:35 -0400 Subject: [Live-devel] DelayQueue infinite loop Message-ID: Hi, DelayQueue::synchronize appears to be stuck in that while loop. It appears curEntry->fDeltatTimeRemaning is always (0,0) and therefor timeSinceLastSync is always higher. This only happens when I am hosting the libraries within a COM DLL. What is recommended? Can I just detect this scenario and break the loop? This is in DelayQueue.cpp at DelayQueue::synchronize Loop starts at line 214. DelayQueueEntry* curEntry = head(); while (timeSinceLastSync >= curEntry->fDeltaTimeRemaining) { timeSinceLastSync -= curEntry->fDeltaTimeRemaining; curEntry->fDeltaTimeRemaining = DELAY_ZERO; curEntry = curEntry->fNext; } Thanks, From finlayson at live555.com Tue Aug 26 14:22:31 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 26 Aug 2014 14:22:31 -0700 Subject: [Live-devel] DelayQueue infinite loop In-Reply-To: References: Message-ID: <855CEA60-2C66-4272-9628-722C8D8FFFAE@live555.com> > DelayQueue::synchronize appears to be stuck in that while loop. It appears curEntry->fDeltatTimeRemaning is always (0,0) No, that shouldn't happen. See below. > This is in DelayQueue.cpp at DelayQueue::synchronize > Loop starts at line 214. > > > DelayQueueEntry* curEntry = head(); > > while (timeSinceLastSync >= curEntry->fDeltaTimeRemaining) { > > timeSinceLastSync -= curEntry->fDeltaTimeRemaining; > > curEntry->fDeltaTimeRemaining = DELAY_ZERO; > > curEntry = curEntry->fNext; > > } Note that "curEntry" is set to the next entry in the queue, before continuing the loop. Also, each delay queue is set up so that the last entry in the queue has a delay ("fDeltaTimeRemaining") of 'eternity' (in reality, 68 years). So the condition "timeSinceLastSync >= curEntry->fDeltaTimeRemaining" is never true for the last entry in the queue, and so the "while()" loop will always terminate. So, something else must be wrong in your system. (I hope you don't have more than one thread trying to use the same UsageEnvironment & TashScheduler?) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Tue Aug 26 14:37:39 2014 From: finlayson at live555.com (Ross Finlayson) Date: Tue, 26 Aug 2014 14:37:39 -0700 Subject: [Live-devel] High CPU usage streaming TCP and UDP with OnDemandServerMediaSubsession In-Reply-To: <009301cfc14e$85ade650$9109b2f0$@com> References: <009301cfc14e$85ade650$9109b2f0$@com> Message-ID: <137AD59F-E927-49E6-9FD2-AFE940E42D29@live555.com> Chris, Thanks for the note. Yes, your analysis of the problem was correct. I've just installed a new version - 2014.08.26 - of the "LIVE555 Streaming Media" code that should fix this. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris at gotowti.com Tue Aug 26 17:00:03 2014 From: chris at gotowti.com (Chris Richardson (WTI)) Date: Tue, 26 Aug 2014 17:00:03 -0700 Subject: [Live-devel] High CPU usage streaming TCP and UDP with OnDemandServerMediaSubsession In-Reply-To: <137AD59F-E927-49E6-9FD2-AFE940E42D29@live555.com> References: <009301cfc14e$85ade650$9109b2f0$@com> <137AD59F-E927-49E6-9FD2-AFE940E42D29@live555.com> Message-ID: <00c801cfc189$dafc1e70$90f45b50$@com> Hi Ross, > Thanks for the note. ?Yes, your analysis of the problem was correct. ?I've just installed a new version -?2014.08.26 - of the "LIVE555 Streaming Media" code that should fix this. Thanks for fixing this so quickly. I have tested the new version and everything is working now. Thanks, Chris Richardson WTI From Kenneth.Forsythe at activu.com Thu Aug 28 05:01:03 2014 From: Kenneth.Forsythe at activu.com (Kenneth Forsythe) Date: Thu, 28 Aug 2014 08:01:03 -0400 Subject: [Live-devel] DelayQueue infinite loop In-Reply-To: <855CEA60-2C66-4272-9628-722C8D8FFFAE@live555.com> References: <855CEA60-2C66-4272-9628-722C8D8FFFAE@live555.com> Message-ID: Hi Ross, No sir, not using any threading yet. Because it is stuck in the while loop, the call BasicTaskScheduler::createNew() is not even returning. It works in the testRTSPClient program and in my own test program, but not when in this COM library that is being hosted in another program. I'll have to dig deeper to see if I can get more info. Thanks, From: live-devel [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson Sent: Tuesday, August 26, 2014 5:23 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] DelayQueue infinite loop DelayQueue::synchronize appears to be stuck in that while loop. It appears curEntry->fDeltatTimeRemaning is always (0,0) No, that shouldn't happen. See below. This is in DelayQueue.cpp at DelayQueue::synchronize Loop starts at line 214. DelayQueueEntry* curEntry = head(); while (timeSinceLastSync >= curEntry->fDeltaTimeRemaining) { timeSinceLastSync -= curEntry->fDeltaTimeRemaining; curEntry->fDeltaTimeRemaining = DELAY_ZERO; curEntry = curEntry->fNext; } Note that "curEntry" is set to the next entry in the queue, before continuing the loop. Also, each delay queue is set up so that the last entry in the queue has a delay ("fDeltaTimeRemaining") of 'eternity' (in reality, 68 years). So the condition "timeSinceLastSync >= curEntry->fDeltaTimeRemaining" is never true for the last entry in the queue, and so the "while()" loop will always terminate. So, something else must be wrong in your system. (I hope you don't have more than one thread trying to use the same UsageEnvironment & TashScheduler?) Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From finlayson at live555.com Thu Aug 28 06:54:36 2014 From: finlayson at live555.com (Ross Finlayson) Date: Thu, 28 Aug 2014 06:54:36 -0700 Subject: [Live-devel] DelayQueue infinite loop In-Reply-To: References: <855CEA60-2C66-4272-9628-722C8D8FFFAE@live555.com> Message-ID: <5C794145-EBB1-4451-AE1A-264EF29C8DCE@live555.com> > No sir, not using any threading yet. Because it is stuck in the while loop, the call BasicTaskScheduler::createNew() is not even returning. > It works in the testRTSPClient program and in my own test program, but not when in this COM library that is being hosted in another program. I suspect that what's happening is that, when the code is used in your 'COM' library, various global variables are not being initialized properly - i.e., their class constructors are not being called. If that's the case, then it's definitely a bug in your system somewhere - perhaps in the way that you're compiling or linking the code for use in a 'COM' library. Ross Finlayson Live Networks, Inc. http://www.live555.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rglobisch at csir.co.za Fri Aug 29 00:01:26 2014 From: rglobisch at csir.co.za (Ralf Globisch) Date: Fri, 29 Aug 2014 09:01:26 +0200 Subject: [Live-devel] Live555 liveMedia wrapped as into a DirectShow filter In-Reply-To: <4A5ED6E2-DF62-4B49-8ECB-7FE95BA9253A@live555.com> References: <4A5ED6E2-DF62-4B49-8ECB-7FE95BA9253A@live555.com> Message-ID: <540041660200004D000AEDBB@pta-emo.csir.co.za> We have released a DirectShow RTSP source filter based on live555 at http://sourceforge.net/projects/videoprocessing/.It has support for limited media types (H.264, AMR, AAC, LATM, MP3, PCM 8/16-bit), and you're welcome to make contributions to support other ones. >>> Ross Finlayson 08/21/14 5:25 PM >>> > We are currently developing a streaming player application that utilizes DirectShow filters. We were interested in wrapping liveMedia into a filter to be able to plug in as a possible RTSP client source filter. Have you done this or could you help us with that? We don't provide any explicit support for DirectShow filters, because they are specific to just one OS platform (and a waning one at that). However, several people on this mailing list have developed their own DirectShow filter that use our RTSP client code. Perhaps some of them would like to contribute tips on how to do this? Ross Finlayson Live Networks, Inc. http://www.live555.com/ -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. Please consider the environment before printing this email. -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. Please consider the environment before printing this email. From seamxr at gmail.com Fri Aug 29 15:11:12 2014 From: seamxr at gmail.com (James Huang) Date: Fri, 29 Aug 2014 15:11:12 -0700 Subject: [Live-devel] The way to muxing H264 and AAC RTP into a MPEG Transport stream Message-ID: Hi, After weeks of time spent, now my module finally can muxing H264 and AAC RTP streams into a MPEG-TS. Here write down the "big picture" for someone new at this onto the track quickly. Objective: Create a relay program which works as a RTSP client, consumes the video and audio RTP sessions, muxing them into a MPEG-TS and then push to somewhere network you need. To-do: 1. The implementation can start on the testRTSPCclient 2. For incoming H264 RTP session, firstly, you will need to implement a filter class to insert H264 Start code in the beginning of each frame. The work will simply insert 4 bytes (0x00, 0x00, 0x00, 0x01) in the beginning of each frame. 3. For incoming AAC RTP session, firstly, you will need to implement a filter class to insert ADTS header in front of each AAC packet. ADTS header just several bytes and the format can be find on internet easily, mostly bit-wise operations according to your environment. 4. In the continueAfterSETUP(), feed the video session's FrameSource (this will be a H264VideoRTPSource) into your H264 StartCode filter, and then feed the H264 StartCode filter into a MPEG2TransportStreamFromESSource by it's addNewVideoSource, use 5 as the mpegVersion. 5. In the continueAfterSETUP(), feed the audio session's FrameSource (this will be a MPEG4GeneticRTPSource) into your ADTS header filter, and then feed the ADTS header filter into the same MPEG2TransportStreamFromESSource by it's addNewAudioSource, use 4 as the mpegVersion. 6. After the 2 session setup completed, call startPlaying of a SimpleRTPSink with the MPEG2TransportStreamFromESSource. Hope this helps. James. -------------- next part -------------- An HTML attachment was scrubbed... URL: