[Live-devel] Need help: RTSP Stream -> video render
Brad O'Hearne
brado at bighillsoftware.com
Mon Feb 27 15:25:53 PST 2012
Barry,
Thank you very much for your reply. What you have spoken to here is exactly the model that I have followed, but the crux of the issue I am trying to solve involves exactly what you are touching on. In short what I have is:
H.264 video -> RTSP -> RTSPClient -> MediaSink subclass
exactly as you have described. My present challenge is properly grabbing the received data in the afterGettingFrame() function, and feeding to the ffmpeg avcodec. Specifically, I am trying to make sure that the data received from Live555 and handed to ffmpeg is the proper format and quantity. In other words, the ffmpeg avcodec is seeking a packet (or more properly put, an "AVPacket" according to their parlance. I know this isn't an ffmpeg discussion list, so my purpose isn't to discuss ffmpeg, but I do need to gather and construct (if necessary) the received data to construct this packet.
My questions are:
1. Is such a packet the payload within the afterGettingFrame() function, and if so, how do I grab it?
2. If the answer to 1 is no, then I need some any pointers on how to construct it.
I do indeed currently have adapted code similar to the testRTSPClient example and can receive RTSP data successfully. However, what I'm seeing is that when the afterGettingFrame() function is getting called, I am seeing a cycle of data received....2 calls with a 9 and 5 bytes respectively, followed by the receipt of about 23K+/- of data. (see the attached log). I am suspecting that these aren't all full packets.
Any insight you can give would be greatly appreciated, thx!
Brad
Brad O'Hearne
Founder / Lead Developer
Big Hill Software LLC
http://www.bighillsoftware.com
On Feb 27, 2012, at 3:41 PM, Barry Stump wrote:
> I am working on an iOS project similar to what you describe: H.264 video + AAC audio with Live555 and FFmpeg handling RTSP and video decoding (respectively). I recommend basing your work on testRTSPClient.cpp, with an Objective-C++ wrapper class as the interface between the rest of your code and the C++ Live555 library. You will need to create one or more custom subclasses of MediaSink to handle the media streams, with the data payload being delivered in the afterGettingFrame() functions. For AAC, an RTSP frame of data is equivalent to Core Audio's concept of an audio "packet" not an audio "frame". On the video side, at a minimum, you will need to add the MPEG start code 0x00000001 to each frame before you hand it off to FFmpeg to decode. See the H264VideoFileSink.cpp file for an example of this.
>
> The details of using FFmpeg inside iOS are beyond the scope of this list, but the Dropcam source code may be of help to you. Note that it also uses Live555 for RTSP handling, but uses the deprecated synchronous methods. You are better off following the testRTSPClient example for Live555.
>
> https://github.com/dropcam/dropcam_for_iphone
>
> -Barry
>
>
> On Fri, Feb 24, 2012 at 4:38 PM, Brad O'Hearne <brado at bighillsoftware.com> wrote:
> Hello,
>
> I am reaching out to anyone out there who would be willing to give me some guidance with some issues I'm running into getting the Live555 library integrated into an app I'm writing. My use case is pretty simple to understand:
>
> I am writing a mobile app (on both iOS and Android, but for the purposes of discussion here, I am addressing iOS/Objective C first), and I need to consume an RTSP stream over the network, and render the stream to the device's screen. The stream is H.264. I am developing on and targeting only iOS 5. I cannot use built-in video-playback capabilities in iOS because they support HTTP LIVE Streaming but don't support RTSP; and in addition, it would seem that the iOS API controls the source endpoints in AVFoundation, so injecting there is not an option. In this use case, I need real-time video -- minimization of latency is paramount (i.e. I do not want latency due to buffering -- I'd rather drop frames than buffer).
>
> I seem to have gotten Live555 compiled properly on iOS 5 (to the best of my knowledge), and linked in with an iOS5 app. I can even run one of the sample clients and pull this RTSP stream, which outputs info to the console using a DummySink. I also have compiled ffmpeg on iOS 5, and have that linked into my project in Xcode. So I have both Live555 and ffmpeg libraries in my Xcode project now.
>
> What I need to do is take the data received over RTSP, decode the H.264 video, and then output it to the screen. It would seem that this wouldn't be too utterly terrible. However, referencing some of these libraries / headers inside Xcode, and trying to move some of this code around into a more Objective-C friendly fashion is giving me fits.
>
> If there is anyone out there familiar with using Live555 on iOS, or anyone who can give guidance here, I would very much appreciate it. Please feel free to reply here, or contact me offline as well at brado at bighillsoftware.com.
>
> Regards,
>
> Brad
>
> Brad O'Hearne
> Founder / Lead Developer
> Big Hill Software LLC
> http://www.bighillsoftware.com
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20120227/edfa72e3/attachment-0002.html>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: data.txt
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20120227/edfa72e3/attachment-0001.txt>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20120227/edfa72e3/attachment-0003.html>
More information about the live-devel
mailing list