[Live-devel] Need help: RTSP Stream -> video render

Barry Stump bstump at codemass.com
Mon Feb 27 14:41:11 PST 2012


I am working on an iOS project similar to what you describe: H.264 video +
AAC audio with Live555 and FFmpeg handling RTSP and video decoding
(respectively).  I recommend basing your work on testRTSPClient.cpp, with
an Objective-C++ wrapper class as the interface between the rest of your
code and the C++ Live555 library.  You will need to create one or more
custom subclasses of MediaSink to handle the media streams, with the data
payload being delivered in the afterGettingFrame() functions.  For AAC, an
RTSP frame of data is equivalent to Core Audio's concept of an audio
"packet" not an audio "frame".  On the video side, at a minimum, you will
need to add the MPEG start code 0x00000001 to each frame before you hand it
off to FFmpeg to decode.  See the H264VideoFileSink.cpp file for an example
of this.

The details of using FFmpeg inside iOS are beyond the scope of this list,
but the Dropcam source code may be of help to you.  Note that it also uses
Live555 for RTSP handling, but uses the deprecated synchronous methods.
 You are better off following the testRTSPClient example for Live555.

https://github.com/dropcam/dropcam_for_iphone

-Barry

On Mon, Feb 27, 2012 at 2:39 PM, Barry Stump <barry.stump at gmail.com> wrote:

> I am working on an iOS project similar to what you describe: H.264 video +
> AAC audio with Live555 and FFmpeg handling RTSP and video decoding
> (respectively).  I recommend basing your work on testRTSPClient.cpp, with
> an Objective-C++ wrapper class as the interface between the rest of your
> code and the C++ Live555 library.  You will need to create one or more
> custom subclasses of MediaSink to handle the media streams, with the data
> payload being delivered in the afterGettingFrame() functions.  For AAC, an
> RTSP frame of data is equivalent to Core Audio's concept of an audio
> "packet" not an audio "frame".  On the video side, at a minimum, you will
> need to add the MPEG start code 0x00000001 to each frame before you hand it
> off to FFmpeg to decode.  See the H264VideoFileSink.cpp file for an example
> of this.
>
> The details of using FFmpeg inside iOS are beyond the scope of this list,
> but the Dropcam source code may be of help to you.  Note that it also uses
> Live555 for RTSP handling, but uses the deprecated synchronous methods.
>  You are better off following the testRTSPClient example for Live555.
>
> https://github.com/dropcam/dropcam_for_iphone
>
> -Barry
>
>
> On Fri, Feb 24, 2012 at 4:38 PM, Brad O'Hearne <brado at bighillsoftware.com>wrote:
>
>> Hello,
>>
>> I am reaching out to anyone out there who would be willing to give me
>> some guidance with some issues I'm running into getting the Live555 library
>> integrated into an app I'm writing. My use case is pretty simple to
>> understand:
>>
>> I am writing a mobile app (on both iOS and Android, but for the purposes
>> of discussion here, I am addressing iOS/Objective C first), and I need to
>> consume an RTSP stream over the network, and render the stream to the
>> device's screen. The stream is H.264. I am developing on and targeting only
>> iOS 5. I cannot use built-in video-playback capabilities in iOS because
>> they support HTTP LIVE Streaming but don't support RTSP; and in addition,
>> it would seem that the iOS API controls the source endpoints in
>> AVFoundation, so injecting there is not an option. In this use case, I need
>> real-time video -- minimization of latency is paramount (i.e. I do not want
>> latency due to buffering -- I'd rather drop frames than buffer).
>>
>> I seem to have gotten Live555 compiled properly on iOS 5 (to the best of
>> my knowledge), and linked in with an iOS5 app. I can even run one of the
>> sample clients and pull this RTSP stream, which outputs info to the console
>> using a DummySink. I also have compiled ffmpeg on iOS 5, and have that
>> linked into my project in Xcode. So I have both Live555 and ffmpeg
>> libraries in my Xcode project now.
>>
>> What I need to do is take the data received over RTSP, decode the H.264
>> video, and then output it to the screen. It would seem that this wouldn't
>> be too utterly terrible. However, referencing some of these libraries /
>> headers inside Xcode, and trying to move some of this code around into a
>> more Objective-C friendly fashion is giving me fits.
>>
>> If there is anyone out there familiar with using Live555 on iOS, or
>> anyone who can give guidance here, I would very much appreciate it. Please
>> feel free to reply here, or contact me offline as well at
>> brado at bighillsoftware.com.
>>
>> Regards,
>>
>> Brad
>>
>> Brad O'Hearne
>> Founder / Lead Developer
>> Big Hill Software LLC
>> http://www.bighillsoftware.com
>>
>>
>> _______________________________________________
>> live-devel mailing list
>> live-devel at lists.live555.com
>> http://lists.live555.com/mailman/listinfo/live-devel
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20120227/ae6b0668/attachment.html>


More information about the live-devel mailing list