[Live-devel] displaying RTSP stream directly

Diez B. Roggisch deets at web.de
Fri Aug 20 12:08:57 PDT 2010


Hi all,

this is most probably a stupid question, but I've been banging my head  
on this for several days now, to no avail so far.

My general scenario is this:

  - grab an  MP4V-ES/90000 stream from a FLIR Infrared Camera (the  
camera works, grabbing RTSP from it using VLC has no problems)
  - decode the data
  - display it together with additional information

All this takes place under OSX.

Now what I have so far is this:

  - A python twisted based RTSP-client which gets RTP-frames from the  
camera. The concatenation of these leads to a file which the "file"- 
command describes as

  /tmp/camera.output: MPEG sequence, v4, video, simple @ L3

  - Pretty much the same program, modeled after openRTSP, in  
ObjectiveC/Cocoa/C++ using live555 (thus I ask here) which produces  
the same data-stream.

This data is unfortunately *not* decodable by VLC (which I want to use  
for testing only anyway). But what I'm really after is to decode the  
data to get the raw pixels, then show these inside a NSView.

Now I'm totally lost here. In my dream-world, I'd be feeding the  
returned data to a MP4-decoder, which in turn yields a raw image file  
every now and then. Apparently, things aren't that easy.

I've scratched my head looking at QuickTimeFileSink, but I guess much  
of it's > 2K lines of complexity comes from producing a QuickTime-mov- 
container file or some such.  If I absolutely *have* to use that, I'm  
somehow using that. But of course I want this to be piped/filtered,  
*not* dumped on the drive & then read again.

Any suggestions?

Thanks,

Diez


More information about the live-devel mailing list