[Live-devel] Live stream from network camera to streaming server object
Vadim
vadimp at pochta.ru
Thu May 15 08:32:33 PDT 2008
Hi,
I want to develop small application that performs a kind of "advanced relay".
I have network camera with hardware MPEG4ES encoder.
Running openRTSP to it works perfect.
The final idea is to perform a kind of stream relay from the remote
network camera device to a local network.
After such "relaying", the stream will be accessible from the local
network using multicast server object.
I use openRTSP sample as an "input module" of the application, and
testMPEG4VideoStreamer as an "output module".
The
final idea is to have a possibility to connect to multicast RTSP source
located in local network by reusing already received stream from remote
camera.
<--------WEB
--------><------------------------------------------------Local
Network, Multicast
------------------------------------------------------>
CAMERA
------------ >RTSPClient ------?????------> RTSPServer
(Multicast) ------------------> Local Network Client Player
------------------> Local Network Client Player
------------------> Local Network Client Player
Q` 1:
Is this the right way to achieve this? I mean, am I using the right LM
objects and flow?
After
deep scanning of mail list, I've found a number of explanations close
to testMPEG4VideoStreamer sample, where the actual stream comes from
some kind of FramedSouce (device or file).
How
can I get this object (or, more exaclty type of MediaSource -
*videoSource) in order
to perform the next call, as described in testMPEG4VideoStreamer
sample:
videoSink->startPlaying(*videoSource,
afterPlaying, videoSink);
that will add the streaming source (input) to server object(output)?
Q` 2: (in case Q`1 is OK)
What
is the the right way to get MediaSource/FramedSource object from
existing RTSPClient connection to remote network camera (openRTSP code flow)?
Best Regards,
Vadim Punski
More information about the live-devel
mailing list