[Live-devel] Audio+Video streams OpenRTSP to FFMPEG
Marcin
marcin at speed666.info
Mon Oct 13 10:04:03 PDT 2014
Hi,
You cannot do this using pipe. You need to pass both streams to
libavcodec libs as separate streams (synchornized). Then you can encode
them and MUX them into FLV mux to pass it to RTMP server.
Marcin
W dniu 2014-10-13 18:29, Muhammad Ali pisze:
> My objective is to use OpenRTSP to receive audio + video stream from
> IP camera and pass it on to FFMPEG which can then stream it to an RTMP
> server.
>
> I've been using pipe to send stdout to ffmpeg (pipe input source) but
> that was only a video stream (OpenRTSP -v flag). Now the requirement
> has come to stream both the audio and video streams. So ofcourse I
> tried to replace -v with -4 and obviously it failed as they are two
> separate streams and not a single elementary stream. Am i correct ?
>
> So now, what will be the correct way to achieve my objective. I myself
> am a developer and am not shy to code but I prefer to use something
> that is already around (if there is).
>
> --
> Muhammad Ali
> And Or Logic
> www.andorlogic.com <http://www.andorlogic.com>
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.live555.com/pipermail/live-devel/attachments/20141013/8dd55418/attachment.html>
More information about the live-devel
mailing list