[Live-devel] Once again, SDP support for Live555 & interaction with FFMpeg ...
Ross Finlayson
finlayson at live555.com
Wed Sep 2 19:07:51 PDT 2009
>We have a multicast stream, initiated from another process, which
>generates an .SDP file for the stream
>
>v=0
>o=MangoDSP 126 14843616424497153183 IN IP4 192.168.0.62
>s=Mango DSP Audio/Video
>m=video 7170 RTP/AVP 96
>a=rtpmap:96 H264/90000
>c=IN IP4 232.0.0.11
>a=fmtp:96
>packetization-mode=1;profile-level-id=42001E;sprop-parameter-sets=Z0KAHkLaAtD0QA==,aEjgGody
>a=control:__StreamID=270385256
>
>We've searched through the Live555 archives and there is mention of
>passing this to MediaSession::createNew(), but we're sort of stumped
>on making this a reality.
First, create a "MediaSession" object, by calling
"MediaSession::createNew()", with the SDP description (string) as
parameter.
Then, go through each of this object's 'subsessions' (in this case,
there'll be just one, for "video"), and call
"MediaSubsession::initiate()" on it.
Then, you can create an appropriate 'sink' object (e.g.,
encapsulating your decoder), and then call "startPlaying()" on it,
passing the subsession's "readSource()" as parameter.
See the "openRTSP" code (specifically, "testProgs/playCommon.cpp")
for an example of how this is done.
>Ultimately our plan is to pass the stream onto FFMpeg for
>decoding/storage. When we try to open this stream directly with
>FFMpeg it takes forever (and often never) for the program to lock
>onto the stream, so perhaps there is a problem with ffmpeg and this
>h264 stream. But, when we open a unicast stream with openRTSP and
>pipe the output to ffmpeg it works fine.
The problem here is probably that - when you tune into an ongoing
multicast stream - you're missing the special PPS and SPS NAL units
that normally appear at the start of the stream. (In contrast, when
you stream a unicast stream, you get the whole stream, starting from
the beginning, and so will get the special PPS and SPS NAL units.)
To overcome this, you will need to decode the PPS and SPS NAL unit
data from the SDP description, and insert these at the front of the
stream that you pass to your decoder. Specifically, you call
"MediaSubsession:: fmtp_spropparametersets()" on your 'subsession'
object, to get the appropriate configuration string, and then decode
this string by calling "parseSPropParameterSets()". (See
"liveMedia/include/H264VideoRTPSource.hh".)
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list