[Live-devel] Add an audio sub-session makes the video stop
Fabrice Triboix
ftriboix at falcon-one.com
Mon Dec 12 13:49:16 PST 2011
Hello everyone,
I hope somebody could point me to some directions where to investigate.
I am trying to modify an existing RTSP server based on live555. It
streams live video without problems, and I have to add live audio
sub-streams for each video streams.
As a first step, I wanted to stream an MP3 file, so I created an audio
source class based on the "DeviceSource" template. Every time the
doGetNextFrame() function is called, I read 10000 bytes from the file
and update the data members of the class accordingly, and then call the
static FramedSource::afterGetting() method.
I normally connected the output of my class to an
"MPEG1or2AudioStreamFramer" by calling
CMySource* src = CMySource::createNew(...)
MPEG1or2AudioStreamFramer::createNew(envir(), src)
In the existing code, there is also a class based on
OnDemandServerMediaSubsession, let's call it CMySubsession. It implements:
- createNewStreamSource(): which returns a MPEG1or2AudioStreamFramer*
created as above
- createNewRTPSink(): which returns an MPEG1or2AudioRTPSink
Now, if I don't add the audio sub-session, the video plays fine with
VLC. If I add the audio substream, the destination address in the
"groupsock" for the video stream stays at 0.0.0.0, and thus nothing is
sent. I also noticed on the VLC side that no SDP is sent and the server
closes the RTSP TCP connection after about 10 seconds.
The MP3 file looks OK, I can stream it using testMP3Streamer.
Thanks a lot for any help!
Best regards,
Fabrice
More information about the live-devel
mailing list