[Live-devel] Streaming Relay
Wissam
wissam.l at free.fr
Sat Oct 8 13:37:22 PDT 2005
Hi all,
Any help for this please?
Thanks & Regards,
Wissam
> -----Message d'origine-----
> De : live-devel-bounces at ns.live555.com [mailto:live-devel-
> bounces at ns.live555.com] De la part de wissam.l at free.fr
> Envoyé : vendredi 7 octobre 2005 15:15
> À : LIVE555 Streaming Media - development & use
> Objet : Re: [Live-devel] Streaming Relay
>
> Hi Ross,
>
> First of all, thank you for your answer.
>
> > >I would like to create a unicast streaming relay called Relay that
> launches
> > 2
> > >threads
> >
> > No you don't. See <http://www.live555.com/liveMedia/faq.html#threads>
>
> I removed all the threads in my program and I used an unique class called
> relay
> where I defined the same UsageEnvironment and TaskScheduler for all my
> objects:
>
> I defined a global variable:
> FramedSource* fVideoSource;
> In relay I do the following:
>
> setupStreams();
> iter.reset();
> madeProgress = False;
> while ((subsession = iter.next()) != NULL) {
> if (subsession->readSource() == NULL) continue;
> setVideoSource(subsession->readSource());
> madeProgress = True;
> break;
> }
> startPlayingStreams();
> launchServer(getVideoSource());
>
> return 0;
> *****************************************************************
> void * Relay::setVideoSource(FramedSource * source){
> fVideoSource=source;
> }
>
> FramedSource * Relay::getVideoSource(){
> return fVideoSource;
> }
> Then just after I got the reference on that VideoSource, I used the code
> of the
> testOnDemand to use it like described in my previous email.
>
> It is what the method launchServer(getVideoSource()) defined in the same
> class
> Relay is supposed to do.
>
> void * launchServer(FramedSource* source) {
> {
> char const* streamName = "x";
> ServerMediaSession* sms
> = ServerMediaSession::createNew(*env, streamName, streamName,
> descriptionString);
> *env << sms->addSubsession(MPEG4ESVideoServerMediaSubsession
> ::createNew(*env, reuseFirstSource,source));
> rtspServer->addServerMediaSession(sms);
> *env << "running...";
> announceStream(rtspServer, sms, streamName);
> }
>
> env->taskScheduler().doEventLoop();
>
>
> I am able to set up a RTSP session with the encoder, I succeed to create a
> media
> session, a media subsession. When I request the relayed stream in question
> using
> a VLC, I am able to get back the SDP.
>
> It still though giving me exactly the same error:
>
> parseRequestString() returned cmdName "PLAY", urlPreSuffix "", urlSuffix
> "x"
>
> >> fOurServerMediaSession(0x80caca0)
> PLAY rtsp://192.165.2.234:8556/x RTSP/1.0
> CSeq: 4
> Session: 1
> Range: npt=0.000-
> User-Agent: VLC Media Player (LIVE.COM Streaming Media v2005.07.19)
>
> [ 260 ] startStream()- OnDemandServerMediaSubsession.cpp : here we start
> the
> stream [ 72 ] startPlaying()- MediaSink.cpp : here[ 83 ] startPlaying()-
> MediaSink.cpp : here[ 68 ] getNextFrame()- FramedSource.cpp :
> fIsCurrentlyAwaitingData (nil)
> [ 69 ] getNextFrame()- FramedSource.cpp : attempting to get the next
> frame[ 68
> ] getNextFrame()- FramedSource.cpp : fIsCurrentlyAwaitingData (nil)
> [ 69 ] getNextFrame()- FramedSource.cpp : attempting to get the next
> frame[ 68
> ] getNextFrame()- FramedSource.cpp : fIsCurrentlyAwaitingData 0x1
> FramedSource[0x80c9c78]::getNextFrame(): attempting to read more than once
> at
> the same time!
> [ 69 ] getNextFrame()- FramedSource.cpp : attempting to get the next
> frame.
>
>
> It seems definitely that the threads I am using are not the origin of the
> problems.
>
> - Would you see please why am not able to read my source ?
> - Any alternative solution to do an unicast relay for live streams with
> RTSP
> signalling files would be with a great help.
>
> Thanks a lot for your attention.
>
>
>
>
> Selon wissam.l at free.fr:
>
> > Hi,
> >
> > I would like to create a unicast streaming relay called Relay that
> launches 2
> > threads: one thread for a relayClient and another one for a relayServer.
> >
> > I do have an streamer (a testOnDemandRTSPServer) that streams m4v files
> on
> > demand. My relayClient would be the equivalent of the openRTSP client
> and the
> > playcommon merged , and my relayServer would be the
> testOnDemandRTSPServer
> > modified in a way to support live streaming; my purpose is to get the
> live
> > stream from the streamer with my relayClient and stream it again with my
> > relayServer:
> >
> > So basically what I do in my client is that I get the reference to
> > freadSource
> > defined in the object MediaSession in this way:
> >
> > ***********************MediaSession**********************************
> > } else if (strcmp(fCodecName, "MP4V-ES") == 0) {
> > fReadSource = fRTPSource
> > = MPEG4ESVideoRTPSource::createNew(env(), fRTPSocket,
> > fRTPPayloadFormat,
> > fRTPTimestampFrequency);
> > ***********************MediaSession**********************************
> >
> > ***********************relayClient start
> **********************************
> > setupStreams();
> > iter.reset();
> > madeProgress = False;
> > while ((subsession = iter.next()) != NULL) {
> > if (subsession->readSource() == NULL) continue;
> >
> > setVideoSource(subsession->readSource());
> > //to pass the reference to the relayserver.
> >
> > madeProgress = True;
> > break;
> > }
> > if (!madeProgress) shutdown();
> > //Note that for the moment I have only one video subsession.
> >
> > void * relayClient::setVideoSource(FramedSource * source){
> > fVideoSource=source;
> > }
> >
> > FramedSource * relayClient::getVideoSource(){
> > return fVideoSource;
> > }
> > ***********************relayClient end
> **********************************
> >
> >
> > In Relay:
> > - Client is relayClient.
> > - Server is relayServer.
> >
> > Launch the thread relayClient then I pass the reference freadSource to
> the
> > relayServer server:
> >
> >
> > server->setVideoSource(shr->client->getVideoSource());
> > server->launchServer(shr->env_server);
> > And I launch the thread of the serve later on.
> >
> > So definetly I do have the fVideoSource reference in my relayServer as
> > following
> >
> > {
> > char const* streamName = "x";
> >
> > ServerMediaSession* sms
> > = ServerMediaSession::createNew(*env1, streamName, streamName,
> > descriptionString);
> >
> > *env1 << sms->addSubsession(MPEG4ESVideoServerMediaSubsession
> > ::createNew(*env1,
> reuseFirstSource,fVideoSource));
> > rtspServer->addServerMediaSession(sms);
> > *env1 << "running...";
> >
> > announceStream(rtspServer, sms, streamName);
> > }
> >
> > env1->taskScheduler().doEventLoop(); // does not return
> > }
> >
> > So then I defined my own ServerSubsession called
> > MPEG4ESVideoServerMediaSubsession where I can inject the fVideoSource
> > unstead
> > of the filename... (I got inspired from what have been done in
> > MPEG4ESVideoFileServerMediaSubsession.)
> >
> > Later on in the method createNewStreamSource I used the same reference
> to the
> > fVideoSource unstead of creating a ByteStreamFileSource* fileSource
> >
> >
> > FramedSource* MPEG4ESVideoServerMediaSubsession
> > ::createNewStreamSource(unsigned /*clientSessionId*/, unsigned&
> estBitrate)
> > {
> > estBitrate = 500;
> > return MPEG4VideoStreamFramer::createNew(envir(), fVideoSource);
> > }
> >
> > At the end when I requested the stream in question on the appropriate
> > announced
> > address and port, the framed source blocks and says your are trying to
> read
> > twice on the same source
I guess this comes from the method
> getNextframe in
> > FramedSource:
> >
> > if (fIsCurrentlyAwaitingData) {
> > envir() << "FramedSource[" << this << "]::getNextFrame(): attempting
> to
> > read
> > more than once at the same time!\n";
> > exit(1);
> > }
> >
> > But what would that mean ?
> >
> > The curious thing is that the dummy source is played by the dummy sink
> and I
> > can
> > get the SDP lines correctly .
> >
> > Knowing that if I pass the same reference fVideoSource to the code of
> the
> > testMPEG4VideoStreamer as a relayServer in the method play:
> >
> > void play() {
> >
> > debug(">> relayServer::setVideoSource(%p)\n",fVideoSource);
> > videoSource = MPEG4VideoStreamFramer::createNew(*env1, fVideoSource);
> > *env1 << "Beginning to play...\n";
> > videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
> > }
> >
> > I dont have any problem playing it except that the
> > PassiveServerMediaSubsession
> > is not a good solution for me.
> >
> > I would like to know if I need really in this case to write a
> DeviceSource
> > class
> > ?
> >
> > Do I miss something in the method createNewStreamSource?
> >
> > Any help is welcome.
> >
> > Best regards.
> >
> > _______________________________________________
> > live-devel mailing list
> > live-devel at lists.live555.com
> > http://lists.live555.com/mailman/listinfo/live-devel
> >
>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
More information about the live-devel
mailing list