<!doctype html public "-//W3C//DTD W3 HTML//EN">
<html><head><style type="text/css"><!--
blockquote, dl, ul, ol, li { padding-top: 0 ; padding-bottom: 0 }
--></style><title>Re: [Live-devel] Some questions need your
help!</title></head><body>
<blockquote type="cite" cite><font face="Arial">1. I want to write an
application. I hope it can receive and send video streaming at the
same time. In that case, should I open two BasicTaskSchedulers and two
UsageEnvironments? Or, just only one BasicTaskScheduler and one
UsageEnvironment is enough?</font></blockquote>
<div><br></div>
<div>You can probably do what you want using just a single event
loop. I.e., one "UsageEnvironment" and one
"BasicTaskScheduler".</div>
<div><br></div>
<div><br></div>
<blockquote type="cite" cite><font face="Arial"> 2. I have a
device which acts as a real-time MPEG4 video streaming server. It
resides at rtsp://adm:12345@177.0.0.1:5000/udpstream. I can use the
example of openRTSP program to communicate with it. But I read the
example of openRTSP program a lot of times and I still cannot
understand one thing which is 'Where has the continuously received
video stream gone?'. Where can I find it?</font></blockquote>
<div><br></div>
<div>The output files - named "video-<something>" and
"audio-<something>" - should be stored on your
computer, in the same directory from which you ran the "openRTSP"
application.</div>
<div><br></div>
<div><br></div>
<blockquote type="cite" cite><font face="Arial"> 3. This is an
extension to question 2. I have objects of MPEG4ESVideoRTPSink and
MPEG4GenericRTPSink. Which one can be used to contain the video stream
coming from video streaming server?</font></blockquote>
<div><br></div>
<div>Neither of these. "RTPSink"s are used to *send*
RTP data. To receive RTP data, "RTPSource"s are used.
This data is then written to a "MediaSink" subclass - e.g.,
a "FileSink" (for "openRTSP").</div>
<x-sigsep><pre>--
</pre></x-sigsep>
<div><br>
Ross Finlayson<br>
Live Networks, Inc.<br>
http://www.live555.com/</div>
</body>
</html>