<div>I have figured out how to do this, so here is the basis of my code, and some hanging unanswered questions... the code is mostly testMPEG2TransportStreamer.cpp.<br></div>
<div> <br>#include <fstream><br>#include <math.h><br>#include <vector><br><br>#include "liveMedia.hh"<br>#include "BasicUsageEnvironment.hh"<br>#include "GroupsockHelper.hh"
<br>// To stream using "source-specific multicast" (SSM), uncomment the following:<br>//#define USE_SSM 1<br>#ifdef USE_SSM<br>Boolean const isSSM = True;<br>#else<br>Boolean const isSSM = False;<br>#endif<br><br>
// To set up an internal RTSP server, uncomment the following:<br>//#define IMPLEMENT_RTSP_SERVER 1<br>// (Note that this RTSP server works for multicast only)<br><br>#define TRANSPORT_PACKET_SIZE 188<br>#define TRANSPORT_PACKETS_PER_NETWORK_PACKET 7
<br><br>class pipeToRTP {<br> public:<br> // Constructor and destructor<br> pipeToRTP();<br><br> // Primitive execution method<br> void execute();<br><br> protected:<br> //-----------------------------------------------------------------------
<br> // Variable Declarations<br> //-----------------------------------------------------------------------<br> <br> char rtpFileName_[L_tmpnam];<br> string destAddressStr_;<br> unsigned short rtpPortNum_;
<br> <br> //-----------------------------------------------------------------------<br> // Method Declarations<br> //-----------------------------------------------------------------------<br> void preamble();
<br> void setupRTP();<br> void compute();<br> void play();<br> void postamble();<br> <br> // Buffers<br> char cBuffer_[MAX_GRAB]; // Ouput of FEC is a type 1000 SB<br><br>
<br> // RTP objects<br> UsageEnvironment* env;<br> FramedSource* videoSource;<br> RTPSink* videoSink;<br> BasicTaskScheduler* scheduler;<br> ByteStreamFileSource* Source;<br> ofstream outPipe_;
<br><br>};<br><br><br>/**<br> * pipeToRTP Constructor<br> */<br>pipeToRTP::pipeToRTP() {<br> tmpnam(rtpFileName_); // make a random filename<br>}<br><br><br>/**<br> * Class Execution<br> */<br>void pipeToRTP::execute() {
<br> preamble();<br> m_sync();<br> setupRTP(); <br> //compute(); // Called via setupRTP()<br> postamble();<br>}<br><br><br>/**<br> * Preamble<br> * Initializes file headers, variables, and buffers<br> */<br>
void pipeToRTP::preamble() {}<br><br><br>/**<br> * Main loop<br> * This is where the data packets are pushed to a file stream<br> */<br>void pipeToRTP::compute() { <br> <br> while (m_do(lper_, hin_.xfer_len)) {
<br> // Grab to our unix pipe<br> m_grabx(hin_, cBuffer_, ngot_);<br> if (ngot_ > 0) {<br> // Send out the packet to Unix pipe<br> outPipe_.write(cBuffer_, ngot_); // write a block of data
<br> scheduler->SingleStep(0); // Made this guy public<br> }<br> }<br><br>}<br><br><br>/**<br> * Postamble<br> * Performs post-processing tasks such as closing file headers and freeing<br> * memory.
<br> */<br>void pipeToRTP::postamble() {<br>#ifdef DEBUG<br> *env << "...done reading from file\n";<br>#endif<br><br> Medium::close(videoSource);<br> // Note that this also closes the input file that this source read from.
<br> <br> outPipe_.close();<br> <br> exit(1);<br>}<br><br><br>/**<br> * Main Routine<br> * Instantiates an instance of the class and calls the execution method<br> */<br>void mainroutine() {<br> pipeToRTP p;<br><br>
try {<br> p.execute();<br> }<br> catch (...) {<br> m_error("Primitive execution failed (pipeToRTP)");<br> }<br>}<br><br><br><br>/**<br> * setupRTP<br> * Initializes RTP environment.<br>
*/<br>void pipeToRTP::setupRTP() {<br> // Begin by setting up our usage environment:<br> scheduler = BasicTaskScheduler::createNew();<br> env = BasicUsageEnvironment::createNew(*scheduler);<br><br> ...<br><br>#ifdef DEBUG
<br> *env << "Beginning streaming...\n";<br>#endif<br> <br> outPipe_.open(rtpFileName_); // Open the file<br> if (!outPipe_) {<br> cerr << "Could not open fifo for output" << endl;
<br> exit(1);<br> }<br> <br> play();<br> compute();<br>}<br><br><br><br>void afterPlaying(void* /*clientData*/) {<br> // Just for the compiler<br>}<br><br><br><br>void pipeToRTP::play() {<br> unsigned const inputDataChunkSize
<br> = TRANSPORT_PACKETS_PER_NETWORK_PACKET*TRANSPORT_PACKET_SIZE;<br><br> // Open the input as a 'byte-stream file source':<br> Source = ByteStreamFileSource::createNew(*env, rtpFileName_, inputDataChunkSize);
<br> if (Source == NULL) {<br> *env << "Unable to open file \"" << rtpFileName_<br> << "\" as a byte-stream file source\n";<br> exit(1);<br> }<br> <br> // Create a 'framer' for the input source (to give us proper inter-packet gaps):
<br> videoSource = MPEG2TransportStreamFramer::createNew(*env, Source);<br><br> // Finally, start playing:<br>#ifdef DEBUG<br> *env << "Beginning to read from file...\n";<br>#endif<br> videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
<br>}<br><br><br><br>Right, so the only way I could manage to get this working was to use an ofstream object to write my data to, using a random filename which was passed to ByteStreamFileSource::createNew. Also, teh above functions play() and compute() HAD to be withing setupRTP() or else I would get a bad file descriptor error from select()... this probably has something to do with scope, but I will not investigate this further. Performace seems to be good, this uses minimal cpu cycles on my machine.
<br><br>If anyone has a better alternative to ofstream, I would be happy to hear it!<br><br>Russell<br><br><br> </div>
<div><span class="gmail_quote">On 6/19/07, <b class="gmail_sendername">Russell Brennan</b> <<a href="mailto:rjbrennn@gmail.com" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">rjbrennn@gmail.com
</a>> wrote:</span>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0px 0px 0px 0.8ex; padding-left: 1ex;">
<div>Hello,</div>
<div> </div>
<div>I am trying to take a buffer of MPEG2-TS data which is constantly written to, and hook it up to live555 for output via RTP. I am currently basing my work on the testMEPG2TransportStreamer code, but obviously this was designed to stream a file.
<br clear="all"></div>
<div>My current idea is not really panning out, due to the fact that I am apparently confused... ByteStreamFileSource is used in the test program, but it seems that I will need to use a different type of source. Is there an existing MediaSource that will work, or will I need to create my own?
</div>
<div> </div>
<div>Also, MPEG2TransportStreamFramer is used as the videosource to videoSink->startPlaying . This would seem to still be valid for what I am doing, correct?</div>
<div> </div>
<div>Lastly, doEventLoop seems to continuously call SingleStep(), and from what I can make of this function, it seems that in some way this is where the data to be sent out is packaged and sent. Assuming that I get acknowledgement each time my buffer is ready to be sent out, can I simply call SingleStep() each time I have data to send?
</div>
<div> </div>
<div>I hope I was clear enough... Thanks in advance!</div>
<div><br>-- <br>Russell Brennan<br> </div></blockquote></div><br><br clear="all"><br>-- <br>Russell Brennan<br><a href="mailto:RJBrennn@gmail.com" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">RJBrennn@gmail.com
</a><br>(708) 699-7314