<span class="q"><span>Would appreciate feedback
<blockquote type="cite"> <b>a) So I was looking for something which demultiplexes this TS into audio and video streams.</b></blockquote>
<blockquote type="cite">Currently when I look at the code, we do have modules to demultiplex program streams, multiplex into transport stream and also modules to convert from program stream to transport stream.</blockquote>
<blockquote type="cite"><i>But what I would really need is to either have a demultiplexer for the transport stream, or rather converter from transport to program stream. DO we have either support which I might have missed out?
</i></blockquote>
<div><br> </div></span>
<div>No, our library currently has no support for *demultiplexing* Transport Streams. Sorry.</div></span><span></span><span>
<div><br> </div>
<div><font color="#33cc00">>>>>Thanks, will have to look at something which does that and hooks it onto livemedia. </font></div></span>
<div><font color="#33cc00"></font> </div>
<div><font color="#33cc00">I looked at an earlier post of June where you had suggested the below for piping streams from encoder. or in my case the audio(AAC) and video streams.</font></div><br>
<div><font color="#33ff33"><b><span style="COLOR: red"><font size="2">"run your MPEG-2 TS grabbing code in a separate process (that </font></span></b><b><span style="COLOR: red"><font size="2">writes to stdout), and just pipe it to "testMPEG2TransportStreamer"
</font></span></b><b><span style="COLOR: red"><font size="2">using the command line - i.e.,</font></span></b><b><span style="COLOR: red"><font size="2"><span> </span>yourGrabberApplication | testMPEG2TransportStreamer. </font>
</span></b><b><span style="COLOR: red"><font size="2">Alternatively, if your Transport Stream source is available as a </font></span></b><b><span style="COLOR: red"><font size="2">med file (e.g., device), then just run </font>
</span></b><b><span style="COLOR: red"><font size="2"><span> </span>testMPEG2TransportStreamer < yourTransportStreamDeviceFileName". </font></span></b></font></div>
<div><font color="#33ff33"><b><span style="COLOR: red"><font size="2"></font></span></b></font> </div>
<div><font color="#33cc00">Could I do something similar to pipe 2 outputs? since whatever demux i ultimately decide to use, outputs audio (AMR) and video(MP4-V-ES). </font> <br>I know it is slightly out of scope but would appreciate your feedback. Would it better to read input from a socket in this case and make sure demuz sends data onto a socket.? Or would it be better to intergate some sort of demux with the live555 and if so, currently the ByteStreamFileSource only reads from files(do you think it would be an issue to read stream of bytes, or would be better to make some sort of temp files.)
</div><span></span><span class="q">
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">
<div><span>
<blockquote type="cite"> </blockquote>
<blockquote type="cite"><i><b>b) Another observation I made is that even if I got the above support to demultiplex into 2 elementary streams( one would be MP4-V-ES ) and other would be AAC supported audio stream(elementary).
</b></i></blockquote>
<blockquote type="cite"><i>If my understanding is correct we dont have support for AAC ES right? How hard would it be for me to incorporate this support.</i></blockquote>
<blockquote type="cite"> Alternatively the SimpleRTPSink might be useful for any type of information, but we have no support for a framer class for the AAC audio type right?</blockquote>
<div><br> </div></span>
<div>AAC audio should be streamed using the "MPEG4GenericRTPSink" class. (See the "ADTSAudioFileServerMediaSubsession" for an example.)</div>
<div><br> </div>
<div>Note also that we have a perfectly good RTSP server implementation. You don't need to use a separate 'Darwin Streaming Server'.</div></div></blockquote>
<div> </div></span><span></span>
<div><font color="#33cc00">Thanks. I could use something like this right/</font></div>
<div><font color="#33cc00">I can create the audio sink as suggested by you apart from a few parameters like no. of channels etc which I can look into. I have enclosed a snippet of code :</font></div>
<div><font color="#33cc00"></font> </div>
<div><font color="#33cc00">I have a doubt on what type of audio (AAC es ) framer class I would use??</font></div>
<div><font color="#000000">
<p>void play() {<br> // Open the input file as a 'byte-stream file source':<br> ByteStreamFileSource* fileSource1<br> = ByteStreamFileSource::createNew(*env, inputFileName1);<br> //First one for video es</p>
<p>// Open the input file as a 'byte-stream file source':<br> ByteStreamFileSource* fileSource2<br> = ByteStreamFileSource::createNew(*env, inputFileName2);<br> //second fro aac es<br> FramedSource* videoES = fileSource;
</p>
<p> // Create a framer for the Video Elementary Stream:<br> videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES);<br> <font color="#ff0000">audioSource = ?::createNew(*env</font> ,audioES) </p>
<p><br> videoSink->startPlaying(*videoSource, afterPlaying, videoSink);</p>
<p> audioSink->startPlaying(*audioSource, afterPlaying, audioSink);<br><br>}<br></p></font></div>
<div>Thanks for your prompt response. Appreciate your advice</div><span></span><span class="sg">
<div> </div></span>