[Live-devel] Queries

Brian D'Souza brian.vdsouza at gmail.com
Fri Aug 10 09:57:50 PDT 2007


Would appreciate feedback

 *a) So I was looking for something which demultiplexes this TS into audio
and video streams.*

Currently when I look at the code, we do have modules to demultiplex program
streams, multiplex into transport stream and also modules to convert from
program stream to transport stream.

*But what I would really need is to either have a demultiplexer for the
transport stream, or rather converter from transport to program stream. DO
we have either support which I might have missed out? *



No, our library currently has no support for *demultiplexing* Transport
Streams.  Sorry.


>>>>Thanks, will have to look at something which does that and hooks it onto
livemedia.

I looked at an earlier post of June where you had suggested the below for
piping streams from encoder. or in my case the audio(AAC) and video streams.

*"run your MPEG-2 TS grabbing code in a separate process (that **writes to
stdout), and just pipe it to "testMPEG2TransportStreamer" **using the
command line - i.e.,** yourGrabberApplication | testMPEG2TransportStreamer.
**Alternatively, if your Transport Stream source is available as a **med
file (e.g., device), then just run **    testMPEG2TransportStreamer <
yourTransportStreamDeviceFileName". *
**
Could I do something similar to pipe 2 outputs? since whatever demux i
ultimately decide to use, outputs audio (AMR) and video(MP4-V-ES).
I know it is slightly out of scope but would appreciate your feedback. Would
it better to read input from a socket in this case and make sure demuz sends
data onto a socket.? Or would it be better to intergate some sort of demux
with the live555 and if so, currently the ByteStreamFileSource only reads
from files(do you think it would be an issue to read stream of bytes, or
would be better to make some sort of temp files.)

>
>
> *b) Another observation I made is that even if I got the above support to
> demultiplex into 2 elementary streams( one would be MP4-V-ES ) and other
> would be AAC supported audio stream(elementary). *
>
> *If my understanding is correct we dont have support for AAC ES right? How
> hard would it be for me to incorporate this support.*
>
>  Alternatively the SimpleRTPSink might be useful for any type of
> information, but we have no support for a framer class for the AAC audio
> type right?
>
>
>
> AAC audio should be streamed using the "MPEG4GenericRTPSink" class.  (See
> the "ADTSAudioFileServerMediaSubsession" for an example.)
>
>
> Note also that we have a perfectly good RTSP server implementation.  You
> don't need to use a separate 'Darwin Streaming Server'.
>

Thanks. I could use something like this right/
I can  create the audio sink as suggested by you apart from a few parameters
like no. of channels etc which I can look into. I have enclosed a snippet of
code :

I have a doubt on what type of audio (AAC es ) framer class I would use??

void play() {
  // Open the input file as a 'byte-stream file source':
  ByteStreamFileSource* fileSource1
    = ByteStreamFileSource::createNew(*env, inputFileName1);
  //First one for video es

// Open the input file as a 'byte-stream file source':
  ByteStreamFileSource* fileSource2
    = ByteStreamFileSource::createNew(*env, inputFileName2);
  //second fro aac es
  FramedSource* videoES = fileSource;

  // Create a framer for the Video Elementary Stream:
  videoSource = MPEG4VideoStreamFramer::createNew(*env, videoES);
 audioSource = ?::createNew(*env ,audioES)


  videoSink->startPlaying(*videoSource, afterPlaying, videoSink);

  audioSink->startPlaying(*audioSource, afterPlaying, audioSink);

}
Thanks for your prompt response. Appreciate your advice
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.live555.com/pipermail/live-devel/attachments/20070810/bbd95043/attachment-0001.html 


More information about the live-devel mailing list