[Live-devel] Framed Source Issue

Mukherjee, Debargha debargha.mukherjee at hp.com
Fri Mar 27 15:22:11 PDT 2009


Thanks. How about the MPEG4 video? I am currently encoding video frames into MPEG4 and then using the MPEG4VideoStreamFramer class before feeding into MPEG4ESVideoRTPSink. Is that correct?


> -----Original Message-----
> From: live-devel-bounces at ns.live555.com
> [mailto:live-devel-bounces at ns.live555.com] On Behalf Of Ross Finlayson
> Sent: Friday, March 27, 2009 2:42 PM
> To: LIVE555 Streaming Media - development & use
> Subject: Re: [Live-devel] Framed Source Issue
>
> On Mar 27, 2009, at 9:28 AM, "Mukherjee, Debargha"
> <debargha.mukherjee at hp.com
>  > wrote:
>
> > Hi Ross,
> >
> > I am still somewhat confused.
> > The parameter fDurationInMicrosecondss is being set
> correctly by me
> > in the deliverFrame() function of my AudioEncSource class
> before the
> > call to FramedSource::afterGetting(this). Could you point me to
> > where in your code it is actually used to decide when to make the
> > next call?
> >
> > In line 144 of
> MPEG1or2AudioStreamFramer::continueReadProcessing(),
> > as you mentioned, it is *set* anyways by your code. So
> however I set
> > it in my class does not seem to matter at all.
> >>
> >>
> >>
> >>
>
> If your audio source object delivers discrete audio frames (one at a
> time), then you should *not* feed it into a
> "MPEG1or2AudioStreamFramer". (That class is for parsing MPEG audio
> from an unstructured byte stream.). Instead, feed your input source
> directly to your "MPEG1or2AudioRTPSink".
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>


More information about the live-devel mailing list