[Live-devel] Streaming live AMR and H263P

salick at videocells.com salick at videocells.com
Wed Mar 29 04:33:26 PST 2006


> Because your encoding application delivers discrete (H.263 video and
> AMR audio) frames, then your "FramedSource" subclasses - that you
> will write - should also deliver discrete frames.  (Don't just
> concatenate the encoded frames into a buffer, otherwise you'll have
> to parse the data into frames once again, before they can be streamed.)
>
> Your H.263 video and AMR audio "FramedSource" subclasses will feed
> into a "H263plusVideoRTPSink" and a "AMRAudioRTPSink" (not a
> "*Source"), respectively.

For AMR source I sort of get the implementation I need to do, by looking
at the AMRFileSource. I can see it sets the fTo byte array as the frame,
fPresentationTime,fDurationInMicroseconds... and then calls nextTask().

But there's no filesource implementation for H.263P, so How should I
implement the doGetNextFrame() suppose I have one H.263 frame in a byte
array (lets call it frameBuffer).






More information about the live-devel mailing list