[Live-devel] Streaming live AMR and H263P

salick at videocells.com salick at videocells.com
Wed Mar 29 01:06:29 PST 2006


Thanks, I'll try doing what you said.
Two more questions:
1. I see that RTCPInstance::createNew() gets a sink and a source.
   what sink should i give it if I have two sinks (video and audio)?
   what is the source parameter that it gets? should I just give it NULL
   like in the example?
2. When I'm delivering discete frames for the AMR source, how am I doing
this exactly? What is a "frame" in audio?


> At 06:01 AM 3/28/2006, you wrote:
>>Hello,
>>
>>I have an application which captures frames and mic audio and then
>>converts them to h263p and AMR respectivly. How can I stream from a
>> buffer
>>which is being written at realtime? (buffer for video frames and buffer
>>for audio).
>
> See <http://www.live555.com/liveMedia/faq.html#liveInput>
>
> Because your encoding application delivers discrete (H.263 video and
> AMR audio) frames, then your "FramedSource" subclasses - that you
> will write - should also deliver discrete frames.  (Don't just
> concatenate the encoded frames into a buffer, otherwise you'll have
> to parse the data into frames once again, before they can be streamed.)
>
> Your H.263 video and AMR audio "FramedSource" subclasses will feed
> into a "H263plusVideoRTPSink" and a "AMRAudioRTPSink" (not a
> "*Source"), respectively.
>
>
> 	Ross Finlayson
> 	Live Networks, Inc. (LIVE555.COM)
> 	<http://www.live555.com/>
>
> _______________________________________________
> live-devel mailing list
> live-devel at lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>




More information about the live-devel mailing list