[Live-devel] Streaming live AMR and H263P

Ross Finlayson finlayson at live555.com
Wed Mar 29 01:46:31 PST 2006


>1. I see that RTCPInstance::createNew() gets a sink and a source.
>    what sink should i give it if I have two sinks (video and audio)?

You must create a separate "RTCPInstance" for each "RTPSink"

>    what is the source parameter that it gets? should I just give it NULL
>    like in the example?

Yes (because you are a sender, not a receiver, and so don't use a "RTPSource").

>2. When I'm delivering discete frames for the AMR source, how am I doing
>this exactly? What is a "frame" in audio?

Basically the same as in video.  Note what gets out output from your 
AMR encoder.  This output will be in discrete units of data - each 
representing the encoded audio from one particular time 
interval.  (For AMR, I think each frame is 20ms of audio.)


	Ross Finlayson
	Live Networks, Inc. (LIVE555.COM)
	<http://www.live555.com/>



More information about the live-devel mailing list