[Live-devel] Problems with Streaming of AMR
Ross Finlayson
finlayson at live555.com
Tue Oct 31 10:34:58 PST 2006
>So as you suggested I looked into the wis-streamer code as well and
>included the part for the AMR-header:
>
>enum Mode ourAMRMode = MR122; // the only mode that we support
>fLastFrameHeader = toc_byte[ourAMRMode];
>fNumTruncatedBytes = 0;
>
>But on the receiver side the frame is still to short:
>
>[amr_nb @ 0xb45fc008]amr frame too short (31, should be 32)
>
>Maybe I'm still missing something?
You're probably forgetting - in your encoder - to adjust the returned
frame size to allow for the extra 1-byte header. Search for
"EncoderIncludeHeaderByte" in "AMREncoder/interf_enc.c" in the
"wis-streamer" code, to see where I had to modify the existing FFMPEG
code to do this.
>Another thing that could be the problem is the following: As the AMR
>frames transmitted are not interleaved (because it is not supported) I
>used the RawAMRRTPSource rtp_source instead of AMRAudioSource.
>
>RTPSource* rtp_source = 0;
>AMRAudioSource* audio_source = AMRAudioRTPSource::createNew(env,
>rtp_sock, rtp_source, 97);
>// audio_source is not further used, instead rtp_source is used.
No, your receiving application must read from the object that's
returned by "AMRAudioRTPSource::createNew()" - in your case,
"audio_source".
>
>Is it correct that I can use the rtp_source?
No (see above).
>What is the difference in
>these two? If I would use the audio_source, what is the purpose of
>rtp_source in this example?
"rtp_source" would then be used only for RTP-specific operations,
*not* for reading from the incoming stream (you would use
"audio_source" for this). For example, you would use "rtp_source"
when creating your "RTCPInstance" object, which you should be doing,
because you should be using RTCP (at both ends) as well as RTP.
See, for example, the code in "MediaSession.cpp" that creates and
uses a "AMRAudioRTPSource".
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
More information about the live-devel
mailing list