<p>From: Ross Finlayson <<a href="mailto:finlayson@live555.com" target="_blank">finlayson@live555.com</a>><br>To: LIVE555 Streaming Media - development & use <<a href="mailto:live-devel@ns.live555.com" target="_blank">live-devel@ns.live555.com</a>><br>
Date: Wed, 19 Mar 2008 23:18:49 -0700<br>Subject: Re: [Live-devel] live 555 rtsp library for mplayer<br>>>When you stream audio over live-555's RTSP plugin on mplayer, what<br>>>decides the<br>>>audio data to be big endian vs little endian?</p>
<p>>16-bit PCM audio data - when streamed using RTP using an<br>>IETF-standard RTP payload format - is *always* in Big Endian order<br>>(called "Network Byte Order" in the RFCs). That's the standard[*].</p>
<div>I am developing the server for an embedded device, I was hoping I could avoid that but alas!</div>
<div>It sucks that mplayer code actually support a whole slew of combinations (any of the </div>
<div>types specified in file ad_pcm.c, function static int init(sh_audio_t *sh_audio). The code<br>checks sh_audio->format to match a data format, in order for data to be 16 bit, little endian </div>
<div>it has to be one of these formats: 'sowt' (0x74776F73) , 0x0, 0x1, 0xfffe .</div>
<div>If I could soecify a string for that would falll into one of these cases, I wouldve been</div>
<div>in good shape.</div>
<div><br>>>I am developing a server and I am<br>>>specifying the format as 97 (dynamic format) with rtpmap being : 96 L16/8000<br>>>for single channel, 16 bit/sample, 8 khz sample rate audio. But<br>>>mplayer somehow<br>
>>thinks audio data is in big endian format and attaches a BE -> LE<br>>>conversion filter</div>
<div> </div>
<div>>It is correct; your server is not. If your server's PCM data is<br>>originally in Little Endian order, then you need to insert a LE->BE<br>>filter in your server. If you are using our libraries to build your<br>
>server, then this involves just inserting a "EndianSwap16" filter<br>>object in front of your "SimpleRTPSink". (If you are streaming from<br>>WAV audio files, then you could just use the<br>
>"WAVAudioFileServerMediaSubsession" class, which does this for you.<br>>Note also our "LIVE555 Media Server", which can stream ".wav" audio<br>>files.)</div>
<div> </div>
<div>I am not using live555 libraries because of the light weight requirement of </div>
<div>this project, so I need to come up with an optimized version of the byte </div>
<div>swaper. I wonder if there would be a way to read the data backwards from </div>
<div>the A/D that we use for audio.</div>
<div><br>>[*] The reason for this is that IETF protocol standards began in an<br>>era when most computers on the Internet were Big Endian computers<br>>like Sun workstations (which originally used the Motorola 68xxx<br>
>architecture). Back then, computers that used the (Little Endian)<br>>Intel 8086 architecture were (generally speaking) too underpowered to<br>>be used as Internet nodes. If we had known back then that the x86<br>
>architecture would come to dominate the industry, then perhaps things<br>>would have been done differently....</div>
<div> </div>
<div>The standards definitely lack today's computing and multimedia needs.</div>
<div>I was looking at the rfc for media specification - the fourcc and character</div>
<div>based stream identification all sleems like a joke to me ..they need new</div>
<div>revisions like many other rfc's.</div>
<div> </div>
<div>Ratin </div>
<div> </div>
<div>Ross Finlayson<br>Live Networks, Inc.<br><a href="http://www.live555.com/" target="_blank">http://www.live555.com/</a></div>