[Live-devel] Send and receive wav (streaming)

Ross Finlayson finlayson at live.com
Wed Apr 14 09:55:12 PDT 2004


At 05:23 AM 4/14/04, you wrote:
>We still have trouble with receiving wav files...

Note that if you're simply writing the received PCM audio data (taken from 
RTP packets) into a file, then you're not "receiving WAV files".  Instead, 
you're just "receiving raw PCM audio files".  Such files will not play in 
any media player that I'm aware of.  (Just giving the files a ".wav" file 
extension won't work :-)

Instead, you will need to add a WAV file header to the front of the 
data.  (See "WAVAudioFileSource.cpp" for help in figuring out what a WAV 
file header should look like.)

>It seems like we are missing some headers or something like
>that

Exactly.  (BTW, you could have figured this out by comparing the received 
file data with the original file data that you sent.)

>In testWAVStreamer the function NetworkFromHostOrder16() is called

Note the modified code in the newest version of the "LIVE.COM Streaming 
Media" release.  This should now be "EndianSwap16::createNew()" (*not* 
"NetworkFromHostOrder16::createNew()").  (The old code did not work 
correctly on big-endian architectures, like Macs.)

>do we have
>to call the corresponding function on the receiver end?

Yes, you need to insert a filter (another "EndianSwap16") that converts 
back from network byte order (i.e., big-endian) to little-endian byte order 
(which is how data is stored in WAV files).

>  If so, where?

The "EndianSwap16" filter needs to go between the RTP source (a 
"SimpleRTPSource" in your case) and the "FileSink" that you are writing to.

>Another problem is that once we've started receiving the stream we want to 
>play
>it back in real-time. Any idea how to do this in redhat?

You should be able to play the stream using "VLC" 
<http://www.videolan.org/vlc/>, by giving it a SDP file that describes the 
stream.  (You can use "testMP3.sdp" as a model, but you will need to modify 
the "m=" and "c=" lines to change the multicast address, port number, and 
RTP payload format, as appropriate.)

>Related to this we want
>to stream from the microphone on the sender side, how do we do this?

Sometime in the future the code will contain a "LinuxAudioInputDevice" 
source class that will deliver audio data for you.  But for now you will 
need to write your own source class that does this.  Sorry I can't help you 
more with this.


	Ross Finlayson
	LIVE.COM
	<http://www.live.com/>



More information about the live-devel mailing list