[Live-devel] issue with streaming ulaw from memory buffer

Ke Yu jingke2000 at gmail.com
Wed Apr 6 21:59:07 PDT 2011


Hi!

I'm writing some code to stream live audio in ulaw format over RTP
multicast. I derived a class from framedSource class to read in the
data from a memory buffer. An audio recorder thread feeds ulaw audio
data to this buffer. In the derived class, I specified frame size @
128 bytes, duration @ 16000 us and the presentation time, etc in the
"doGetNextFrame" function. The network sink is an instance of
"SimpleRTPSink". It seems to be straightforward. However, the audio
was very broken when played back from VLC. Later I found that the
output RTP packet has 1024 bytes payload and arriving interval for
each  RTP packet is 128 ms. I intended to have a 16-ms arriving
interval and 128-byte RTP payload (which is the audio recorder's
output frame size). Does this make any sense and how should I do that?

Thanks!


More information about the live-devel mailing list