[Live-devel] CPU usage abnormalities

Michael Kostukov the-bishop at rogers.com
Fri Feb 17 16:31:24 PST 2006


Hello to all fellow developers:

I've developed a server and client applications using live555 library. 
Recently I noticed the following VERY strange problems:

1)  doEventLoop() on the client-side consumes 100% CPU. That is on 3.0 
GHz Pentium 4 (!).  If I insert a 1ms Sleep delay into the loop (after 
SingleStep() call) CPU  usage goes down almost to 0%, but certain types 
of complex sessions (containing multiple audio streams, for example) 
"starve", i.e. not able to read packets as fast as needed. Tweaking the 
delay a little allowed me to reduce CPU usage (while maintaining smootch 
playback), but this seems more of a hack than a proper solution.

I also noticed that sample app  OpenRTSP is not using any significant 
CPU percentage in its doEventLoop() (< 1% on average) even without any 
delay. Any idea what might have caused doEventLoop() to use all of 
available CPU in my client?

- The client is receiving 7 streams- 1 video, 3 audio and 3 subtitle
- All streams use custom RTP payload format I just made up (X-MY-VID, 
X-MY-AUD, etc). I customized MediaSession.cpp file to properly create 
SimpleRTPSource's for each of these formats. I set    doNormalMBitRule   
to TRUE  for each of these formats (so that large frames are split 
across packets)
- Video and audio streams use 90000 timestamp frequency, subtitles use 1000

2) Even weirder, as soon as client starts receiving multicast packets, 
SERVER CPU usage jumps to 100%! How is that possible? As I understant, 
server ALWAYS sends out packets in multicast mode, client doesn't really 
"pull" anything from it since it gets the packets from the switch...

Anyway, I've been battling with these "supernatural" problems for quite 
a while- any feedback would be greatly appreciated.

Regards,

Michael



More information about the live-devel mailing list