[Live-devel] Re: CPU Usage Abnormalities

Michael Kostukov the-bishop at rogers.com
Fri Feb 17 18:49:22 PST 2006


Hello Ross:

Some extra info regarding my previous post. I enabled debug output in 
Live555 (DEBUG macro), and it seems that the server is receiving a HUGE 
amount of data from the client- more than it sends! I see messages all 
over the place in server's debug output- "saw incoming RTCP packet 
from...". It seems that the client is constantly sending RTCP packets to 
server, and it does it so often that both server and client are 
"swamped". In fact one of my client machines was sending as much as 1MB 
/ sec of data (!!!) to the server. Do you have any idea why the client 
might be doing that?

Regards,

Michael


    /
    Hello to all fellow developers:

    I've developed a server and client applications using live555
    library. Recently I noticed the following VERY strange problems:

    1)  doEventLoop() on the client-side consumes 100% CPU. That is on
    3.0 GHz Pentium 4 (!).  If I insert a 1ms Sleep delay into the loop
    (after SingleStep() call) CPU  usage goes down almost to 0%, but
    certain types of complex sessions (containing multiple audio
    streams, for example) "starve", i.e. not able to read packets as
    fast as needed. Tweaking the delay a little allowed me to reduce CPU
    usage (while maintaining smootch playback), but this seems more of a
    hack than a proper solution.

    I also noticed that sample app  OpenRTSP is not using any
    significant CPU percentage in its doEventLoop() (< 1% on average)
    even without any delay. Any idea what might have caused
    doEventLoop() to use all of available CPU in my client?

    - The client is receiving 7 streams- 1 video, 3 audio and 3 subtitle
    - All streams use custom RTP payload format I just made up
    (X-MY-VID, X-MY-AUD, etc). I customized MediaSession.cpp file to
    properly create SimpleRTPSource's for each of these formats. I
    set    doNormalMBitRule   to TRUE  for each of these formats (so
    that large frames are split across packets)
    - Video and audio streams use 90000 timestamp frequency, subtitles
    use 1000

    2) Even weirder, as soon as client starts receiving multicast
    packets, SERVER CPU usage jumps to 100%! How is that possible? As I
    understant, server ALWAYS sends out packets in multicast mode,
    client doesn't really "pull" anything from it since it gets the
    packets from the switch...

    Anyway, I've been battling with these "supernatural" problems for
    quite a while- any feedback would be greatly appreciated.

    Regards,

    Michael /



-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.live555.com/pipermail/live-devel/attachments/20060217/ce94a285/attachment.html


More information about the live-devel mailing list