I have a daemon application written in C and currently working without any known problems on a Solaris 10 computer. I am in the process of transferring it to Linux. I had to make minimal changes. During testing, it passes all test cases. There are no problems with its functionality. However, when I look at its CPU usage while idle on my Solaris machine, it uses about 0.3% of the processor. On a virtual machine running Red Hat Enterprise Linux 4.8, the same process uses all available CPUs (usually somewhere in the 90% + range).
My first thought was that something should be wrong with the event loop. An event loop is an infinite loop ( while(1)) with a call select(). Timeval is set so that timeval.tv_sec = 0and timeval.tv_usec = 1000. This seems reasonable enough for what this process does. As a test, I ran into timeval.tv_secto 1. Even after that, I saw the same problem.
Is there something I am missing in how select works on Linux or Unix? Or does it work differently with the OS and runs on a virtual machine? Or maybe there is something else that I'm just missing?
One more thing, I'm not sure which version of vmware server is being used. It was recently updated about a month ago.