Bear with me, it can be a little difficult to explain. I am trying to figure out how to encode a program that uses only the amount of CPU it needs. This is a bit confusing to explain, so I'm just using a real example.
I made a Tetris game with an endless main game cycle. I limited it to 40 frames per second. But the cycle still runs thousands or even millions of times per second. It just shows when enough time has passed to limit it to 40 frames per second.
Since I have a 4-core processor, when I start the game, everything is fine and the game works well. But CPU usage is 25% for the gameplay. This is expected as it is an endless loop and continues to run continuously.
Then I read online to add a 1 ms delay to the main loop. This immediately reduced usage to 1% or less. This is good, but now I intentionally wait 1 ms every cycle. This works because my main loop takes much less time to complete, and 1 ms delay does not affect the game.
But what if I make big games. Games with longer and more intensive processor cycles. What if I need this 1 ms slice for the correct game. Then, if I remove the delay, the processor will again move by 25%. If I add a delay, the game will be slow and may have some lag.
What is the ideal solution in this case? How are real games / applications encoded to prevent this problem?