I have an application (game) that runs on the JVM.
The logic of updating the game (which runs 60 times / sec) ends at about 25%, used for it by the "time cut" (1/60 s), and then is reset from the remaining 75%. But when the GC collector is about to work, it rises to 75-200% and remains there for the rest of the performance.
the game uses about 70 MB of heap and grows about 1-2 MB / s. When the GC starts, it returns to 70Mb, so there are no true memory leaks. I will try to reduce this number in the future, but this should not be a problem in this area.
I use JVM 8 without arguments or flags at runtime, I donβt know which GC will give me.
I tried to set the heap to different sizes, but this does not affect this phenomenon.
I have two theories why this might be:
GC inadvertently fragmentes my heap in a way that causes caching in the update loop. I have a logic that really benefits from the proximity of the data when it passes through it and updates it. Maybe he shuffles some data into the old area, keeping some at a young age (nursery)?
Sudden GC processing launches my OS, which makes it clear that my main update protector does not require as many CPU resources as it currently does, which lowers its priority. (However, this phenomenon persists even if I skip thread.sleep () to avoid unused CPU usage.
What do you think. Are my theories plausible, can something be done with them, or do I need to switch to C-language? My knowledge of GC is limited.
P.S. , () 75% GC. VSync, , 200%.