Why do so many applications allocate an incredibly large amount of virtual memory, although they do not use it?

For some time I watched some strange phenomena in programming, since overcommit is enabled by default on Linux systems.

It seems to me that almost all high-level applications (for example, applications written in a high-level programming language such as Java, Python or C #, including some desktop applications written in C ++ that use large libraries such as Qt) use an insane amount of virtual RAM. For example, a web browser typically needs to allocate 20 GB of RAM using only 300 MB. Or for the dektop environment, mysql server, almost every java or mono application, etc., Allocate tens of gigabytes of RAM.

Why is this happening? What's the point? Is there any use in this?

I noticed that when I disable overcommit on linux, in the case of a desktop system that actually runs many of these applications, the system becomes unusable, as it does not even load properly.

+5
source share
2 answers

Languages ​​that run their code inside virtual machines (for example, Java (*), C # or Python) usually assign large amounts of (virtual) memory immediately upon startup. Part of this is necessary for the virtual machine itself, part is pre-allocated for sending to the application inside the virtual machine.

With languages ​​running under the direct control of the OS (for example, C or C ++), this is optional. You can write applications that dynamically use only the amount of memory they need. However, some applications / frameworks are still designed in such a way that they request more memory from the operating system once, and then manage the memory itself in the hope that it will be more efficient than the OS.

There are problems with this:

  • It is not necessarily faster. Most operating systems are already pretty smart at how they manage their memory. Rule number 1 on optimization, measurement, optimization, measurement.

  • Not all operating systems have virtual memory. There are some quite capable of them that cannot run applications that are so "sloppy", suggesting that you can easily allocate a lot of "not real" memory.

  • You have already figured out that if you turn your OS from “generous” to “strict”, these memory swamps fall on your nose .; -)


(*) Java, for example, cannot expand its virtual machine after it starts. You must specify the maximum size of the virtual machine as a parameter ( -Xmxn ). Thinking "Better Safe Than Sorry" leads to serious running-in of certain people / applications.

+3
source

These applications usually have their own memory management method that is optimized for their own use and more efficient than the default memory management provided by the system. Thus, they allocate a huge block of memory to skip or minimize the memory management effect provided by the system or libc.

+1
source

Source: https://habr.com/ru/post/1234777/


All Articles