Using Java memory with native processes

What is the best way to configure a server application written in Java that uses its own C ++ library?

The environment is a 32-bit Windows machine with 4 GB of RAM. JDK - Sun 1.5.0_12.

At startup, the Java process receives 1024 MB of memory (-Xmx), but I often see OutOfMemoryErrors due to a lack of heap. If the memory is increased to 1200 MB, OutOfMemoryErrors errors occur due to lack of swap space. How is memory allocated between the JVM and its own process?

Does the Windows / 3GB switch have any effect with native processes and the Sun JVM?

+4
source share
4 answers

I had a lot of problems with this option (Java on 32-bit systems - msw, etc.), and all of them were solved by backing up only * under 1 GB of RAM for the JVM.

Otherwise, as indicated, the actual occupied memory in the system for this process will be more than 2 GB; at that moment I had โ€œsilent deathsโ€ of the process - no errors, no warnings, just the process ends very quietly.

I got more stability and performance by working on multiple JVMs (each with 1 GB of RAM) on the same system.

+2
source

I found the JNI memory management information here , and here is the JNI JVM section on memory management .

Well, having 3 GB of user space per 2 GB of user space should help, but if you have problems with the 2 GB swap space, I think that 3 GB will just make things worse. How big is your swap file? Is it unloaded?

You can get a better idea about heap allocation by connecting jconm to your jvm.

+1
source

How is memory allocated between the JVM and its own process?

The Sun JVM garbage collector has a mark-and-sweep function, with the ability to enable parallel and incremental GC.

Well, more precisely, it was delivered, and above this applies only to long-lived objects. For young objects, GC is still executed using the stop copy collector, which is much better for working with short-lived objects (and all typical Java programs create many short-lived objects).

The copy collector moves over all elements of the heap, copying them to a new heap if they are referenced, and then deletes the old heap. Thus, 1M live objects require up to 2M real memory: if each object is alive, there will be two copies during garbage collection.

Thus, the JVM requires much more system memory than is available for code running on the VM, because there is significant overhead for managing and garbage collection.

Does the Windows / 3GB switch have any effect with native processes and the Sun JVM?

/3GB allows the address space of the user's virtual memory to be 3 GB, but only for executable files whose headers are marked IMAGE_FILE_LARGE_ADDRESS_AWARE . As far as I know, Sun java.exe is not. I do not have a Windows system, so I can not check.

0
source

You, unfortunately, did not explain your problem well enough. The real question is: why is the Java process growing so much. Do you have a memory leak? Do you have a real reason to have so much data in the JVM?

Is the C ++ library allocating native memory from the C stack, or is it allocating memory from the Java object space, or is it doing something else completely?

0
source

Source: https://habr.com/ru/post/1286445/


All Articles