Java: Is the memory limit hidden somewhere?

I have a pretty computational intensive test program in Java that I am testing on a Linux server. It works fine on my Macbook.

If I run it on the server, the following will happen: As soon as the memory reaches 324 MB, the program will stop. Apparently, some restrictions prevent him from using more memory, and the garbage collector needs to do more and more work to stay below this limit. At some point, the "excess" of the excess of "GC" is exceeded.

I am running java with -Xmx16000m, so this cannot be a memory limit. What are the other possible limits?

These are the versions of Java that I use:

Macbook:

java version "1.5.0_16"

Java (TM) 2 runtime, standard version (build 1.5.0_16-b06-284)

HotSpot (TM) Java client virtual machine (build 1.5.0_16-133, mixed mode, sharing)

Linux server:

java version "1.6.0_12"

Java (TM) SE Runtime Environment (build 1.6.0_12-b04)

HotSpot (TM) 64-bit Java Server Virtual Machine (build 11.2-b01, mixed mode)

+3
source share
4 answers

A parallel collector will throw an OutOfMemoryError if too much time is spent on garbage collection: if more than 98% of the total time is spent on garbage collection and less than 2% of the heap is recovered, OutOfMemoryError will be thrown.

, , , . , -XX: -UseGCOverheadLimit.

+3

JConsole JVisualVM JVM . JVM. JVisualVM . , , .

+2

Thanks for the tip. My problem seems to be more specific. There is nothing wrong with a Java program. I'm calling Java from Mathematica JLink. Now I understand that the 324 MB limit should come from there (somewhere).

+1
source

If this is the available main memory on the device. If you run out of free memory, your program will stop, no matter how much virtual memory you have.

-1
source

Source: https://habr.com/ru/post/1710435/


All Articles