HBase java.lang.OutOfMemoryError

I have a problem with Hbase.

I have a script that starts the HBase shell and inserts many rows into a single column table. I tried to insert 10,000 rows, but after about 1700 I get the terrible error "java.lang.OutOfMemoryError: cannot create a new native thread". I tried changing the size of the Java heap from the default 1000mb to 1800mb, but this does not allow me to insert more than 1700 lines.

However, I noticed that I can insert 1000 rows, exit the shell, restart the shell, insert another 1000 into the same table, exit again, etc. etc. I am not very versed in the JVM to understand why this allows me to do this in multiple sessions, but not allowing me to insert the same session into it.

Can someone explain to me what is happening here and what can I do?

EDIT:

Now I am using a 64 bit machine, the red hat of linux 5, with Java 1.6. I give HBase a bunch of 20gb (I have a shared memory of ~ 32 gigabytes). For the stack size, I give 8 MB. By default on a 64-bit version - 2 Mb; with 2mb I got the same error, and increasing it to 8mb did not help at all (I could insert as many lines, regardless of the size of the stack, ~ 1700).

I read that reducing heap size can lead to an error, but that didn't help either. Below are the jvm options that I set (all by default except the stack size).

HBASE_OPTS="$HBASE_OPTS -ea -Xss8M -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalMode"
+3
source share
3 answers

. , , , HTable, , put . ( map)

, HBase ( map. , , HBase ( HTable) . .

+4

, HTablePool HTableInterface, close().

0

, , , HTableInterface, HTablePool .

HTableInterface table = tablePool.getTable(tableName);
// Do the work
....
....
table.close()
0

Source: https://habr.com/ru/post/1787532/


All Articles