Bulk upload H2 database

I have a task to load some data from mysql to h2. We need to display the tree at the user interface level, where the tree has more than 50,000 nodes. Therefore, thinking that reading from H2 can reduce latency.

To do this, I have to load about 1M (each client has its own tree) H2 records from MySQL. The boot part is an application descriptor. Where he reads from MySQL and updates the package to H2. It processes 40,000 records at runtime. But the migration process continues as the migration process continues.

I tried to use "SET LOG 0, SET LOCK_MODE 0, SET UNDO_LOG 0" to load data efficiently, but the same server goes out of memory.

I installed 512M heap memory.

The H2 document says that for faster data loading, use "create table ... as select ...", but I think this will not reduce the load time, since the application should read 1M records and create a CSV file.

Can anyone suggest a way?

+4
source share
1 answer

To avoid creating CSV files, you could create linked tables using CREATE LINKED TABLE or using CALL LINK_SCHEMA . Then copy the data using CREATE TABLE ... AS SELECT .

Regarding the memory issue, are you creating a database in memory? If so, you can save memory (at some speed) if you use a file system in memory or a compressed file system in memory : jdbc:h2:memFS:test or jdbc:h2:memLZF:test instead of jdbc:h2:mem:test . If you are not using the database in memory, I am not sure what the problem is.

+2
source

Source: https://habr.com/ru/post/1446594/


All Articles