How can I save a 2 - 3 GB tree in memory and make it available for nodejs?

I have a big data tree that I want to have for efficient access to leaves and efficiently serialize large chunks (10-20 MB) at a time in json.

Now I store it as javascript objects, but I see the garbage collection time 4-5 seconds, which is not entirely normal.

I tried using the built-in database (both sqlite and lmdb), but the overhead of row-to-tree conversion performance when I access the data is prety high - it takes me 6 seconds to serialize 5 MB in json.

Ideally, I would like to say v8 "please do not try to garbage collect this tree!" (I tried to disable GC for the whole process, but a lightweight tcp server was launched in front of it, and it quickly started running out of memory).

Or maybe there is a built-in (or not built-in?) Database that handles this initially, which I don’t know about. (I know about MongoDB - it has a maximum object size limit of 16 MB).

I am thinking about trying to pack a tree into a node buffer object (i.e. basically simulate the v8 stack itself), but before I get desperate, I thought I would ask stackoverflow :-)

+4
source share
2 answers

Storing large objects in the GC language is bad practice. This is a problem in the Java world. There are 2 solutions for this:

  • Use memory database - Redis . See if you can use the data structure primitives that Redis provides you.
  • Go Native - NodeJS () FFI, . . addons , .

, . Node, , Message Broker, Beanstalk/ZeroMQ/RabbitMQ.

, . TCP- , Trending Tree .

, MongoDB , . .

+1

, ? Neo4j , node.js .

0

Source: https://habr.com/ru/post/1614872/


All Articles