Reading a 200 megabyte json file takes 1.5 GB of memory

I use the json_spirit library in C ++ to parse a 200mb json file. What surprises me is that when reading into memory, my program uses 1.5 GB of my RAM. Is this something that is expected when json is deserialized?

This is how I load into json file:

std::ifstream istream(path.c_str()); json_spirit::mValue val; json_spirit::read(istream, val); 
+4
source share
2 answers

You can try rapidjson .

It is optimized for memory usage and performance.

Using the initu-parsing option (i.e., changing the original parsing line), it takes only 16 bytes for the JSON value for storing the DOM in a 32-bit architecture. String values ​​will use pointers pointing to the modified source string.

I expect memory usage to be much less.

Quickjson, on the other hand, also supports SAX-style parsing. If the application just needs to go through the JSON file from start to finish (for example, to make some statistics), then the SAX API will be even faster and there will be very little memory consumption (software stack + maximum length of the string value).

+1
source

I think it does not depend on JSON. It is rather a matter of data structure overhead. If you have a lot of small objects, the administrative part becomes more and more relevant.

Although more than seven times higher, it seems excessive.

0
source

Source: https://habr.com/ru/post/1442551/


All Articles