Using an STL card for 16 GB of data on a 64-bit machine

This may seem interesting (or may be required), I just wanted to load a very large database into memory (it can go from 12 to 16 GB). The file will be loaded every day in memory and will subsequently be used for this entire day (etc.). Is it ok if I use an STL card for this use case? Does the STL card really work well with this data size on a 64-bit machine (if anyone has experience working on this kind of problem). Also no. requests to this STL card will be about 1000 per second. Let me know if anyone has experience working on similar issues or should I go for a different data structure (any third-party tool that can reliably do this)?

My main problem is that I want to keep I / O time in real time. But I also have mysql as my database, where I need to save this data. This is normal. If I use sqlite as a "in-memory" DB and then I save this data in mysql (on disk)? I think mysql also provides a "MySQL Cluster" for something similar, but I don't know how useful it is in practice.

+3
source share
4 answers

I do not think it's a good idea. To effectively manage such a large amount of data, you will need many optimizations. std::mapprobably will not be optimized for your scenario, plus I'm afraid that the algorithms that you could write for processing will not be as efficient as possible.

. - -, ( 16 , ) ram.

+7

unordered_map, O (1) , .. , .

+4

DB, . www.sqlite.org .

+4

, , :

  • RDBMS - std:: map
  • 16 .
  • FOREVER.
  • - .
  • .

I just think this is a bad idea.

-1
source

Source: https://habr.com/ru/post/1788482/


All Articles