Reading Big Data in C ++

Im uses C ++ to read large files with over 30,000 lines and 3,000 columns. (30000 x 3000). im using a 2d vector to read data. But I need to do this process a couple of times. Is there a way to optimize the reading process?

+4
source share
2 answers

I will give you some ideas, but not an exact solution. Because I do not know the complete information about your system.

  • In fact, if you have this large data file, and only some data changes in the next reading. try using some database methodology.
  • For performance, you can use the simultaneous reading of files (reading one part of a file in parts using multiple threads).
  • If you also need to process the data, use separate threads for processing and you can link queues or parallel queues.
  • If your data length is fixed (e.g. fixed length numbers). and if you know the changed location, try to read only the changed data, and not read and process the whole file again and again.
  • if any of the above methods did not help to use the memory matching methodology . If you're looking for portability, Boost Memory-Mapped Files can help you cut down on your work.
+2
source

The memory card mechanism is normal, as there are only read operations.

+2
source

Source: https://habr.com/ru/post/1482076/


All Articles