I will give you some ideas, but not an exact solution. Because I do not know the complete information about your system.
- In fact, if you have this large data file, and only some data changes in the next reading. try using some database methodology.
- For performance, you can use the simultaneous reading of files (reading one part of a file in parts using multiple threads).
- If you also need to process the data, use separate threads for processing and you can link queues or parallel queues.
- If your data length is fixed (e.g. fixed length numbers). and if you know the changed location, try to read only the changed data, and not read and process the whole file again and again.
- if any of the above methods did not help to use the memory matching methodology . If you're looking for portability, Boost Memory-Mapped Files can help you cut down on your work.
source share