I have large rawdata CT files that can exceed 20-30 GB in size. For most of our modern computers in the department, we have a maximum of 3 GB. But for data processing we need to go through all the data available. Of course, we could do this by sequentially viewing the data using the read and write functions. But sometimes it is necessary to store some data in memory.
Currently, I have my own memory management, where I created the so-called MappableObject. Each rawdata file contains, say, 20,000 structures, each of which shows different data. Each MappableObject refers to a location in the file.
In C #, I created a somewhat partially working mechanism that automatically updates mps and, if necessary, discards data. Since then I know MemoryMappedFiles, but in .NET 3.5 I refused to use it because I knew that in .NET 4.0 it would be available initially.
So, today I tried MemoryMappedFiles and found out that it is not possible to allocate as much memory. If I have a 32-bit system and I want to allocate 20 GB, it does not work due to the excess of the size of the logical address space. This is somehow clear to me.
But is there a way to handle large files like mine? What other chances do I have? How do you guys decide these things?
Thanks Martin
msedi source share