Usually you process large files using the streaming API, either raw binary ( Stream ), or through some protocol reader ( XmlReader , StreamReader , etc.). This can also be done with memory mapped files in some cases. The key point here is that you are viewing only a small part of the file at a time (moderate-sized data buffer, logical “row” or “node”, etc. - depending on the scenario).
If this is strange, your desire to compare it somehow directly with some large mass. Honestly, I don’t know how we can help with this without additional information, but if you are dealing with an actual number of this size, I think you will fight if only the binary protocol does not make it convenient. And "performing arithmetic such as division and multiplication" is meaningless according to raw data; this only makes sense when analyzing data with specific user operations.
Also: note that in .NET 4.5 you can flip the configuration switch to increase the maximum size of arrays by going over the 2 GB limit. He still has a limit, but: he is a little more. Unfortunately, the maximum number of elements remains unchanged, so if you use the byte[] array, this will not help. But if you use SomeCompositeStruct[] , you can get a higher level of use. See gcAllowVeryLargeObjects
source share