I know that this answer is too late to help the original poster, but I hope it can help others who have encountered a similar problem.
First, the problem is with the DataTable , not the DataAdapter .
The problem may be that you really have insufficient memory (in this case, my answer will not help). You can do the math to figure out if this could be the case - number of records x guessing bytes per record. If this approaches 2 GB on a 32-bit platform or your available RAM on a 64-bit platform, then the only option is to reduce the number of records, the number of fields, or come up with an approach that uses a DataReader instead of a DataTable.
In your case, you have records of 150 KB, let's assume that each of them requires 1 KB of memory, which gives us a rounded figure of 150 MB. Even on a 32-bit machine with 2 GB of RAM, this should be good (provided that a lot of this kind of memory allocation does not happen). In your case, you have a 64-bit machine with 128 GB of RAM (nice). Therefore, according to the logic of things, there should be no memory errors.
So what is the cause of the problem? This is a Lots of Big Objects (LOH). What for? DataTable creates an array to hold these records. As far as I understand, it creates an array of 50, and then increases as records are added. Any memory allocation of more than 85,000 bytes will come from a large object heap. (You worked on a 64-bit platform, so it turns out that as soon as you reach 10 625 records, the selection will start coming from a bunch of large objects.) The problem with a bunch of large objects is that it does not compact. Thus, there may be a lot of free space, but there is not a single continuous block that would be large enough. With .net 4.5, Microsoft has improved it in terms of combining adjacent fragments, but does not reorganize them to create large blocks of free space. As a result, if you run into LOH in my experience, you will get the โOut of memoryโ exception only a matter of time.
Decision?
It worked for me to set the initial capacity of the DataTable . When retrieving records from the database, this will mean counting first, so this happens due to an additional query to the database, and then:
. . dsGrid.InitialCapacity = count; daGrid.Fill(dsGrid, "Query"); . .
Although this does not exclude deviations from LOH, this should mean that it performs only one selection instead of several. Thus, in addition to eliminating exclusions from lack of memory, you should also get a performance gain (offset by the need for an additional query to the database).
You can force the .net garbage collector to compress a bunch of large objects, but you can only tell it to do it the next time it starts. I tend to use this if I know that I got lost in a heap of large objects. This may be redundant, but consider amending my proposal:
. . dsGrid.InitialCapacity = count; if (count > 10625) { System.Runtime.GCSettings.LargeObjectHeapCompactionMode = System.Runtime.GCLargeObjectHeapCompactionMode.CompactOnce; } daGrid.Fill(dsGrid, "Query"); . .