System.outofmemoryexception When populating a DataAdapter?

I need to pull 150K records from the db. I use da.Fill(ds,"Query") and throw it system.outofmemoryexception .

 Dim daGrid As New SqlDataAdapter(sqlcmd_q) daGrid.Fill(dsGrid, "Query") daGrid.Dispose() 

I need only this data type. I can not use XML. because I need to assign this to MSChartControl to display ScotterPlot.

Any suggestions?

+4
source share
3 answers

The first thing I checked is how many columns you are returning and what their data types are. Although 150K records are many, it should not give you an OOM exception if each record has a length of about 13K (on a 32-bit machine). This tells me that you are either returning more fields than you need, or perhaps some of the fields are very large strings or binary data. Try cutting out the select statement to return only those fields that are absolutely necessary for display.

If this does not work, you may need to move from the DataTable to the list of user-defined data types (a class with corresponding fields).

+7
source

You did not specify a request. Make sure that it contains only the columns you need.

If you still have problems, you can try upgrading to 64-bit (if your hardware supports it and you have more than 2 GB of free memory).

If this does not help you reduce the amount of memory. A possible option would be to make a graph without storing all the underlying data in memory. Just upload the data one at a time, calculate the coordinates and store them without saving the master record. Perhaps you can even make a request.

+4
source

I know that this answer is too late to help the original poster, but I hope it can help others who have encountered a similar problem.

First, the problem is with the DataTable , not the DataAdapter .

The problem may be that you really have insufficient memory (in this case, my answer will not help). You can do the math to figure out if this could be the case - number of records x guessing bytes per record. If this approaches 2 GB on a 32-bit platform or your available RAM on a 64-bit platform, then the only option is to reduce the number of records, the number of fields, or come up with an approach that uses a DataReader instead of a DataTable.

In your case, you have records of 150 KB, let's assume that each of them requires 1 KB of memory, which gives us a rounded figure of 150 MB. Even on a 32-bit machine with 2 GB of RAM, this should be good (provided that a lot of this kind of memory allocation does not happen). In your case, you have a 64-bit machine with 128 GB of RAM (nice). Therefore, according to the logic of things, there should be no memory errors.

So what is the cause of the problem? This is a Lots of Big Objects (LOH). What for? DataTable creates an array to hold these records. As far as I understand, it creates an array of 50, and then increases as records are added. Any memory allocation of more than 85,000 bytes will come from a large object heap. (You worked on a 64-bit platform, so it turns out that as soon as you reach 10 625 records, the selection will start coming from a bunch of large objects.) The problem with a bunch of large objects is that it does not compact. Thus, there may be a lot of free space, but there is not a single continuous block that would be large enough. With .net 4.5, Microsoft has improved it in terms of combining adjacent fragments, but does not reorganize them to create large blocks of free space. As a result, if you run into LOH in my experience, you will get the โ€œOut of memoryโ€ exception only a matter of time.

Decision?

It worked for me to set the initial capacity of the DataTable . When retrieving records from the database, this will mean counting first, so this happens due to an additional query to the database, and then:

 . . dsGrid.InitialCapacity = count; daGrid.Fill(dsGrid, "Query"); . . 

Although this does not exclude deviations from LOH, this should mean that it performs only one selection instead of several. Thus, in addition to eliminating exclusions from lack of memory, you should also get a performance gain (offset by the need for an additional query to the database).

You can force the .net garbage collector to compress a bunch of large objects, but you can only tell it to do it the next time it starts. I tend to use this if I know that I got lost in a heap of large objects. This may be redundant, but consider amending my proposal:

 . . dsGrid.InitialCapacity = count; if (count > 10625) { System.Runtime.GCSettings.LargeObjectHeapCompactionMode = System.Runtime.GCLargeObjectHeapCompactionMode.CompactOnce; } daGrid.Fill(dsGrid, "Query"); . . 
0
source

Source: https://habr.com/ru/post/1340981/


All Articles