Problem with mass copy Sql memory

We use the CqlBulk Copy class in C #. Insert bulk data in sql. We have a table with 10 million records.

We insert data into a packet of 10,000 in a loop

We are facing a physical memory problem. Memory is increasing and not decreasing.

Below is our code. How can we free memory when a bulk copy of sql is used, or is there any other way to do bulk insertion.

using (System.Data.SqlClient.SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn,SqlBulkCopyOptions.TableLock,null)) { //bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn); bulkCopy.DestinationTableName = DestinationTable; bulkCopy.BulkCopyTimeout = 0; bulkCopy.BatchSize = dt1.Rows.Count; Logger.Log("DATATABLE FINAL :" + dt1.Rows.Count.ToString(), Logger.LogType.Info); if (SQlConn.State == ConnectionState.Closed || SQlConn.State == ConnectionState.Broken) SQlConn.Open(); bulkCopy.WriteToServer(dt1); //DataTable SQlConn.Close(); SQlConn.Dispose(); bulkCopy.Close(); if (bulkCopy != null) { ((IDisposable)bulkCopy).Dispose(); } } 

The full code is updated here.

 try { using (SqlConnection SQlConn = new SqlConnection(Common.SQLConnectionString)) { DataTable dt1 = FillEmptyDateFields(dtDestination); //SqlTableCreator ObjTbl = new SqlTableCreator(SQlConn); //ObjTbl.DestinationTableName = DestinationTable; using (System.Data.SqlClient.SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn,SqlBulkCopyOptions.TableLock,null)) { //bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn); bulkCopy.DestinationTableName = DestinationTable; bulkCopy.BulkCopyTimeout = 0; bulkCopy.BatchSize = dt1.Rows.Count; Logger.Log("DATATABLE FINAL :" + dt1.Rows.Count.ToString(), Logger.LogType.Info); if (SQlConn.State == ConnectionState.Closed || SQlConn.State == ConnectionState.Broken) SQlConn.Open(); bulkCopy.WriteToServer(dt1); SQlConn.Close(); SQlConn.Dispose(); bulkCopy.Close(); if (bulkCopy != null) { ((IDisposable)bulkCopy).Dispose(); } } } dtDestination.Dispose(); System.GC.Collect(); dtDestination = null; } catch (Exception ex) { Logger.Log(ex, Logger.LogType.Error); throw ex; } 
+4
source share
2 answers

The key question here will be: what is dt1 , where did it come from, and how did you release it? DataTable is actually quite difficult to clean up, and to be honest, I would not recommend a DataTable source here. However, if you must use a DataTable , then make sure you use a completely separate DataSet / DataTable per iteration, and release the old one so that it can be recycled.

More effective, however, is the use of WriteToServer(IDataReader) - this allows you to process strings in a streaming manner. If you copy between two SQL systems, you can simply use ExecuteReader() for a separate command / connection, but IDataReader pretty simple, and you can write a basic IDataReader for most sources (or find libraries that do therefore, for example CsvReader to process files with delimiters such as csv / tsv).

+9
source

I think the problem is with this line:

 bulkCopy.BatchSize = dt1.Rows.Count; 

The BatchSize property determines how many rows are inserted into one internal transaction. The line size here is potentially unlimited.

http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.batchsize.aspx

Try setting it to a small and fixed number to solve the problem:

 bulkCopy.BatchSize = 1000; 

It is up to you to determine the optimal batch size.

+1
source

Source: https://habr.com/ru/post/1438879/


All Articles