Sql server 2008 - performance tuning features for inserting large amounts of data

I need to insert a large amount of data into a table. Does sqlserver 2008 (compared to 2005) have any new features to improve performance in this case?

+3
source share
2 answers

I don't know if this is possible for your problem, but if you really try to develop this in code.

I had a similar question for a large project in the past that was supposed to import production data for 15 years into a new schema (in SQL Server 2005.) System.Data.SqlClient.SqlBulkCopy was the fastest option.

, 1 , .NET GC, . , (32- , .)

- g :

Table dataToInsert = new Table();
var sqlCommand = new SqlCommand("select * from old database");
DataReader dataFromOldSystem = sqlCommand.ExecuteReader();
foreach (DataRow oldRow in dataFromOldSystem.Tables[0])
{
// I had to modify/transpose the row from the old table in some way
DataRow newRow = new DataRow(oldRow.GetInt(0), oldRow.GetDateTime(1), oldRow.GetInt(2));
dataToInsert.AddRow(newRow);

newRow = new DataRow(oldRow.GetInt(0), oldRow.GetDateTime(1), oldRow.GetInt(3));
dataToInsert.AddRow(newRow);

newRow = new DataRow(oldRow.GetInt(0), oldRow.GetDateTime(1), oldRow.GetInt(4));
dataToInsert.AddRow(newRow);

// check if the number of rows is over some magic number that is below the memory limit
// you can check the private bytes in use by your app to help guess this number
if (dataToInsert.Rows.Count > 1000000)
{
SqlBulkCopy bulkCopier = new BulkCopy(blah);
bulkCopier.Execute();

dataToInsert = null;
GC.Finalize();
GC.Free;

dataToInsert = new Table();
}
}
+1

SQL Server 2008 MERGE TSQL, INSERT, UPDATE DELETE.

, System.Data.SqlClient.SqlBulkCopy ( SQL Server 2005).

+4

Source: https://habr.com/ru/post/1770446/


All Articles