The fastest way to read / write to a SQL server with a large data set?

I have about 60 million records in the database, and I have to process them all. So the idea is to use C # code to read the data, process it and then return it back to db. Data does not arrive and does not go to the same table - several tables are involved.

I want to see what is best to do? Should I read 100K records at a time in a dataset and then process each record and then use bulk insert into the database and then read the next set?

+3
source share
2 answers

, SQL.

, , . , .

EDIT: .

sql , . , . , 1 . 10 1 db. , 2 10 .

, 1 , 1 , 1 . 1 . , , , - , , ..

, DataAdapter , - . DataReader, , , . , 100 . .

+2

DataSet DataAdapter!

, DataReader - SQL- Stored Proc SqlCommand, ExecuteReader . DataReader , , DateSet, Entity Framework Linq to SQL, NHibenate - , , - .

, SqlBulkCopy, TableLock , " ", , "Full". , , ( , ).

SqlBulkCopy SQL Server, , BatchSize ( ). , UseInternalTransaction SqlBulkCopy, - .

, , . , - , " ".

, , ( , ).

, , : "" . "" 1 100 . . , Parallels .

, , , , , SqlBlukCopy .

+2

Source: https://habr.com/ru/post/1761767/


All Articles