Is it possible to insert a large amount of data using linq-to-sql?

I need to insert a large amount of data into SqlServer 2008. My project is based on linq-to-sql.

I am processing a csv file with 100,000 lines. Each line is mapped to an object Order. Orderalso contains a set of objects Categoryand Code. I need to map each line to an object in order to check it.

Then I need to insert all these objects into the database.

List<Order> orders = Import("test.csv");
db.Orders.InsertAllOnSubmit(orders);
db.SubmitChanges();

OR

foreach(Order order in orders)
db.Orders.InsertOnSubmit(order);
db.SubmitChanges();

Both ways are slow. Is there a workaround? I can use a different approach than l2sql for this task.

I read about the SqlBulkCopy class - would handle insert children ?

+3
source share
4

.

foreach(List<Order> orderbatch in orders.Batch(100))
{
  db.Orders.InsertOnSubmit(orderbatch); 
  db.SubmitChanges();   
}


public static IEnumerable<List<T>> Batch<T>(this IEnumerable<T> source, int batchAmount)
{
  List<T> result = new List<T>();
  foreach(T t in source)
  {
    result.Add(t);
    if (result.Count == batchSize)
    {
      yield return result;
      result = new List<T>();
    }
  }
  if (result.Any())
  {
    yield return result;
  }
}
+2

@Brian, LINQ to SQL , , , .

, , , ( 2008 ).

+2

CSV : http://www.codeproject.com/KB/database/CsvReader.aspx

, , SQL Server, , .

LINQ to SQL , ... .

.

0

, , , 1000 , .

Performance is balanced between two edges: excessive memory usage caused by storing all 100,000 objects in memory on one side, and time to create a session and reconnect the database on the other side.

By the way, there is no significant difference between a session.InsertAllOnSubmit (data) and foreach (var i in data) session. Insert (i).

0
source

Source: https://habr.com/ru/post/1754618/


All Articles