How to optimize data import into SQL Server with 100,000 records with 100 concurrent users

I am using a Visual Studio web application using ASP.NET MVC and C #. I have an Excel file of over 100,000 records to import into SQL Server 2008 R2.

I used SqlBulkCopywith ado.net DataTableto copy data to a table ( Table1).

Then the verification and processing of the data and their marking are valid / invalid.

Then add and update records to 3 tables ( Table2, Table3and Table4) from Table1using the stored procedure.

Table4contains 2 triggers and uses the cursor to update several records associated with it inside Table4and another table ( Table2).

When testing, it takes 1 minute for 10,000 records from my local machine in debug mode.

If 10 or 100 users are imported at the same time, does it support?

Is there an improved process for improvement and support for many concurrent users?

My computer configuration:

  • Windows 10 64Bit
  • 3.6 GHz AMD FX QuardCore processor.
  • RAM 8 GB

  • Excel file contains 50 columns

  • BulkCopy takes 2 - 3 seconds
  • The stored procedure takes from 30 seconds to 2 minutes.

When I test it with 5 concurrent users, process 2 or 3 succeeds and I get this error:

Transaction (process ID 66) is locked in lock mode | communication buffer resources with another process and was selected as a victim of deadlock. Restart the transaction.

+4
4

Zzz Project, . Entity Framework, Dapper. ZZZProject

+2
  • (, , ) ? , ? MERGE INSERT UPDATE.

  • (, CTE while loop MERGE statement) FAST_FORWARD, .

  • . ?

  • . -. , , . Sql. , .

  • SSIS .

  • table2_temp, table3_temp, table4_temp ( ). MERGE 2, table3, table4, 4 .

+2

/ .

+1

, .

+1

Source: https://habr.com/ru/post/1684519/


All Articles