How to handle large inserts in SQL SERVER?

I want to complete a series of queries as part of a migration project. The scripts to be generated are created using a tool that analyzes an outdated database, then creates a script to map each of the old objects to the corresponding new record. Scripts work well for small objects, but some of them have hundreds of thousands of records that create script files of about 80 MB in size.

What is the best way to run these scripts?

Does the command line have SQLCMD, which deals with larger scripts?

I could also break scripts into smaller scripts, but I don't want to run hundreds of scripts to perform the migration.

+3
source share
5

, BULK INSERT .

, , insert -, BULK INSERT .

+2

BULK INSERT BCP , ( XML) XML. , SET IDENTITY INSERT ON, ? , , SSIS , . SQL SSIS BCP, SQL (, SSIS SQL) OUTPUT INTO .

+1

, INSERT, . , ( , - , ).

, ETL (DTS, SSIS, BCP BULK INSERT FROM - ), .

script ( ), PowerShell .

0

script. /, 100 . 30 .

, .

-tab mysqldump TO OUTFILE, .

0

"BULK INSERT" , . CSV? , , , , / , .

0

Source: https://habr.com/ru/post/1698038/


All Articles