C # massive SQL update request

unihere script:

I have a text file of 1.8 million rows that I need to enter into an SQL table, the code that I am working just fine is slow (for example, 250 thousand rows per day). Unfortunately, I have about 4 text files of this size to drop them, so I need a way to speed up the process. Any help would be greatly appreciated. if some code doesn’t look true, it leads me to omit some things to ensure confidentiality. I know that I can cut the file.appendall file, but I use it for tracking, and I also do an asterisk ++, so I can pick up the next day so as not to stop the backup.

DirectoryInfo dinfo = new DirectoryInfo(ocrdirectory);
FileInfo[] Files = dinfo.GetFiles("*.txt");
foreach (FileInfo filex in Files)
{
        string[] primaryfix = File.ReadAllLines(dinfo + "\\" + filex);
        string filename = filex.ToString();
        string[] spltifilename = filename.Split('.');
        foreach (string primary in primaryfix)
        {
            string sqltable = ("dbo.amu_Textloadingarea");
            string sql = "update " + sqltable + 
                         " set [Text] = [Text] + '" + primary +"|"+ 
                         "' where unique = '" + spltifilename[0] + "'";
            File.AppendAllText(@"C:\convert\sqltest.txt", sql+"\n");
            SqlConnection con = new SqlConnection("Data Source= Cote ;Initial Catalog= eCASE;Integrated Security= SSPI");
            con.Open();
            SqlCommand cmd = new SqlCommand(sql, con);
            SqlDataReader reader = cmd.ExecuteReader();
            con.Close();
            Console.WriteLine(start);
            start++;
        }
+3
5

SSIS bcp , .

, , . , , , . :

  • db , .
  • db . ( ).
  • ...

, , .

+7

SSIS SQL Server . SSIS , .

+2

( ). , cmd.ExecuteNonQuery() , . SqlCommand ( , ) SQL .CommandText. , .

+2

. , , .

0

Source: https://habr.com/ru/post/1769156/


All Articles