Cycling 100K times when a stored procedure call will at least create a 100K cross process and / or cross-network calls will be slow.
If you are using an SQL server, another option is to use TVP ( table value parameters ) to avoid calling the insert in a loop from your C #. It allows you to transfer a data table to a stored procedure in one call.
From the link above, they recommend 1,000 lines at a time (but always measure and experiment for your application):
Using table parameters is comparable to other ways to use set-based variables; however, often using tabular parameters can be faster for large data sets. Compared to bulk operations that have a higher initial cost than tabular parameters, tabular parameter values โโare well suited for inserting less than 1000 rows.
So, maybe try a cycle 100 times, passing 1000 lines at a time (instead of crossing the border 100K times).
You can also overestimate why asp.net has 100K elements at once in your application. Is it transferred to the server and stored in memory immediately with possible memory problems? Could this be broken? Do you do data processing when asp.net reads and processes 100K lines where the sql server agent task might be more suitable? If you provide more information about the data flow of your application and what it does, people can offer more options.
source share