We have a small C # tool that we put together to analyze a data file, create some objects and paste them into the database.
Logic is essentially.
string [] lines = File.ReadAllLines("C:\\Temp\\Data.dat") foreach(string line in lines) { MyDataObject obj = ParseObject(line); myDataContext.MyDataObjects.InsertOnSubmit(obj); } myDataContext.SubmitChanges();
This was good from the very beginning, since the data file was only ~ 1000 lines per day, but recently this file has grown to ~ 30 000 lines, and the process has become very slow.
All calls to SubmitChanges() fine, but as soon as it starts the process of dumping 30,000 inserts into the database, it just stops. As a test, I fixed 30,000 insert statements and ran them directly from QA. It took about 8 minutes.
After 8 minutes, the C # / Linq version completed only about 25% of the inserts.
Does anyone have any suggestions on how I can optimize this?
source share