Overview
I came across a performance slowdown repeating MANY times through a calculator class. Iterations take about 3 million. Each at the beginning and take more time and more as the number of iterations grows (30 million per / per process). I need to stop the program / Restart execution when I left it in order to return to normal conditions (3 million / Per process).
WHAT AM I DOING
I have a scientific application that checks a set of parameters in a process. For example, I have N scenarios (i.e., a Combination of parameters), tested using a set of experiments, which consist of a calculator class that takes parameters in the input data, processes them in T possible XP conditions, and stores the output in ORM objects that after each iteration, it runs in the database. In other words, each of the N-parameter combinations is transmitted T times through a calculator.
Parameter combination : Params Set 1, Params Set 2, ...., Params Set N Experimental Set : XP Set 1 , XP Set 2 , ...., XP Set T
So, I have combinations of NxT, N and T - about 256 each, which gives 65000+ iterations.
HOW DO I DO IT
I have a GUI for fixing parameter sets and starting background workers (one combination of parameters). Each working Backrgound loads the first set of T XP, executes the current set of parameters, proceeds to the next set of XP, etc. The report is generated after each separate iteration by the calculator (i.e., after each Nx / Tx) and an event is fired to populate the .NET Linq / SQL ORM objects (AgileFX) and store them in the SQL Server database.
PROBLEM
The process works fine with the first 30mn, and then slowly starts to drift, each iteration takes longer and longer (sound like a memory overflow or so ...)
HINT
Oddly enough, the experimenter noted very appropriately that the processing time increases linearly : + 3mn more than the processing time of the use-case. Which boils down to arithmetic progression (Tn + 1 = Tn + 3mn) I have a 12-core INTEL and 24 GB RAM.