I have a long console application, through millions of iterations. I want to use if memory usage increases linearly as the number of iterations increases .
What would be the best way to do this?
I think I really need to worry about using peak memory during startup? I basically need to find out what maximum number of iterations I can run on this equipment, given the memory on the server.
I am going to set up a batch run of runs and record the results for different interaction sizes, and then graphically display the results to determine the memory usage trend , which can then be extrapolated to any given equipment.
Look for tips on the best way to implement this, which .net methods, classes to use, or should I use external tools. This article http://www.itwriting.com/dotnetmem.php offers me to profile my own application using code to share the shared memory used by the .net environment in other applications on the box.
thanks
source
share