There are many .NET assemblies in our application that have not yet been deployed using NGen scripts, so they are always in JITted mode at runtime.
Since our application is usually deployed on a terminal server, getting Windows to share binary images of code is probably more optimal than the current method, so I’m considering setting up base addresses and NGen'ning assemblies.
So, I ran the program without NGen at all and used [listdlls from SysInternals] [1] to find the size of each, which I then increased to the next size class (ie xxxx → 10000). Then I laid out a memory list for all of our assemblies and adjusted the base addresses of all of them.
So far so good with listdlls Now I can see that none of our assemblies have been reinstalled at runtime.
However, how can I measure how much memory is actually shared between two instances? Basically, let's say I run two instances of the program without running NGEN on the builds, and then, after running NGEN, do it again.
What numbers should I look at and with which instrument to find the actual effect, if any?
For example, I know that the act of reloading our assemblies can lead to the fact that we use third-party assemblies that we use (for example, DevExpress components) so that they suddenly rebuild, and then all this is washed.
So where do I read, what numbers? For example, am I using the task manager working set? Private memory? fix size? free memory before and after?
Any tips?