Reaching memory limit slows down .Net application

We have a 64-bit C # / application. Net3.0, which runs on a 64-bit Windows server. From time to time, an application may use a large amount of available memory. In some cases, the application stops allocating additional memory and slows down significantly (500 times slower). When I check the memory from the task manager, the amount of memory used is barely changing. The application continues to run very slowly and never throws an exception in memory. Any ideas? Let me know if additional data is needed.

+4
source share
5 answers

You can try to enable server mode for Garbage Collector . By default, all .NET applications run in workstation mode, where the GC tries to perform its deployments while maintaining the application. If you enable server mode, it will temporarily stop the application so that it frees up memory faster (much) and also uses different heaps for each processor / core.

Most server applications will see a performance improvement using GC server mode, especially if they allocate a lot of memory. The downside is that your application will basically stop when it runs out of memory (until the GC runs out).

* To enable this mode, paste the following into app.config or web.config :

 <configuration> <runtime> <gcServer enabled="true"/> </runtime> </configuration> 
+5
source

When you press the physical memory limit, the OS will start paging (i.e. write the memory to disk). This will really cause the slowdown that you are observing.

Decision?

  • Add more memory - this will help until you press the new memory limit
  • Rewrite your application to use less memory
  • Find out if there is a memory leak and fix it.

If memory is not a problem, is it possible that your application hits the CPU hard? Do you see that the processor reaches 100%? If so, check that large collections are repeated over and over.

+5
source

Other answers have a lot of good stuff. However, I'm going to chip in my two pence (or cents - depending on where you come from?) Anyway.

Assuming that this is indeed a 64-bit process, as you stated, there are several areas of investigation ...

What memory usage are you checking? Using Mem or VMem size? The VMem size is the one that really matters, as it applies to both paged and nonpaged memory. If two numbers are far from noise, then memory usage is indeed the cause of the slowdown.

What is the actual memory usage of the whole server when something starts to slow down? Is slowdown related to other applications? If this is the case, then you may have a kernel memory problem - which may be due to huge amounts of disk access and low-level resource use (for example, creating 20,000 mutexes or loading several thousand raster images using code using Win32 HBitmaps). You can get some instructions about this in the task manager (although the version of Windows 2003 is more informative directly on this issue than in 2008).

When you say that the application becomes much slower, how do you know? Do you use huge dictionaries or lists? Could it be that the internal data structures become so large that they complicate the work that any internal algorithms perform? When you reach huge numbers, some algorithms may start slower by an order of magnitude.

What is the CPU load on startup on a full platform? Actually the same as slowing down? If the CPU usage decreases as the memory usage increases, this means that everything that it does forces the OS to work longer, which means that it probably loads the OS too much. If there is no difference in processor load, then I assume that the internal data structures become so large that they slow down your algos.

I would probably look at the performance of Perfmon in the application - starting with some .Net and my own memory counters, hits and misses, cache and disk queue length. Run it in the course of the application from launch to when it starts working like an asthmatic tortoise, and you can just get the key to it.

+2
source

Looking through the other answers, I would say that there are many good ideas. Here I have not seen:

Get a memory profiler, for example SciTech MemProfiler. He will tell you what stands out, why, and he will show you the whole piece of n cubes.

He also has video tutorials if you don’t know how to use it. In my case, I found that I had IDisposable instances that I did not use (...)

+1
source

Source: https://habr.com/ru/post/1308804/


All Articles