Garbage collection modes: if there are 2 applications on the server, does the “Server mode” mean robbing Peter to pay Pavel?

A 2-partner (if necessary, I can separate the questions to get separate correct answers):

We are in a situation where working in server mode is probably appropriate: we have 2 enterprise-level applications running on the same farm. If I set the application configuration in both applications in “server mode”, do I run the risk of damaging one application every time another GC?

Will network load balancing shift traffic from the machine in the middle of the GC (again in server mode)?

EDIT I broke the load balance part of this question: Is the web farm load balancer affected by GC?

+6
source share
3 answers

From MSDN (highlighted by me)

A workstation is the default GC mode and the only one available on single-processor computers. The GC workstation is hosted on Windows Forms console applications. It performs a complete (generation 2) collection simultaneously with the running program, thereby minimizing the delay. This mode is useful for client applications where perceived performance is usually more important than raw bandwidth.

The GC server is available only on multiprocessor computers. It creates a separate managed heap and thread for each processor and performs parallel assemblies. During collection, all managed threads are suspended (threads executing native code are suspended only when their own call returns). Thus, the GC server mode maximizes throughput (the number of requests per second) and improves performance as the number of processors increases. Computers with four or more processors offer enhanced performance. All managed software applications that use the Lync Server Application API must use the GC server.

In server mode, the GC runs periodically on all processors in parallel. This may deprive other applications of processor time during collection.

Thus, server mode can improve application performance, but it can degrade the performance of other applications running on the same server. All this is very speculative - I think you will need to do some benchmarking to accurately check the speed and performance of your applications.

+1
source

If you are concerned about two enterprise applications that affect each other's performance, you should move them to two separate virtual machines.

GC is optimized and runs its own stream (s). It is designed for invisibility for the current application. Therefore, on a multiprocessor corporate server, a separate process should not be damaged at all.

On the other hand, the server is still getting some workload from the GC. If you think that the GC somehow slows down your applications, perhaps you can do some memory and CPU profiling to see where the problem is. You can find a way to optimize the code and use less resources.

From J.Richter "CLR via C # v3" p. 585

This mode finely tunes the garbage collector for the server side of the Application. The garbage collector assumes that no other applications (client or server) are running on the machine, and it is assumed that all processors on the machine are available for garbage collection. This GC mode splits the managed heap into several partitions, one per processor. When garbage collection begins, the garbage collector has one thread per processor; each thread collects its own section in parallel with other threads. Parallel collections work well for server applications in which workflows tend to exhibit uniform behavior. This feature requires the application to run on a computer with multiple processors, so that the threads can actually be run simultaneously to achieve improved performance.

+3
source

I saw the memory allocation from running away due to the lazy GC, apparently affecting unrelated processes before. However, server mode does not seem to do this (hog system memory reduces the collection rate), so you should be fine.

0
source

Source: https://habr.com/ru/post/894209/


All Articles