Our ASP.Net 2.0 web application ran successfully on Windows Server 2003. We began to see some environmental limitations, such as memory bursts and CPU utilization, and as we prepared for scaling, we decided it was time for a larger server with higher availability.
We decided to upgrade to Windows Server 2008 to take advantage of the overall configuration of IIS 7. In our development and integration environment, we reproduced the OS and application in 2008 / IIS 7, and everything seemed fine. But, in truth, we do not yet have a good way to model cargo flows, and we cannot accurately reproduce our prod environment (we are small with limited resources). Therefore, as soon as we went into production, we were surprised to see that productivity was much worse in 2008 than in 2003.
We also switched from a 32-bit environment to a 64-bit one in the process, and we also included an ASP.NET 3.5 dll in the project.
Using memory through the roof, but I'm not so worried about it. In part, we believe that this is due to the overhead of Server 2008 server memory, so you need to get rid of more RAM. The nuisance is that we observe processor spikes up to 99% of processor utilization, which we have never seen before in the 2003 / IIS 6 environment.
Has anyone encountered these problems before and are there any suggestions for a solution / place to search? Now we are doing the following:
1) Purchase time by adding memory.
2) Purchase time, setting limits for the application pool: disable w3wp.exe when the processor reaches 99% of the load. Since you do not have the ability to process application pools, I have a scheduled task that processes any stopped application pools.
3) Classic Integrated, , .
.
Larry Dallas