Profiling SQL Server and / or ASP.NET

How could you profile multiple queries that are launched from an ASP.NET application? There is some software that I work in that is very slow due to the database (I think). Tables have indexes, but they are still draggable because they work with a lot of data. How can I profile to see where I can make a few minor improvements that we hope will lead to faster improvements?

Edit: I would like to add that the web server likes the timeout during these long requests.

+4
source share
5 answers

Sql Server has great tools to help you in this situation. These tools are built into Management Studio (formerly called Enterprise Manager + Query Analyzer).

Use SQL Profiler to show you the actual queries coming from a web application.

Copy each of the problematic queries (those that consume a lot of CPU time or IO). Run queries using "Show Actual Execution Plan". Hope you see some obvious index that is missing.

You can also launch the setup wizard (the button is located next to "Actual execution plan"). He will run a request and make suggestions.

Usually, if you already have indexes and the queries are still slow, you will have to rewrite the queries in a different way.

Storing all your queries in stored procedures makes this easy.

+5
source

To profile SQL Server, use SQL Profiler .

And you can use the ANTS Profiler from Red Gate to profile your code.

+4
source

Another .NET profiler that plays well with ASP.NET is dotTrace . I personally used it and found many bottlenecks in my code.

+3
source

I believe that you have the answer you need to comment on the requests. However, this is the easiest part of tuning performance. Once you know that these are requests, not a network or application, how do you find and fix the problem?

Performance tuning is a tricky thing. But there are places to see first. Are you saying that you are returning a lot of data? Are you returning more data than you need? Are you really returning only the columns and records that you need? Returning 100 columns with select * can be a lot slower than returning 5 columns that you are actually using.

Are your indexes and statistics updated? See how to update statisistcs and reindex into BOL if you haven't already. Do you have indexes in all join fields? What about the fields in the where clause.

Are you using a cursor? Did you use subqueries? How about merging - if you use it, can it be changed to merging everyone?

Are your requests accessible (google if they are not familiar with the term.)

Do you use different ones when you can use a group?

Do you get locks?

There are many other things to look at, this is just the starting point.

+2
source

If there is a specific request or stored procedure that I want to configure, I found the inclusion of statistics before the request will be very useful:

SET STATISTICS TIME ON SET STATISTICS IO ON 

When statistics are included in Query Analyzer, statistics are displayed in the Messages tab of the Results panel.

IO statistics was especially useful for me because it allows me to find out if I need an index. If I see a high reading rate from IO statistics, I can try adding different indexes to the affected tables. When I try the index, I run the query again to find out if the read count has fallen. After several iterations, I usually can find the best index for the tables involved.

Below are the MSDN links for these statistics commands:

SET STATISTICS TIME

SET STATISTICS IO

+1
source

Source: https://habr.com/ru/post/1277129/


All Articles