SQL Query is slower in a .NET application, but instantly in SQL Server Management Studio

Here is SQL

SELECT tal.TrustAccountValue FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = 70402 AND ta.TrustAccountID = 117249 AND tal.trustaccountlogid = ( SELECT MAX (tal.trustaccountlogid) FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = 70402 AND ta.TrustAccountID = 117249 AND tal.TrustAccountLogDate < '3/1/2010 12:00:00 AM' ) 

Basically, the Users table contains the TrustAccount table and the TrustAccountLog table.
Users: Contains users and their data.
TrustAccount: A user can have multiple TrustAccounts.
TrustAccountLog: contains a check of all TrustAccount "movements". A TrustAccount is associated with multiple TrustAccountLog entries. Now this query is executed in milliseconds inside SQL Server Management Studio, but for some strange reason, it always occurs in my C # application and even in the schedule (120 seconds).

Here is the code in a nutshell. It is called several times in a loop and the operator is ready to work.

 cmd.CommandTimeout = Configuration.DBTimeout; cmd.CommandText = "SELECT tal.TrustAccountValue FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = @UserID1 AND ta.TrustAccountID = @TrustAccountID1 AND tal.trustaccountlogid = (SELECT MAX (tal.trustaccountlogid) FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = @UserID2 AND ta.TrustAccountID = @TrustAccountID2 AND tal.TrustAccountLogDate < @TrustAccountLogDate2 ))"; cmd.Parameters.Add("@TrustAccountID1", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID1", SqlDbType.Int).Value = userId; cmd.Parameters.Add("@TrustAccountID2", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID2", SqlDbType.Int).Value = userId; cmd.Parameters.Add("@TrustAccountLogDate2", SqlDbType.DateTime).Value =TrustAccountLogDate; // And then... reader = cmd.ExecuteReader(); if (reader.Read()) { double value = (double)reader.GetValue(0); if (System.Double.IsNaN(value)) return 0; else return value; } else return 0; 
+44
performance c # sql-server
Apr 29 '10 at 10:45
source share
13 answers

If this is a sniffing parameter, try adding option(recompile) at the end of your request. I would recommend creating a stored procedure to encapsulate logic in a more manageable way. It is also agreed - why do you pass 5 parameters if you need only three, judging by the example? Can you use this query?

 select TrustAccountValue from ( SELECT MAX (tal.trustaccountlogid), tal.TrustAccountValue FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = 70402 AND ta.TrustAccountID = 117249 AND tal.TrustAccountLogDate < '3/1/2010 12:00:00 AM' group by tal.TrustAccountValue ) q 

And for what you stand, you use an ambiguous date format, depending on the language settings of the user performing the request. For me, for example, this is January 3rd, not March 1st. Check this:

 set language us_english go select @@language --us_english select convert(datetime, '3/1/2010 12:00:00 AM') go set language british go select @@language --british select convert(datetime, '3/1/2010 12:00:00 AM') 

The recommended approach is to use the "ISO" format yyyymmdd hh: mm: ss

 select convert(datetime, '20100301 00:00:00') --midnight 00, noon 12 
+26
Apr 29 '10 at 21:07
source share
β€” -

In my experience, the usual reason why a request is executed quickly in SSMS, but slower from .NET, is due to differences in the SET settings connection. When a connection is opened by SSMS or SqlConnection , a group of SET commands is automatically issued to configure the runtime. Unfortunately, SSMS and SqlConnection have different SET values ​​by default.

One common difference is SET ARITHABORT . Try executing SET ARITHABORT ON as the first command from your .NET code.

SQL Profiler can be used to control which SET commands are issued by both SSMS and .NET, so you may find other differences.

The following code demonstrates how to issue the SET command, but note that this code has not been tested.

 using (SqlConnection conn = new SqlConnection("<CONNECTION_STRING>")) { conn.Open(); using (SqlCommand comm = new SqlCommand("SET ARITHABORT ON", conn)) { comm.ExecuteNonQuery(); } // Do your own stuff here but you must use the same connection object // The SET command applies to the connection. Any other connections will not // be affected, nor will any new connections opened. If you want this applied // to every connection, you must do it every time one is opened. } 
+51
Apr 29 '10 at 10:54
source share

Had the same problem in a test environment, although a working system (on the same SQL server) worked fine. Adding OPTION (RECOMPILE) as well as OPTION (OPTIMIZE FOR (@ p1 UNKNOWN)) did not help.

I used SQL Profiler to find the exact query that was sent by the .net client, and found that it was wrapped by exec sp_executesql N'select ... and that the parameters were declared as nvarchars - the compared columns are simple varchars.

Putting the captured request text in SSMS has confirmed that it is as slow as running from a .net client.

I found that changing the parameter type in AnsiText fixes the problem:

p = cm.CreateParameter() p.ParameterName = "@company" p.Value = company p.DbType = DbType.AnsiString cm.Parameters.Add(p)

I could never explain why test and living environments had such a noticeable difference in performance.

+8
Sep 24 '14 at 15:25
source share

Most likely, the problem lies in the criteria

 tal.TrustAccountLogDate < @TrustAccountLogDate2 

The optimal execution plan will depend heavily on the value of the parameter, passing 1910-01-01 (which does not return rows) will certainly cause a different plan than 2100-12-31 (which returns all rows).

When a value is specified as a literal in a query, SQL Server knows which value to use when generating the plan. When a parameter is used, the SQL server will only generate the plan once and then reuse it, and if the value in the subsequent execution is too different from the original, the plan will not be optimal.

To correct the situation, you can specify OPTION(RECOMPILE) in the request. Adding a query to a stored procedure will not help you with this particular problem, unless you create a WITH RECOMPILE procedure.

Others have already mentioned this ("sniffing option"), but I thought that just explaining the concept would not hurt.

+5
Apr 29 2018-10-21T00:
source share

These can be type conversion problems. Are all identifiers really SqlDbType.Int at the data level?

Also, why are there 4 options, where 2 will do?

 cmd.Parameters.Add("@TrustAccountID1", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID1", SqlDbType.Int).Value = userId; cmd.Parameters.Add("@TrustAccountID2", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID2", SqlDbType.Int).Value = userId; 

May be

 cmd.Parameters.Add("@TrustAccountID", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID", SqlDbType.Int).Value = userId; 

Since they are assigned the same variable.

(This can lead to the server making a different plan, because it expects four different variables to be 4: 4 constants, which makes it 2 variables that can affect server optimization.)

+4
Apr 29 '10 at 14:23
source share

I hope your specific problem will be resolved by now, as this is an old post.

The following SET options may affect reuse of the plan (full list at the end)

 SET QUOTED_IDENTIFIER ON GO SET ANSI_NULLS ON GO SET ARITHABORT ON GO 

The following two statements relate to msdn - SET ARITHABORT

Setting ARITHABORT to OFF can adversely affect query optimization, resulting in performance issues.

The default ARITHABORT option for SQL Server Management Studio is enabled. Setting ARITHABORT client applications to OFF can accept different query plans, making it difficult to troubleshoot poor performance. That is, the same request can be quickly executed in the management studio, but slower in the application.

Another interesting topic to understand is Parameter Sniffing , as stated in the Slow app, fast in SSMS? Understanding The Mysteries - Erland Sommargog

Another possibility is to convert (inside) VARCHAR columns to NVARCHAR using the Unicode input parameter, as indicated in Troubleshooting SQL Index Errors in varchar Columns - by Jimmy Bogard

OPTIMIZATION FOR THE UNKNOWN

In SQL Server 2008 and above, consider OPTIMIZE FOR UNKNOWN. UNKNOWN: indicates that the query optimizer uses statistics instead of the initial value to determine the value for the local variable during query optimization.

OPTION (RECOMPILE)

Use "OPTION (RECOMPILE)" instead of "WITH RECOMPILE" if recompiling is the only solution. This helps in optimizing the implementation of parameters. Read the Sniffing, Embedding Option and RECOMPILE Options - Paul White

SET options

The following SET options may affect msdn- based plan reuse - Planning for caching in SQL Server 2008

  • ANSI_NULL_DFLT_OFF 2. ANSI_NULL_DFLT_ON 3. ANSI_NULLS 4. ANSI_PADDING 5. ANSI_WARNINGS 6. ARITHABORT 7. CONCAT_NULL_YIELDS_NUL 8. DATEFIRST 9. DATEFORMAT 10. FORCEPLAN 11. LANGUAGE 12. NO_BROEDERETET 11. N_BROEDTERTABTER
+3
Jan 21 '16 at 14:45
source share

Since you seem to ever return a value from a single row from a single column, you can use ExecuteScalar () in the command object instead, which should be more efficient:

  object value = cmd.ExecuteScalar(); if (value == null) return 0; else return (double)value; 
+2
Apr 29 '10 at 11:00
source share

In my case, the problem was that my Entity Framework generated queries that use exec sp_executesql .

When the parameters do not match the exact type, the indexes are not used in the execution plan because it decides to translate the conversion into the query itself. As you can imagine, this leads to much lower performance.

in my case, the column was defined as CHR (3), and the Entity Framework skipped N'str 'in the query, which causes the conversion from nchar to char. So for a query that looks like this:

ctx.Events.Where(e => e.Status == "Snt")

It generated an SQL query that looks something like this:

FROM [ExtEvents] AS [Extent1] ... WHERE (N''Snt'' = [Extent1].[Status]) ...

The simplest solution in my case was to change the type of the column, otherwise you might struggle with your code to pass the correct type first.

+2
Dec 17 '15 at 10:55
source share

Sounds perhaps related to sniffing parameters? Have you tried to do what the client code sends to SQL Server (use the profiler to catch the exact statement), and then run this in Management Studio?

Sniffing parameter: Poor performance of the SQL stored procedure execution plan - sniffing parameter

I have not seen this in code before, only in procedures, but it's worth a look.

+1
Apr 29 2018-10-18T00:
source share

You do not seem to close your data reader - it may begin to add up in several iterations ...

0
Apr 29 '10 at 10:52
source share

I had a problem with another root cause that exactly matched the name of these problems.

In my case, the problem was that the result set was open source for the .NET application, when it was looped over each returned record and made three more database queries! For several thousand lines, this misled the initial query, which looked like it was slowly running based on time information from SQL Server.

The fix was therefore a reorganization of the .NET code that caused the calls so that it would not leave an open result set when processing each row.

0
Dec 20 '10 at 16:32
source share

I understand that the OP does not mention the use of stored procedures, but there is an alternative solution to problems with sniff parameters when using stored procedures that are less elegant, but worked for me when OPTION(RECOMPILE) seems to do nothing.

Just copy your parameters into the variables declared in the procedure and use them instead.

Example:

 ALTER PROCEDURE [ExampleProcedure] @StartDate DATETIME, @EndDate DATETIME AS BEGIN --reassign to local variables to avoid parameter sniffing issues DECLARE @MyStartDate datetime, @MyEndDate datetime SELECT @MyStartDate = @StartDate, @MyEndDate = @EndDate --Rest of procedure goes here but refer to @MyStartDate and @MyEndDate END 
0
Nov 09 '15 at 10:49
source share

I suggest you try and create a stored procedure that can be compiled and cached by the Sql server and thus improve performance

-one
Apr 29 '10 at 10:50
source share



All Articles