I have a console application (C #) that opens a connection to a sql database, executes a stored procedure, and then exits. The stored procedure executes itself (using getdate and dated) and returns timings to the console application. A stored procedure always reports about 100 milliseconds to complete.
Running the console application several times gives a consistent set of timings (including 300 ms for the ExecuteReader command)
However, what I accidentally discovered and can reliably reproduce is the following effect: If I open SSMS and connect to the database, run the console application twice, ExecuteReader in the console application is much faster the second time.
Please note that you do not need to start or even open the stored procedure in SSMS, you just need to connect to the database
The second launch of the console application significantly affects and really improves due to the opening of SSMS and connecting to the same database For example
ExecuteReader when SSMS is not open 300 ms ExecuteReader when SSMS is not open 300 ms ExecuteReader when SSMS is not open 300 ms Open SSMS and connect to database First ExecuteReader when SSMS is open and connected to same database 300 ms Second ExecuteReader with SSMS open and connected 10 ms !!! Third ExecuteReader with SSMS open and connected 10 ms Fourth ExecuteReader with SSMS open and connected 10 ms Close SSMS ExecuteReader back to reporting 300 ms to execute
In other words, the time specified for the ExecuteReader is less than the time it takes the stored procedure to run
Note that a stored procedure always takes as much time as needed.
It seems that SSMS has the kind of cache that the console application is allowed to use.
Can anyone shed some light on this? sys.dm_exec_connections does not show differences between all different connections
SSMS is v17.3 Connecting SQL Server Database SQL SQL Server SQL
source share