Measure application performance by aggregating SQL audit records

Suppose there is a simple audit table with two columns (there are more columns in production):

ID | Date 

When the request is processed, we add an entry to this table. Requests are processed in batches; there can be any number of elements in a batch. For each item we add an entry. Between parties there will be at least 2-second delay (number is configurable).

Performance is measured by how quickly we can process requests in a unit of time, for example, per second. Consider the sample data (2 clusters, the number of elements is only for demonstration purposes):

 --2016-01-29 10:27:25.603 --2016-01-29 10:27:25.620 --2016-01-29 10:27:25.637 --2016-01-29 10:27:25.653 --2016-01-29 10:27:25.723 --Avg time between requests = 24ms --2016-01-29 10:27:34.647 --2016-01-29 10:27:34.667 --2016-01-29 10:27:34.680 --2016-01-29 10:27:34.690 --2016-01-29 10:27:34.707 --Avg time = 12ms 

We can say that in the worst case, 41.67 requests can be processed per second, and in the best case, 83.33. It would be nice to know the average batch performance.

Question. Is it possible to get these indicators using only T-SQL and how?

EDIT. To make the results statistically significant, it may be useful to drop batches with less than 10 elements (configurable).

+5
source share
1 answer

I may have simplified your request, but consider the following

 Declare @YourTable table (ID int,Date datetime) Insert Into @YourTable values ( 1,'2016-01-29 10:27:25.603'), ( 2,'2016-01-29 10:27:25.620'), ( 3,'2016-01-29 10:27:25.637'), ( 4,'2016-01-29 10:27:25.653'), ( 5,'2016-01-29 10:27:25.723'), ( 6,'2016-01-29 10:27:34.647'), ( 7,'2016-01-29 10:27:34.667'), ( 8,'2016-01-29 10:27:34.680'), ( 9,'2016-01-29 10:27:34.690'), (10,'2016-01-29 10:27:34.707') Declare @BatchSecondsGap int = 2 -- Seconds Between Batches Declare @MinObservations int = 5 -- Batch must n or greater ;with cte as ( Select *,Cnt = sum(1) over (Partition By Batch) From ( Select *,Batch = sum(Flg) over (Order By Date) From ( Select ID,Date ,Flg = case when DateDiff(SECOND,Lag(Date,1,null) over (Order By Date),Date)> =@BatchSecondsGap then 1 else 0 end ,MS = case when DateDiff(SECOND,Lag(Date,1,Date) over (Order By Date),Date)> =@BatchSecondsGap then 0 else DateDiff(MILLISECOND,Lag(Date,1,Date) over (Order By Date),Date) end From @YourTable ) A ) B ) Select Title = 'Total' ,DateR1 = min(Date) ,DateR2 = max(Date) ,BatchCnt = count(Distinct Batch) ,TransCnt = count(*) ,MS_Ttl = sum(MS) ,MS_Avg = avg(MS*1.0) ,MS_Std = stdev(MS) From cte Where Cnt> =@MinObservations Union All Select Title = concat('Batch ',Batch) ,DateR1 = min(Date) ,DateR2 = max(Date) ,BatchCnt = count(Distinct Batch) ,TransCnt = count(*) ,MS_Ttl = sum(MS) ,MS_Avg = avg(MS*1.0) ,MS_Std = stdev(MS) From cte Where Cnt> =@MinObservations Group By Batch 

Returns

enter image description here


The following figure shows that you will not be punished for the time between batches, so it will become a simple aggregation for the final results

enter image description here

+3
source

Source: https://habr.com/ru/post/1261407/


All Articles