Reviewing Performance Metrics On Stored Procedures In SQL Server
It’s often a good idea to review the performance of a stored procedure prior to pushing it out to your production environment. Whether you’re pushing out a new proc or simply publishing an updated procedure you should always review the performance of the procedure in order to ensure it will run in a timely manner. The best practice should have you recording your metrics in a change log at the top of your proc. Each subsequent code change should have the new metrics included prior to publication. This will allow you to track the metrics over time and help identify areas of concern should the procedure take significantly longer to execute after a code change is made.
Personally, I use the script below to return the CPU time, logical reads and total execution time. This script is designed to execute three times on a procedure. Each time a proc is altered SQL will create a new execution plan. Generally the first couple of times you execute the procedure it will take a bit longer for the results to pull back. Once the procedure has been executed the data will be cached and allow subsequent executions to be completed more quickly. For procedures that are used to drive reporting, running the proc several times will give you the best metrics for how the proc will perform in your environment.
SET NOCOUNT ON
DECLARE @cpu_ INT, @lreads_ INT, @eMsec_ INT, @i INT
SET @i = 1 WHILE @i <=3 BEGIN
SELECT @cpu_ = cpu_time,
@lreads_ = logical_reads,
@eMsec_ = total_elapsed_time
FROM sys.dm_exec_requests
WHERE session_id = @@spid
------------------------------------------------------------------
BEGIN TRAN
--INSERT SAMPLE EXECUTION LINE HERE
ROLLBACK
------------------------------------------------------------------
SELECT 'CpuMs-' + CAST(cpu_time- @cpu_ AS VARCHAR(10)) AS CpuMs,
'LogRds-' + CAST(logical_reads- @lreads_ AS VARCHAR(10)) AS LogRds,
'Elapsed-'+ CAST(total_elapsed_time - @eMsec_ AS VARCHAR(10)) AS Elapsed
FROM sys.dm_exec_requests
WHERE session_id = @@spid
SET @i = @i + 1
END
Below is an example of how the output will look. I’ve hidden the data out put to better illustrate how the script performed on each of the 3 executions. You will see that the final execution took significantly less time than the first. This is because once the procedure has been executed the data is cached and an execution plan has been created. The metrics provided by the second and third execution of the procedure will give a better picture of how the script will perform on a day-to-day basis.
An additional step you should take is to copy and paste the metrics of the newly published procedure into a change log within the procedure itself. Change logs are important for tracking the changes made to an object so that you can have a better understanding of what has been updated on the object over time.