When attempting to assess the performance of a given script on SQL Server it is quite difficult.
The caching of execution plans etc etc, lead to the script taking 20 seconds on the first pass, 10 on the next and settles in to only taking 2 seconds.
This makes it hard to compare apples with apples.
I have found that running the following improves the consistency of the reporting timings, according to books online
“Use DBCC DROPCLEANBUFFERS to test queries with a cold buffer cache without shutting down and restarting the server.”
CHECKPOINT GO DBCC DROPCLEANBUFFERS GO
It runs each script with a “Cold buffer cache” and gives you a consistent starting point for comparing queries and gives you a more realistic time frame for your SQL execution.
Also helpful when attempting to time your query execution time:
set statistics time ON
And to further analyse poorly running queries
set statistics IO ON
With IO statistics on you can see the results of physical disk activity. For example previously I used this to see that on a particular query when I changed one small part of the query it caused the execution plan to change drastically and the disk IO went from doing a few thousand reads on a very large table to a couple of million reads, which blew out the performance of the query.
Hope this is useful