Worth Corner, Crawley, RH10 7SL

Support Hours - Mon-Sat, 8.00-18.00

Welcome to our SQL Server blog

This blog was created in order to pass on some of our knowledge and advice in the hope that the community might find it useful.

Obviously Microsoft SQL Server is our area of expertise and therefore this is our most thorough blog with everything from the basic through to the the incredibly advanced and, as such, we should have something for everyone within our pages.

Optimize For Ad-Hoc Workloads
Published: Oct 16, 2015
Since finding out about this option it has been an sp_configure setting that I’ve generally recommended everyone turns on within their SQL Server.

It’s not essential that you do so, just a recommendation that I give, alongside a full explanation of course.

So what does this setting do?

Whenever we run ad-hoc statements on our SQL Servers we generate a plan which gets put into cache, taking up space. However, what many people don’t realize is that within SQL Server these plans are very rarely ever re-usable (see my previous post).

Therefore this setting can help to reduce the wasted space associated with these.

With this setting turned on, SQL Server will not save a full execution plan in the plan cache, but instead it simply saves a “plan stub”. This is a much smaller entry which will only become a full plan in cache if you were to run the exact same textual query for a second time.

This means that in a system which relies on non-repeated ad-hoc statements (perhaps application generated t-sql code), this can drastically reduce the memory required for the plan cache and therefore allocate that memory to your buffer pool instead.

This is demonstrated very well indeed on an MSDN blog which can be found here so that’s where I’ll take my examples from (no point re-inventing the wheel).

So… we’ll start with an empty plan cache (do NOT run on a production system please):

dbcc freeproccache

We’ll also ensure that the optimize for ad-hoc workload setting is off:

sp_configure 'optimize for ad hoc workloads', 0

And then we’ll check the size of our plan cache to make sure that it is indeed empty based on the criteria we’ll be using (the sales.salesOrderHeader table in AdventureWorks2012):

select plansizeinmb = sum(size_in_bytes)/1024.0/1024.0
from sys.dm_exec_cached_plans
cross apply sys.dm_exec_sql_text(plan_handle)
where text like '%salesorderheader%' and
text not like '%dm_exec_cached_plans%'

So now we’re going to invent a pointless workload just to mimic the creation of numerous ad-hoc statements being run against the instance:

declare @salesOrderID table
id int identity(1, 1),
orderID int
insert into @salesOrderID
select salesOrderID
from sales.SalesOrderHeader

declare @counter int = 1, @sql varchar(max)

while @counter <= (select max(id) from @salesOrderID)
@sql = 'select h.salesOrderID
from sales.salesOrderHeader h
join sales.salesPerson p
on h.salesPersonID = p.businessEntityID
where salesOrderID = '
+ convert(varchar(6), orderID)
from @salesOrderID
where id = @counter

exec (@sql)

select @counter += 1

This code takes a while to complete (and ideally you want to discard results after execution so that you don’t collapse management studio), but when it does finish you can see that the size of the plan cache has grown to a massive amount:

So let’s change the sp_configure setting:

sp_configure 'optimize for ad hoc workloads', 1

Now reset the plan cache and run the exact same code again.

This time when it finishes we have the following result for the size of the plan cache:

So you can see just how powerful a feature this is to have turned on in a system which is hit hard by ad-hoc queries. Personally I’d much rather have 1GB+ larger buffer pool than wasted plan cache space if I can help it.

Leave a Comment
Your email address will not be published. All fields are mandatory.
NB: Comments will only appear once they have been moderated.

SQL  World  CEO
Kevin  Urquhart

iPhone Selfie

I am a SQL Server DBA, Architect, Developer, Trainer, and CEO of SQL World. This is my blog in which I’m simply trying to share my SQL knowledge and experiences with the world.


© Copyright 2020 SQLTraining Ltd.