In this day and age with such a prolific speed in processor it’s relatively rare (although not unheard of by any stretch) that I hear people saying that they don’t have the CPU resources to run compression on their databases… but what I do regularly hear is people asking what space saving benefits it will give them.
Well, luckily, there is a stored procedure that Microsoft have built into SQL Server which you can use to answer exactly that question…
With that in mind, let’s try it out using one of the larger tables in my AdventureWorks database:
exec sp_estimate_data_compression_savings
@schema_name = 'sales',
@object_name = 'salesOrderDetailEnlarged',
@index_id = 1,
@partition_number = null,
@data_compression = 'row'
go
exec sp_estimate_data_compression_savings
@schema_name = 'sales',
@object_name = 'salesOrderDetailEnlarged',
@index_id = 1,
@partition_number = null,
@data_compression = 'page'
go
As per the above, you can clearly see the projected savings you would receive with each type of compression… obviously with row being a lower compression setting than page.
Note – I have to admit that I’ve slightly cut off the results from that query… there are two columns to the right which basically tell you that SQL Server took a sample, ran compression against that sample, and then projected up to the full size of the table… therefore you have to be careful because you need to know your data in order to know whether or not the sample is likely to be accurate for your whole table. However, as a whole, the prediction is pretty good and you can see that the savings are, in my case, very good indeed… I could halve the size of the table in question… not bad if you happen to be in a situation in which you have CPU to spare but disk and RAM are limited.