Worth Corner, Crawley, RH10 7SL

Support Hours - Mon-Sat, 8.00-18.00

Welcome to our SQL Server blog

This blog was created in order to pass on some of our knowledge and advice in the hope that the community might find it useful.

Obviously Microsoft SQL Server is our area of expertise and therefore this is our most thorough blog with everything from the basic through to the the incredibly advanced and, as such, we should have something for everyone within our pages.


Slowly Changing Dimension Implementations - Synchronous Triggers
Published: Dec 02, 2020
In the previous post I went through what a Slowly Changing Dimension is and why they can be such a problem for people when designing database solutions. Here I’ll start to work through a few implementations. Note there is no real “this is the way to go” solution because each has its advantages and pitfalls and needs to be considered carefully by each architect for the design at hand but, hopefully, knowing a few solutions will mean that you can try and few and find one that you’re most comfortable with.

Let's get started...

Synchronous Triggers

This is one that causes a lot of arguments within developers and architects alike because there is a long standing train of thought that “Triggers are bad and therefore shouldn’t be used”. Personally I don’t think there’s anything wrong with Triggers but that does come with the caveat that they’re used properly, implemented well, don’t hinder performance, and serve a specific purpose not best suited to any other object.

In the case of Slowly Changing Dimensions there are little options other than the trigger in order to track your changes. For this we’ll use the DimCompany table from my previous posts as our example and I’m going to use the “dbo and history schema” version of the Dimension… you can have fun working out triggers for the other types of implementation on your own:

drop table if exists history.DimCompany, dbo.DimCompany
go

create table DimCompany
(
    
companyID tinyint identity not null,
    
fromDate datetime not null,
    
toDate datetime not null,
    
companyName varchar(100) not null
)
go

create table history.DimCompany
(
    
companyID tinyint not null,
    
fromDate datetime not null,
    
toDate datetime not null,
    
companyName varchar(100) not null
)
go


So with the above, we would look to implement a trigger in the following way:

create trigger tr_dimCompany on dbo.DimCompany for update, delete
    
-- note not for Insert as new records won't need to be in History
as
    
-- Log deleted record for both Delete and Update
    
insert into history.DimCompany(companyID, companyName, fromDate, toDate)
    
select companyID, companyName, fromDate, ToDate
    
from deleted

    
-- check if it's an update
    
if exists (select * from inserted)
    
begin
        
-- Change the dates if not passed in
        -- (most often these would be changed to defaults - I've hard coded for ease of demo)
        
update c
        
set c.companyName = i.companyName,
            
c.fromDate = isnull(i.fromDate, current_timestamp),
            
c.toDate = isnull(i.toDate, '9999-12-31')
        
from DimCompany c
        
join inserted i
        
on c.companyID = i.companyID
    
end
go


We can now run a simple insert and see what happens (record in the dbo table, nothing in history):

insert into DimCompany select '2000-01-01', '2018-03-31', 'Pear Technology Ltd'
go

select * from DimCompany
select * from history.DimCompany
go



Now let’s update our record and see what happens (note that normally we would let the system deal with all dates in an audit but I’m hard coding them just to make them match what we saw in my previous demo scripts between dbo and history):

update DimCompany
set fromDate = '2018-04-01', toDate = '9999-12-31', companyName = 'Pear Computers'
where companyID = 1
go

select * from DimCompany
select * from history.DimCompany
go



That’s exactly what we wanted.

This is the way in which I see nearly all slowly changing dimensions being coded and I therefore think it’s fine to use Triggers… just make sure that you’re not doing anything silly such as bulk inserts of 100,000 rows in individual transaction etc. otherwise you’ll definitely notice a performance problem quite quickly. Also, if you have a lot of processes updating records simultaneously, then maybe also consider Read Committed Snapshot Isolation so that your History tables don’t become a bottleneck.

Leave a Comment
Your email address will not be published. All fields are mandatory.
NB: Comments will only appear once they have been moderated.

SQL  World  CEO
Kevin  Urquhart

iPhone Selfie

I am a SQL Server DBA, Architect, Developer, Trainer, and CEO of SQL World. This is my blog in which I’m simply trying to share my SQL knowledge and experiences with the world.

Categories


© Copyright 2020 SQLTraining Ltd.