in this posting i address a question of maintainability and complexity for data vault models. if you have comments, thoughts, or other experiences i encourage you to add your comment to the end of this posting.
how maintainable is this approach from a complexity standpoint?
very maintainable. in fact it is one of the most flexible data modelling architectures on the planet! it changes when you need to change it. it doesn’t complain, there’s near zero re-engineering of etl, queries, and downstream processes. this is actually shown in the slides for tdwi technical presentation and discussed in my coaching section. i’d say if you build a data vault, you most likely will never have to “stop the whole project” and re-engineer it from scratch again.
but… ask my customers, don’t just take my word for it. go to the forums at: linkedin.com – “data vault discussions” register (it’s free) and ask people what they think… they’ll tell you. why else would so many companies and government agencies around the world have invested so much time and money in the data vault? they wouldn’t do it if it didn’t work, and if it wasn’t flexible.
let’s put it in perspective: to make a change to a typical “federated” data warehouse (using stars and conformed dimensions, etc..) usually can take 3 to 9 months, and requires 3 to 8 people – working on everything from data modeling, to database partitioning, to re-indexing, to parallel queries, to index hints, to report writing, to etl…
to make a change to a “data vault” model, there’s no comparison…. it usually will take you less than 24 hours (if your data vault model has been built properly) – and i’m talking about: loading, modeling, and reporting (getting the data to the users).