Q&A:Maintainability and Complexity

In this posting I address a question of maintainability and complexity for Data Vault models.   If you have comments, thoughts, or other experiences I encourage you to add your COMMENT to the end of this posting.

How maintainable is  this approach from a complexity standpoint?

Very maintainable.  In fact it is one of the most flexible data modelling architectures on the planet!  It changes when you need to change it.  It doesn’t complain, there’s near zero re-engineering of ETL, queries, and downstream processes.  This is actually shown in the slides for TDWI technical presentation and discussed in my coaching section.  I’d say if you build a Data Vault, you most likely will never have to “stop the whole project” and re-engineer it from scratch again.

But… ask my customers, don’t just take my word for it.  Go to the forums at: LinkedIn.com – “Data Vault Discussions” register (it’s FREE) and ASK people what they think…  They’ll tell you.  Why else would so many companies and government agencies around the world have invested so much time and money in the Data Vault?  They wouldn’t do it if it didn’t work, and if it wasn’t flexible.

Let’s put it in perspective: to make a change to a typical “FEDERATED” data warehouse (using stars and conformed dimensions, etc..) usually can take 3 to 9 months, and requires 3 to 8 people – working on everything from data modeling, to database partitioning, to re-indexing, to parallel queries, to index hints, to report writing, to ETL…

To make a change to a “Data Vault” model, there’s no comparison….  It usually will take you less than 24 hours (if your Data Vault model has been built properly) – and I’m talking about: Loading, Modeling, and Reporting (getting the data to the users).

Tags: , , ,

No comments yet.

Leave a Reply

*