Dynamic Data Modeling, Auto-Adaptation

I’ve long held a belief that the Data Vault modeling style ushered in a new era, that being one where we have the power to dynamically accept and automatically absorb new data elements.  Of course, as always there are some rules and regulations (patterns) around when and how this can happen, and when the modeling concepts require modeling intervention.  However – imagine a world for a minute where this can happen.  What would it do for you? How helpful would it be?  Is it something you are interested in?

It is a very important step to making a data warehouse even easier to deal with – and it’s one that is long over-due.  I’m of the belief that Dynamic Data Models are the future.  What I mean by that term is: the ability to change not only the ETL/ELT on the fly to accomodate the new elements, but the ability to automatically assign the new elements to the target model (be it: staging, data vault, or even raw star schema).

I have a working prototype of this technology in my labs right now, and it’s proving to be quite elegant.  When it sees a new element it tries to determine where to attach it to the target model (through a custom process), and then re-generates the specific ETL/ELT load code.  It finally sends me an email letting me know there’s a new element available.

Granted, if you are dealing with XML and a new XSD shows up one night with the feed, then it makes it easier – the new element, it’s parent, and it’s relationship to the parent records have already been defined.  But if you simply receive a new element on a flat-file, well, that’s a different story. 

It’s somewhat an easy task to modify both the staging and the Data Vault models once the target structure is figured out.  Generating the new ETL/ELT loader code is also mostly easy.  The harder task is actually getting those ETL/ELT processes imported to the correct place for running.  Of course this sort of activity in production has “long” been frowned upon, but I say this: for the Data Vault and for the staging area, capturing the information is of the utmost importance. 

The fact that it arrived becomes a part of the audit trail.  I also say this: business users who are used to changing business rules through dynamic interfaces on the business user side, ALREADY do this sort of thing in production.  So if they do it in OLTP, why shouldn’t they do this for data warehousing?

I believe Dynamic Data Warehousing is on it’s way, and I will do everything I can to get it there and to prove the concept works, and works well.  Who knows, it might never be accepted by mainstream, but then again – it might be forcibly pushed by business user community for more rapid response to changes by the EDW team.

Do you have a Dynamic Data Warehousing need?  Ask your questions…  I’d love to hear what pros/cons and thoughts you have about the subject.

Thanks,
DanL@DanLinstedt.com

Tags: , , ,

No comments yet.

Leave a Reply

*