Do You Believe in Standards?

Even after my last post about standards, rules, procedures there are still a number of people in the industry who believe in “modifying the structure” of the Data Vault, breaking the rules and standards, and thinking that it’s OK to do this and still call it a Data Vault.  In this post, I will cover why the fundamentals standards of the Data Vault are important, and IF you believe that you have a modification that will make the Data Vault better, then I will explain the process that I expect you to go through (publicly) in order to demonstrate to me that it is worth taking a look at.

Where did it all begin?

It all began in 1990 when I noticed there were no documented best practices, no rules, no standards for building, architecting, and implementing enterprise data warehouses.  I knew from experience, that bending and changing existing rules for data modeling simply wouldn’t fit the bill.

You see, I was (back then) doing to 3rd normal form modeling what some people want to do now to the Data Vault.  I was BREAKING THE RULES, defined by Codd & Date.  I was modifying the architecture to force-fit it to meet the needs of the Data Warehouse.  I was breaking more rules by applying common SDLC project management to data warehousing.  I was changing SDLC to meet the needs of the project.

3nf was designed for OLTP.  All of it’s rule sets were made for OLTP, none of the rules met any of the adaptations of data warehousing.  I found this out the hard way!  After breaking the rules, changing the architecture and the design, I ended up with a data warehouse (3 months later) that required changes.  Along the way, I began to realize that the more rules and standards I broke (to make it fit with data warehousing) the harder it became to maintain the data model AND the project.

I then tried Star Schema, much to my shagrin – I got the SAME RESULT… well – if you don’t change where you’re going, you will end up where you are headed!

So what happened next?

I decided that there were some really good parts of the architecture from both 3nf and star schema that worked within the bounds of a data warehouse.  There were also some really good parts of SDLC that worked.  So I took the best-of-breed approaches and put them in a pot to stir up a new batch of architecture and methodology.

In reality, I realized that in order to “come up” with the right rules and standards, I HAD to follow the SCIENTIFIC APPROACH.  In other words, I needed to develop a rule, then test it against a set of criteria that all data warehouses face!  Without some form of scientific test and control experiment I would not get any plausible results, nor be able to test for failures.

IF YOU WISH TO CHANGE, ALTER, MODIFY THE DATA VAULT RULES AND STANDARDS – AND STILL CALL IT A DATA VAULT – THEN YOU NEED TO FOLLOW THE SAME PROCESS!  Only after you submit your results to me, can I possibly consider a change to the architecture and/or the rule sets.

So what are the criteria you tested with?

The following are the criteria that I tested with, and that you must test with in order to submit for architectural changes or rule changes to the Data Vault model and/or methodology:

  • Scalability –
    • MODEL: please arrange 3 test cases to ensure your solution will scale, today’s scalability test should take the architecture/model to 300 Terabytes without breaking.  A theory should be formed about “what happens when you reach 500 terabytes, 1 petabyte, 2 petabytes, and 3 petabytes.” 
    • METHODOLOGY: please arrange scale-up and scale-down of your team resources, or your project approach, MONITOR and MEASURE the results of ESTIMATED HOURS versus ACTUAL HOURS, also watch the COMPLEXITY rating (use a scoring mechanism for this, if you don’t have one – look at FUNCTION POINT ANALYSIS).
  • Flexibility –
    • MODEL: Please arrange 3 test cases for change to DIFFERENT PARTS of the data model.  The MOST IMPORTANT OUTCOME is to have ZERO re-engineering of up-stream and down-stream processes (including other parts of the data model, querying processes, loading processes, real-time loads, re-indexing, database activities, etc…) THERE IS LOTS TO CONSIDER HERE!  BE THOROUGH AND COMPLETE.
    • METHODOLOGY: Please ensure that the changes you make can be accomplished within 2 hours on the data model, and that any new loading processes and any new queries can be written and unit-tested within 8 hours of a standard working day. 
    • ANY DEVIATION from the numbers above, and your proposed change FAILS the test!
  • Repeatability / Redundancy
    • MODEL: please arrange 3 test cases to apply the change to the ENTIRE model.  ANY changes to the fundamental structure must be applied to the entire model.  What’s done to ONE LINK MUST BE DONE TO ALL LINKS – AND STILL WORK!  The same can be said for Hubs and Satellites.  The test case must prove that it can work FOR ALL DATA VAULT MODELS FOR ALL TIME, otherwise it is not a change worth discussing.  If there is a “condition” setup, like a TRANSACTION LINK, then the condition must be weighed in view of the other standards.  (REMEMBER, EVEN WITH A TRANSACTION LINK THERE ARE NO CORE MODIFICATIONS TO THE FUNDAMENTAL LINK STRUCTURE – especially the primary key and business keys – and driving keys).  TOO many conditions or exceptions will raise the complexity rating, and it will fail the flexibility test!!
    • METHODOLOGY: please ensure that the changes to the rules CAN BE: monitored, measured, optimized.  again, if you put too many “exceptions” around the methodology for implementation (for instance: only do this when…) then the method becomes unruly, impossible to measure.  It goes without saying, if you can’t measure it, you can’t monitor it.  If you can’t monitor it, you can’t optimize it.  If you can’t optimize it, you CAN’T ACCOMPLISH THE TASK REPEATABLY – you can’t generate or automate it!!
  • Business Value
    • MODEL: If you cannot see, clearly explain, or measure the BUSINESS VALUE of the new attribute or change to the model, then it fails the case – and HAS NO PLACE in the Data Vault at all.  You must clearly document the BUSINESS VALUE – in general, no matter what the business – EVERYONE using the Data Vault MUST BENEFIT in order for the change to pass the test.
    • METHODOLOGY: The business value is all in the measurement, optimization, and transparency.  If your new process is secretive, then it cannot be open to review from the business.  WE in IT are BUSINESS PEOPLE and it’s HIGH TIME we started acting like it.  We need to run IT like a business (because that’s what it is), and in doing so our methodology must be transparent to our customers.  If you cannot justify the BUSINESS VALUE of the process you are adding to the methodology, then IT HAS NO VALUE and does not belong in the Data Vault Methodology or project plan.

Ultimately, all of these tests need to be applied ACROSS THE BOARD.  They need to work in Batch loads, Real-time loads, Real-time queries, Data Mining, Alerts, Mixed Workload (Real-time & batch at the same time, WHILE querying).  They need to pass physical PERFORMANCE TESTS in order to be allowed in to the standards.  They also need to be technology independent.  It shouldn’t matter (to the new standard) if the model is built on SSD, or in RAM, or on a columnar DB, or a Relational DB, or even a NOSQL DB.

Conclusions…

I welcome thoughts, comments, and ideas about how and when to change the Data Vault model – one of the current items under heavy review and scrutiny is in fact: END-DATES in LINKS.  However, I would politely request, and encourage you, to go through the scientific process – as I did.  That’s what I spent 1990 to 2000 doing in Research and Design with both the methodology and the data model standards.

IT’S BECAUSE I SPENT THE TIME TO TEST IT, THAT IT IS CURRENTLY STABLE, STRONG, AND PROVING ITSELF AS A VIABLE SUCCESS IN BUSINESSES AROUND THE WORLD TODAY.

This is also why I say to you: if the data model or methodology does not follow the standards, then it cannot be called a Data Vault.

I hope this clears the air, and I welcome discussion (both here and on the forums) about all these topics.

Thanks,
Dan Linstedt

Tags: , , ,

One Response to “Do You Believe in Standards?”

  1. Raphael Klebanov 2010/12/24 at 10:40 am #

    Dan,

    The posting of yours definitely clears the air. I had number of discussion with my customers where I almost \force fed\ them with Data Vault Standards going over and over again the advantages of the Data Vault approach and absolute importance of following the rules. Luckily, you clearly and consistently define those rules in several sources so my job of interpretation was next to none.

    However, I would like to mention one object of Data Vault (forgive me if this is false statement) that I learned from Kent Graziano and called Slink. Slink is essentially a combination of LINK and SATELLITE.
    Before implementing Slinks, I was researching this object but did not found anything certain said about the use of the Slinks. (The closest is the item from the Data Vault Discussion area. … Link with optional relation; What is a SLINK? and does it taste good? … http://www.datavaultinstitute.com/forums/archive/index.php/f-45.html)

    I implemented the Slinks in couple of business area for customer’s EDWs and they seem working fine. Even use of third party tool, WhereScape, for developing procedural code did not generate any issues.

    My question is would you approve adding Slinks to the list of acceptable object to be used within Data Vault work frame or will you consider Slinks are non-standard Data vault component with all the consequences to follow? I do believe in Standards and want to get clarifications on the subject of Slinks.

    Side Note:
    1. “Standard Link” which associates more than one table (hubs and links). It carries more than one foreign key.
    2. “Slink” can be seen as an Effective Link with Satellite (all in one table). Slink is used to effectively capture history of relationships. Slink is used when there are no other descriptive attributes on a link. Slink carries the End_Date and does not have a Satellite. It also has the Load_Date as a part of the primary key to allow the relationship to “come and go”. If other attributes are discovered later a satellite can be added without losing the historical data.
    3. I can provide additional information on the implementation of Slinks, email me rklebanov@wherescape.com

    Greatly Appreciated! And Happy Holidays!
    –Raphael

Leave a Reply

*