Trials Of DV Code Generation

This entry is a brief walk-through of issues around generating Loading Code for the Data Vault.  Some of these you may have thought of, others, maybe not.  In any case, I’ve been working hard on RapidACE v2, and trying to get the code-generation up as SaaS, but it’s been very difficult to get the templates right, and accommodate for most (if not all) the special things that have to happen.  In case you are writing or have written a code generator, you might find this interesting – or maybe, you’ve already solved these issues, and if so, congrats…  It’s a tough challenge.

What are the issues?

The issues stem from metadata, all the different possible combinations of cross-reference metadata that can actually be used to “guide” the code-generation process.  I’ve run across hundreds of issues (so far), and a few of these that are particularly thorny are listed below, especially given that the code generator I’ve written uses cross-references and templates exclusively to generate the code.

  • Composite Hub Keys
  • Multiple hierarchical lookups for dependent children (Link tables needing TWO parent Hub lookups on the same field)
  • Ok – composite hub keys coupled with multiple hierarchical lookups for dependent children.
  • Multi-layered Link tables (shouldn’t have this, but can!)
  • Detecting and identifying Sequence Fields (by naming convention, data type, etc..)
  • Utilizing customized Satellite keys (how do you generate a working load pattern for a customized Sat?)
  • Degenerate Hubs (not-physically implemented, except for a key in the Link table)
  • Field operations, filters, joiners, expressions, conditions
  • Field name changes (from input to output fields

The system I’ve constructed for RapidACE v2 handles all of these requirements.  Now, couple this with the ability to “generate generic metadata” for ANY ETL template to “use” to output mappings, and that’s where some of the complications come in.   For instance, some ETL engines handle filters differently, as they handle expressions differently (pretty much nearly everything is handled differently).

I saw a generic approach in the Netherlands at the #dvseminar recently, to produce a single Hub load, single Link load, and single Satellite Load.  This is very intriguing, given that these ETL engines are so heavily structural driven.  I’m beginning to wonder if we should be converting ALL our data sets to XML, and calling it good?

What have you run in to?

What types of issues have you had to solve in your code generators?   I’d be curious to hear about it.

Thanks!
Dan Linstedt

Tags: ,

One Response to “Trials Of DV Code Generation”

  1. Eric Steele 2011/06/03 at 10:04 am #

    I can only imaging the complexity of trying to handle this many nuances for multiple ETL engines.

    I wrote a set of Python scripts that completely regenerate our loaders. It pulls from one system, Consensus, and populates one ETL tool, Pentaho, and I had plenty of hurdles to figure out just doing that.

    The Pentaho transformations and jobs are XML files, so I have a Python module that keeps an XML template for each kind of component that I would need to use. I then populate the templates with the specific data that is needed for the specific loader and stack the filled templates together to generate the completed files.

Leave a Reply

*