I’ve decided to place my background up for your perusal – so you can see where I’ve been, and what my qualifications are to work with you.
My name is Dan Linstedt, I’ve been working in IT for 25+ years, and in Data Warehousing/BI since 1990. You can find out more information about me at: http://www.linkedin.com/in/dlinstedt
I’ve completely re-designed my site in hopes of making it more active, and more interactive. I will be posting Youtube videos, samples, downloads, information, and documentation about the Data Vault Modeling structures and Data Vault Methodology here. This can become your primary source of information for all things Data Vault.
I hope that you will use the comments to fill in your questions, thoughts, and information about your successes on your current projects using the Data Vault. I will begin by describing the history of the Data Vault in my first set of posts.
Welcome all, to the founder/author and creator of the Data Vault and my blog.
If you’d like consulting, I also perform on-site consulting – you can find out more information at my consulting company: http://www.EmpoweredHoldings.com
1982 – Began working in assembler code on a Dec-VT180 (z80/z8080 motherboard). I wanted it to read MS-DOS 1.0 disks, so I re-wrote the bios to change the disk skew factors. The compiler tried to compile the whole bios in 64k of RAM and didn’t work properly, so I re-wrote the compiler to use disk swap of libraries. The linker then refused to link all the libraries (again it tried to put the whole symbol table and all the jump-tables in 64k of RAM), so I re-wrote the linker to use disk linking/swapping. I was finally able to read MS-DOS 1.0 disks. I continued to work in Assembly language for the next 4 years, by-passing basic until I moved on to programming the TRS-80.
1984-1987 – worked on Apple II series and IBM XT series creating simple games, and graphics editors using Sprites. I was a little upset that my VT180 didn’t have a music processing chip, so I wired up my eletronics circut board to build a small amplifier. I plugged in the RS-232C transmit port for variable voltages to the amplifier. I then programmed a simple one-note player program on VT180 to transmit bytes of information to the RS-232c port, and out to the amplifier. Voila – I could pre-program Bach one note at a time… although it took me a while.
I also attended a summer camp for computer kids at Stanford University, to improve my programming skills. They had a really cool stand-up video game in the student center called Cloak & Dagger that I liked to play. During High school I designed something called a DRAM Cartridge drive, as EPROMS were the hot new thing. The DRAM cartridge drive based on flash Proms on a cartridge that would be inserted through a slot in the computer. I took it to HP, it made it to the assistant VP general legal counsel. They told me “it’s too early” bring it back in a couple years. So I refined it, and a few years later brought it back – they then handed me a brochure for the HP Palmtop LX with the PCMCIA cartridge slot… At least I tried.
I went on to learn Basic, and Pascal – then moved in to Turbo Pascal. From there I learned Object Oriented Programming and built an original DOS windowing environment for my internship job in Quality Assurance at Borland Software. During my time at Borland, I learned Turbo Assembler, and Windows 3.0 programming. I was working with Test Scripts, regression testing, compiler testing, and IDE components. After leaving Borland, I demoed the windowing libraries to Symantec for Macintosh IDE development. I then wrote my own “Resource Editing/Development toolkit” called QuickRez, which allowed developers to build, edit, compile, compress and re-attach Turbo Vision resource files.
After my internship was over I returned to college at Chico State where I worked for one of my favorite professors building a DBASE II/DBASE III OLTP application. The app was to collect operational data from multiple sites around California for the HUD engergy savings programs, then at night – use an auto-dialer (which was a headset placed into a data terminal) and synchronize ALL of the DBASE databases, so that in the morning, they all would have updated numbers. Alas, HUD cancelled the program (energy savings program) at the last minute. So I went off to work in the real-world.
I began working in 4GL (Omnis programming language/GUI, cross-platform OLTP with an internal database) from Blyth Software, using PowerBuilder and Omnis Software for OLTP front-ends.. I worked in Technical Support for a couple years trying to help people solve their problems, and spending a majority of my time reading the source code for the engineers, and assisted them in discovering the multiple timing bugs (concurrency locks that didn’t work – and caused corruption in their proprietary database files).
1990 – I also began dabbling in Historical databases, data warehouses, and statistics (BI analytics these days). I began thinking about consolidating data, how to scale it – how to support Terabytes in data stores. I thought that consolidation of different data stores might be a partial answer to data integration problems that my Dad had discussed with me, he was always telling me how the US Military needed systems that would integrate disparate data – and at the time, especially COTS packages and their data. This was when I started playing with Data Models, I realized very quickly that 3nf wouldn’t really cut it and had a variety of reasons why.
1993 – It was right around this time I was introduced to Informatica (Gaurav and Diaz and their development staff). I moved on to working for a variety of consulting organizations, doing everything from ETL to Data Warehousing Build Outs. Shortly there-after I began working in the government space, continuing on to solve some of the government integration problems and disparate data sets. I continued to focus on Data Warehousing and Data Integration problems, and continued to work on the Data Vault Modeling components, as they began to take shape.
1997 – By this time I had a few data warehouses under my belt, and the Data Vault Model was really holding it’s own. I was working in a situation that required SEI/CMMI Level 5 IT Business Processes, even for Data Warehouses. The government had issues with Y2k coming, and accountability and auditability of the Data Warehouses had become stable through the Data Vault efforts. So we were focusing heavily on Process Repeatability, Measurement, Lean-Initiatives, and Cycle Time Reduction (Known in the 80’s as Business Process Re-engineering). So, I began to adapt the SDLC, RAD techniques, and PMP to the Data Vault Model. The idea I wanted to create was: the ability to build multiple data warehouses the same way, over and over again regardless of the requirements. I also wanted a business process (for IT) to allow us to measure and optimize the Data Warehousing and BI delivery mechanisms. I was tired of IT being over-budget and blowing time-lines consistently.
1999 – Crowning Data Vault Project, had me working with Informatica version 2.7 and 3.0, replacing SQL Scripts, HTML on a Dec Alpha Mainframe web-server, Perl Scripts for ETL transfer. We successfully re-wrote our project scripts with more reliability and continuity in 3 weeks. These were scripts that originally took us 3 months to create by hand.
2000 – I spent the year discussing these things with Claudia Imhoff, Bill Inmon, Clive Finkelstein, Kent Graziano, and a few others in the Data Warehousing industry. I tried several times to get Dr. Kimball’s opinion, even sending him drafts of my yet to come articles – but alas, for whatever the reason, I never received a response – even to this day. So, in 2001, I published the articles – I was convinced that if the Data Vault were to proceed successfully it must do so on it’s own merits, and it must do so as a grass roots effort.
2001 through 2006 I spent a lot of time working with Informatica, helping them get large deals, close a number of multi-million dollar sales. I used to trip out to HQ once a year to have a pow-wow session with the engineers, to discuss how the product was being used at their largest clients. I also spent time as their chair-person for the Informatica Developer Network, and in fact – was instrumental in helping Allan Prattis build the IDN in the first place. I worked with Ivan Chong on PowerAnalyzer, and later PowerCenter core engineering. I also worked with Informatica to help them address better market needs. Along the way, I assisted FirstLogic, X-Aware, IBM with DB2, and Teradata all with their product lines.
2006 to 2009 – I learned Pentaho, Jasper and Talend. I’ve also picked up PHP, Adobe Flash, HTML, XML, Java, and can work in many different environments. I got to meet folks from IT in CERN, Department of Defense, US Navy, and the US Army. I also got to help solve Multi-Hundred Terabyte sized problems, work with MPP systems, SMP systems, Clusters, and so on. I even got to work with a Petabyte system, what a rush that was. Can you say I love data? Not really, I just like solving problems.
Here we are today, 2010 – ten years later, and I’m very proud to say that we have a great deal of Data Vault Successes around the world, and more are being built every day.
Let me bring my knowledge to you, let me help you with your contracts, let me assist your IT team to deliver on-time, and in-budget. I know it can work.