In discussing the applicability of Extract, Load and Transform with a client, I realized that this is a great Top-Down approach to quickly getting the data into the datawarehouse so that it can now be leveraged by the end-user. So take this example …
Step 1 – ELt – meaning load the data into the data warehouse with minor transformation (date and so forth), but largely a copy of the source system data – call it a working data store withing the EDW.
Step 2 – Create a view into the data for the end-user to build their BI on. This should have only taken 1-2 weeks. I think you have a happy end-user.
Step 3 – Work with the end-user as they progress through the evolution of their data requirements to capture their true requirements. At the same time make sure you are looking at how other data that has been normalized into the EDW to properly understand the situation.
Step 4 – Now using the proper Top-Down modelling tools create your T for the ELT to move the data out of the working data store environment into the EDW proper. We need to be careful to support the View entity to not put any bumps into the road.
Does this sound like Enterprise Information Integration – sort of, this is a short term virtual cloud model that gets implemented into a third normal form EDW for all future use. This approach should save up to 50% of normal ETL development: the small "t" in the beginning is very simple and can be standardized; and this is iterative, client inclusive.
So we aren’t ready or we have an investment in ETL tools already. Take the same above approach using ETL, not the same savings due to tool and server, but you should be able to reduce cost and be more responsive. But wait that’s not all … only kidding. But this process needs strong guiding principals, a focus on re-usability and standardization.