Welcome to the 17th Episode of the CoffeeCAST with Stephen Hayward of Project X.
Ask and you shall receive – I asked and Yves agreed. This week we are podcasting from two continants (North America and France). I have joining me the Director of Product Marketing, Yves de Montcheuil from Sunopsis who just happens to be in France for our chat. Sunopsis is changing the way we do Data Integration with their ELT tool and Integration platform (to know more on ETL vs ELT see these posts EIEIO and ETL vs ELT).
In our conversation today we talk about:
- Complexity of using the tool
- Type of user
- How it works (high level)
- What you want to do – leverage the Process Model Tool
- How you want to do it – Knowledge Management Tool
- Benchmarks of ELT vs ETL with some examples
- UK Health Insurance Company
- Italian Example – 63 hours to 4 minutes – an extreme case, but interesting in that the old ETL was built in ELT in two weeks.
- Yves was good not to mention Ascential and Informatica which are ETL platforms.
- The reason I wanted to discuss the benchmarks was to look at the increase in productivity in developing ETL and the ulitmate cost reduction.
- Data Movement
- Data Conductor – ELT in batch
- Active Integration Platform – Batch, asynchronous and synchronous (service oriented architecture)
I really enjoyed the conversation and look forward to understanding the product in better detail as we continue our evaluation. I hope you enjoy the conversation. If you have any thoughts join in the conversation by posting a comment or drop me an email. Offer up a comment and let us know your thoughts.
The Podcast is available here or on iTunes for download or subscription. Have a great day.
Podcast: Play in new window | Download
Jim – thanks for your comments. The 900x performance increase from 63 hours to 4 minutes was a special one (but it’s so good I like to mention it!), most of our clients “only” get 5x to 100x performance gains.
Regarding business analysts, maybe some Sunopsis clients will comment, but I’d like to emphasize the fact that once best practices have been implemented in Data Conductor by the IT folks, users just need to define their business rules – “what” they want to do – and Data Conductor generates all the complex data flow and optimized code for them.
I really enjoyed that discussion. I wonder if these tools are as good as Yves claimed. That increase in from 63 hours to four minutes was incredible.
I really like your metaphor for the data transfer to the delivery of items to a store and getting them on the shelf.
I would love to hear from some users about how easy the tool is to use by Business Analysts.