Over the weekend I have revisited Tableau, enjoyed some success with MonetDB, tried to turn MySQL into a hundred million row data warehouse, been underwhelmed with Firebird, installed Greenplum and spent many frustrated hours with Talend Open Studio, Pentaho Kettle and Jitterbit.
Of course, I could just buy QlikView, but what can be done for less $money? Unfortunately data warehouses and BI front-ends are not sexy problems in the opensource community. Graphs and charts get a little more attention, but you’ll need to write your own code to glue them to your application.
In summary, what can I say about our options?
First, write your own ETL. Why do opensource ETL tools like Talend and Kettle work so hard to rebuild Informatica? It reminds me of Linux in the 1990s when the community wanted to beat Windows and kept working to look like Windows and wondering when victory would arrive. Informatica, like OLAP and mainframes, is from an era when memory was scarce; languages were low-level, slow to compile & run, abstracted little and were not at all portable. On top of that, ODBC drivers were tightly controlled and costly.
But now we can pick from many great scripting languages. Today’s languages abstract the hard parts, are easy to read, can be edited while executing and talk to any system, database, web service or application. I think the next direction for ETL will be a simple (but extensible) transformation language using an ORM wrapper… Rails on ETL. Until that arrives, you can achieve everything you need with PHP, Perl, Ruby and others.
Best option for low-cost data warehouse?
Check out the totally free MonetDB. Unless Vertica or InfoBright reconsiders releasing a low/no cost option, MonetDB will likely mature to become a first-choice column-store database. It’s an academic project that has earned a sizeable development community and user base. The product is functional today for tens of millions of rows (maybe more). So far I have personally worked with a few million rows in MonetDB and I’d like to use it again. With a little focus on usability and packaging, it could be a contender.
Greenplum, freely available for development, won’t help. The architecture is designed around Massively Parallel Processing. As a single, standalone installation, it’s basically just PostgreSQL. You won’t see extra performance without a farm of servers.
To my surprise, MySQL itself is not too bad. The MyISAM tables are speedy and Alex Tomic wrote a post about using multiple queries against the Archive storage engine and how to steal an index with that engine. With basic MyISAM on a fast server, I’m running 10GB table scans in under a minute, but moderate aggregations take a few minutes. Architecturally, MySQL is limited. One query = one thread = one core. Running two simultaneous queries is an option, but MySQL still would not do the kind of transparent, optimized caching that you need for a warehouse. Throughput is limited to disk I/O speed. InfoBright has built a column-store storage engine for MySQL but it’s targeted for the enterprise only.
What about the front end?
For the money and quality and ease of integration, it’s hard to beat Tableau. $1800 bucks isn’t cheap, but for a small business that truly needs to analyze patterns, this will do the job and it makes very pretty charts. The most recent version has integrated support for mapping based on zip code, area code, state, country and others. The maps also incorporate Census and USGS data and are pulled live from an online source. They look great! Tableau has always had a smooth, easy-to-understand layout and a crisp look that makes each chart very attractive in a presentation. It also automatically guesses what chart you want based on the quality & number of aggregates and dimensions.
The drawback is that Tableau doesn’t have its own high-speed database or ETL tool. Tableau can’t shine until a low/no-cost read-optimized database is available. Until then, it does support the most common databases and data warehouses, both commercial and open-source. Except it can’t handle generic ODBC and I don’t know why.
There’s JasperSoft = CrystalReports + OLAP + Informatica + Web Dashboards. Each component is from a different opensource project, so they don’t all use the same platform or interface, and they can’t all read the same data sources. The democratization of BI is NOT going to come from enterprise tools made cheap; it will come from simple disruptive tools that add new ideas and polish with each release. Sorry, Jasper.
What would I use to build a reporting system for a smaller business?
Well, assuming we’re doing it to make more money, not to keep up appearances, the best choice is still to pay the money for QlikView. It reads ODBC, OLE DB, text files and Excel–everything a business needs. The ETL language is easy to understand for any businessperson that has put together an Access database or enjoys Excel formulas (blech!). The GUI front-end designer is powerful & straightforward. And the in-memory database behind QlikView is so incredibly fast that I routinely analyze 10 million of rows in a split-second. It’s a one-stop shop.
Tableau is a good option but you lose the database and ETL. Maybe you don’t have a large volume of data or maybe it’s all in one view in the database–Tableau could work for you.
At a lower cost? Well, it definitely comes down to tradeoffs in coder skill, money, development time and ease of use. Whereas in QlikView anyone can write the basic code to read a couple tables, all other solutions demand heavy lifting somehwere.
If I was doing it for free?
I’d start with PHP, and possibly Ruby. Read from a database, calculate, generate Google Charts, and maybe use one of the low/no-cost Flash-based charting libraries for interactive splash. In a future post I’d like to cover ORMs and Google Chart APIs and how it can help get these projects off and running quickly.
Got any ideas? I’m always on the lookout for a faster cheaper better way to create these solutions.