In case you missed it, Forbes recently ran an interesting piece about Hadoop titled Can Hadoop Survive its Weird Beginning? It is worth a read for anyone in the Big Data/Hadoop space.
In it, author Dan Woods makes a couple of interesting observations:
- “Hadoop is neither a community-based open source project like Linux or Drupal nor a commercial open core company like Alfresco, or JBoss. Instead it is a strange hybrid that has some significant disadvantages.”
- “Hadoop is rising to prominence in an era of ubiquitous cloud computing and proven models for Software-as-a-Service, which dilute the power of some of the traditional models for commercializing open source.”
Two excellent points. Woods questions the viability of not just the development model, but of classical open source business models to monetize Hadoop. And he’s right; both are challenges that companies and vendors face bringing the awesome potential of Hadoop to bear.
Woods describes the current Hadoop ecosystems as a “Three-Headed Open Core,” based on the three major companies, including MapR, that are bringing Hadoop to the market. It’s an apt description, as each of the “Big Three” vendors has taken a slightly different approach to the same goal.
But since we’re all building solutions on Apache Hadoop, the only decision customers have to make is what are their highest priorities in approaching a big data project. MapR has chosen to bring what we believe is the most enterprise-ready version of Hadoop to market. This has taken significant work and investment in adding value to Hadoop where it matters most. As Woods implies, simply adding a management layer and calling “enterprise grade” may not be enough in the long run.
To answer Woods’ question (as he does), yes, Hadoop will survive, because its value to many far exceeds its birth and growing pains.