Business interest in big data analytics solutions is set to continue growing at a rapidRead More
Hadoop technology ‘needs standardisation’
More work needs to be done by Hadoop providers to help define a clear set of standards for the technology if it is to reach its full potential.
This is according to Stefan Groschupf, co-founder and chief executive of Datameer, who wrote in a recent article for Tech Crunch that a more structured process for innovation will be essential to its future success.
The future currently looks bright for Hadoop, with the platform increasingly proving to be the first choice for many big data analytics deployments. Figures from Allied Market Research suggest that the market will increase from its current value of around $3 billion to $50 billion by the end of the decade, while principal analyst at Forrester Research Mike Gualtieri told attendees of the recent Hadoop Summit that "100 per cent of large companies" will adopt the tools in the next few years.
However, Mr Groschupf said that this potential will only be reached if the industry can improve the standardisation of Hadoop, so that businesses can rest assured that any solution they buy or develop now will be able to continue working for years to come.
At present, he noted that market pressures have led many vendors to pursue very different approaches in order to differentiate themselves, but this has resulted in a situation where many tools will not work effectively with each other.
"This incompatibility stifles innovation in the application layer above the platform, and further splinters the infrastructure landscape, which slows innovation there as well," Mr Groschupf continued. "We only need to look to Java's success and UNIX's demise to realise the importance of standardisation."
A coordinated industry approach to the development of Hadoop should not prevent individual vendors from adding their own "bells and whistles" to the technology or optimising it for behaviours such as performance, he stated.
However, the use of a standard API on which to build should mean compatibility will no longer be an issue, while consistency and ease of use for developers and vendors up the stack will encourage more widespread adoption of Hadoop-based technologies.