Hadoop ‘getting close’ to widespread adoption

One of the most hyped technologies in the increasingly important data analytics sector at the current time is Hadoop, as companies look for solutions to deal with the growing volume, variety and velocity of their information.

However, it is not yet widely seen as a go-to industry solution, with most organizations continuing to task a cautious approach to the tool. It was reported by Information Management, however, that Hadoop is getting closer all the time to threshold of widespread adoption, as it begins to fulfill more of the criteria needed to make the jump from hype to the mainstream.

The publication identified four characteristics that technologies need to have before they become widely accepted throughout an enterprise. These are to offer proven benefits, be based on a familiar design paradigm, be a 'solution enabler' that allows users to focus on results, and to provide good value.

Over the past few years, it noted there has been a continual evolution of Hadoop that has brought it to wider attention among enterprises.

For instance, conventional data analytics techniques have become unable to keep up with the volume of information organizations hold. This is usually due to bottlenecks in hard disk input/output and CPUs, it was stated.

"Hadoop addresses this problem, demonstrating its potential in the process," Information Management said, adding: "The technology is used by such ubiquitous sites as Google, Facebook and Yahoo, giving Hadoop MapReduce tremendous credibility as an enterprise-level solution."

While the proven benefits condition has largely been met, the magazine observed there is still some work to be done on the others. Hadoop does not use traditional approaches for moving large amounts of data, for example, which means professionals often have to learn a new skill set to make the most of the technology. While it is often cheaper than other big data analytics solutions, it also continues to be time-consuming to implement and use.

Therefore, better tools for reducing complexity and allowing developers to focus on business problems are still required. Information Management said: "There have been great strides in this direction, but more is needed to encourage widespread adoption."

But despite the challenges, the continuing evolution of the platform is seeking to address the remaining issues, so companies coming to the technology now for the first time should have greater confidence they will be able to see a real return on investment from the technology.

The upcoming general availability release of Hadoop 2.0, which was described by Gartner analyst Merv Adrian as "an important step", is set to make analyzing large data sets faster and easier, as well as enabling users to work more easier with traditional database tools such as SQL.

"Hadoop is a core component of big data analytics and is here to stay," Information Management stated.

With developers working hard to ensure Hadoop has greater platform stability, better reference templates and best practice guidelines, and more diversity in compatible operating systems and programming languages, the publication said: "It's only a matter of time before Hadoop MapReduce goes mainstream."