Kognitio has announced the release of version 8 of the Kognitio Analytical Platform.Read More
How you can avoid failure with your Hadoop projects
It's a fact that one of the biggest areas of interest for businesses of all sizes at the current time is big data and when it comes to managing this, one of the most concrete and widely used technologies is Hadoop.
Guy Harrison, executive director of research and development for Dell, stated in a piece for VentureBeat that the majority of enterprises are now at least experimenting with Hadoop, including most – if not all – Fortune 500 companies.
However, many of these are still in the initial stages, where they are setting up the necessary workflows to capture business data. Even these first steps bring with them significant risk if they are not carried out carefully and once the initial stages of information gathering have been completed, this is just the start of the challenges.
Mr Harrison said it will not be enough to simply collect data and even determining the meaning of it is not the end of the process for organizations. To be successful with their Hadoop deployments, they must have the right mechanisms in place to take this knowledge and convert it into real-world results. This often requires complex adaptive algorithms that firms will have to manage closely.
He said: "This is not business intelligence as we have known it in the past: the primary aim is not to facilitate executive decision-making through charts and reports, but to entwine data-driven algorithms directly into the business processes that drive customer experience."
However, if firms do not take a great deal of care when implementing the technology, they may end up with what Mr Harrison described as a "Hadoop hangover" if projects fail to meet expectations.
To avoid this, it is critical for businesses to have the right expertise in place. Mr Harrison explained that the world's most successful big data companies, such as Google and Amazon, have done well because they attracted and retained the very best talent, who brought not just an understanding of programming, but knowledge of complex analysis techniques, business insight and problem solving skills.
These data scientists will be the key to success for many larger firms, but they are in short supply. It was explained academic institutions are only now recognizing the importance of these skills and putting in place the right courses, so it is likely to be years before there are enough qualified personnel to meet demand.
But as well as having the right staff, it is important to understand the strengths and limitations of Hadoop. The technology can bring many benefits to a firm, but it should not be viewed as a magic bullet that will instantly solve all its data challenges.
Mr Harrison noted that some of the challenges brought by Hadoop include difficulty in creating backups, poor integration with enterprise monitoring systems and primitive resource management. None of these issues are insurmountable, but companies that do not acknowledge them may end up setting unrealistically high expectations that cannot be fulfilled.
"Big data is a complex and potentially disruptive challenge to many organizations," he said. However, he added: "For many businesses, the opportunities presented by the big data revolution are as significant and fundamental as those presented by e-commerce 15 years ago. Companies should be bold and determined in reacting to these challenges."