One of the biggest questions in the minds of many IT professionals at the moment is how they can ensure their big data deployments are a success. While the potential of the technology is widely recognised, putting it into practice is not a simple task, and the consequences of a poor execution can be severe.
Therefore, many tech pros were in attendance recently at the Gartner Catalyst conference in San Diego to hear analyst at the firm Svetlana Sicular highlight some of the key reasons why big data analytics projects fail, and what businesses need to bear in mind to avoid following suit, TechTarget reports.
She highlighted seven common reasons why such project can fail, each of which can be categorised into one of three general areas – strategy failures, skills issues and problems at the analysis stage.
When it comes to developing a big data strategy, organisational inertia is a key challenge that must be overcome if businesses are to succeed. Ms Sicular said IT pros need to be persistent and enthusiastic to fire up managers and get them on board with big data.
Other issues arise as a result of employing big data for the wrong use cases. As an example, Ms Sicular highlighted the case of an insurance company that attempted to investigate the relationship between people's lifestyle habits and their propensity to buy health coverage. This ended up failing quickly because it was too general and complex a goal – even when the company tried to narrow it down to just comparing smokers to non-smokers.
Too many grey areas and a lack of understanding of their customers' health issues ultimately doomed the project. To avoid this, Ms Sicular advised firms to carefully prioritise use cases and gradually increase the complexity of the problems they're trying to solve, rather than jump in all at once.
On the skills side, failure to address unexpected difficulties is a common problem. These may not necessarily relate directly to analytics solutions, but can crop up in the implementation process or when determining how results are shared, and many companies have no idea how to respond to these issue.
"I can't overestimate how many times a big data project failed because of the network, or because of security, or because of the facility," Ms Sicular said.
Not having people on board with the requisite level of big data skills also holds many initiatives back.
However, even if companies have the right skills and strategy in place, they can still run into difficulties if they do not use the analytics tools available to them appropriately. For instance, asking the wrong questions can be a costly mistake, as it may not become apparent a business is focusing on the wrong areas until it is too late.
Similarly, applying the wrong models to data can lead to inaccurate results. Ms Sicular observed that understanding which is the most appropriate approach is one of the biggest challenges big data pros will face – but is also the key to success.
Finally, Ms Sicular also reminded businesses not to put too much faith in their data, which means analysing their results closely for any potential signs of bias or influencing factors.
As an example, she highlighted Google's efforts to forecast flu trends in the US. In 2008, it proved to be highly successful in this, predicting an epidemic two weeks ahead of the Center for Disease Control. However, when the experiment was repeated a couple of years later, it ended up overestimating doctors' visits by 50 per cent.
This turned out to be an unintended consequence of its previous success. Ms Sicular explained: "[The] media was talking so much about Google's success, people started looking for Google Flu Trend success instead of googling 'flu'. That skewed the data."