Using customer data will be essential to improving the relationships between retailers and their supplyRead More
Key myths over big data success highlighted
With recognition of the importance of big data to businesses growing all the time, many companies may be looking to overhaul their systems in order to take advantage of this.
However, while the introduction of this technology to an organization does have the potential to change the way in which it uses information to inform its decision-making, it may often be the case that firms misunderstand exactly what changes they will have to make in order to see results, and are put off big data due to a belief it will require a wholesale change to the way they operate.
But it was noted by Tom Davenport, research fellow at the MIT Center for Digital Business and co-founder of the International Institute for Analytics, that this is not necessarily the case. In a piece for the Harvard Business Review, he highlighted that the key to success is often merging new big data tools with existing processes and solutions.
For instance, Mr Davenport said a common misconception is that big data is all about the technologies that are specific to this area, such as Hadoop and Python. He observed: "It is certainly true that those tools are important and useful to big data projects. But unless your company is a start-up, you probably have some legacy technologies and skills that can come in handy as well."
The expert stated companies that have existing data warehousing tools tend to create value from their big data analytics faster than those without these resources, so being able to integrate existing solutions such as SAS, R and SQL with new data-focusing additions will be highly useful.
Mr Davenport said the same applies to the personnel businesses have working on big data, with many companies under the impression they will need to hire new, highly-qualified data scientists in order to see a return.
"The large companies I interviewed about big data projects said they were not hiring PhD level data scientists on a large scale," he stated. "Instead they were forming teams of people with quantitative, computational, or business expertise backgrounds."
Being able to educate existing staff about big data technologies such as Hadoop and scripting languages is necessary, but if done successfully, can ensure organizations have a good platform on which to build new solutions to derive insight from their data.