Converged approaches to data ‘among key big data trends’ for 2016

A move away from centralised data storage approaches, converged analytics platforms and a greater focus on value and quality will be among the key trends facing the big data industry in 2016.

This is according to co-founder and chief executive of MapR John Schroeder, who wrote in an article for IT Pro Portal that as big data analytics has moved beyond a buzzword to become an essential part of many organisations' strategy, it is transforming the enterprise computing environment.

However, this is an area that's constantly evolving. "With many new innovative technologies on the horizon, not to mention a particularly noisy marketplace, differentiating between what is hype and what is just around the corner can be challenging," Mr Schroeder noted.

Therefore, he highlighted several key trends that all businesses looking to improve their big data analytics capabilities will have to consider in 2016.

One of the key areas of focus will be an effort to develop more converged analytics environments. Mr Schroeder said that in the past, it has been accepted best practice to keep operational and analytic systems in separate business applications, in order to prevent analytic workloads from disrupting operational processing.

But this attitude is changing as new tools emerge that can use in-memory data solutions to perform both online transaction processing (OLTP) and online analytical processing (OLAP) without the requirement for data duplication.

"In 2016, converged approaches will become more mainstream as leading organisations reap the benefits of combining production workloads with analytics in response to changing customer preferences, competitive pressures, and business conditions," the MapR chief executive stated. This convergence will also speed up the 'data to action' cycle and removes much of the latency between analytical processes its impact on business performance.

Mr Schroeder also forecast that 2016 will see a shift away from centralised workload and processing models to more distributed solutions. One reason for this will be to better deal with the challenges of managing multiple devices, data centres, and global use cases, across multiple locations.

Changes to overseas data security and protection rules brought about by the nullification of the EU-US Safe Harbor agreement will also dictate how companies store, share and process large quantities of data. With Safe Harbor 2.0 on the horizon and set to bring in new restrictions, global companies will need to re-evaluate their approach to cross-border data storage that will affect their analytics activities.

Elsewhere, it was predicted that 2016 will see the market focusing far less on the "bells and whistles" of the latest products, and more on established solutions that have proven business value.

"This year, organisations will recognise the attraction of a product that results in a tangible business impact, rather than on raw big data technologies – which, while promising an exciting new way of working, really just cloud the issues at hand," Mr Schroeder said.

Ultimately, vendors that are able to demonstrate quality will win out in 2016 as businesses demand proven, stable solutions to meet their requirements for better operational efficiency. 

"Now more than ever, an organisation's competitive stance relies on its ability to leverage data to drive business results. That's easier said than done when it’s pouring in from every origin imaginable," Mr Schroeder said.

Leave a Reply

Your email address will not be published nor used for any other purpose. Required fields are marked *