Chief data officers ‘essential’ to big data success

13

Dec
2016
Posted By : admin Comments are off
131216 - Image credit: iStockphoto/emyerson
Categories :#AnalyticsNews

Organisations that invest in skilled executives to manage their big data analytics projects are better-placed to see success in this area than those that do not, a new report has indicated.

A study of US federal agencies conducted by MeriTalk and ViON Corporation revealed that almost all these bodies (92 per cent) use big data to some degree. However, the majority (58 per cent) graded the effectiveness of their data management strategy as C or worse.

Therefore, having the right personnel on hand to control the direction of such projects will be invaluable. The study found that 88 per cent of organisations with a chief data officer (CDO) leading these efforts report these executives have had a positive impact on their performance.

Meanwhile, 93 per cent of agencies that currently lack a CDO agreed that employing one would have a positive effect on their big data strategies.

Two-thirds (67 per cent) of organisations that do not have a CDO stated their agency lacks leadership when it comes to big data analytics efforts. Organisations with a CDO are also more likely to successfully incorporate big data analytics into their decision making than those without (61 per cent compared with 28 per cent).

Rodney Hite, director of big data and analytics solutions at ViON, said that as organisations are being inundated with huge amounts of data every day, how they manage this information and turn it into insight will be critical.

"Implementing a CDO ensures your agency is focusing the right amount on mission-critical data management goals – while storing and protecting data throughout the process," he continued. "Regardless of whether an agency has one or not, the majority – 57 per cent – believe the CDO will be the hero of big data and analytics."

More than three-quarters (76 per cent) or organisations with a CDO say this individual has taken ownership of data management and governance issues. The primary responsibilities of these personnel include centralising an organisation's data (55 per cent), protecting this information (51 per cent) and improving the quality of data (49 per cent).

Other areas where CDOs have influence include coping with open government data efforts, bridging the gap between IT and operations and "leveraging data to help set and achieve realistic goals".

However, although the benefits of having a CDO are clear, many agencies are not giving these personnel the support they need. The research found just one in four organisations (25 per cent) have a deputy CDO, while the same number have a chief data scientist and only 29 per cent have a chief analytics officer.

This is a situation that is unlikely to change in the near future, as less than a quarter of survey respondents expect to be hiring for any of these roles in the next two years.

However, the good news is that 92 per cent of agencies report their CDO has a strong working relationship with the chief information officer, which ensures the organisation is able to keep pace with the technological realities of big data and analytics. 

Don’t delete big data, companies urged

06

Dec
2016
Posted By : admin Comments are off
dont delete big data companies urged
Categories :#AnalyticsNews

Companies performing ad-hoc big data analytics operations have been reminded of the importance of keeping the data used in the process after it is completed.

Speaking at an IT Leaders Forum organised by Computing.com, director of file, object storage and big data flash at IBM Alex Chen explained businesses may need to refer back to this information at a later date. This may be in order to meet regulatory requirements, or simply if people want to investigate what happened and why a particular decision was taken.

At the moment, many organisations are still in the early adoption stage when it comes to big data, which means they may be performing a large number of experimental and ad-hoc analyses as they learn how to bring this technology into their everyday operations.

Mr Chen said: "It's likely that someone in a line-of-business [in many organisations] has spinned-up a Hadoop cluster and called it their big data analytics engine. They find a bunch of x86 servers with storage, and run HDFS."

Many people tend to throw away this data after it has been processed in order to keep their system running efficiently. Mr Chen noted that even in these ad-hoc deployments, it is not terabytes, but petabytes of data that are being ingested, and the more data that has to be analysed, the longer it will take.

But while deleting this data may keep analytics processes running as fast as possible, it could mean businesses have no answers when they need to demonstrate what led them to their final decision.

"Performing analytics generates a lot more meta-data, too, and due to regulations or business requirements people may just want to see what happened and why they made certain decisions. So you will need to re-run the analytics that were run before," Mr Chen continued. "So you can't just throw away the data any more."

Financial services firms to embrace real-time analytics

30

Nov
2016
Posted By : admin Comments are off
financial services embrace real time analytics
Categories :#AnalyticsNews

A growing number of companies in the financial services sector are set to upgrade their big data analytics initiatives to include real-time solutions, a new report has claimed.

A study by TABB Group noted there is an increasing understanding in the sector that the value of a given piece of data can be lost almost immediately as it becomes outdated. Therefore, capital markets firms are turning to real-time analytics for activities including risk management, compliance, consumer metrics and turning insight into revenue.

Author of the report Monica Summerville noted that simply having data is no longer useful, and traditional ways of thinking about analytics, such as data warehousing and batch-led approaches to analytics, no longer apply.

In today's environment, firms must be able to find and act on patterns in incredibly large data sets in real time, while also being able to reference older, stored data as part of a streaming analytics operation without reverting to batch processing.

"The market for real time big data analytics is potentially as large as business intelligence, real-time streaming and big data analytics combined," Ms Summerville said. "The most successful approaches understand the importance of data acquisition to this process and successfully combine the latest open source technologies with market leading commercial solutions."

Implementing effective solutions for this will be challenging and requires companies to invest in software, hardware and data, as well as personnel with expertise in the sector.

Therefore, in order to ensure businesses can see a quick return on investment, TABB stated they will have to take big data analytics 'upstream' by layering streaming and static big data sets to support real time analysis of combined data sets. 

Such capabilities will be a key requirement if financial services firms are to progress to technologies like machine learning and other artificial intelligence based analytics.

Ms Summerville said: "We believe the upstream analytics approach will increasingly be adopted throughout the industry in response to industry drivers, an unending desire for new sources of alpha and the rising complexity of investment approaches."

Facebook

Twitter

LinkedId