Big data ‘still seen as IT-dominated projects’

27

Oct
2016
Posted By : admin Comments are off
Image credit: iStockphoto/BernardaSv
Categories :#AnalyticsNews

The majority of chief information officers (CIOs) still view big data analytics as primarily IT-based projects, although there is a growing recognition of the impact it can have on business departments.

This is among the findings of a new survey conducted by recruitment firm Robert Half in Australia, which revealed that 54 per cent of CIOs surveyed say the technology's main impact comes in the IT department.

However, nearly one in five professionals (18 per cent) believe that the technology has more impact on their operations departments, while 16 per cent see key benefits being felt by their finance teams. 

This indicates there is growing awareness of big data analytics' potential to transform operations throughout all parts of a business, even though there is still work to be done to improve understanding of what the technology is capable of.

For instance, almost half (49 per cent) of CIOs felt that non-IT senior management do not have enough knowledge about big data to use it effectively. David Jones, senior managing director at Robert Half Asia Pacific, said this suggests many firms are still in the early stages of incorporating analytics into their processes.

"Big data has changed everything about the way business is done, but its value is still being optimised and harnessing its fullest potential is still considered a challenge for many businesses," he continued.

Mr Jones said: "Businesses have to take on an enterprise-wide approach to leverage the full potential of what big data has to offer and senior management plays a key role. A company's board and leaders need to be fully engaged about the impact data can have on its business operations and overall success."

He noted that the initial requirements for implementing big data, such as setting up new software and hardware systems, can demand a significant financial investment from organisations, which may make executives wary. However, once fully operational, advantages such as cost reductions can have a major impact on a business' performance.

The cost of capturing the necessary information for big data analytics was named by respondents as one of the biggest challenges of big data, with 46 per cent of CIOs citing this as an issue. This was followed by data protection and security issues (43 per cent) and the technical considerations of implementing big data processes (also 43 per cent).

Mr Jones therefore said that in order to get the most out of big data, companies are increasingly looking for technology professionals who do not only have proven skills in data analytics, but also have strong business and financial acumen. This will be essential if IT teams are to clearly explain to senior management the advantages and insights they can gain from the technology.

"In our increasingly data-driven world, using data to make informed, strategic decisions that benefit operations in all departments and impact a company's bottom line is crucial for any company," he added.

Effective use of big data algorithms ‘vital’ for retailers

27

Oct
2016
Posted By : admin Comments are off
Image credit: iStockphoto/monsitj
Categories :#AnalyticsNews

It is essential that retailers are able to create effective algorithms to make the most of the large quantities of data they possess if they are to ensure competitive advantage.

This is according to Gartner, which said that this can help them cut costs and improve their top-line revenue in a digital economy where there is a huge volume of information available for analysis.

Speaking at the Gartner Symposium/ITxpo in Australia, principal research analyst at the firm Kelsie Marian said there are several examples where retailers who have acted aggressively to implement such solutions have seen strong results.

She noted this sector is particularly well-placed to take advantage of the technology, as it is traditionally a major hoarder of data, with many companies having years' worth of store-level data available to them. But while this has been used for activities such as demand planning since the 1980s, today's solutions need to be drastically different.

"Data is ubiquitous in the new retail environment, and retailers will survive only if quality data is embedded into every decision, minute by minute, across the retail organisation," Ms Marian said. "But retailers can't humanly scale to keep pace with growth of data, so a fundamentally different approach is necessary." 

Gartner therefore described the future for the sector as being 'algorithmic retailing', which it defined as the application of advanced big data analytics across a complex retail structure, in order to "deliver an efficient and flexible, yet unified, customer experience".

By 2020, the organisation forecast that leading firms in the retail sector will have embraced algorithmic approaches to planning their operations, which will lead the top ten companies to cut up to a third of their headquarters' merchandising staff.

Gartner highlighted several key areas where these advanced analytics are set to have an impact on retail operations.

One of the major operations will be to assist in determining the cost of goods sold. This is dependent on a number of factors, and is driven by the selection, assortment, pricing, promotion and inventory levels of items. Therefore, there is a great deal of potential for algorithms to improve performance, reducing overall costs and increasing top-line revenue.

Elsewhere, this technology can also help optimise how labour is deployed, improve customer service and improve the handling of back-office administrative tasks, from HR to distribution.

In order to take full advantage of the potential of big data and advanced algorithms, Gartner noted there are several steps that retailers must take.

Firstly, it recommended that chief information officers (CIOs) in the sector need to formulate a plan for identifying and classifying all sources of data within their business, as well as spot any gaps that need to be filled.

They also need to be prepared for the explosion in data generated by products, customers and stores that will be caused by the introduction of Internet of Things devices to their ecosystem.

Developing a framework to identify current and future opportunities where algorithms and automations can improve performance is also a must, as is reviewing how other retailers are using the technology.

"Retail CIOs and their teams play a pivotal role in helping business leaders understand the benefits and limitations of algorithms, and how algorithms can support their business goals," said Ms Marian.

Big data ‘set to lower healthcare costs’

24

Oct
2016
Posted By : admin Comments are off
Image credit: iStockphoto/kentoh
Categories :#AnalyticsNews

The implementation of advanced big data analytics solutions in the healthcare sector could help significantly lower costs in the marketplace by changing the way treatments are developed, prepared and delivered.

This is according to a new study by Lux Research, which noted that soaring costs are a problem that is continuing to plague the industry, as previous efforts to address this have had little impact.

In the US, for example, it stated the introduction of the Affordable Care Act has had limited success in tackling the problem, while in the UK, it has been reported that the government is unlikely to direct additional funding towards the NHS in the upcoming Autumn Statement, despite political pressures to create a full seven-day service.

However, the emergence of new, advanced big data analytics solutions can help healthcare providers reduce their costs without resorting to cutting services, Lux stated.

Mark Bünger, Lux Research vice-president and lead author of the report, titled 'Industrial Big Data and Analytics in Digital Health', said: "Whereas solving many past healthcare problems seemed to be a matter of scientific discovery, health policy, or adequate funding, today's most pressing problems are due to a lack of information – or lack of understanding of what to do with it."

He added that big data solutions that meet these challenges are already delivering measurable benefits in terms of both cost and patient outcomes, while partnerships between large technology providers, pharmaceutical firms and academics are bearing fruit.

Among the findings highlighted in the report, Lux noted that big data can help providers offer more personalised therapies, which have the potential to greatly enhance the fight against some of the most severe diseases.

It stated that by studying molecular biomarkers and genetic profiles, cloud-based analytics enable decisions to be made faster, resulting in better outcomes and reduced costs.

Coupling big data with artificial intelligence (AI) also holds a great deal of promise for the healthcare sector, as it offers a more efficient way of analysing very large data sets.

Applications for this in the healthcare sector may include radiology, where AI can help doctors review patient images and CT scans and spot anomalies that may be missed by a human eye.

AI also has a key role to play in the development of therapeutic and caregiving robots, as well as other aids that help monitor cognitive function and diagnostics, Lux stated.

Elsewhere, big data analytics can also help hospitals cut costs by, for example, helping optimise resource allocation, both when it comes to direct patient care and other activities.

Lux added: "Cost gains come from semi-automated diagnostic tools and decision-support algorithms that help focus expensive interventions, medical equipment, and caregivers' time on the patients who need them most."

Finance and manufacturing lead the way for big data investments

14

Oct
2016
Posted By : admin Comments are off
Image credit: iStockphoto/tonefotografia Created:
Categories :#AnalyticsNews

Businesses in the finance and manufacturing sectors will be among the biggest users of big data analytics solutions in the coming years as they strive to make the most of the huge amounts of information their activities generate.

This is according to new research by International Data Corporation (IDC), which found that banking, professional services, discrete manufacturing, process manufacturing and central government will account for almost half of global big data investments in 2016. They will remain the top five sectors for the technology until at least 2020.

Of these, banking will be the largest sector, both in terms of overall revenue and the fastest growth in spending. In 2016, this sector is expected to invest around $17 billion (£13.82 billion) in the technology.

Total spending on big data technologies is expected to grow by 11.3 per cent in 2016, reaching $130.1 billion. The market will then continue to see strong performance until 2020, by which time it is forecast to be worth more than $203 billion.

Over the next few years other parts of the economy that are expected to contribute to this strong growth in big data investments include telecommunications, insurance, utilities and transportation. However, they will be far from alone, as 16 of the 18 sectors examined by IDC are forecast to experience double-digit compound annual growth rates for big data projects between 2015 and 2020.

Jessica Goepfert, program director, Customer Insights and Analysis, at IDC, said: "In our end-user research, respondents from organisations in these industries are placing a high priority on big data analytics initiatives over other technology investments. Within banking, many of these efforts are focused on risk management, fraud prevention and compliance-related activities."

She added that for sectors such as banking and telecoms, improving customer experience will be at their heart of their big data investments. For example, she noted that technologies are increasingly being deployed in call centres to give agents the information they need to deliver the best possible service.

The primary drivers of big data analytics technology will be large companies – those with more than 500 employees. These organisations will generate revenues of more than $154 billion for the sector by 2020. However, small and medium-sized businesses should not be overlooked, as they will remain a significant contributor to the market. Overall, more than a quarter of big data revenue will come from companies with fewer than 500 employees.

Dan Vesset, group vice-president, Analytics and Information Management, at IDC, said: "The availability of data, a new generation of technology, and a cultural shift toward data-driven decision making continue to drive demand for big data and analytics technology and services."

SQL ON HADOOP BENCHMARKING

12

Oct
2016
Posted By : admin 1 Comment
kognitio benchmark tests
At the recent Strata conference in New York we received a lot of interest in the informal benchmarking we have been carrying out that compares Kognitio on Hadoop to some other SQL on Hadoop technologies. We have decided to formalise the benchmarking process by producing a paper detailing our testing and results. In the meantime, we will be releasing intermediate results in this blog. Preliminary results show Kognitio comes out top on SQL support and single query performance is significantly faster than Impala. Read on for more details.

It is clear from recent conversations that many organisations have issues using the tools in the standard Hadoop distributions to support enterprise level SQL on data in Hadoop. This is caused by a number of issues including:

  • SQL maturity – some products cannot handle all the SQL generated by developers and/or third party tools. They either do not support the SQL, or produce very poor query plans
  • Query performance – queries that are supported perform poorly even under single user workload
  • Concurrency – products cannot handle concurrent mixed workload well in terms of performance and give errors when under load

Bearing in mind the types of workload we have been discussing (primarily BI and complex analytics) we decided to initially concentrate on the TPC-DS benchmark. This is a well-respected, widely used query set that is representative of the type of query that seems to be most problematic. The TPC framework is also designed for benchmarking concurrent workloads.

Currently we are testing against Hive, Impala and SparkSQL as delivered in Cloudera 5.7.1 using a 12 node cluster. We will shortly be upgrading our test cluster to the most recent release of Cloudera before running the main benchmarks for the paper. We have also done some initial testing of SparkSQL 2.0 on a small HortonWorks cluster and plan to be including the Cloudera beta of SparkSQL 2.0 in the performance tests.

SQL Maturity

A common theme we’ve heard is that one of the major pain points in Hadoop adoption is the need to migrate existing SQL workloads to work on data in Hadoop. With this in mind we initially looked at the breadth of SQL that each product will execute before moving onto performance. We have categorised each of the 99 TPC-DS queries as follows

  • Runs “out of the box” (no changes needed)
  • Minor syntax changes – such as removing reserved words or “grammatical” changes
  • Long running – SQL compiles but query doesn’t come back within 1 hour
  • Syntax not currently supported

If a query requires major changes to run, it is considered not supported (see the TPC-DS documentation).

Technology Out of the Box Minor Changes Long Running Not Supported
Kognitio on Hadoop 76 23
Hive 1 30 8 6 55
Impala 55 18 2 24
Spark 1.6 39 12 3 43
Spark 2.0 2 72 25 1 1

The above table shows that many products have a long way to go and the step change in SQL supported in Spark 2.0 (from 1.6) shows the developers have recognised this. Kognitio and other technologies that are making the move from the analytical DWH space are at a distinct advantage here as they already possess the mature SQL capability required for enterprise level support.

Query Performance

The results shown right are for a single stream executing over 1TB of data but our goal is to look at concurrent mixed workloads typically found in enterprise applications.

As well as supporting all 99 queries (23 with small syntax changes) initial results for a single query stream show Kognitio is very performant compared to Impala. Kognitio runs 89 out of the 99 queries in under a minute whereas only 58 queries run in under a minute on Impala. However we recognise the real test comes in increasing the number of streams so watch this space as we increase concurrency and add Hive and Spark timings too.

sql on hadoop benchmark tests

A bit about how we run the tests

We’ve developed a benchmarking toolkit based around the TPC framework which can be used to easily test concurrent query sets across technologies on Hadoop platforms. We designed this modular toolkit to allow testers to develop their own benchmark test and are planning to make the toolkit available on github in the coming weeks once we have finished some “How to Use” documentation.

In progress and to come

As I write this we are still looking at a few interim results presented here:

1. Need to complete syntax changes for Hive so these figures may change in the final paper

2. The single query that is not supported by Spark 2.0 did execute but a Cartesian join was used leading to incorrect results.

We are planning to move on to full concurrent workloads in the next week and will publish these and the toolkit soon.

Pushing past pilot stage ‘key challenge’ for big data projects

11

Oct
2016
Posted By : admin Comments are off
111016 - Image credit: iStockphoto/ipopba
Categories :#AnalyticsNews

Many companies that are embarking on big data analytics projects will find themselves struggling to move beyond the pilot stage and roll out their initiatives on a wider scale, a new survey has found.

Research by Gartner identified this as one of the key challenges facing businesses when they undertake big data projects. It revealed that although nearly three-quarters of organisations are planning to invest in this technology or have already done so, just 15 per cent say they have deployed their big data projects at full production scale.

This is almost unchanged from when the same question was asked last year, when 14 per cent of firms stated they had achieved this.

Nick Heudecker, research director at Gartner, suggested one reason for this may be that big data initiatives are having to compete with other IT investments and are often treated as a lower priority.

Indeed, just 11 per cent of IT professionals at organisations that have invested in big data believed these initiatives were as important, or more important, than other projects, while nearly half (46 per cent) said they were less important.

"This could be due to the fact that many big data projects don't have a tangible return on investment that can be determined upfront," Mr Heudecker said. He added: "Another reason could be that the big data initiative is a part of a larger funded initiative. This will become more common as the term 'big data' fades away, and dealing with larger datasets and multiple data types continues to be the norm."

Another issue that can make it difficult for companies to move beyond the pilot stage is they do not have effective business leadership and involvement in these projects. What's more, many trial initiatives are developed using ad-hoc technologies and infrastructure that do not have the scalability or reliability needed for production-level deployment.

Overall, Gartner's survey revealed investments in big data continue to rise, with 48 per cent of organisations investing in the technology in 2016. This marked an increase of three percentage points from a year earlier.

However, the number of companies planning new investments dropped from 31 per cent in 2015 to 26 per cent in this year's survey.

Mr Heudecker said these signs of slowing growth may be an indicator that companies are rethinking how they look at big data analytics and integrate it into their operations.

"The big issue is not so much big data itself, but rather how it is used," he said. "While organisations have understood that big data is not just about a specific technology, they need to avoid thinking about big data as a separate effort."

One trend is that businesses are no longer thinking about the technology in vague terms, but are looking at specific problems and opportunities that it can address. As such, the success of such initiatives will depend on a "holistic strategy around business outcomes, skilled personnel, data and infrastructure", Mr Heudecker continued.

Logistics sector sees high demand for big data

30

Sep
2016
Posted By : admin Comments are off
300916 - Image credit: iStockphoto/Maxiphoto
Categories :#AnalyticsNews

Demand for big data analytics solutions in the logistics and supply chain sector is growing rapidly, with almost all firms now recognising the need for these solutions.

This is according to a new study conducted by Capgemini Consulting, Penn State University and Penske Logistics, which revealed 98 per cent of third-party logistics companies (98 per cent) agreed that data-driven decision-making will be essential to the future success of supply chain processes. This view was shared by 93 per cent of shippers.

Both of these groups also stated that being able to use big data analytics effectively will become a core competency for their supply chain organisations in the coming years. Some 81 per cent of shipping firms and 86 per cent of third-party logistics and outsourcing companies agreed with this.

Tom McKenna, senior vice-president of engineering and technology at Penske Logistics, said: "Data-driven decision-making is certainly an increasing trend in the supply chain."

However, he added: "Among the biggest challenges that come with increased visibility and more data is determining how to best use that information to drive improvements that benefit the customer."

Shipping firms that can successfully turn this data into useful insight stand to gain a significant competitive advantage over their less well-equipped peers, the study said.

Six out of ten shipping companies (60 per cent) said improving integration in their supply chain was a key area where big data is expected to boost performance. Meanwhile, 55 per cent said the technology would help them improve the quality of their data, and 53 per cent added it could improve the performance and quality of their processes.

For third-party logistics firms, the benefits were slightly different. More that seven out of ten of these firms (71 per cent) said the greatest value data provides comes from improving process quality and performance, while 70 per cent cited improving logistics optimisation as among its most important benefits, and 53 per cent named improving integration across the supply chain. 

Big data is also expected to be highly useful in tackling the challenges created by issues such as a tightening of trucking capacity. Nearly three-quarters of shippers (71 per cent) said big data analytics from third-party firms helps them to better understand alternative shipping possibilities, while 61 per cent said they valued data about trade routes and costs that their partners could provide.

Fluctuating capacity, increased shipper demands and disruptions within the industry are creating a volatile decision-making environment for shippers and logistics providers trying to optimise the supply chain,” the study noted. "Both parties are increasingly using information and analytics to drive their decisions."

However, the report did highlight a difference in opinion between shipping companies and third-party providers when it comes to the benefit of big data. While 79 per cent of shippers said their supply chain organisation sees significant value in the use of big data, this compares with 65 per cent of outsourcers who reported that their customers see such value.

IoT projects moving beyond pilot stage, study finds

28

Sep
2016
Posted By : admin Comments are off
280916 - Image credit: iStockphoto/kentoh
Categories :#AnalyticsNews

More businesses are moving beyond small-scale pilot schemes when it comes to the Internet of Things (IoT), towards full-scale deployments that incorporate big data analytics, cloud computing and security capabilities.

This is according to new research from International Data Corporation (IDC), which revealed that almost a third of firms (31.4 per cent) have already launched solutions that take advantage of this technology, while a further 43 per cent expect to deploy these tools in the next 12 months.

More than half of respondents (55 per cent) also agreed that the technology will be an essential strategic solution that helps their business to compete more effectively. Key benefits of IoT solutions include better productivity, lower costs and the automation of internal processes.

Carrie MacGillivray, vice-president for Mobility and Internet of Things at IDC, noted that vendors that can offer an integrated cloud and data analytics solution will be seen as vital partners when organisations are investing in IoT.

Given the huge volume and variety of data that IoT deployments are expected to create in the coming years, being able to effectively analyse this and derive insight in a timely manner is essential. Therefore, having strong analytics tools is an essential part of a good IoT project.

However, this means having people with the right skills and knowledge to make the most of this – and this is something that many businesses are currently lacking.

IDC's research found that a lack of internal skills in this area is a challenge that is hindering many initiatives. This was named as one of the top worries facing decision makers along with privacy/security issues and the costs of implementing IoT solutions.

The company also found that as the benefits of IoT become clearer, the technology is more likely to be embraced by both IT departments and business units.

Vernon Turner, senior vice-president of Enterprise Systems and IDC Fellow for the Internet of Things, commented: "Setting strategies, finding budgets, and supporting IoT solutions have contributed to an ongoing tussle between line of business executives (LOBs) and CIOs. However, that race may be over, because in many cases LOBs are now both leading the discussions and either paying in full or sharing the costs of IoT initiatives with the CIOs."

Customers ‘unaware’ of how firms use their data

28

Sep
2016
Posted By : admin Comments are off
280916 - Image credit: iStockphoto/BernardaSv
Categories :#AnalyticsNews

The vast majority of customers have no idea how businesses are using the personal data they possess, while many also do not trust companies to be responsible in their use of this information.

This is according to new research from the Chartered Institute of Marketing (CIM), which revealed nine of of ten customers are in the dark about how their personal data is used. Meanwhile, 57 per cent did not trust companies to take good care of their data, while 51 per cent complained they had been contacted by organisations that misuse their data.

For many businesses, gathering more detailed and personal information on their customers is a vital part of their big data analytics strategy, as it allows them to tailor their offerings and marketing messages more precisely. This provides a better experience for customers and helps boost business' revenue.

Overall, people are happy for certain data to be used by companies, provided they understand what this will involve. More than two-thirds of consumers (67 per cent) stated they would share more personal information if organisations were more transparent about their plan for it.

Chief executive of the CIM Chris Daly commented: "The solution is clear, marketers need to brush up on the rules, demonstrate clearly the value-add personal data offers in delivering a more personalised experience and ultimately reduce the fear by being open throughout the process."

However, businesses will have to think carefully about how they achieve this transparency. Many may believe that simply updating terms and conditions to explain what they do with data will be adequate, but the CIM's research suggests this may not be good enough to satisfy many consumers.

Only 16 per cent of respondents stated they always read terms and conditions, with many put off by the lengthy and often confusing documents. Therefore, businesses will need to find simpler, clearer ways of communicating to their customers about how they use data.

The research also found there is a mismatch between consumers and businesses when it comes to what data they view as acceptable to share. For instance, more than seven out of ten consumers (71 per cent) stated they were not comfortable with businesses tracking their location via their smartphones. However, one in five businesses are already using geolocation tools to do this.

More that two-thirds (68 per cent) of consumers also expressed reservations about providing information from their social media platforms – something 44 per cent of firms use in their marketing analytics.

One of the biggest concerns that users have is that their information may fall into the wrong hands – either as a result of being sold on to third parties or compromised in a data breach.

"People are nervous about sharing personal data – fears of data breaches and misuse has them on high alert," Mr Daly said.

There have been a series of high-profile incidents in recent months and years that have compromised the personal details of consumers, the most recent of which came when Yahoo! admitted the information of up to 500 million users was stolen in 2014.

Therefore, businesses will have to address these fears if they are to get the necessary buy-in from customers to make personalised, big data-based marketing an effective solution.

FCA approves use of big data for insurers

23

Sep
2016
Posted By : admin Comments are off
230916 - Image credit: iStockphoto/tonefotografia
Categories :#AnalyticsNews

The UK's Financial Conduct Authority (FCA) has signalled its approval of the use of big data in the insurance industry, after it revealed it was dropping plans to launch a full inquiry into the use of the technology in the industry.

The regulator announced a review into the sector last year, stating it was aiming to better understand how the use of big data analytics in areas such as calculating premiums impacted consumers, before deciding on the next steps.

It has now stated that in light of the "broadly positive consumer outcomes" that big data can deliver to the sector, it will not be proceeding with a full market study at the present time, which effectively gives the green light to insurers to implement big data into their decision-making.

However, the regulator did add a note of caution, observing that there are some areas where big data has the potential to harm consumers. Specifically, it noted that as big data changes the extent of risk segmentation, this may lead to categories of customers finding it harder to obtain insurance. 

The FCA also raised concerns about the potential that big data might make it easier for firms to identify opportunities to charge certain customers more.

Director of strategy and competition at the FCA Christopher Woolard noted that as the general insurance sector is a vital part of the economy, affecting millions of consumers, it is essential that it works well.

"There is potential for big data to transform practices across general insurance markets, and some consumers are already seeing benefits but there are also some risks to consumer outcomes," he said.

"While we have decided not to launch a full market study, we are undertaking further work in this area and with the Information Commissioner's Office (ICO) to ensure our rules and policies keep pace with developments in the market, but also do not prevent positive innovations."

The FCA's Call for Input found that although big data is able to improve general consumer outcomes, it can also affect how companies determine their pricing. It suggested that as insurers gather increasing amounts of data from a wider range of sources, and apply sophisticated analytical tools to this, it may lead to the use of reasons other than risk and cost in pricing becoming more common throughout the industry.

While it recognised the potential that consumers who are deemed to be at higher risk may be denied coverage, the review has not shown any signs that this is occurring. "However, the FCA will remain alert to the potential exclusion of higher risk customers and will engage with government if concerns begin to develop because of how firms are using big data," the regulator stated.

The FCA also reminded insurers of their responsibilities to consumers when it comes to ensuring their use of data is in line with security regulations and legislation such as the Data Protection Act. The FCA will be co-hosting a roundtable with the ICO later this year on how data should be used in the general insurance sector.

Facebook

Twitter

LinkedId