Machine learning a key focus for big data initiatives

12

Jul
2016
Posted By : admin Comments are off
Image credit: iStockphoto/Pixtum
Categories :#AnalyticsNews

A large number of companies will look to introduce machine learning capabilities as part of their efforts to exploit big data in the coming years, a new survey has found.

Research by Evans Data found more than a third of big data developers (36 per cent) now use some elements of machine learning in their projects. While the market for this is still largely fragmented, the financial and manufacturing sectors are showing particular interest in the technology, as are businesses looking to take advantage of Internet of Things opportunities.

Janel Garvin, chief executive of Evans Data, explained that machine learning encompasses a range of techniques that are rapidly being adopted by big data developers, who are in an excellent position to lead the way and show what the technology is capable of.

“We are seeing more and more interest from developers in all forms of cognitive computing, including pattern recognition, natural language recognition, and neural networks and we fully expect that the programs of tomorrow are going to based on these nascent technologies of today.”

The most used analytical model that links in closely with artificial intelligence and machine learning development was found to be decision trees. This was followed by linear regression and logistics regression were as next most cited analytical models.

Logistics, distribution, or operations were the company departments found to be most likely to be using advanced data analytics or big data solutions.

Among the survey’s other findings, it was revealed that two-thirds of big data developers are spending at least some of their time instrumenting processes. Meanwhile, 42 per cent are embracing real-time data analytics, while 38 per cent are building capabilities to analyse unstructured data.

The top improvement to data and analytics that developers would like to see is the improved security of off-site data stores.

Most firms set to boost investment in real-time analytics

04

Jul
2016
Posted By : admin Comments are off
Categories :#AnalyticsNews

The vast majority of companies in the retail, technology, banking, healthcare and life sciences sectors will be investing in real-time analytics tools for studying human and machine-generated data.

According to research conducted by OpsClarity, 92 per cent of these organisations expect to increase their focus on streaming data applications within the next 12 months.

To do this, almost four-fifths of respondents (79 per cent) will be reducing investment in batch processing tools, or even eliminating these entirely, as they shift their resources to real-time analytics.

Dhruv Jain, chief executive and co-founder of OpsClarity, stated that the ability to study data in real-time can give businesses a significant competitive advantage in today's digital economy, allowing them to become more agile and innovative.

"With new fast data technologies, companies can make real-time decisions about customer intentions and provide instant and highly personalised offers, rather than sending an offline offer in an email a week later," he said. "It also allows companies to almost instantaneously detect fraud and intrusions, rather than waiting to collect all the data and processing it after it is too late."

One of the key use cases for this technology will be to enhance customer-facing applications. OpsClarity noted that businesses are now able to leverage insights gleaned from multiple streams of real-time data in order to enable timely decisions and responses to queries. 

This type of real-time analysis is now being built directly into customer-facing, business-critical applications. A third of survey respondents (32 per cent) said their real-time solutions would be used primarily to powercore customer-facing applications, whereas 29 per cent will be focusing on improving internal processes.

Almost four out of ten professionals (39 per cent) said they would be deploying real-time data analytics for both purposes.

Jay Kreps, chief executive and co-founder of Confluent, added that real-time data and streaming processes are becoming a central part of how modern businesses harness the information available to them.

"For modern companies, data is no longer just powering stale daily reports – it's being baked into an increasingly sophisticated set of applications, from detecting fraud and powering real-time analytics to guiding smarter customer interactions," he continued.

One of the most popular solutions for handling real-time data is Apache Kafka, with 86 per cent of software developers, architects and DevOps professionals using the open-source message broker.

Mr Kreps noted that this provides a real-time platform for thousands of firms, including major companies such as Uber, Netflix and Goldman Sachs.

Meanwhile, Apache Spark is the data processing technology of choice for 70 per cent of businesses, while 54 per cent use HDFS data sink.

Although there are a wide range of data framework being deployed, and strong indications that many of these will be here to stay for the foreseeable future, the survey revealed a strong preference for open source technologies.

Nearly half (47 per cent) of software developers, architects and DevOps professionals say they exclusively use open source, and another 44 per cent use both commercial and open source.

How big data supports 2016’s summer of sport

30

Jun
2016
Posted By : admin Comments are off
Categories :#AnalyticsNews

There are now countless examples of how big data can help companies across all industries improve decision-making, boost customer service and give employees a better insight into the wider industry. But the technology is far more wide-ranging than many people realise.

In fact, big data will have a key role to play in several of this summer's biggest sporting events. 2016 is a big year for international sports, with Euro 2016 and the Copa America Centenario taking place on either side of the Atlantic, before the Rio Olympics gets underway in August.

But many of this summer's events will be heavily reliant on big data, both to help team and competitors improve their performance, and keep fans up-to-date on what's going on.

For example, one event that's set to greatly increase its use of big data this year is the Tour de France. With almost 200 riders traversing 3,535km of French countryside, strong TV, radio and online coverage is essential for the fans following along.

This year, they will have a lot more information and insight into the event thanks to big data. Tech Week Europe reports that Dimension Data – which is not only delivering information to race organisers the Amaury Sport Organisation (ASO), but sponsoring its own team – will be providing a huge range of information.

Last year, the firm analysed up to six billion bits of data for every stage, turning it into information to help contextualise the race, and in 2016 it's set to review even more.

Adam Foster, the company's head of sports, said: "This year, we're working with a much broader palette, which means access to more meaningful race data, race routes, riders and current weather conditions. What's exciting this year is the ability to deliver all this information to ASO through a unified digital platform."

Having real-time access to multiple video feeds, social media posts and live race information in a single intuitive interface will "greatly enhance" the coverage of the event, he continued.

However, it is not just the Tour de France where big data will have an expanded role to play this year. 

Forbes noted that Wimbledon, which got underway on Monday (June 28th), will be turning to IBM's Watson analytics and machine learning platform to analyse the hundreds of thousands of social media mentions generated by the event.

Alexandra Willis, head of communications, content and digital at the All England Lawn Tennis and Croquet Club, explained: "This allows us to not just look at and respond to trends, but to actually pre-empt them. We're hoping this will help in our quest, not necessarily to always be first but certainly to be early into the conversation when critical things are happening."

In theory, she said this should enable the club to monitor interest in a particular court of player and pre-empt any emerging trends before they become apparent on services like Twitter. This will help it curate content for its media output based on what its audience is most likely to be interested in, rather that reacting to trends after the fact, as has been the case in previous years.

Younger workers most optimistic about big data

28

Jun
2016
Posted By : admin Comments are off
Image credit: iStockphoto/cifotar
Categories :#AnalyticsNews

Companies remain highly optimistic that big data analytics solutions will have a transformative effect on the way they do businesses, but younger employees are far more confident than their older counterparts, a new survey has found.

IDG's 2016 Data and Analytics Survey found that more than half of businesses (53 per cent) plan to implement big data initiatives within the next 12 months, are are already undergoing such as process.  

Overall, 78 per cent of employees agree or strongly agree that the collection and analysis of big data has the potential to fundamentally change the way their company does business in the next one to three years. Meanwhile, 71 per cent agree or strongly agree that big data will create new revenue opportunities and/or lines of business for their company in the same timeframe. 

However, a generational gap is emerging between younger workers who are enthusiastic about the technology and older employees who take a more cautious view. Those aged between 18 and 34 are far more likely than older workers to have a positive view of big data and its potential to transform a business.

"These age-linked differences may be attributable to younger employees being more comfortable with the latest technologies and more inured to the inevitability of technology-driven disruption," IDG stated.

However, it also suggested that as older workers will have seen many hyped developments come and go over their careers, they are less willing to predict that any particular trend will be a source of fundamental change, even one as far-reaching as big data.

The survey also examined the sources of data for use in analytics operations. It found that the average business gathers more than half of its data (54 per cent) from internal sources, while 25 per cent comes from external sources, with 21 per cent being a combination of the two.

The top sources of data for all companies, regardless of size, are sales and financial transaction (56 per cent), leads and sales contacts from customer databases (51 per cent), and email and productivity applications (both 39 per cent). 

IDG's survey did note that the types of data firms focus on differ depending on the size of the companies. Larger enterprises are more likely to collect transactional data, machine-generated/sensor data, government and public domain data, and data from security monitoring. However, smaller businesses concentrate their efforts on email, data from third-party databases, social media, and statistics from news media. 

One of the biggest issues for companies of all sizes will be handling unstructured data, such as emails, word documents and presentations. Due to their disorganized model and lack of a pre-defined database, deriving insight from this sources will prove difficult for many firms.

This may be why just 17 per cent of firms view unstructured data as a primary focus for their big data analytics initiatives, while nearly half (45 per cent) rate it as one of their main challenges.

IoT and cloud ‘the future of Hadoop’

24

Jun
2016
Posted By : admin Comments are off
Iot, cloud storage, hadoop, big data
Categories :#AnalyticsNews

The creator of Hadoop, Doug Cutting, has said that cloud computing and Internet of Things (IoT) applications will be the basis for the next phase of growth for the platform.

So far, most deployments of the big data analytics tool have been in large organisations in sectors such as finance, telecommunications and internet sectors, but this is changing as more use cases emerge for the technology.

Much of this is down to the growing use of digitally-connected sensors in almost all industries, which are generating huge amounts of data that businesses will need to quickly interpret if they are to make the most of the information available to them.

Mr Cutting highlighted several major companies that have already adopted HAdoop to help them handle this huge influx of sensor data.

"Caterpillar collects data from all of its machines," he said. "Tesla is able to gather more information than anyone else in the self-driving business, they're collecting information on actual road conditions, because they have cars sending all the data back. And Airbus is loading all their sensor data from planes into Hadoop, to understand and optimise their processes."

One sector that is on the verge of a revolution in how it manages information is the automotive industry, as a growing number of cars are being equipped with IoT sensors and networking capabilities.

Mr Cutting noted that almost every new car now sold has a cellular modem installed, while almost half of new cellular devices are not phones, but other connected items.

Until now, Hadoop has often been deployed as a key component of a 'data lake', where businesses pool all their incoming data into a single, centralised resource they can dip into in order to perform analytics. However, use cases for IoT typically have a need for data to be exchanged rapidly between end-devices and the central repository.

Therefore, there has been a focus recently on the development of new tools to facilitate this faster exchange of information, such as Flume and Kafka.

Mr Cutting particularly highlighted Apache Kudu as having a key role to play in this. He said: "What Kudu lets you do is update things in real-time. It's possible to do these things using HDFS but it's much more convenient to use Kudu if you're trying to model the current state of the world."

He also noted that while the majority of Hadoop applications are currently on-premises, cloud deployments are growing twice as fast, so it will be vital that providers can deliver ways to embrace this technology in their offerings.

"We are spending a lot of time on making our offerings work well in the cloud," Mr Cutting continued. "We're trying to provide really powerful high-level tools to make the lives of those delivering this tech a lot easier."

How big data helps the hospitality sector

22

Jun
2016
Posted By : admin Comments are off
big data hospitality sector
Categories :#AnalyticsNews

The hospitality sector is a highly competitive part of the economy, with hotels in particular always under pressure to deliver the highest-quality experiences at the lowest cost possible. 

A key challenge for this industry is that in the age of constant connectivity, customers have higher expectations than ever before and will demand a personalised experience. If they do not get this, they will often not have to go far to find a competitor who will meet their needs.

Fortunately, there are steps hotels can take to deliver this service. Anil Kaul, chief executive of Absolutdata Analytics, wrote in a recent piece for Dataquest that big data analytics is a natural partner for the travel and hotel sector, due to the large amount of information that travellers generate.

"Hotel companies can use this data to personalise every experience they offer their guests, from suggesting local restaurants to finding an irresistible price point. They can also use this flood of data to fine-tune their own operations," he stated.

In the past, the hotel sector has not taken full advantage of this vast data source, as many companies did not know how to make the most of it. But as new developments such as mobile technology, powerful analytics solutions and more user-friendly dashboards become available, companies will be able to hugely expand their capabilities.

For example, Mr Kaul stated that on a person-to-person level, smartphone-enabled staff members can pull up instant information about their guests to alert them to needs or requests and help them respond accordingly.

On a wider level, big data can help hotels save money by cutting back on utilities when the location is not at full capacity. Local factors such as the weather or expected events can also be factored in, so room rates can be dynamically adjusted if a major conference is nearby, for example.

It can also help hotels determine which customers will offer the best lifetime value. For instance, Mr Kaul noted that while a guest on a special, once-in-a-lifetime holiday may spend a large amount in their visit, they are unlikely to offer repeat business. On the other hand, a frugal business traveller may seem like a less valuable customer, but if the hotel can make them happy, they could return on a regular basis for years to come.

By using big data analytics to study trends and identify what customers expect, hotels can better understand what they have to do to deliver a personal service and turn a one-time visitor into a repeat customer.

As well as improving the hotel's performance, a successful big data implementation will result in happier customers and an enhanced reputation for the hotel.

"Big data might still be in the adoption phase for the hotel industry, but it has a lot of benefits to offer," Mr Kaul said. "The data is there; it just needs to be put to work. Hotels that fully leverage it will gain a significant competitive edge."

Salaries on the rise for big data professionals

22

Jun
2016
Posted By : admin Comments are off
Big data skills are in demand  Image: iStockphoto/cifotart
Categories :#AnalyticsNews

IT professionals specialising in big data are benefiting from growth in pay as employers show more demand for their skills, research has revealed.

In its latest Tech Cities Job Watch report, IT resourcing firm Experis revealed that average salaries for people with big data expertise have risen by almost eight per cent in a year.

That's nearly three per cent higher than the Bank of England's projected three per cent pay increase for the whole of Britain.

Experis' research is based on over 60,500 jobs advertised across five key tech disciplines: big data, cloud, IT security, mobile and web development.

The latest figures showed 5,148 big data jobs available in the first quarter of 2016, 87 per cent of which were based in London.

One of the key factors in the recent growth in this sector is the rising importance of personal data for businesses that want to improve their customer understanding and predict forthcoming trends.

Many companies are also bringing big data and compliance skills in-house to ensure they stay in line with new EU data protection regulations.

Geoff Smith, managing director at Experis, said big data will continue to be a "major driver" of UK economic growth as the digital revolution gathers pace.

"Yet, many companies have been slow to react and there's a limited talent pool to choose from," he added.

"Employers are willing to pay highly competitive salaries to attract these experts, so they can help with compliance, uncover valuable customer insights that can transform their business and innovate for the future."

Big data and the Internet of Things are set to add £322 billion to the UK economy within the next four years, according to a recent report from the Centre for Economics and Business Research and software provider SAS.

Manufacturers ‘failing to invest in IoT’

17

Jun
2016
Posted By : admin Comments are off
The Internet of Things is set to grow in significance for businesses  Image: iStock/Hin255
Categories :#AnalyticsNews

The Internet of Things (IoT) is one field where many UK manufacturers could be investing more, research has suggested.

Business software provider SAP conducted a survey of 100 senior executives in the manufacturing sector, around a fifth (19 per cent) of whom expected no investment in IoT technologies this year.

As a proportion of overall IT spend, average anticipated investment in these innovations was only eight per cent.

This low level of financial backing for IoT tech and platforms is not the result of a lack of confidence in this burgeoning field, with just over half (51 per cent) of the respondents to the SAP survey saying it could help them cut costs.

Nearly four out of ten manufacturing executives (38 per cent) thought IoT could help their business with product development.

Despite the relatively low level of projected investment, Nayaki Nayyar, general manager and head of IoT Go to Market for SAP, said it was encouraging that businesses recognise how technology can deliver improved insights, proactive measures and better outcomes.

She said IoT is set to have a "major impact" on the manufacturing sector in particular.

"Industry 4.0 and the Internet of Things can enable end-to-end transformation for manufacturing companies and connect the shop floor to the top floor, optimising supply chains and manufacturing operations and ultimately helping them stay competitive," added Ms Nayyar.

Nearly a third (30 per cent) of the manufacturing firms surveyed by SAP identified procurement and supply chain management as areas that could benefit from growth in IoT.

This point was also made in the UK Logistics Confidence Index for the first half of 2016 from Barclays, Moore Stephens and Analytiqa, which was based on the opinions and insights of over 100 chief executives, managing directors and finance directors from the logistics sector.

Nearly six out of ten respondents (58 per cent) said they would be implementing innovative supply chain solutions over the next 12 months. Within this group, 18 per cent of decision makers were set to focus their efforts on IoT.

However, at the top of the list of technologies set to improve supply chain management over the coming year was big data and analytics, cited by 27 per cent of respondents, followed by automation (21 per cent) and cloud computing (21 per cent).

Publishing their findings, the report authors noted that many companies still have a long way to go before they can properly grasp and realise the potential of the latest technologies.

"Some industry observers suggest companies may still be at the beginning of a journey to fully understand the potential solutions that can be developed from the real-time data that is starting to be generated by the IoT," they said.

Last November, Gartner released a report predicting that the number of connected 'things' in use across the globe will reach 6.4 billion this year, up 30 per cent from 2015.

Executive involvement ‘boosts big data profitability’

09

Jun
2016
Posted By : admin Comments are off
boosts big data profitablity
Categories :#AnalyticsNews

Companies that ensure business units play a key role in the development of big data analytics solutions are more than twice as likely to be profitable as those managed solely by the IT department.

This is according to new research by Capgemini and Informatica, which found that currently, less than a third of big data initiatives (27 per cent) are profitable. A further 45 per cent are breaking even, while 12 per cent are said to be losing money.

The study noted the majority of organisations therefore still have significant work to do in order to see a return on investment, those that have strong support from the C-suite are in a much better position.

Almost half of organisations (49 per cent) that had high levels of executive buy-in reported that their initiatives were profitable, compared with just six per cent of companies that had no executive support.

John Brahim, head of Capgemini's Insights and Data Global Practice team, commented: "The study provides insights into those organisations that are realising positive business impact from their big data investments. The companies that are reaping benefits are embracing business ownership of big data which drives a step-change in performance."

The study also found a significant split between the US and Europe when it comes to taking ownership of big data analytics projects, with almost two-thirds of European firms (64 per cent) having their projects controlled by the CIO, compared with just 39 per cent in the US.

Capgemini noted that projects that are led by the chief operating officer are the most likely to be progressing effectively, while organisations that are turning a profit from their big data also tend to be those that are most effective at managing data governance and quality.

Three-quarters of profitable respondents stated they had made excellent or very good progress in improving data quality and data governance, compared to 50 per cent overall.

"The survey findings show a direct correlation between the use of data quality and governance practices and profitable outcomes from big data projects," stated Amit Walia, executive vice-president and chief product officer at Informatica. 

He added: "Achieving business value repeatedly and sustainably requires focusing investments around the three key pillars of data management: big data integration, big data quality and governance, and big data security."

Capgemini offered several recommendations for businesses that are looking to make the most of their big data initiatives. 

For instance, it stated it will be vital to get buy-in from the very top in order for projects to be successful. Anything below boardroom level will not be enough to effect lasting change.

It also advised businesses to modernise their data warehousing systems and create a "robust, collaborative data governance framework" that enables organisations to react quickly, while also ensuring data security and data quality.

Security issues hindering big data adoption, survey warns

06

Jun
2016
Posted By : admin Comments are off
security issues big data analytic adoption
Categories :#AnalyticsNews

The lack of a clear strategy for securing businesses' most sensitive information could be one of the major barriers that prevent companies from fully exploiting big data analytics, a new survey has warned.

Research conducted by Dataguise revealed that almost three-quarters of enterprises (73 per cent) report that their big data initiatives have been delayed or even abandoned altogether as a result of security concerns.

Even when companies have put in place multiple layers of defences, there is still a lack of confidence that data will be safe. Less than half of respondents (47 per cent) had faith in their security solutions.

One common issue is that too many people will have access to data. While four out of five companies (80 per cent) indicated their IT teams were able to access the business' most sensitive information, 40 per cent also said test and development teams also had access. More worryingly, nearly a third of firms (29 per cent) indicated that end-users throughout the enterprise maintained the ability to view sensitive information.

When it comes to testing the security measures of their big data initiatives, although 62 per cent stated their solutions had passed audits, 11 per cent reported a failure, while 20 per cent did not know whether or not their systems had passed.

JT Sison, vice-president of marketing and business development for Dataguise, said: "As we have experienced, many companies are throwing everything they have at IT security challenges. The problem is that even multiple point solutions still leave gaps that put these organisations at risk."

This is likely to become an even more widespread problem as companies continue their transition to big data frameworks. The study found that 28 per cent of respondents reported more than a year's experience with these platforms, while a further 38 per cent are in various stages of adoption.

One key step that will be essential of security challenges are to be overcome is to make it clear who has responsibility for this. The research found that 88 per cent of companies stated their IT security team – including the CISO and CIO – would face scrutiny if they encountered a breach, while 47 per cent added that the CEO and board of directors would also shoulder some responsibility.

Dataguise noted that this illustrates how IT teams are at greatest risk if there is a security incident, and so must take the lead in strengthening big data infrastructure to reduce the risk of unauthorised access.

However, it also makes clear how the C-suite also needs to be focusing on this area. Big data analytics initiatives are unlikely to be successful without support from the board level, and this needs to extend to ensuring the security of frameworks.

Facebook

Twitter

LinkedId