How Tesco is diving into the data lake


Posted By : admin Comments are off
tesco data lake, big data, forecasting
Categories :#AnalyticsNews

An effective big data analytics solution is now an essential requirement for any large business that wishes to be successful in today's competitive environment, regardless of what sector they are in.

However, one part of the economy that particularly stands to benefit from this technology is retail. These firms have a longstanding tradition of gathering and utilising customer data, so the ability to gain greater insight from the information they already have will play a key role in their decision-making.

One company that has always been at the forefront of this is UK supermarket Tesco. It was noted by Forbes that the company was one of the first brands to track customer activity through the use of its loyalty cards, which allows it to perform activities such as delivering personalised offers.

Now, however, it is turning to technologies such as real-time analytics and the Internet of Things in order to keep up with newer competitors such as Amazon, which is moving into the grocery business.

Vidya Laxman, head of global warehouse and analytics at the supermarket, told the publication: "We are focused on data now and realise that to get where we want to be in five years' time, we have to find out what we will need now and create the right infrastructure."

She added that Tesco is focusing on technologies such as Hadoop, which is central to the 'data lake' model that the company is working towards. This will be a centralised, cloud based repository for all of the company's data, designed to be accessible and useable by any part of the organisation whenever it is needed. 

Ms Laxman explained one challenge for the company has been ensuring that the right data gets to where it needs to go, as different departments often need different information. For example, finance teams need details on sales and forecasts, while the customer side of the business needs data that can be used to inform marketing campaigns.

"We have data scientists in all of our organisations who need access to the data," she said. "That's where Hadoop comes into the picture. We've just started on this journey – we've had data warehousing for some time so there are some legacy systems present and we want to leverage what’s good and see where we can convert to using new strategies."

A key priority for Tesco's activities will be to increase the speed of data processing in order to better support activities such as real-time modelling and forecasting.

Under a traditional way of working, it may take nine or ten months just to ingest the relevant data. Therefore, improving these processes will be essential to the success of big data initiatives.

Another factor helping Tesco is an increasing reliance on open source solutions. Mike Moss, head of forecasting and analytics at Tesco, told Forbes that when he began developing his first forecasting system for the company eight years ago, any use of open source required a lengthy approval process to get it signed off.

"There wasn't the trust there in the software," he said. "It now feels like we're in a very different place than previously … Now we have freedom and all the engineers can use what they need to use, as long as it's reasonable and it makes sense."

What happened to the ‘data gravity’ concept?


Posted By : admin Comments are off
hadoop spark services platform
Categories :#AnalyticsNews

A few years ago, one of the emerging thoughts in the data storage sector was the idea of 'data gravity' – the concept that the information a business generates has mass that affects the services and applications around it. The more data firms create, the more 'pull' it has on surrounding parts of the organisation.

The term was coined back in 2010 by Dave McCrory. In his original post, he spelled out how as data volumes grow, the effect they have on other parts of the IT environment becomes more pronounced – in much the same way that a larger planet or star exerts a greater gravitational pull than a smaller one.

Back then, when big data was still in its infancy for many companies, there was a great deal of uncertainty about the impact that growing volumes of data would have on a business, and Mr McCrory's concept helped get IT professionals used to the idea of data as having a tangible, real-world impact on how a firm operates.

These days, it's not a term that you hear very often. But why is this? It's not exactly the case that the concept hasn't worked out, but as big data technology has evolved, its rather been overtaken as the accumulation of vast quantities of data becomes the new normal for many firms – the influence has moved from local planet gravity to cosmos 'market' scale gravity.

When Mr McGrory first described the concept, tools like Hadoop were still a long way away, and the impact that the platform has had on the big data market has been huge. As a result, the notion that data has a 'pull' on just parts of the IT department has progressed to an enterprise level influence.

Many strategies are now more guided by ideas such as the 'data lake' – where all of a business' generated information is pooled into a central resource that businesses can dip into whenever they need it. Is this the ultimate evolution of the gravity concept – a data black hole – hopefully one where information escapes!

The idea of data having 'mass' that can affect other parts of the business hasn't gone away – it's just become the accepted truth, the norm, as more companies put data, and the information derived from it, at the heart of their activities.

Telcos aiming to boost customer care through big data


Posted By : admin Comments are off
Telcos aiming to boost customer care through big data
Categories :#AnalyticsNews

Using big data analytics to improve customer care solutions will be the top priority for telecommunications companies in the coming years, a new survey has found.

Research conducted by Guavus revealed that 87 per cent of network providers have either already implemented a big data strategy or are in the process of doing so. The primary drivers for the adoption of such services include maximising revenue, named by 66 per cent of respondents, boosting customer experience and loyalty (61 per cent), and cutting operational expenditures (also 61 per cent).

However, in the next two years, it will be improving customer care that will be the focus of these activities. The study found that 57 per cent of respondents named this as their top issue they are looking to address over the period, ahead of revenue assurance (48 per cent), improving targeted offerings (47 per cent) and better service assurance (44 per cent).

Anukool Lakhina, founder and chief executive of Guavus, said it is no surprise to see that proactive customer care will be the top area of investment for 2016 and beyond, as in today's competitive environment, "providing a seamless customer experience holds the key to safeguarding operator revenue streams".

He added that being able to gain a complete, end-to-end picture of subscribers' experiences enables telcos to intervene quickly as soon as potential issues are detected. This means they can remedy any service degradations, prevent churn and raise customer satisfaction for increased loyalty – ultimately leading to improved revenue.

Mr Lakhina stated that as companies become more familiar with big data and their strategies mature, the focus is shifting away from simply collecting and analysing very large data sets towards being able to derive actionable intelligence from their information.

"Operators have realised that the ability to fuse data streams and bridge the gap between business and operational data is essential to achieving this goal," he said. "However, it's also vital to strip out only the most valuable nuggets of data for analysis, as trying to store everything will increase costs, delay time to insight and devalue the quality of the analytics provided."

An inability to integrate data from disparate systems is currently one of the biggest barriers to success for many telcos, with 28 per cent of respondents listing this as a problem. This was followed by poor data quality and management (25 per cent) and finding personnel with the right skills to handle such projects.

The study also found that a large number of telcos remain dubious over the value of data lakes, despite the fact these solutions have been hyped as one of the keys to big data success. Only 22 per cent of respondents said data lakes are a critical part of how they bring disparate data together, while some 68 percent of network operators stated they remain unsure about these tools, or are waiting to see whether they will emerge as more than just hype.