Forrester once predicted that enterprise adoption of Hadoop will become mandatory. While some companies are still struggling with their Hadoop projects, others are using the big data framework to revolutionize their data storage and analytics.
The advantages of Hadoop — flexibility and lower costs — appeal to enterprises, so Hadoop has fundamentally changed how businesses process and store very large, fast-moving data sets. With additional software like Kognitio, organizations can also achieve high-speed BI and analytics on their Hadoop-based data.
But have you ever wondered which household-named brands and businesses have made a true success of Hadoop for big data analytics, and how?
Here are five businesses successfully using Hadoop:
In 2015, Marks and Spencer adopted Cloudera Enterprise to analyze its data from multiple sources. The goal for the British retail business was to better understand its customers’ behavior.
Marks and Spencer uses Hadoop to plug gaps in campaign management, manage customer loyalty data, and it uses data from digital assets to help create more personalized and targeted communications.
Thanks to their decision to use Hadoop, the company can now successfully predict stock demand and uses business analytics to keep its shelves full during peak times.
British postal service company Royal Mail used Hadoop to pave the way for its big data strategy, and to gain more value from its internal data.
The business used Hortonworks’ Hadoop analytics tools to transform the way it managed data across the organization. Royal Mail can now identify customers in particular industries who are most at risk of churn, allowing the sales and marketing teams to take proactive preventative steps. It also enables the company to find new ways of integrating the tech with its more conventional tools.
As a driver of enhanced customer experiences, Royal Bank of Scotland (RBS) decided to use Hadoop (Cloudera Enterprise) to gain intelligence from its online customer chat conversations.
RBS processes around 250,000 chat logs and associated metadata per month, storing this unstructured data in Hadoop. By using a big data management and analytics hub built on Hadoop, the business uses machine learning as well as data wrangling to map and understand its customers’ journeys.
The high street bank is also using big data analytics to delve into transactional data to analyze and identify where customers are paying twice for financial products, and deliver enhanced customer experiences.
British Airways deployed Hadoop in April 2015 as a data archive for legal cases. Previously theses were stored on an enterprise data warehouse which was costly for the airline.
Since deploying Hortonworks 2.2 HDP, British Airways has gained ROI within a year, and is able to deliver 75% more free space for new projects, translating directly into cost reductions for the airline.
“In business intelligence, if you don’t adopt this technology to do at least part of your job role, you will not exist in a few years’ time. You can only go so far with traditional technology. It still has a place within your architecture, but quite frankly, this is where you need to be.”
Alan Spanos, Data Exploitation Manager, British Airways
Expedia makes use of Hadoop clusters using Amazon Elastic MapReduce (Amazon EMR) to analyze high volumes of data coming from Expedia’s global network of websites. These include clickstream, user interaction, and supply data. Highly valuable for allocating marketing spend, this data is merged from web bookings, marketing departments and marketing spend logs to analyze whether the outlay has equated to increased bookings.
The firm has seen costs drop and can process and analyze higher volumes of data.
There are many high profile businesses using Hadoop for lower-cost and big data BI and analytics, delivering enhanced customer insights, better user experiences and greater business returns.
While Hadoop encounters problems with high-speed interactive analytics, especially for high volumes of for concurrent users, software like Kognitio on Hadoop ensures organizations can still run hundreds of high-speed concurrent analytical queries over their big data sets.