Financial services firms to embrace real-time analytics

30

Nov
2016
Posted By : admin Comments are off
financial services embrace real time analytics
Categories :#AnalyticsNews

A growing number of companies in the financial services sector are set to upgrade their big data analytics initiatives to include real-time solutions, a new report has claimed.

A study by TABB Group noted there is an increasing understanding in the sector that the value of a given piece of data can be lost almost immediately as it becomes outdated. Therefore, capital markets firms are turning to real-time analytics for activities including risk management, compliance, consumer metrics and turning insight into revenue.

Author of the report Monica Summerville noted that simply having data is no longer useful, and traditional ways of thinking about analytics, such as data warehousing and batch-led approaches to analytics, no longer apply.

In today's environment, firms must be able to find and act on patterns in incredibly large data sets in real time, while also being able to reference older, stored data as part of a streaming analytics operation without reverting to batch processing.

"The market for real time big data analytics is potentially as large as business intelligence, real-time streaming and big data analytics combined," Ms Summerville said. "The most successful approaches understand the importance of data acquisition to this process and successfully combine the latest open source technologies with market leading commercial solutions."

Implementing effective solutions for this will be challenging and requires companies to invest in software, hardware and data, as well as personnel with expertise in the sector.

Therefore, in order to ensure businesses can see a quick return on investment, TABB stated they will have to take big data analytics 'upstream' by layering streaming and static big data sets to support real time analysis of combined data sets. 

Such capabilities will be a key requirement if financial services firms are to progress to technologies like machine learning and other artificial intelligence based analytics.

Ms Summerville said: "We believe the upstream analytics approach will increasingly be adopted throughout the industry in response to industry drivers, an unending desire for new sources of alpha and the rising complexity of investment approaches."

IoT and cloud ‘the future of Hadoop’

24

Jun
2016
Posted By : admin Comments are off
Iot, cloud storage, hadoop, big data
Categories :#AnalyticsNews

The creator of Hadoop, Doug Cutting, has said that cloud computing and Internet of Things (IoT) applications will be the basis for the next phase of growth for the platform.

So far, most deployments of the big data analytics tool have been in large organisations in sectors such as finance, telecommunications and internet sectors, but this is changing as more use cases emerge for the technology.

Much of this is down to the growing use of digitally-connected sensors in almost all industries, which are generating huge amounts of data that businesses will need to quickly interpret if they are to make the most of the information available to them.

Mr Cutting highlighted several major companies that have already adopted HAdoop to help them handle this huge influx of sensor data.

"Caterpillar collects data from all of its machines," he said. "Tesla is able to gather more information than anyone else in the self-driving business, they're collecting information on actual road conditions, because they have cars sending all the data back. And Airbus is loading all their sensor data from planes into Hadoop, to understand and optimise their processes."

One sector that is on the verge of a revolution in how it manages information is the automotive industry, as a growing number of cars are being equipped with IoT sensors and networking capabilities.

Mr Cutting noted that almost every new car now sold has a cellular modem installed, while almost half of new cellular devices are not phones, but other connected items.

Until now, Hadoop has often been deployed as a key component of a 'data lake', where businesses pool all their incoming data into a single, centralised resource they can dip into in order to perform analytics. However, use cases for IoT typically have a need for data to be exchanged rapidly between end-devices and the central repository.

Therefore, there has been a focus recently on the development of new tools to facilitate this faster exchange of information, such as Flume and Kafka.

Mr Cutting particularly highlighted Apache Kudu as having a key role to play in this. He said: "What Kudu lets you do is update things in real-time. It's possible to do these things using HDFS but it's much more convenient to use Kudu if you're trying to model the current state of the world."

He also noted that while the majority of Hadoop applications are currently on-premises, cloud deployments are growing twice as fast, so it will be vital that providers can deliver ways to embrace this technology in their offerings.

"We are spending a lot of time on making our offerings work well in the cloud," Mr Cutting continued. "We're trying to provide really powerful high-level tools to make the lives of those delivering this tech a lot easier."

Insurers ‘missing out’ on big data advantages

16

May
2016
Posted By : admin Comments are off
insurers missing out on big data advantages
Categories :#AnalyticsNews

Many companies in the life and property/casualty (P&C) insurance sectors are failing to take full advantage of the potential of advanced big data analytics solutions, according to a new report.

Research conducted by Bain and Company revealed that despite some early successes, much of the industry has yet to scratch the surface of what big data technology is capable of. It found one in three life insurance providers and one in five P&C insurers do not apply big data tools to any business functions.

These businesses will therefore lack critical customer insight that can be used to gain a competitive advantage.

Many insurance firms are aware of this issue and have plans in place to increase their spending on big data over the next three to five years. On average life insurance providers expect this to rise by 24 per cent, while P&C providers foresee a 27 per cent increase.

But even among those businesses that are looking to boost their performance, many initiatives will be narrowly focused on two key functions: sales and marketing, and fraud detection. However, these activities are just a small part of what big data analytics can bring to a company.

"In our work with insurers around the world, discussions tend to centre on data management issues and technology investment decisions," said Henrik Naujoks, head of Bain's Financial Services Practice for Europe, the Middle East and Africa and co-author of the brief. 

He added: "Very few are focused on the more important question of how to derive real, valuable insights from the data in order to inform better, more strategic decisions about their business, their processes and, most importantly, their customers."

Bain highlighted three key areas where effective use of big data can help inform decision-making: customer experience, innovation and underwriting.

When it comes to customer experience, for instance, one life insurance producer used big data to develop an algorithm that could identify which prospects could be approved for coverage without the need for an expensive blood test that was previously a standard requirement. As a result, this meant around 30 per cent of applicants did not have to have the test.

Elsewhere, one P&C firm found that underwriting due diligence activities could take up to nine months. But by deploying big data to analyse its own client database, and compare to US federal data on safety violations in order to screen potential clients, it has been able to greatly cut down on the number of site inspections by its engineers, which are both expensive and time-consuming.

This use of big data can also help insurers avoid taking on a high-risk client that could lead to a big payout further down the line.

However, insurers, much like any other business, must recognise that simply investing in analytics solutions alone will not be enough to guarantee success. Instead, analytics must break out of the IT department and be viewed as a key part of the wider business.

Lori Sherer, co-author of the brief and leader of Bain's Advanced Analytics Practice, said: "The most successful insurers break out of the silo and involve business stakeholders across the organisation to inform the analytic development process. The result is insights that are more likely to be adopted by the front line, thereby giving them a competitive leg up in the industry."

Retail banks turn to big data to regain customer trust

13

Apr
2016
Posted By : admin Comments are off
Retail banks turn to big data to regain customer trust
Categories :#AnalyticsNews

For many retail banks, the task of regaining consumer trust in the wake of the financial crisis of 2008-09 will be a difficult and ongoing challenge. With the sector still viewed with suspicion by many people, presenting a more personal face and improving customer service levels will be a high priority.

It was noted by FusionExperience chief executive Steve Edkins in an article for ITProPortal that this has become even more important in today's connected era, where the internet and social media mean dissatisfied customers are able to quickly voice any complaints to a wide audience.

In order to improve their customer service and avoid such issues, many retail banks are therefore turning to big data to offer services tailored to individual customers.

According to a study from the Centre for Economics and Business Research (Cebr), more than four-fifths of retail banks (81 per cent) will have adopted big data analytics by 2020. As well as helping track key industry trends and allowing banks to proactively adapt their strategy, this will also have a key role to play in building profiles of individual customers.

This can be useful at every stage of the customer journey. Mr Edkins noted that initially, big data analytics can be used to more effectively evaluate risk and creditworthiness. Then, when it comes to retaining customers, offering specific deals and tailoring their services accordingly will go a long way towards making consumers feel valued.

However, financial institutions will face two key challenges when it comes to adding big data to their customer service activities. The first will be how they extract relevant information from the huge amount of data they collect – separating the signal from the noise in order to make informed decisions.

The second will be how they collate this data and turn it into a useable format in time to make a difference. Today's fast-paced world demands the ability to extract, analyse and act on insights gained from data quickly if a company wants to maintain a competitive advantage.

"It is no small feat for retail banks to ingratiate big data into their processes as it often requires a daunting technological overhaul," Mr Edkins said, adding that one of the biggest challenges for these firms is getting complex legacy systems in line with today's big data capabilities. These often result in key data being placed in silos, and make it difficult for businesses to get the information they need quickly.

"To rectify this, banks will need to make better use of growing data sets such as correspondence, loan facility letters, contracts and the diversity of customer interactions if they want to offer bespoke consumer products that will allow them to fend off their more agile competitors," he stated.

However, if retail banks can get this right and build a strong customer service culture centred around big data, the rewards on offer are significant. Cebr's data forecasts that effective analysis of data could add £240 billion to the UK's economy through improved efficiency and better understanding of the market and customer demands.

Banks ‘not making the most’ of big data

08

Apr
2016
Posted By : admin Comments are off
Banks 'not making the most' of big data
Categories :#AnalyticsNews

Many banks should be doing more to turn the wealth of information they have available on their customers into actionable insights, it has been stated.

Speaking to Network World, head of banking and financial services at IT consultancy Xavient Information systems Deanne Yamato-Tucker noted that these institutions now have access to a wide variety of data from consumer-facing products such as apps. However, few of these are effectively analysing this information.

As a result, they are failing to take advantage of new opportunities to re-invent their offerings, deliver higher levels of customer service and develop innovative new products.

By careful use of their customers' data, banks should be able to offer more specific, tailored services to consumers, with rates that are "based on a consumer's banking patterns, levels of deposits, spending patterns, web browsing history, social media information [and] geolocation data", Ms Yamato-Tucker stated.

She added that offerings such as biometric identification, loyalty programmes, savings schemes and interactive money management programmes can all be part of a personalised user experience.

Crucially, much of the data needed to make these innovations a reality is already being collected anyway, so banks would not even have to put in place extensive new information gathering processes in order to learn more about their customers. The key to success will be how they can harness this existing data.

In particular, financial services firms need to improve how they handle metadata in order to make the organisation and analysis of information easier.

"With the growing variety and increasing velocity of data, banks need to develop comprehensive metadata management and data governance processes," Ms Yamato-Tucker said. "One cannot share and understand data effectively, and in a meaningful way, without managing the metadata."

Almost every bank has now set up services such as online and mobile portals that allow users to create payments, transfer funds and check their statements wherever they are. This was described by Ms Yamato-Turner as the "first round" of banking innovation.

The second, she continued, will be "a ubiquitous customer experience, where the customer, and their devices, as a representation of the customer, is the centre of the mobile ecosystem."

Security to be key big data use for 2016

30

Dec
2015
Posted By : admin Comments are off
Image credit: iStockphoto/weerapatkiatdumrong
Categories :#AnalyticsNews

A growing number of organisations in sectors such as banking and insurance are set to turn to big data analytics in 2016 in order to keep their critical information safe from hackers and other unauthorised users.

This is according to a new forecast for the year ahead from Oracle. It noted that 2016 will see big data become more integral to the day-to-day workings of many businesses.

Neil Mendelson, vice-president of big data and product management at Oracle, said: "2016 will be the year when big data becomes more mainstream and is adopted across various sectors to drive innovation and capture digitisation opportunities."

However it will be the technology's ability to identify unusual and potentially fraudulent activity that will be of particular interest to the financial services sector.

The company stated: "2016 will witness an increase in the proliferation of experiments [around] default risk, policy underwriting, and fraud detection as firms try to identify hotspots for algorithmic advantage faster than the competition."

Another key driver for big data security solutions will be increasing public awareness of the numerous ways their personally identifiable information can be collected, shared, stored and stolen. This will in turn lead to more calls for greater regulation to ensure this data is protected.

"The continuous threat of ever more sophisticated hackers will prompt companies to both tighten security, as well as audit access and use of data," Oracle continued.

Among its other predictions, the company forecast increased demand for data scientists from established enterprises, while the emergence of new management tools will allow more businesses to implement technologies such as machine learning, natural language processing and property graphs.

Simpler data discovery tools will also let business analysts identify the most useful datasets within enterprise Hadoop clusters, reshape them into new combinations and analyse them with exploratory machine learning techniques. This will improve both self-service access to big data and provide richer hypotheses and experiments that drive the next level of innovation.

Predictive analytics ‘the future’ for the investment industry

22

Dec
2015
Posted By : admin Comments are off
Predictive analytics 'the future' for the investment industry
Categories :#AnalyticsNews

Professionals in the investment sector will come to rely much more heavily on predictive analytics solutions in the coming years when they are looking to analyse their clients' behaviour and measure themselves against competitors.

This was the conclusion of a group of broker-dealer professionals who discussed the future of the industry with InvestmentNews.com at a recent technology conference.

They agreed that the primary goal for the use of big data analytics in the investment sector will be to offer advisers more insight into the decisions being made and identify trends or client risks before they occur.

Aaron Spradlin, chief information officer at United Planners Financial Services, told the publication that to achieve this, professionals will have to turn to outside software in order to effectively add predictive elements to these activities.

"When you look at big data, it's very sophisticated, and there's some really cool tools out there, but it's not easy to do," he said.

There was agreement among the experts that while all the data necessary to identify potential risks and opportunities already exists, it is only recently that the tools to unlock this have become widely available.

For instance, Mr Spradlin stated that until now, the majority of his firm's activities have been focused around data collection, drawing together large amounts of information about advisers and clients.

It can then use this to offer advisers compliance guidance, looking at trends and behaviours and using those to write better alerts, he said.

James Clabby, chief information officer at AIG Advisor Group, added that advisers can already use this detailed client information as a tool to make better decisions. For example, by pulling together a list of all clients with a certain amount of assets and a certain percentage of that in cash, an adviser can engage them on where that money might best be invested.

This is one example of how professionals are using analytics today, with Mr Clabby describing this as "little big data".

However, moving from this to the next stage of evolution, where predictive analytics can be used to dictate all of a professional's decision-making, will typically require a major investment. 

Mr Spradlin said: "I don't have money to invest significantly in this area, and I think that's where we're hoping for innovation from the industry, from … bigger firms that might step in and say, 'Hey, pass us the data, we'll automate it and pass it back'."

The experts also stated that increased regulation is likely to be one of the big challenges for the investment sector as analytics and the use of potentially sensitive data becomes the norm.

It will also be important to ensure that this information is being handled sensitively so as not to create privacy concerns. Gary Gagnon, vice president of technology for Cambridge Investment Research, warned: "There's the danger that we face that we could become so effective at using big data about individual clients and their decisions and their plans that it might actually alienate some people."

Hadoop adopters ‘must get to grips’ with its complexity

02

Dec
2015
Posted By : admin Comments are off
Hadoop adopters 'must get to grips' with its complexity
Categories :#AnalyticsNews

The number of businesses turning to Hadoop to assist with their big data analytics operations is continuing to grow. However, even the most sophisticated users are still struggling to get to grips with the operational complexity of the technology.

This is according to Wikibon, which noted that Hadoop has seen an "unprecedented" rate of innovation recently – primarily because it is an ecosystem rather than a single product, which enables providers to come up with their own solutions. This pace of development has encouraged a large number of businesses to investigate the capabilities of the platform.

According to the 2015 edition of Wikibon's Big Data Survey, 41 per cent of respondents reported they had at least one production deployment of Hadoop – a ten per cent increase from 18 months earlier.

However, analyst at the company George Gilbert observed that prospective users of the technology – as well as those already in the pilot or development stage – need to be aware that there are "no easy solutions" to getting the most out of Hadoop.

He noted the biggest challenge facing Hadoop users is making the technology manageable. As there are many moving parts involved in such a solution, this makes it a highly complex, difficult to control platform. If businesses do not appreciate this, they are likely to see their total cost of ownership (TCO) spiral as administrative overheads grow and users invest in solutions they later abandon.

One way to tackle these issues is to embrace cloud computing technology and turn to third-party experts to assist with the deployment. Mr Gilbert observed that Hadoop-as-a-Service solutions can simplify some of the management issues associated with the technology – although not all.

Turning to native cloud services like AWS, Azure and Google Cloud Platform can dramatically simplify management, as much of the burden of setting up and administrating a Hadoop system is placed on the service provider.

However, there is a trade-off for this. Mr Gilbert stated that in many cases, businesses that desire this level of simplicity will have to sacrifice choice and openness.

"All the cloud providers will provide ever more powerful DevOps tools to simplify development and operations of applications on their platforms," he said. "But as soon as customers want the ability to plug in specialised third-party functionality, that tooling will likely break down."

Therefore, enterprises may need to decide early whether they are prepared to accept a more limited level of functionality in exchange for reducing some of the complexity of Hadoop.

"Right now the customers with the skills to run Hadoop on-premises are internet-centric companies and traditional leading-edge enterprise IT customers such as banks, telcos, and large retailers," Mr Gilbert said. "Solving the TCO and manageability problem won’t be easy." 

FCA to investigate UK insurance market’s use of big data

26

Nov
2015
Posted By : admin Comments are off
FCA to investigate UK insurance market's use of big data
Categories :#AnalyticsNews

The UK's Financial Conduct Authority (FCA) has announced a review into the way general insurance firms in the country use big data analytics to improve their operations.

It has issued a call for input on the subject, with the regulator inviting both consumers and the industry to give their thoughts on how big data impacts on organisations' decision-making and the effects the technology has on customers.

The study will focus on three key questions. It will ask whether big data affects consumer outcomes, whether it fosters or constrains competition in the sector, and if the FCA's regulatory framework affects developments in the retail insurance big data market.

Christopher Woolard, director of strategy and competition at the FCA, said: "Big data is having an ever-growing social and commercial impact, and has the potential to transform practices and products across financial services. We are starting our work on big data by seeking to better understand how insurance firms are using data, and how this may evolve in the future."

He added that the review will influence the FCA's thinking when it comes to determining what steps need to be taken in future to regulate the sector.

It was noted by the Financial Times that big data has become particularly important to the insurance industry, which is able to use detailed information about their customers' behaviour to calculate more personalised premiums.

They can also take advantage of platforms like as social media to verify whether claims are fraudulent. They can, for instance, check whether two people in a seemingly random road accident are connected.

However, a 2013 study by the Association of British Insurers (ABI) suggested 71 per cent of consumers would be unhappy with insurers pricing their products based on information gathered from social media, so companies will have to tread carefully when utilising this type of data.

The ABI stated this week that insurers take care to treat personal data sensitively, adding: "Big data can make insurance work better for customers by improving the claims experience and creating personalised and innovative products."

Big data ‘set to revolutionise supply chain management’

25

Nov
2015
Posted By : admin Comments are off
Big data 'set to revolutionise supply chain management'
Categories :#AnalyticsNews

While many early adopters of big data analytics have focused their attentions on areas such as marketing, customer service and the financial sector, as the technology matures, it is attracting attention from a wide range of sectors and for a large number of use cases.

One area where the technology has particular potential is in supply chain management. It was noted by CFO.com that this part of a business is typically rich with information from multiple sources, while at the same time being a major cost centre for many organisations. As such, there are huge opportunities to leverage this information and make processes more efficient.

Regenia Saunders and Jason Meil from management consulting firm SSA & Co wrote that many current operations are not taking full advantage of this potential.

"They are optimising, but not strategically," they stated. "When applying data to [the] supply chain, it's critical to step back and look at what truly drives business value."

A common problem businesses encounter when they try to apply big data analytics to their supply chain is they end up focusing on the wrong areas. To counter this, businesses need to devote more time to the planning stage. This is often the most difficult part of the process to get right, but mistakes here can have the biggest impact on overall costs.

"We've found with our clients, again and again, that big data can have a measurable impact on driving greater accuracy in planning, ensuring that companies make the right amount of the right product," Ms Saunders and Mr Meil said.

By deploying advanced algorithms and machine learning, businesses can see increased forecast accuracy across their SKUs, which can lead directly to less waste, less inventory, and fewer stock-out issues.

This will be particularly important in today's retail environment, which is seeing increased volatility in consumer buying patterns. At the same time, the fast pace of growth in emerging markets can often make it tricky to predict where organisations should be focusing their efforts.

Therefore, strong use of analytics will be essential in meeting demand and ensuring companies are aligned to the needs of the market. For example, if the data suggests that customers prioritise convenience above all else, this can indicate that the organisation should explore how it can optimise its supply chain to get products to consumers as quickly as possible.

On the other hand, if quality is found to be a key driver, investments in R&D, product lifecycle management, supplier relationship management, and manufacturing should be prioritised.
    
"These supply chain decisions directly impact financial allocations," Ms Saunders and Mr Meil said. "Making the best decision for the organisation requires identifying and measuring key performance indicators (KPIs) directly related to key supply chain areas."

To be effective, supply chains must be lean, agile and externally-focused. Effective big data analytics solutions can help a business meet these goals and position itself for success.

Facebook

Twitter

LinkedId