Insurers ‘missing out’ on big data advantages

16

May
2016
Posted By : admin Comments are off
insurers missing out on big data advantages
Categories :#AnalyticsNews

Many companies in the life and property/casualty (P&C) insurance sectors are failing to take full advantage of the potential of advanced big data analytics solutions, according to a new report.

Research conducted by Bain and Company revealed that despite some early successes, much of the industry has yet to scratch the surface of what big data technology is capable of. It found one in three life insurance providers and one in five P&C insurers do not apply big data tools to any business functions.

These businesses will therefore lack critical customer insight that can be used to gain a competitive advantage.

Many insurance firms are aware of this issue and have plans in place to increase their spending on big data over the next three to five years. On average life insurance providers expect this to rise by 24 per cent, while P&C providers foresee a 27 per cent increase.

But even among those businesses that are looking to boost their performance, many initiatives will be narrowly focused on two key functions: sales and marketing, and fraud detection. However, these activities are just a small part of what big data analytics can bring to a company.

"In our work with insurers around the world, discussions tend to centre on data management issues and technology investment decisions," said Henrik Naujoks, head of Bain's Financial Services Practice for Europe, the Middle East and Africa and co-author of the brief. 

He added: "Very few are focused on the more important question of how to derive real, valuable insights from the data in order to inform better, more strategic decisions about their business, their processes and, most importantly, their customers."

Bain highlighted three key areas where effective use of big data can help inform decision-making: customer experience, innovation and underwriting.

When it comes to customer experience, for instance, one life insurance producer used big data to develop an algorithm that could identify which prospects could be approved for coverage without the need for an expensive blood test that was previously a standard requirement. As a result, this meant around 30 per cent of applicants did not have to have the test.

Elsewhere, one P&C firm found that underwriting due diligence activities could take up to nine months. But by deploying big data to analyse its own client database, and compare to US federal data on safety violations in order to screen potential clients, it has been able to greatly cut down on the number of site inspections by its engineers, which are both expensive and time-consuming.

This use of big data can also help insurers avoid taking on a high-risk client that could lead to a big payout further down the line.

However, insurers, much like any other business, must recognise that simply investing in analytics solutions alone will not be enough to guarantee success. Instead, analytics must break out of the IT department and be viewed as a key part of the wider business.

Lori Sherer, co-author of the brief and leader of Bain's Advanced Analytics Practice, said: "The most successful insurers break out of the silo and involve business stakeholders across the organisation to inform the analytic development process. The result is insights that are more likely to be adopted by the front line, thereby giving them a competitive leg up in the industry."

7 Social Media Platforms To Build Brand Image For Businesses

12

May
2016
Posted By : Comments are off
Categories :#AnalyticsNews

Few years ago, small businesses were in dilemma and had doubts whether social media was worth investing into. But now, it is no longer a question or issue on whether you need to be using social media. Rather, you have to think about the ways on using it effectively and efficiently in order to drive the business forward. A recent statistic was published by iDigic, Social media services for business – a Factosocial 101 infographic. It provides a clear description on how the various social media sites influence the brand value and business of a firm. Sometimes small businesses might feel that social media presence is a kind of luxury rather than some kind of necessity. Social media can be a huge boon for small business and the time they invest will definitely pay off.

Impact of Social Media Worldwide

Do you know there are more than 523,000,000 global users present in social media platforms?

Social Media 101 For Business - Factosocial

Article and Infographic supplied by iDigic

The amount of engagement and followers social media websites like Facebook, Twitter, Instagram and Pinterest brings is humongous. Even for business, social media is a tool by which they can predict the customer requirements and ensure that it reaches to them just like how they want.

Let’s check out how full fledge social media is in achieving the goals of business –

Facebook

Regarded as one of the relevant social media sites, Facebook steamrolls through competition and the clear proof for that is the 1.62 billion users. And the count keeps increasing each day. Like most of the things that are famous, Facebook has immense potential for your business as well as its marketing needs. Statistics show that about 42% of the business feel and accept that Facebook is important for their business. Start-ups can post details of their firm, their stories, and the various products and services that they sell. They can add friends and ensure that to get likes for each post that they submit. There is a likelihood that about 51% of the customers who like your page might be converted into a customer. On an average, individuals spend about one hour everyday on Facebook.

Twitter

Similar to the Facebook, this 140 character limit brings in great leverage to businesses through social interaction. Having more than 320 million users from all around the world this is the perfect for business to increase their engagement rate. Many experts feel that this is one of the most powerful tools for creating a target audience and even it is easy to develop trust and engage with them. Do you know that Twitter provides 21% more engagement rate when there are two hashtags? Twitter is the best hub for women as about 55% of the users in Twitter are women and the remaining ones are males. That is why Twitter is filled with various posts related to fashion and cuisine.

Instagram

It is surprising to know that about 93% of the world famous brands are using Instagram. Basically, this is one of the best social media platform for sharing vivid images with your friends via an uncluttered and clean interface. No doubt, Instagram has become one of the most famous social networking sites to create brands.

Just like other social media sites, Instagram provides business the chance to interact with their potential clients as well as ensure that they make the purchase. There are various studies which show that about 68% of the Instagram users engage with popular brands each day. Even social media marketers like to use Instagram in order to publicize the products and services of their customers.

Pinterest

If you are on the lookout of a social media site which depends on visual content, then Pinterest is the perfect choice which plays second fiddle to Instagram. Also, like the Twitter, this is the perfect for those businesses with deal with products related to fashion and cuisine. So, it is not surprising to see that 79% of Pinterest users are female while the remaining ones are male.

Compared to Facebook, Pinterest generates 27% more revenue from clicks and about 32% of individuals like to surf and pin pictures on Pinterest rather than watch TV. In fact, there are various brands online which use Pinterest to create a solid base to provide easy interactions.

Linkedin

When it comes to professional social network, Linkedin has the biggest followers in terms of working professionals beyond any doubt. Here about 41% of the people use this for marketing their products and services. Just think of it as a place where individuals from various industries discuss their corporate lifestyle and goals and create a long lasting relationship.

Various firms have a solid strategy which involves using the Linkedin profile in order to enhance their online credibility. Furthermore, 97% of staffing departments use Linkedin for recruitment and hiring activities. On an average, an individual spends about 2 hours on Linkedin

Google+

Another famous social media tool is Google+ where 41% of the users interact with their favorite brands. Even though it was backed by the giants of the Internet themselves, initially it had a very unsure start. Later on the folks at big G pulled their act together and currently the network is growing by leaps and bounds. More than 70% of the top brands are using Google+ to market their products and services. Even 51% of digital marketing professionals are using Google+ as it is interactive as well as easy to use. Various statistics show that Google+ has about 540 million active users and the biggest number of visits each month when compared to other platforms.

YouTube

For a long time, YouTube has been associated with providing free streaming of videos and since the takeover from Google, this has reached new heights. About 43% of new users will purchase a product whose ad they seen on YouTube. Firms even experience 31% increase in traffic using the YouTube ads.

In Summary

Social media is one of the major forces in the scheme of business today. For many businesses it has become quite critical to get their brand online and they consider social media as the first step. However, just by selecting the required social media cannot guarantee you the required success unless you plan out the required steps and strategies using it.

From Kognitio’s perspective all of this social interaction and engagement generates data, lots and lots of data. THe faster you can interact with, and query and analyse this data, the greater the liklihood of a high value interaction or outcome event – ultimately what your business wants! So if your analytics can’t keep up or are not fast enough to get that rapid turn-around for an immediate outcome – come talk to us.

White House warns on big data risks

11

May
2016
Posted By : admin Comments are off
White House warns about big data risks
Categories :#AnalyticsNews

A new report on the big data analytics sector from the White House has warned businesses that they must consider the ethical implications of their deployments and ensure that they are not discriminating against any individuals through their use of data.

The study, titled "Big Data: A Report on Algorithmic Systems, Opportunity, and Civil Rights" noted that if used correctly, big data can be an invaluable tool in overcoming longstanding biases and revisiting traditional assumptions.

For instance, by stripping out information such as race, national origin, religion, sexual and gender orientation, and disability, big data solutions have the potential to prevent discriminatory harm when it comes to activities such as offering employment, access to finance or admission to universities. However, the report warned that if care is not taken with the implementation of these technologies, they could exacerbate any problems.

One of the big challenges is that despite what many people assume, big data is not necessarily impartial. It can be subject to a range of issues such as imperfect inputs, poor logic and the inherent biases of the programmer.

"Predictors of success can become barriers to entry; careful marketing can be rooted in stereotype. Without deliberate care, these innovations can easily hardwire discrimination, reinforce bias, and mask opportunity," the study stated.

For instance, poorly selected data, incomplete or outdated details and unintentional historical biases could all result in the wrong data being input into big data systems. Meanwhile, poorly-designed algorithms can also cause problems if they assume correlation equals causation, or if personalised recommendations use too narrow a criteria to infer a user's true preferences.

The report highlighted several case studies that illustrate how big data can be used to improve outcomes – as well as some of the pitfalls that need to be avoided.

For example, it noted that many people in the US have difficulty gaining access to finance because they have limited or non-existent credit files. This is an issue that particularly affects African-American and Latino individuals, who are nearly twice as likely to be 'credit invisible' than whites.

Big data presents a great opportunity to improve access to credit, as it can draw on many more sources of information in order to build a picture of an applicant. This may range from phone bills, previous addresses and tax records to less conventional sources, such as location data derived from use of cellphones, social media data and even how quickly an individual scrolls through a personal finance website.

However, it warned: "While such a tool might expand access to credit for those underserved by the traditional market, it could also function to reinforce disparities that already exist among those whose social networks are, like them, largely disconnected from everyday lending.

"If poorly implemented, algorithmic systems that utilise new scoring products to connect targeted marketing of credit opportunities with individual credit determinations could produce discriminatory harms." 

The report also included a number of recommendations for improving big data outcomes, such as increasing investments in research, improving training programmes, and developing clear standards for both the public and private sector.

"Big data is here to stay; the question is how it will be used: to advance civil rights and opportunity, or to undermine them," it added.

Enterprises set to boost investment in big data infrastructure

29

Apr
2016
Posted By : admin Comments are off
Enterprises set to boost investment in big data infrastructure
Categories :#AnalyticsNews

Businesses around the world are set to hugely increase their investments in hardware and networking solutions to support big data analytics operations in the coming years, as there is a growing recognition that more robust solutions will be required in order to cope with the huge influx of data.

Research by Technavio predicts that between 2015 and 2020, global spending on servers, storage and networking equipment to support big data activities is set to increase at a compound annual growth rate (CAGR) of 33 per cent, reaching $26.95 billion (£18.48 billion) by the end of the forecast period.

The biggest sub-sector within this market will be for storage tools, which is set to grow from $3.99 billion in 2015 to $17.6 billion by 2020. The study noted that big data storage will require a huge amount of data handling capacity. In particular, solutions that offer a high input/output per second rate will be essential in ensuring that big data analytics activities can be conducted quickly and effectively.

Technavio noted the major types of storage used in big data are network-attached storage and clustered network-attached storage. Both of these are particularly useful when it comes to storing large quantities of unstructured data.

Amrita Choudhury, a lead analyst at Technavio for automatic identification systems, added: "Introduction of object storage is a new trend, which is expected to drive the global storage market over the next four years. It is an architecture that considers data as objects and block storage, instead of file hierarchy."

The need for improved big data analytics solutions is also a key driver for the server and networking markets, as many enterprises are focusing on increasing their performance in this area in order to keep pace with their competitors.

There is also a growing realisation that data is now a key factor in determining a company's market value, which is driving the adoption of analytics tools and supporting systems in many countries around the world.

Servers that feature direct attached storage are the most common solutions for companies that are thinking specifically about how their infrastructure can better support data processing activities.

"The servers possess features such as high compute intensity, better virtualisation capabilities, modular design, scaling capacity, highly efficient memory, and processor power utilisation, which make it an inevitable part of big data structure," Ms Choudhury stated. 

Global spending on server solutions for big data is expected to reach $6.6 billion by 2020, while networking tools, including cabling, routers and switches, with grow by 36.49 per cent CAGR to reach $2.75 billion.

What happened to the ‘data gravity’ concept?

25

Apr
2016
Posted By : admin Comments are off
hadoop spark services platform
Categories :#AnalyticsNews

A few years ago, one of the emerging thoughts in the data storage sector was the idea of 'data gravity' – the concept that the information a business generates has mass that affects the services and applications around it. The more data firms create, the more 'pull' it has on surrounding parts of the organisation.

The term was coined back in 2010 by Dave McCrory. In his original post, he spelled out how as data volumes grow, the effect they have on other parts of the IT environment becomes more pronounced – in much the same way that a larger planet or star exerts a greater gravitational pull than a smaller one.

Back then, when big data was still in its infancy for many companies, there was a great deal of uncertainty about the impact that growing volumes of data would have on a business, and Mr McCrory's concept helped get IT professionals used to the idea of data as having a tangible, real-world impact on how a firm operates.

These days, it's not a term that you hear very often. But why is this? It's not exactly the case that the concept hasn't worked out, but as big data technology has evolved, its rather been overtaken as the accumulation of vast quantities of data becomes the new normal for many firms – the influence has moved from local planet gravity to cosmos 'market' scale gravity.

When Mr McGrory first described the concept, tools like Hadoop were still a long way away, and the impact that the platform has had on the big data market has been huge. As a result, the notion that data has a 'pull' on just parts of the IT department has progressed to an enterprise level influence.

Many strategies are now more guided by ideas such as the 'data lake' – where all of a business' generated information is pooled into a central resource that businesses can dip into whenever they need it. Is this the ultimate evolution of the gravity concept – a data black hole – hopefully one where information escapes!

The idea of data having 'mass' that can affect other parts of the business hasn't gone away – it's just become the accepted truth, the norm, as more companies put data, and the information derived from it, at the heart of their activities.

Royal Mail embraces big data to boost performance

20

Apr
2016
Posted By : admin Comments are off
Royal Mail embraces big data boost performance
Categories :#AnalyticsNews

As the UK's largest mail carrier, dealing with billions of items every year, Royal Mail is a company well-used to managing huge volumes of information. But when it comes to improving how it handles its own digital data, the business is still in the rollout stage.

Speaking at a recent Hadoop Summit in Dublin, director of the firm's Technology Data Group Thomas Lee-Warren, explained the company has turned to Hadoop as the basis of a drive to gain more value from its internal data.

He told ComputerworldUK that as every item Royal Mail delivers is tracked, it has a huge amount of data at its disposal. 

"We are about to go up to running in the region of a hundred terabytes, across nine nodes," he said. 

One of the key challenges for managing this was to reduce the time moving information around the business. Previously, Mr Lee-Warren estimated the company's data insights team could spend up to 90 per cent of their time simply moving data backwards and forwards between its data warehousing solution and its analytical solution.

However, the organisation's Hadoop platform, which uses a Hortonworks deployment of the open-source software, eliminates much of this and helps Royal Mail get closer to its goal of data analysts spending 90 per cent of their time exploiting data and making it available to the rest of the business.

"We're accelerating that whole process, we're not having to spin up projects just to get data," Mr Lee-Warren said. "We are able to accomplish a huge amount of work with single individuals."

The company is still building out its big data analytics solution, and is taking a measured approach to the technology. As Royal Mail has relatively few resources it can devote to the area, it has to keep a tight focus on projects that can deliver a specific return on investment.

For example, one solution the data insights team is working on is churn modelling in order to help reduce customer attrition. By studying the data, Royal Mail can help its business units identify customers in particular industries who are most at risk of churn, so the sales and marketing teams can take proactive steps to avoid this.

A key advantage of deploying Hadoop for such tasks is the speed the software can provide. This enables the company to experiment more and find new ways of integrating the technology with its more conventional tools.

Mr Lee-Warren also noted that Royal Mail has not so far experienced difficulty in attracting talented big data professionals to the company, even though a lack of skills in the industry was one of the top topics for discussion at the Hadoop Summit.

He said: "It may be because we have a very attractive brand, but we're not finding it difficult to attract strong talent. A lot of the time I think data scientists get locked into a way of working that they find difficult and they like new challenges all the time, and we can provide that." 

Address your big data challenges, the Kognitio Analytical Platform explained

15

Apr
2016
Posted By : admin Comments are off
Categories :#AnalyticsNews, Blog

Watch how the Kognitio Analytical Platform provides highly scalable, in-memory analytical software that delivers ultra fast, high-concurrency SQL access to large and varied data using low-cost commodity hardware or Hadoop. When your growing user community wants ever faster query responses for complex workloads – they want unequivocal raw compute power by harenssing lots of CPUs efficiently doing lots of concurrent work, never waiting on slow disk. Enjoy the video, we had fun putting it together, leave us comments telling us what you think of it…

ADDITIONAL RESOURCES

Learn more by visiting the Kognitio Analytical Platform page

Explore

Retail banks turn to big data to regain customer trust

13

Apr
2016
Posted By : admin Comments are off
Retail banks turn to big data to regain customer trust
Categories :#AnalyticsNews

For many retail banks, the task of regaining consumer trust in the wake of the financial crisis of 2008-09 will be a difficult and ongoing challenge. With the sector still viewed with suspicion by many people, presenting a more personal face and improving customer service levels will be a high priority.

It was noted by FusionExperience chief executive Steve Edkins in an article for ITProPortal that this has become even more important in today's connected era, where the internet and social media mean dissatisfied customers are able to quickly voice any complaints to a wide audience.

In order to improve their customer service and avoid such issues, many retail banks are therefore turning to big data to offer services tailored to individual customers.

According to a study from the Centre for Economics and Business Research (Cebr), more than four-fifths of retail banks (81 per cent) will have adopted big data analytics by 2020. As well as helping track key industry trends and allowing banks to proactively adapt their strategy, this will also have a key role to play in building profiles of individual customers.

This can be useful at every stage of the customer journey. Mr Edkins noted that initially, big data analytics can be used to more effectively evaluate risk and creditworthiness. Then, when it comes to retaining customers, offering specific deals and tailoring their services accordingly will go a long way towards making consumers feel valued.

However, financial institutions will face two key challenges when it comes to adding big data to their customer service activities. The first will be how they extract relevant information from the huge amount of data they collect – separating the signal from the noise in order to make informed decisions.

The second will be how they collate this data and turn it into a useable format in time to make a difference. Today's fast-paced world demands the ability to extract, analyse and act on insights gained from data quickly if a company wants to maintain a competitive advantage.

"It is no small feat for retail banks to ingratiate big data into their processes as it often requires a daunting technological overhaul," Mr Edkins said, adding that one of the biggest challenges for these firms is getting complex legacy systems in line with today's big data capabilities. These often result in key data being placed in silos, and make it difficult for businesses to get the information they need quickly.

"To rectify this, banks will need to make better use of growing data sets such as correspondence, loan facility letters, contracts and the diversity of customer interactions if they want to offer bespoke consumer products that will allow them to fend off their more agile competitors," he stated.

However, if retail banks can get this right and build a strong customer service culture centred around big data, the rewards on offer are significant. Cebr's data forecasts that effective analysis of data could add £240 billion to the UK's economy through improved efficiency and better understanding of the market and customer demands.

Banks ‘not making the most’ of big data

08

Apr
2016
Posted By : admin Comments are off
Banks 'not making the most' of big data
Categories :#AnalyticsNews

Many banks should be doing more to turn the wealth of information they have available on their customers into actionable insights, it has been stated.

Speaking to Network World, head of banking and financial services at IT consultancy Xavient Information systems Deanne Yamato-Tucker noted that these institutions now have access to a wide variety of data from consumer-facing products such as apps. However, few of these are effectively analysing this information.

As a result, they are failing to take advantage of new opportunities to re-invent their offerings, deliver higher levels of customer service and develop innovative new products.

By careful use of their customers' data, banks should be able to offer more specific, tailored services to consumers, with rates that are "based on a consumer's banking patterns, levels of deposits, spending patterns, web browsing history, social media information [and] geolocation data", Ms Yamato-Tucker stated.

She added that offerings such as biometric identification, loyalty programmes, savings schemes and interactive money management programmes can all be part of a personalised user experience.

Crucially, much of the data needed to make these innovations a reality is already being collected anyway, so banks would not even have to put in place extensive new information gathering processes in order to learn more about their customers. The key to success will be how they can harness this existing data.

In particular, financial services firms need to improve how they handle metadata in order to make the organisation and analysis of information easier.

"With the growing variety and increasing velocity of data, banks need to develop comprehensive metadata management and data governance processes," Ms Yamato-Tucker said. "One cannot share and understand data effectively, and in a meaningful way, without managing the metadata."

Almost every bank has now set up services such as online and mobile portals that allow users to create payments, transfer funds and check their statements wherever they are. This was described by Ms Yamato-Turner as the "first round" of banking innovation.

The second, she continued, will be "a ubiquitous customer experience, where the customer, and their devices, as a representation of the customer, is the centre of the mobile ecosystem."

‘Cognitive storage’ aims to cut the cost of big data

06

Apr
2016
Posted By : admin Comments are off
cognitive storage cut big data costs
Categories :#AnalyticsNews

One of the key challenges for any organisation embarking on a big data project will be ensuring that costs are kept under control – something that is not always easy to do when firms are collecting and storing huge amounts of information.

Therefore, in order to tackle this issue, IBM has revealed it is working on a new method for automatically classifying information in order to ensure the most relevant data is always on hand.

Known as 'cognitive storage', the solution involves putting value to incoming data, determining what data should reside on which type of media, what levels of data protection should apply and what policies should be set for the retention and lifecycle of different classes of data, Computer Weekly reports.

IBM researcher Giovanni Cherubini explained the most obvious answer to the challenge of handling large amounts of data while keeping costs low is to have tiers of storage – such as flash and tape solutions – with the most important data held on the fastest media.

The machine learning tool aims to assess the value of data and direct it to the most appropriate solution, by studying metadata and analysing access patterns, as well as learning from the changing context of data use to help it assign value. 

IBM researcher Vinodh Venkatesan added: "Administrators would help train the learning system by providing sample files and labelling types of data as having different value."

For business users, the challenge of this is that they will have a large variety of data – from business-critical transactional data to emails, machine sensor data and more – so it will be essential that any cognitive storage system is able to categorise this correctly.

Mr Venkatesan said: "For an enterprise, there are ‘must keep’ classes of data and these could be set to be of permanently high value. But that is a small proportion in an enterprise. 

"The rest, the majority, which cannot necessarily be manually set, can be handled by cognitive storage – such as big data-type information and sensor information that might have value if analysed."

Facebook

Twitter

LinkedId