Security to be key big data use for 2016


Posted By : admin Comments are off
Image credit: iStockphoto/weerapatkiatdumrong
Categories :#AnalyticsNews

A growing number of organisations in sectors such as banking and insurance are set to turn to big data analytics in 2016 in order to keep their critical information safe from hackers and other unauthorised users.

This is according to a new forecast for the year ahead from Oracle. It noted that 2016 will see big data become more integral to the day-to-day workings of many businesses.

Neil Mendelson, vice-president of big data and product management at Oracle, said: "2016 will be the year when big data becomes more mainstream and is adopted across various sectors to drive innovation and capture digitisation opportunities."

However it will be the technology's ability to identify unusual and potentially fraudulent activity that will be of particular interest to the financial services sector.

The company stated: "2016 will witness an increase in the proliferation of experiments [around] default risk, policy underwriting, and fraud detection as firms try to identify hotspots for algorithmic advantage faster than the competition."

Another key driver for big data security solutions will be increasing public awareness of the numerous ways their personally identifiable information can be collected, shared, stored and stolen. This will in turn lead to more calls for greater regulation to ensure this data is protected.

"The continuous threat of ever more sophisticated hackers will prompt companies to both tighten security, as well as audit access and use of data," Oracle continued.

Among its other predictions, the company forecast increased demand for data scientists from established enterprises, while the emergence of new management tools will allow more businesses to implement technologies such as machine learning, natural language processing and property graphs.

Simpler data discovery tools will also let business analysts identify the most useful datasets within enterprise Hadoop clusters, reshape them into new combinations and analyse them with exploratory machine learning techniques. This will improve both self-service access to big data and provide richer hypotheses and experiments that drive the next level of innovation.

Big data helps fight Christmas fraud


Posted By : admin Comments are off
Big data helps fight Christmas fraud
Categories :#AnalyticsNews

Retailers around the world are currently experiencing their busiest time of the year in the run-up to the festive season, with post-Christmas sales also right around the corner. However, it is not just consumers hunting for bargains or engaging in last-minute panic buying that will be turning to online stores this season.

It is also a prime time of year for fraudsters, who will hope their transaction will go unnoticed in among the Christmas rush. If they are not careful, retailers can find themselves seriously out of pocket if they fall victim to such crimes – so many companies are turning to big data analytics to help clamp down on this problem.

One such organisation is eBay Enterprise, which provides omnichannel fulfillment services for hundreds of brand-name merchants – and as such, has to absorb all fraud-related losses incurred by its clients. Therefore, it is vital that the company is able to spot fraudulent transactions and block them before they are finalised.

It was noted by Datanami that in 2014, the company stopped $55 million worth of these purchases, and that number is expected to rise this year. The key to this success is how the firm uses big data to analyse buying activity and identify patterns.

Tony Ippolito, strategic risk and technology manager for eBay Enterprise, told the publication that the more data the company has available to it, the better.

Identifying a fraudulent transaction typically involves cross checking information provided by the customer against a wide range of fields – including names, email addresses, billing addresses and shipping addresses – to find inconsistencies. In eBay Enterprise's case, this involves running every order through an Oracle database containing around 1.3 billion entries.

"We also collect as much information as we can about the product, the kind of item, and the amount of the order," Mr Ippolito said. "We do device fingerprinting, we collect IP address, and then we do geolocation lookups."

Common red flags include long-distance orders for high value goods such as electronic gadgets, video games and designer clothing, which can be easily resold on the black market. Expedited shipping requests are another key signifier of fraudsters, as they are keen to get items in their hands before their crimes are spotted.

To counter this, eBay Enterprise runs each transaction through a big data analytics system that is equipped with around 600 rules, as well as a machine learning algorithm that uses more than 20 models to match incoming transactions against known fraud patterns. 

Mr Ippolito said: "It's a lot of data collection and aggregation, seeing trends and applying that across the board to make sure we're not missing anything."

This is only possible with an effective big data system that is able to process millions of incoming transactions and accurately assess their likelihood of being fraudulent in real-time, as any delays or mistakes will either lead to genuine customers being inconvenienced, or enabling a fraudulent transaction to be completed.

Such systems also have to be regularly tweaked and updated to keep up as fraudsters change their tactics in response to these tools. For instance, if the company implements a rule that requires it to 'queue', or manually review, every transaction over $50, then the fraudsters will move their target to orders under $40.

"When you close off a certain area, you have to be aware of what the next logical step for them is," Mr Ippolito says. "If you shut down overnight shipping, then they'll move into third-day shipment. It's a lot more nuanced than that, but that's the general idea."

Predictive analytics ‘the future’ for the investment industry


Posted By : admin Comments are off
Predictive analytics 'the future' for the investment industry
Categories :#AnalyticsNews

Professionals in the investment sector will come to rely much more heavily on predictive analytics solutions in the coming years when they are looking to analyse their clients' behaviour and measure themselves against competitors.

This was the conclusion of a group of broker-dealer professionals who discussed the future of the industry with at a recent technology conference.

They agreed that the primary goal for the use of big data analytics in the investment sector will be to offer advisers more insight into the decisions being made and identify trends or client risks before they occur.

Aaron Spradlin, chief information officer at United Planners Financial Services, told the publication that to achieve this, professionals will have to turn to outside software in order to effectively add predictive elements to these activities.

"When you look at big data, it's very sophisticated, and there's some really cool tools out there, but it's not easy to do," he said.

There was agreement among the experts that while all the data necessary to identify potential risks and opportunities already exists, it is only recently that the tools to unlock this have become widely available.

For instance, Mr Spradlin stated that until now, the majority of his firm's activities have been focused around data collection, drawing together large amounts of information about advisers and clients.

It can then use this to offer advisers compliance guidance, looking at trends and behaviours and using those to write better alerts, he said.

James Clabby, chief information officer at AIG Advisor Group, added that advisers can already use this detailed client information as a tool to make better decisions. For example, by pulling together a list of all clients with a certain amount of assets and a certain percentage of that in cash, an adviser can engage them on where that money might best be invested.

This is one example of how professionals are using analytics today, with Mr Clabby describing this as "little big data".

However, moving from this to the next stage of evolution, where predictive analytics can be used to dictate all of a professional's decision-making, will typically require a major investment. 

Mr Spradlin said: "I don't have money to invest significantly in this area, and I think that's where we're hoping for innovation from the industry, from … bigger firms that might step in and say, 'Hey, pass us the data, we'll automate it and pass it back'."

The experts also stated that increased regulation is likely to be one of the big challenges for the investment sector as analytics and the use of potentially sensitive data becomes the norm.

It will also be important to ensure that this information is being handled sensitively so as not to create privacy concerns. Gary Gagnon, vice president of technology for Cambridge Investment Research, warned: "There's the danger that we face that we could become so effective at using big data about individual clients and their decisions and their plans that it might actually alienate some people."

‘Declaration of Data Rights’ called for to ensure privacy


Posted By : admin Comments are off
'Declaration of Data Rights' called for to ensure privacy
Categories :#AnalyticsNews

A think tank has called on the United Nations to draw up a universal 'Declaration of Data Rights' in order to ensure that enterprises, countries and citizens are able to enjoy the benefits of big data, while still balancing this with individuals' rights to privacy.

The suggestion was one of a number of recommendations made in a report by the Institute of Development Studies (IDS), which examined what will be needed to ensure that the developing world does not miss out on the opportunities afforded by the technology.

It noted that now is the time for action in this area, as many major decisions that will influence the future direction of big data are currently being taken. With it claiming around 90 per cent of digital data being created in the last two years – and the quantity of information doubling every two years – the area is attracting much more attention on an international level.

For instance, the report observed that the recent European Court of Justice ruling that invalidated the Safe Harbor agreement between the EU and the US will open up "fundamental questions" about how personal data is used. At the same time, issues of data sharing and privacy are part of the Transatlantic Trade and Investment Partnership (TTIP) negotiations between the EU and US, and the 24-nation Trade in Services Agreements (TISA) discussions.

"The outcome of these negotiations will shape big data impacts for years to come," the report said. It also noted that the fact deals such as TTIP and TISA are being negotiated in secret makes it very difficult for citizens to engage, as does the fact that the long-term implications of big data are still not well understood.

IDS' study observed there are four key areas where big data will have an impact on developing nations as they increasingly embrace the technology. These are its economic impact, its effect on human development through advancing health or education, its implications for human rights and how it can reduce the strain on environmental resources.

Dr Stephen Spratt, research fellow at the IDS, noted that developing countries face particular challenges when it comes to the implementation of big data, as in many cases, protections for civil liberties have not been encouraging.

"A worst case scenario is one where a government can see citizen data but information on government activities remains closed, and where corporations offering internet access to people in developing countries do so on the condition of targeted advertising and right to use data in exchange," he said.

Dr Spratt stated that "much more needs to be done" to minimise the risks facing these nations and ensure that the benefits of big data are shared equally, rather than just among large corporations, the richest individuals and developed countries.

The report called for the UN to establish a panel of social science, ethics, legal and technical experts to draft new guidelines that will "enshrine citizens' rights to access data on their government's activities in the process, and a citizen's right to see and control the information held about them by governments and corporations." 

Other recommendations in the report included improving funding for public research into the implications of the increasing use of automated decision-making and learning algorithms, and requiring large enterprises based in developed countries to employ the same approach to data privacy in all countries they operate.

Cloud and appliances to be biggest drivers of big data in 2016


Posted By : admin Comments are off
Cloud and appliances to be biggest drivers of big data in 2016
Categories :#AnalyticsNews

The coming year will see a new wave of adoption for big data technologies, with cloud computing and the need to gather data from a growing number of appliances among the key drivers of this.

Predictions for the industry in 2016 by Ovum note that there will be a "rising tide of IT spending" that will boost investment in big data analytics

Principal analyst at Ovum and author of the report Tony Baer said: "The next wave of big data investment will target more of the enterprise mainstream that will have more modest IT and data science skills compared with the early adopters."

Despite the increasing interest in tools such as Spark – which Ovum noted will be the fastest-growing set of workloads in 2016 – SQL will remain a key first step for organisations looking to make the most of big data.

"Don't count SQL out," Mr Baer said. "SQL-on-Hadoop remains a potent draw for Hadoop vendors who are aiming to reach the large base of enterprise SQL developers out there."

He added that Spark will be complementary to SQL, providing businesses with additional paths to insights, such as through the streaming of graph analysis. this can then be queried using language that enterprise database developers are very familiar with.

Another key prediction for 2016 will be the emergence of data lakes as a key priority for mature Hadoop users. Enterprises that have already successfully put analytics into production across multiple lines of business and stakeholder groups will drive increased demand for tools to govern the data lake and make it more transparent.

As a result, Ovum forecast significant growth in tools that build on emerging data lineage capabilities to catalogue, protect, govern access, tier storage, and manage the lifecycle of data stored in data lakes.

"Governance of data lakes will not be built in a day. While some of the tooling exists today, capabilities such as managing the lifecycle of multi-tiered storage will have to be extended to cover the growing heterogeneity of Hadoop clusters," Mr Baer said.

Lack of talent ‘hindering analytics adoption’


Posted By : admin Comments are off
Lack of talent 'hindering analytics adoption'
Categories :#AnalyticsNews

A lack of workers with the skills to analyse big data and weed out poor quality information will be one of the biggest hurdles many businesses face when looking to adopt the technology, a new study has suggested.

Research conducted by AT Kearney found that two-thirds of companies – including those with the most advanced analytics capabilities – have difficulty hiring enough people with the ability to generate insights from their data.

This is a problem that is only expected to get worse in the near future, as the study found companies will need 33 per cent more big data talent over the next five years.

In some sectors, the demand is even higher. Communications, media and technology businesses were found to be the most focused on this area, estimating their need for big data talent will increase by 43 per cent over the next five years. This was followed by financial services (36 per cent) and automotive firms (35 per cent).

More than four out of ten respondents (43 per cent) added that at least ten per cent of their company’s digital analytics positions are currently unfilled – with one in ten companies having a vacancy rate of between 20 and 30 per cent, and four per cent having more than 30 per cent of positions empty.

Partner at AT Kearney and co-author of the report Khalid Khan said that companies need to be looking for ‘trilinguals’ – individuals with a firm grasp of quantitative analytics, digital technology and business strategy.

Such people are currently rare in the corporate sector, so competition for the best professionals is fierce. As a result, many businesses are turning to university graduates who are schooled in statistical modelling.

However, Mr Khan noted that while these individuals often possess the right technical analytics skills, they may lack the ability to derive business insights with the data. Nearly 60 per cent of respondents agreed that people currently emerging from universities are underprepared for this.

Therefore, companies need a clear strategy in order to ensure they are targeting and hiring the right people. Mr Khan said: “A good strategy will determine early where there are already good pockets of expertise, where more talent is needed, and where talent could be better used.”

Telcos aiming to boost customer care through big data


Posted By : admin Comments are off
Telcos aiming to boost customer care through big data
Categories :#AnalyticsNews

Using big data analytics to improve customer care solutions will be the top priority for telecommunications companies in the coming years, a new survey has found.

Research conducted by Guavus revealed that 87 per cent of network providers have either already implemented a big data strategy or are in the process of doing so. The primary drivers for the adoption of such services include maximising revenue, named by 66 per cent of respondents, boosting customer experience and loyalty (61 per cent), and cutting operational expenditures (also 61 per cent).

However, in the next two years, it will be improving customer care that will be the focus of these activities. The study found that 57 per cent of respondents named this as their top issue they are looking to address over the period, ahead of revenue assurance (48 per cent), improving targeted offerings (47 per cent) and better service assurance (44 per cent).

Anukool Lakhina, founder and chief executive of Guavus, said it is no surprise to see that proactive customer care will be the top area of investment for 2016 and beyond, as in today's competitive environment, "providing a seamless customer experience holds the key to safeguarding operator revenue streams".

He added that being able to gain a complete, end-to-end picture of subscribers' experiences enables telcos to intervene quickly as soon as potential issues are detected. This means they can remedy any service degradations, prevent churn and raise customer satisfaction for increased loyalty – ultimately leading to improved revenue.

Mr Lakhina stated that as companies become more familiar with big data and their strategies mature, the focus is shifting away from simply collecting and analysing very large data sets towards being able to derive actionable intelligence from their information.

"Operators have realised that the ability to fuse data streams and bridge the gap between business and operational data is essential to achieving this goal," he said. "However, it's also vital to strip out only the most valuable nuggets of data for analysis, as trying to store everything will increase costs, delay time to insight and devalue the quality of the analytics provided."

An inability to integrate data from disparate systems is currently one of the biggest barriers to success for many telcos, with 28 per cent of respondents listing this as a problem. This was followed by poor data quality and management (25 per cent) and finding personnel with the right skills to handle such projects.

The study also found that a large number of telcos remain dubious over the value of data lakes, despite the fact these solutions have been hyped as one of the keys to big data success. Only 22 per cent of respondents said data lakes are a critical part of how they bring disparate data together, while some 68 percent of network operators stated they remain unsure about these tools, or are waiting to see whether they will emerge as more than just hype.

What will 2016 hold for the big data industry?


Posted By : admin Comments are off
What will 2016 hold for the big data industry?
Categories :#AnalyticsNews

The last 12 months have been a busy time for the big data industry, as more businesses have begun to recognise the value of the technology, while many initiatives are starting to move out of proof-of-concept tests and into full production. 

But as 2015 comes to a close, many professionals will naturally be turning their attentions to the future and wondering what it will hold for the technology. And with the maturity of big data growing rapidly and more best practices emerging, there are major changes ahead.

For instance, it was forecast by Oracle that in 2016, one key trend will be that big data is no longer limited to the experts. In a piece for the Predictive Analytics Times, vice-president of big data and advanced analytics at the company Neil Mendelson and the firm's vice-president of big data integration and governance Jeff Pollock observed that "data civilians" will find themselves operating much more like data scientists.

"While complex statistics may still be limited to data scientists, data-driven decision-making shouldn't be," the pair stated. "In the coming year, simpler big data discovery tools will let business analysts shop for datasets in enterprise Hadoop clusters, reshape them into new mashup combinations, and even analyse them with exploratory machine learning techniques."

Providing a wider audience with the tools to explore their data will help businesses improve self-service capabilities and enable more users to develop hypotheses and experiments based on the insights provided by big data.

As a result of this, experimental data labs are also set to take off in 2016. Mr Mendelson and Mr Pollock said, for example, that the insurance sector will trial a wide range of tests and pilot schemes related to default risk, policy underwriting, and fraud detection as firms try to identify the best way of utilising their data before their competition.

Meanwhile, the emergence of more mature technologies and best practices will mean many businesses no longer have to go it alone and create DIY solutions for their big data analytics.

"In 2016, we'll see technologies mature and become more mainstream thanks to cloud services and appliances with pre-configured automation and standardisation," the experts stated. This will mean that lead-in and development times for big data tools will fall considerably, as well as making deployment significantly cheaper.

They also forecast that in 2016, the emergence of more Internet of Things (IoT) enabled sensors will join forces with powerful cloud computing tools to become the 'killer app' for big data analytics.  Expanding cloud services will not only be able to gather sensor data, but also feed it into big data analytics and algorithms to turn it into actionable results.

Highly secure IoT cloud services will also help manufacturers create new products that are able to act upon the analysed data without the need for human intervention.

However, in order to ensure that these innovations can be used safely and effectively, issues surrounding data governance and security will also come to the fore in the next 12 months.

"The continuous threat of ever more sophisticated hackers will prompt companies to both tighten security, as well as audit access and use of data," Mr Pollock and Mr Mendelson said. They added it will also be increasingly important to know where data originates – not just in terms of what sensor or system generates it, but which country's jurisdiction it will fall under for data protection and privacy purposes.

Hadoop adopters ‘must get to grips’ with its complexity


Posted By : admin Comments are off
Hadoop adopters 'must get to grips' with its complexity
Categories :#AnalyticsNews

The number of businesses turning to Hadoop to assist with their big data analytics operations is continuing to grow. However, even the most sophisticated users are still struggling to get to grips with the operational complexity of the technology.

This is according to Wikibon, which noted that Hadoop has seen an "unprecedented" rate of innovation recently – primarily because it is an ecosystem rather than a single product, which enables providers to come up with their own solutions. This pace of development has encouraged a large number of businesses to investigate the capabilities of the platform.

According to the 2015 edition of Wikibon's Big Data Survey, 41 per cent of respondents reported they had at least one production deployment of Hadoop – a ten per cent increase from 18 months earlier.

However, analyst at the company George Gilbert observed that prospective users of the technology – as well as those already in the pilot or development stage – need to be aware that there are "no easy solutions" to getting the most out of Hadoop.

He noted the biggest challenge facing Hadoop users is making the technology manageable. As there are many moving parts involved in such a solution, this makes it a highly complex, difficult to control platform. If businesses do not appreciate this, they are likely to see their total cost of ownership (TCO) spiral as administrative overheads grow and users invest in solutions they later abandon.

One way to tackle these issues is to embrace cloud computing technology and turn to third-party experts to assist with the deployment. Mr Gilbert observed that Hadoop-as-a-Service solutions can simplify some of the management issues associated with the technology – although not all.

Turning to native cloud services like AWS, Azure and Google Cloud Platform can dramatically simplify management, as much of the burden of setting up and administrating a Hadoop system is placed on the service provider.

However, there is a trade-off for this. Mr Gilbert stated that in many cases, businesses that desire this level of simplicity will have to sacrifice choice and openness.

"All the cloud providers will provide ever more powerful DevOps tools to simplify development and operations of applications on their platforms," he said. "But as soon as customers want the ability to plug in specialised third-party functionality, that tooling will likely break down."

Therefore, enterprises may need to decide early whether they are prepared to accept a more limited level of functionality in exchange for reducing some of the complexity of Hadoop.

"Right now the customers with the skills to run Hadoop on-premises are internet-centric companies and traditional leading-edge enterprise IT customers such as banks, telcos, and large retailers," Mr Gilbert said. "Solving the TCO and manageability problem won’t be easy." 

Big data ‘becoming a top business priority’


Posted By : admin Comments are off
Big data 'becoming a top business priority'
Categories :#AnalyticsNews

The fast pace of growth in big data analytics over the last four years has seen it become much more of a priority for senior personnel on the business side of an organisation, rather than being seen purely as an IT project.

This is among the findings of a new report by the Economist Intelligence Unit (EIU), which noted that the tone of conversations around big data has shifted from initial excitement over its potential to an expectation that it will deliver long-term gains.

Some 44 per cent of respondents agreed that data has become an important tool for driving strategic decision-making, up from 39 per cent in 2011. A further 14 per cent of this year's respondents stated that data has completely changed the way they do business. Just three per cent of companies felt that the rise of big data has had no impact on how they operate.

"Over the past four years, executives have not only become better educated about the technology behind big data, but have fully embraced the relevance of data to their corporate strategy and competitive success," the study stated.

It noted that since 2011, many companies have come to view big data as a strategic asset. As a result of this, the number of businesses that have well-defined data management strategies that focus on identifying and analysing the most valuable data has seen impressive growth.

The benefits of such an approach are also becoming clearer. The EIU noted there is a clear correlation between effective management of data and positive financial results.

"Companies with a well-defined data strategy are much more likely to report that they financially outperform their competitors," the report stated. "In addition, they are more likely to be successful in executing their data initiatives and effectively applying their data and analytics to resolve real and relevant business problems."

Enterprises across all industries have been successful in attracting the attention of the board when it comes to their big data initiatives. Senior executives across all functions and business units are increasingly taking the lead on projects instead of relying on IT.

However, there is still much work to be done if enterprises are to make the most of their potential. The EIU characterised the position of many businesses as being in the "data adolescence" phase, where they have overcome the initial obstacles and are learning and growing quickly, but have not yet reached full maturity.

Indeed, less than a third of respondents (30 per cent) said they put all their data to good use. The majority of companies (54 per cent) felt there were only using around half of their valuable information, while 16 per cent said they leveraged "very little" of these resources.

Going forward, the EIU noted that companies will need to shift their focus away from volume when considering big data and instead think more about how they will extract value from the information.

"Data and analytics will be increasingly applied to predict future outcomes and automate decisions and actions," it said. "Most importantly, many companies will have to continue to evolve their structure and culture to scale up successful data pilots across the entire organisation."