FTC warns against big data ‘discrimination’

11

Jan
2016
Posted By : admin Comments are off
FTC warns against big data 'discrimination'
Categories :#AnalyticsNews

The US Federal Trade Commission (FTC) has released a new report advising businesses what considerations they need to take into account when working with big data, with the regulator particularly urging firms to avoid strategies that produce exclusionary or discriminatory outcomes.

In the document released last week (January 6th), the body looked specifically at the end of the big data lifecycle – how information is used after it has been collected and analysed. It noted that while the technology has a wide range of benefits, it can also lead to reduced opportunities for certain groups and the targeting of vulnerable consumers for fraud and higher prices.

FTC chairwoman Edith Ramirez said: "Big data's role is growing in nearly every area of business, affecting millions of consumers in concrete ways. The potential benefits to consumers are significant, but businesses must ensure that their big data use does not lead to harmful exclusion or discrimination."

The report observed that positive outcomes of big data are not only limited to improved results at enterprises. There are also a number of wider societal benefits that can be seen, such as improved healthcare, education and equality. 

For example, Google is among a number of businesses that are deploying big data solutions as part of their hiring processes, in order to create a more diverse workforce. "Through analytics, Google identified issues with its hiring process, which included an emphasis on academic grade point averages and 'brainteaser' questions during interviews," the report said. 

"Google then modified its interview practices and began asking more structured behavioural questions (e.g., how would you handle the following situation?). This new approach helped ensure that potential interviewer biases had less effect on hiring decisions."

However, while big data can be used to eliminate many personal biases from business' decision-making, the technology could introduce new types of discrimination that could affect opportunities for citizens if used incorrectly.

One example highlighted by the FTC's report is incidents where credit card providers have lowered certain customers' credit limits based not on their own payment history, but analysis of other consumers with a poor repayment history.

In one case, a provider settled with the FTC after it was alleged to have failed to disclose that it identified some customers as having a higher credit risk if they used their cards to pay for  marriage counseling, therapy, or tyre repair services, based on its experiences with other consumers and their repayment histories.

"Using this type of a statistical model might reduce the cost of credit for some individuals, but may also result in some creditworthy consumers being denied or charged more for credit than they might otherwise have been charged," the FTC said.

Other potential issues for big data included its ability to expose sensitive information. The FTC highlighted one study that combined data from Facebook 'Likes' and limited survey information, which was able to predict a user's ethnic origin with 95 per cent accuracy and male users' sexual orientation 88 per cent of the time.

The report offered several recommendations to ensure that the use of big data does not lead to discrimination. These include reviewing data sets and algorithms to ensure that hidden biases are not having an unintended impact on certain populations.

"Remember that just because big data found a correlation, it does not necessarily mean that the correlation is meaningful," it concluded. "As such, you should balance the risks of using those results, especially where your policies could negatively affect certain populations."

European businesses ‘lose £20m a year” through poor data management

31

Dec
2015
Posted By : admin Comments are off
European businesses 'lose £20m a year" through poor data management
Categories :#AnalyticsNews

Businesses in Europe could be missing out on sales worth up to £20 million a year because they do not have the ability to take full advantage of the data they collect.

This is according to research conducted by flash storage provider Pure Storage, which claimed more than half of companies (51 per cent) have missed out on an opportunity because they were not aware of it until it was too late.

Almost a third of respondents (31 per cent) stated they had experienced this at least once a year, while 19 per cent see such an occurrence multiple times a week. 

In many cases, businesses had access to the right data that would have given them an insight into the opportunity, but were unable process it in time to take advantage.

Vice-president for Europe, the Middle East and Africa at Pure Storage James Petter commented: "The reason we're seeing these trends emerge is because it is now cheaper for businesses to retain the data they are collecting than to destroy it, so the volume of data a business holds is growing rapidly."

At the same time, he added it is becoming increasingly complicated and costly to access usable information fast enough to make a difference.

Nearly three-quarters of businesses (72 per cent) said they collect information that is never used, with almost half of these firms (48 per cent) putting this down to the fact that data processing is too time consuming. One in five also highlighted the expense of these activities as a challenge.

However, as the amount of information available grows, companies that are better able to process this and turn it into usable information that can influence decision-making will be in a much better position that those without these capabilities. This will be particularly true as in many cases, competitors will all have access to the same raw data from publicly-available sources.

Mr Petter said: "As companies gather more and more granular data on what they do, the potential to gain understanding and plan accordingly is not just a profitable undertaking, it is a necessity. Transformation is being forced on organisations at an ever-increasing pace. They must adapt to new ways of doing business, new markets and new practice, or die."

Among the concerns raised by businesses when it comes to making the most of their data is the regulatory burden placed upon them. Overall, one in ten respondents said their data processing efforts are held back by data protection worries.

Organisations in the UK were found to be particularly affected by such issues. Some 39 per cent of respondents in the country agreed that well-meaning regulations often have unforeseen negative consequences for their business.

However, this was not a view reflected elsewhere in Europe. In France, for example, 42 per cent of respondents stated data protection rules have actually helped their ability to do business, while German firms were also positive. 

Some 40 per cent of businesses in the country stated that there were no new regulations in their industry that affected their performance, and over a quarter (26 per cent) said that regulations aimed at a different industry had a positive impact on their activities.

What do CIOs want from big data in 2016?

31

Dec
2015
Posted By : admin Comments are off
What do CIOs want from big data in 2016?
Categories :#AnalyticsNews

With the new year almost upon us, many experts have been offering their thoughts on what the big data market will hold in 2016. And while top tips for the technology include greater use of developments such as the Internet of Things and machine learning, will these be the innovations that the people actually using the tools want to see?

CBR noted that for many chief information officers (CIOs), the main thing they will be hoping to get out of the next 12 months will be clarity. this includes better messaging about the technology itself, and clearer information on issues such as privacy and data sharing.

Top of the wish list for Hortonworks founder Arun Murthy will be an end to the hype and overuse of buzzwords that has defined much of the big data sector in previous years. He told the publication: "I just wish the term went away and [we can] just call it data and be done with it."

CBR noted that this is indicative of a growing feeling that simply referring to 'big data' is too broad, particularly now that the initial hype stage is coming to an end and businesses are starting to ask more about what they can actually do with the technology in order to drive real-world value from their applications. 

Improved standardisation and clearer guidelines from regulators about how large volumes of sensitive data need to be protected will also be highly important to CIOs in 2016, as more production deployments of such solutions go live.

CBR stated that the EU, for instance, should be creating a set of laws that make it clear exactly how data can legally be shared across the continent. There have been steps taken towards this recently, with a draft text of the upcoming European General Data Protection Regulation agreed upon by MEPs earlier this month.

However, the publication observed that one of the biggest problems with previous pan-European data protection laws is that they have been constantly updated and revised, while individual member states have often interpreted EU directives differently. This has caused a great deal of stress for CIOs as they struggle to keep up. 

A more consistent set of rules that do not change from country to country will give CIOs much greater peace of mind as they go about building systems for the collection, storage and processing of large volumes of data in 2016, CBR continued.

Efforts to develop universal processes for data collecting and ensure that all applications are able to integrate with each other more easily will also assist with this.

CBR said: "Platform providers have taken steps in standardising and making it easy to connect different data sources, but plenty more can be done."

Among the other CIO big data wishes that the publication highlighted for 2016 was the implementation of more focused data collection policies that offer more detail about the type of information organisations need to collect. The ability to make migrating data between different parts of a network easier was also named as a priority for the year ahead.

Big data investments ‘to double’ by 2021

31

Dec
2015
Posted By : admin Comments are off
Big data investments 'to double' by 2021
Categories :#AnalyticsNews

Global investment in big data analytics technology is set to double between 2015 and 2021, with innovations such as the Internet of Things driving business interest in the technology.

This is according to a report from Strategy Analytics, which estimated that global revenues from big data analytics solutions will reach $73.77 billion by 2021, up from £36.2 billion in 2015. By 2022, the market will have reached $80 billion, for a compound annual growth rate of 12 per cent over the next seven years.

Key sectors for the technology include the healthcare, financial services, industrial and manufacturing markets. Companies in these markets will particularly benefit from solutions that can interpret the vast amounts of data they generate, allowing them to respond to evolving market dynamics in ways that benefit them and their customers.

Andrew Brown, executive director of the Enterprise and IoT Research team at Strategy Analytics, said that big data analytics will make the difference between reactive and proactive businesses.

He added: "Predictive analytics software can help businesses respond in a proactive way by dealing with issues before they occur. Prescriptive analytics on data sources can suggest decision options that take advantage of the predictive elements and provide real differentiation and competitive advantage for companies leveraging these technologies."

The report also noted that many investments in big data analytics technology will go towards open-source tools. As well as being less expensive than proprietary alternatives, these solutions will also have the advantage of being able to run on commodity hardware, which OEM vendors are hoping will help broaden the technology's appeal to small and medium-sized enterprises.

The healthcare vertical segment will remain the largest market for big data analytics solutions, with this set to grow from $7.96 billion in software revenue in 2015 to $17.03 billion in 2022. This will be followed by the financial services market, which Strategy Analytics forecast is set to double in size by 2022.

Security to be key big data use for 2016

30

Dec
2015
Posted By : admin Comments are off
Image credit: iStockphoto/weerapatkiatdumrong
Categories :#AnalyticsNews

A growing number of organisations in sectors such as banking and insurance are set to turn to big data analytics in 2016 in order to keep their critical information safe from hackers and other unauthorised users.

This is according to a new forecast for the year ahead from Oracle. It noted that 2016 will see big data become more integral to the day-to-day workings of many businesses.

Neil Mendelson, vice-president of big data and product management at Oracle, said: "2016 will be the year when big data becomes more mainstream and is adopted across various sectors to drive innovation and capture digitisation opportunities."

However it will be the technology's ability to identify unusual and potentially fraudulent activity that will be of particular interest to the financial services sector.

The company stated: "2016 will witness an increase in the proliferation of experiments [around] default risk, policy underwriting, and fraud detection as firms try to identify hotspots for algorithmic advantage faster than the competition."

Another key driver for big data security solutions will be increasing public awareness of the numerous ways their personally identifiable information can be collected, shared, stored and stolen. This will in turn lead to more calls for greater regulation to ensure this data is protected.

"The continuous threat of ever more sophisticated hackers will prompt companies to both tighten security, as well as audit access and use of data," Oracle continued.

Among its other predictions, the company forecast increased demand for data scientists from established enterprises, while the emergence of new management tools will allow more businesses to implement technologies such as machine learning, natural language processing and property graphs.

Simpler data discovery tools will also let business analysts identify the most useful datasets within enterprise Hadoop clusters, reshape them into new combinations and analyse them with exploratory machine learning techniques. This will improve both self-service access to big data and provide richer hypotheses and experiments that drive the next level of innovation.

Big data helps fight Christmas fraud

23

Dec
2015
Posted By : admin Comments are off
Big data helps fight Christmas fraud
Categories :#AnalyticsNews

Retailers around the world are currently experiencing their busiest time of the year in the run-up to the festive season, with post-Christmas sales also right around the corner. However, it is not just consumers hunting for bargains or engaging in last-minute panic buying that will be turning to online stores this season.

It is also a prime time of year for fraudsters, who will hope their transaction will go unnoticed in among the Christmas rush. If they are not careful, retailers can find themselves seriously out of pocket if they fall victim to such crimes – so many companies are turning to big data analytics to help clamp down on this problem.

One such organisation is eBay Enterprise, which provides omnichannel fulfillment services for hundreds of brand-name merchants – and as such, has to absorb all fraud-related losses incurred by its clients. Therefore, it is vital that the company is able to spot fraudulent transactions and block them before they are finalised.

It was noted by Datanami that in 2014, the company stopped $55 million worth of these purchases, and that number is expected to rise this year. The key to this success is how the firm uses big data to analyse buying activity and identify patterns.

Tony Ippolito, strategic risk and technology manager for eBay Enterprise, told the publication that the more data the company has available to it, the better.

Identifying a fraudulent transaction typically involves cross checking information provided by the customer against a wide range of fields – including names, email addresses, billing addresses and shipping addresses – to find inconsistencies. In eBay Enterprise's case, this involves running every order through an Oracle database containing around 1.3 billion entries.

"We also collect as much information as we can about the product, the kind of item, and the amount of the order," Mr Ippolito said. "We do device fingerprinting, we collect IP address, and then we do geolocation lookups."

Common red flags include long-distance orders for high value goods such as electronic gadgets, video games and designer clothing, which can be easily resold on the black market. Expedited shipping requests are another key signifier of fraudsters, as they are keen to get items in their hands before their crimes are spotted.

To counter this, eBay Enterprise runs each transaction through a big data analytics system that is equipped with around 600 rules, as well as a machine learning algorithm that uses more than 20 models to match incoming transactions against known fraud patterns. 

Mr Ippolito said: "It's a lot of data collection and aggregation, seeing trends and applying that across the board to make sure we're not missing anything."

This is only possible with an effective big data system that is able to process millions of incoming transactions and accurately assess their likelihood of being fraudulent in real-time, as any delays or mistakes will either lead to genuine customers being inconvenienced, or enabling a fraudulent transaction to be completed.

Such systems also have to be regularly tweaked and updated to keep up as fraudsters change their tactics in response to these tools. For instance, if the company implements a rule that requires it to 'queue', or manually review, every transaction over $50, then the fraudsters will move their target to orders under $40.

"When you close off a certain area, you have to be aware of what the next logical step for them is," Mr Ippolito says. "If you shut down overnight shipping, then they'll move into third-day shipment. It's a lot more nuanced than that, but that's the general idea."

Predictive analytics ‘the future’ for the investment industry

22

Dec
2015
Posted By : admin Comments are off
Predictive analytics 'the future' for the investment industry
Categories :#AnalyticsNews

Professionals in the investment sector will come to rely much more heavily on predictive analytics solutions in the coming years when they are looking to analyse their clients' behaviour and measure themselves against competitors.

This was the conclusion of a group of broker-dealer professionals who discussed the future of the industry with InvestmentNews.com at a recent technology conference.

They agreed that the primary goal for the use of big data analytics in the investment sector will be to offer advisers more insight into the decisions being made and identify trends or client risks before they occur.

Aaron Spradlin, chief information officer at United Planners Financial Services, told the publication that to achieve this, professionals will have to turn to outside software in order to effectively add predictive elements to these activities.

"When you look at big data, it's very sophisticated, and there's some really cool tools out there, but it's not easy to do," he said.

There was agreement among the experts that while all the data necessary to identify potential risks and opportunities already exists, it is only recently that the tools to unlock this have become widely available.

For instance, Mr Spradlin stated that until now, the majority of his firm's activities have been focused around data collection, drawing together large amounts of information about advisers and clients.

It can then use this to offer advisers compliance guidance, looking at trends and behaviours and using those to write better alerts, he said.

James Clabby, chief information officer at AIG Advisor Group, added that advisers can already use this detailed client information as a tool to make better decisions. For example, by pulling together a list of all clients with a certain amount of assets and a certain percentage of that in cash, an adviser can engage them on where that money might best be invested.

This is one example of how professionals are using analytics today, with Mr Clabby describing this as "little big data".

However, moving from this to the next stage of evolution, where predictive analytics can be used to dictate all of a professional's decision-making, will typically require a major investment. 

Mr Spradlin said: "I don't have money to invest significantly in this area, and I think that's where we're hoping for innovation from the industry, from … bigger firms that might step in and say, 'Hey, pass us the data, we'll automate it and pass it back'."

The experts also stated that increased regulation is likely to be one of the big challenges for the investment sector as analytics and the use of potentially sensitive data becomes the norm.

It will also be important to ensure that this information is being handled sensitively so as not to create privacy concerns. Gary Gagnon, vice president of technology for Cambridge Investment Research, warned: "There's the danger that we face that we could become so effective at using big data about individual clients and their decisions and their plans that it might actually alienate some people."

‘Declaration of Data Rights’ called for to ensure privacy

17

Dec
2015
Posted By : admin Comments are off
'Declaration of Data Rights' called for to ensure privacy
Categories :#AnalyticsNews

A think tank has called on the United Nations to draw up a universal 'Declaration of Data Rights' in order to ensure that enterprises, countries and citizens are able to enjoy the benefits of big data, while still balancing this with individuals' rights to privacy.

The suggestion was one of a number of recommendations made in a report by the Institute of Development Studies (IDS), which examined what will be needed to ensure that the developing world does not miss out on the opportunities afforded by the technology.

It noted that now is the time for action in this area, as many major decisions that will influence the future direction of big data are currently being taken. With it claiming around 90 per cent of digital data being created in the last two years – and the quantity of information doubling every two years – the area is attracting much more attention on an international level.

For instance, the report observed that the recent European Court of Justice ruling that invalidated the Safe Harbor agreement between the EU and the US will open up "fundamental questions" about how personal data is used. At the same time, issues of data sharing and privacy are part of the Transatlantic Trade and Investment Partnership (TTIP) negotiations between the EU and US, and the 24-nation Trade in Services Agreements (TISA) discussions.

"The outcome of these negotiations will shape big data impacts for years to come," the report said. It also noted that the fact deals such as TTIP and TISA are being negotiated in secret makes it very difficult for citizens to engage, as does the fact that the long-term implications of big data are still not well understood.

IDS' study observed there are four key areas where big data will have an impact on developing nations as they increasingly embrace the technology. These are its economic impact, its effect on human development through advancing health or education, its implications for human rights and how it can reduce the strain on environmental resources.

Dr Stephen Spratt, research fellow at the IDS, noted that developing countries face particular challenges when it comes to the implementation of big data, as in many cases, protections for civil liberties have not been encouraging.

"A worst case scenario is one where a government can see citizen data but information on government activities remains closed, and where corporations offering internet access to people in developing countries do so on the condition of targeted advertising and right to use data in exchange," he said.

Dr Spratt stated that "much more needs to be done" to minimise the risks facing these nations and ensure that the benefits of big data are shared equally, rather than just among large corporations, the richest individuals and developed countries.

The report called for the UN to establish a panel of social science, ethics, legal and technical experts to draft new guidelines that will "enshrine citizens' rights to access data on their government's activities in the process, and a citizen's right to see and control the information held about them by governments and corporations." 

Other recommendations in the report included improving funding for public research into the implications of the increasing use of automated decision-making and learning algorithms, and requiring large enterprises based in developed countries to employ the same approach to data privacy in all countries they operate.

Cloud and appliances to be biggest drivers of big data in 2016

15

Dec
2015
Posted By : admin Comments are off
Cloud and appliances to be biggest drivers of big data in 2016
Categories :#AnalyticsNews

The coming year will see a new wave of adoption for big data technologies, with cloud computing and the need to gather data from a growing number of appliances among the key drivers of this.

Predictions for the industry in 2016 by Ovum note that there will be a "rising tide of IT spending" that will boost investment in big data analytics

Principal analyst at Ovum and author of the report Tony Baer said: "The next wave of big data investment will target more of the enterprise mainstream that will have more modest IT and data science skills compared with the early adopters."

Despite the increasing interest in tools such as Spark – which Ovum noted will be the fastest-growing set of workloads in 2016 – SQL will remain a key first step for organisations looking to make the most of big data.

"Don't count SQL out," Mr Baer said. "SQL-on-Hadoop remains a potent draw for Hadoop vendors who are aiming to reach the large base of enterprise SQL developers out there."

He added that Spark will be complementary to SQL, providing businesses with additional paths to insights, such as through the streaming of graph analysis. this can then be queried using language that enterprise database developers are very familiar with.

Another key prediction for 2016 will be the emergence of data lakes as a key priority for mature Hadoop users. Enterprises that have already successfully put analytics into production across multiple lines of business and stakeholder groups will drive increased demand for tools to govern the data lake and make it more transparent.

As a result, Ovum forecast significant growth in tools that build on emerging data lineage capabilities to catalogue, protect, govern access, tier storage, and manage the lifecycle of data stored in data lakes.

"Governance of data lakes will not be built in a day. While some of the tooling exists today, capabilities such as managing the lifecycle of multi-tiered storage will have to be extended to cover the growing heterogeneity of Hadoop clusters," Mr Baer said.

Lack of talent ‘hindering analytics adoption’

14

Dec
2015
Posted By : admin Comments are off
Lack of talent 'hindering analytics adoption'
Categories :#AnalyticsNews

A lack of workers with the skills to analyse big data and weed out poor quality information will be one of the biggest hurdles many businesses face when looking to adopt the technology, a new study has suggested.

Research conducted by AT Kearney found that two-thirds of companies – including those with the most advanced analytics capabilities – have difficulty hiring enough people with the ability to generate insights from their data.

This is a problem that is only expected to get worse in the near future, as the study found companies will need 33 per cent more big data talent over the next five years.

In some sectors, the demand is even higher. Communications, media and technology businesses were found to be the most focused on this area, estimating their need for big data talent will increase by 43 per cent over the next five years. This was followed by financial services (36 per cent) and automotive firms (35 per cent).

More than four out of ten respondents (43 per cent) added that at least ten per cent of their company’s digital analytics positions are currently unfilled – with one in ten companies having a vacancy rate of between 20 and 30 per cent, and four per cent having more than 30 per cent of positions empty.

Partner at AT Kearney and co-author of the report Khalid Khan said that companies need to be looking for ‘trilinguals’ – individuals with a firm grasp of quantitative analytics, digital technology and business strategy.

Such people are currently rare in the corporate sector, so competition for the best professionals is fierce. As a result, many businesses are turning to university graduates who are schooled in statistical modelling.

However, Mr Khan noted that while these individuals often possess the right technical analytics skills, they may lack the ability to derive business insights with the data. Nearly 60 per cent of respondents agreed that people currently emerging from universities are underprepared for this.

Therefore, companies need a clear strategy in order to ensure they are targeting and hiring the right people. Mr Khan said: “A good strategy will determine early where there are already good pockets of expertise, where more talent is needed, and where talent could be better used.”

Facebook

Twitter

LinkedId