A new report on the big data analytics sector from the White House has warned businesses that they must consider the ethical implications of their deployments and ensure that they are not discriminating against any individuals through their use of data.

The study, titled "Big Data: A Report on Algorithmic Systems, Opportunity, and Civil Rights" noted that if used correctly, big data can be an invaluable tool in overcoming longstanding biases and revisiting traditional assumptions.

For instance, by stripping out information such as race, national origin, religion, sexual and gender orientation, and disability, big data solutions have the potential to prevent discriminatory harm when it comes to activities such as offering employment, access to finance or admission to universities. However, the report warned that if care is not taken with the implementation of these technologies, they could exacerbate any problems.

One of the big challenges is that despite what many people assume, big data is not necessarily impartial. It can be subject to a range of issues such as imperfect inputs, poor logic and the inherent biases of the programmer.

"Predictors of success can become barriers to entry; careful marketing can be rooted in stereotype. Without deliberate care, these innovations can easily hardwire discrimination, reinforce bias, and mask opportunity," the study stated.

For instance, poorly selected data, incomplete or outdated details and unintentional historical biases could all result in the wrong data being input into big data systems. Meanwhile, poorly-designed algorithms can also cause problems if they assume correlation equals causation, or if personalised recommendations use too narrow a criteria to infer a user's true preferences.

The report highlighted several case studies that illustrate how big data can be used to improve outcomes – as well as some of the pitfalls that need to be avoided.

For example, it noted that many people in the US have difficulty gaining access to finance because they have limited or non-existent credit files. This is an issue that particularly affects African-American and Latino individuals, who are nearly twice as likely to be 'credit invisible' than whites.

Big data presents a great opportunity to improve access to credit, as it can draw on many more sources of information in order to build a picture of an applicant. This may range from phone bills, previous addresses and tax records to less conventional sources, such as location data derived from use of cellphones, social media data and even how quickly an individual scrolls through a personal finance website.

However, it warned: "While such a tool might expand access to credit for those underserved by the traditional market, it could also function to reinforce disparities that already exist among those whose social networks are, like them, largely disconnected from everyday lending.

"If poorly implemented, algorithmic systems that utilise new scoring products to connect targeted marketing of credit opportunities with individual credit determinations could produce discriminatory harms." 

The report also included a number of recommendations for improving big data outcomes, such as increasing investments in research, improving training programmes, and developing clear standards for both the public and private sector.

"Big data is here to stay; the question is how it will be used: to advance civil rights and opportunity, or to undermine them," it added.