Blogs

JPMorgan Chase turns to big data for economy monitoring

Back in 2015, Wall Street bank JPMorgan Chase unveiled a new think tank, JPMorgan Chase Institute, dedicated to delivering data-rich insights for the public good. Their first report used proprietary data to provide deep insights into how the income and spending habits of people in the US fluctuate on a yearly and monthly basis.

The research was based on detailed transaction information for nearly 30 million Chase customers. Of course all uniquely identifiable information, including names, account numbers, addresses, dates of birth, and social security numbers, is removed before the institute receives any data, while only aggregated data – rather than individual – can be published. The insights from its big data analyses can help leaders in all business sectors to make informed choices on economic policy decisions.

JPMorgan Chase has long invested in Big Data Analytics and this article describes their use of the Apache Hadoop framework to manage their exponentially growing data size and fast processing of unstructured data. Through Hadoop they can store vast volumes of unstructured data that they had never stored before and use that to derive insights using a range of customer-focused data mining and data analytics tools.

In addition to the think tank’s reports, JPMorgan Chase has made use of big data analytics in a number of areas including:

  • For Fraud Detection
  • To drive value for clients (benchmarking client performance against their peers and competitors)
  • To enrich the customer experience

In this article from 2017, it describes how JPMorgan Chase is using machine learning to handle the “mind-numbing job of interpreting commercial-loan agreements”. Until the project went online, that particular task took 360,000 hours of lawyer’s time annually!

The firm continues to invest in big data and also robotics and cloud infrastructure (they’ve created their own private cloud platform), using new technologies to maintain its market dominance.

Another leading global financial services player uses Kognitio in its big data infrastructure, for ultra-fast SQL on their Hadoop cluster. Kognitio enables Tableau to handle hundreds of concurrent complex SQL queries, returning query results in near real-time. Other SQL engines couldn’t even approach real-time rendering. You can read more about this customer use case here.

This post was originally published in May 2015 and has been updated for freshness, accuracy and comprehensiveness.

Leave a Reply

Your email address will not be published nor used for any other purpose. Required fields are marked *


SQL on Hadoop. Bring your BI to life.

Read how to transform Hadoop into your best BI platform

Download the guide