The banking sector is often described to me as being one of the more proactive when it comes to knowing the value of data and working to get value from it.
However, like other sectors it faces severe challenges in the form of legacy IT systems.
Legacy systems are the number on IT challenge facing the sector, according to 48% of respondents to research by Talend. A further 43% also named it as the main barrier to realising the benefits of Big Data analytics.
Legacy being a hindrance isn’t a particularly surprising finding; it is often identified as being a thorn in the foot of businesses.
A Talend roundtable, attended by CBR, went through the findings and sought to analyse banks’ use of their data.
Nimish Shah, Banking sector lead, Talend, went through the main use cases for Big Data in banks.
First of all it is used for customer retention and churn analysis, how to understand clients and to offer personalised services.
Secondly to improve risk management by analysing clients across systems, Hadoop for example is being used by the bigger credit card companies and banks to detect fraud.
Patrick Wolfe, Professor of Statistics, UCL Big Data Institute, spoke of modelling problems and how to approach being a challenge, he said:
"Should a bank acquire a small data science company and drive it to do Big Data or should banks hire data scientists, how does one engage around this?"
Deciding how to approach Big Data is just one challenge as are legacy systems, one way in which legacy is being tackled is by using Hadoop as an infrastructure.
Ben Musgrave, enterprise account manager, BIPB, said: "We’ve got clients using Big Data, Hadoop and things like that purely because it’s a cheap infrastructure for them.
"They’re not doing Big Data on it but actually saying we can use commodity hardware to chuck all this stuff in, whereas historically we would have had to buy really high end hardware to do this, so it’s purely an infrastructure cost saving to do it."
One of the key elements here is that users are saying they are doing Big Data because they have Hadoop, but actually they aren’t.
This is where questions should be asked about the adoption of Big Data tools. Is just one line of business using some analytics? That doesn’t make you a Big Data company.
Decrying the quality of the data 45% of banks say the low quality is preventing real-time insights for the business.
Part of the problem is that they had very siloed ways of gather certain types of data. ‘Put rubbish in, get rubbish out’ is the old adage that should be drummed into any business looking to do Big Data.
A possible way of addressing this is to flag the problem at the data entry point. By processing data in real-time it should be possible to flag any poor or incorrect data.
Banks, like other sectors face a number of challenges with getting to grips to Big Data but if they want to fight off ‘challenger banks’, they will need to get to grips with it quickly.