For the last several years, industry regulators have driven the change agenda in Financial Services. RDR, AML, UCITS, FATCA and CASS have all dominated the agendas of industry conferences as firms faced up to some eye-watering regulatory fines. But whilst the level of FCA fines in 2014 and 2015 soared to almost £2.5bn, the corresponding figure for breaches of data protection regulation levied by the ICO was just over £3m. Is it any wonder that data infrastructure projects struggled to get attention?
Recently, however, the balance has been shifting thanks to two key pieces of regulation.
The EU’s GDPR (General Data Protection Regulation) which comes into force next year will threaten firms with fines of up to 20M euros or 4% of global turnover for failure to satisfy much more onerous data responsibilities. As if that wasn’t strong enough, firms could even be forced to stop taking on new business until appropriate remediation activities have been carried out.
At the other end of the spectrum, the potential business opportunities presented by PSD2 and Open Banking have bolstered a renewed interest in data amongst Financial Services firms. But, whatever the reason, data is back in fashion.
You can tell it’s fashionable because there are new words to describe it. Data is now BIG. There is so much of it, in fact, that Warehouses are no longer able to contain it so we have to hold it in Lakes. Some of the data must leak out into the ground though because there are Data Miners who chisel it out of a Hadoop rock face. And it seems to be quite complex stuff because, once we’ve extracted it, we have Data Scientists who analyse it and report their findings to Chief Data Officers.
Despite my cynicism, I’m actually very relieved to see data in the mainstream of Financial Services debate. Here at Altus we’ve spent years building our industry models on the solid foundation of data to the point where they now describe over 10,000 information flows. I still can’t think of any better way to understand a Financial Services business or to predict what will happen when you change something than by modelling all the data that business exchanges with the outside world. So it’s lovely to finally be able to talk about these models without needing to apologise for sounding “technical”!
Ironically, I suspect it may well be the industry response to new data standards and regulation which ends up being too technical. I predict we’ll see swarms of analysts poring over database schemas to make sure that every table column is assessed for content and compliance without understanding where the data comes from or goes to.
Without that business context, it will be awfully difficult to take the full benefit of the most valuable asset FS firms possess. If they aren’t careful, these companies will find themselves up the data lake without a paddle.