Written by Kevin Okell on Thursday 7 December 2017
There has been a lot of debate around Regtech recently in the industry press. Plenty of opinions have been thrown around about its potential role as the saviour of inefficient change and compliance functions the world over. Unfortunately, most of the grand ideas have so far resulted in little more innovation than a slightly quicker AML process for banks.
2008 demonstrated that the risk of systemic problems in the financial system was very real, and that the impact could be colossal. In an effort to prevent a repeat performance, regulators around the world have been creating reams of new rules to safeguard against bad behaviour. However, the regulators ability to intervene in the case of bad practice is restricted to posthumous reviews of data which is provided by firms. At best this means the regulator is slow to respond to a problem in the system, and at worst that it won’t be noticed at all.
A recent initiative announced by R3 and RBS promises a new approach and could set the tone for future development. R3’s distributed ledger technology (Corda) will be used to share mortgage transaction data directly with the FCA. Instead of waiting for a periodic extract from market participants, the regulator will have direct line of sight over the transactions occurring in the network.
This should not only increase the speed with which the FCA is able to make decisions and undertake analysis, but it will also reduce data inconsistencies. Ultimately, this will increase the clarity the regulator has of transactions in the network, and could significantly reduce or even stop many instances of fraud, such as the authorise push payment abuse we have recently
While this is a useful advance which should enable operational savings for FS firms, it may also be the first step towards something much more impactful; an active regulator. A few commentators have previously floated this idea and indeed the concept of regulatory and supervisory nodes was at the heart of R3’s original white paper. Instead of taking a retrospective view, the regulator of the future could be monitoring the live network of financial transactions on a day to day basis. If this idea takes off, it would open up the potential for regulatory intervention before material harm has been caused and herald a transformation in the role of the regulator.
What if the distributed leger extended upstream to the advice which precedes a transaction? Not only would the regulator have real-time visibility of consumers’ end-to-end experience to provide early warning of any systemic issues, but advisers too could enjoy the benefit of understanding their clients’ complete history of financial advice without the need for repeated fact finds. Ultimately this could usher in a new age of low-level granularity and data visibility which would change the way advisers interact with their clients.
The potential for this technology to eliminate the silos between regulators and regulated entities (well explored by Gillian Tett in The Silo Effect, 2015) is enormous, but is the regulator up to it? A completely different skill set would be required in order to understand the masses of data involved. Data scientists harnessing the power of distributed processing and data analytics is a far call from poring over pdf and excel returns on a quarterly basis.
Ultimately, the regulators of the future have the potential to become the vehicle diagnostics systems of the financial services engine, but it may not be a smooth transition and we can expect a few more emissions scandals before we get there.