WHY THIS MATTERS IN BRIEF
Regulating industries and companies is an increasingly complicated task, and some are arguing that it’s one that should be automated and given to the machines, but that opens up its own set of challenges.
The love affair British politicians have enjoyed with fintech since the financial crisis seems to have cooled somewhat over the past year or so, but that said it’s still very much alive and kicking. While four years ago, George Osborne could be spotted withdrawing some Bitcoin at an ATM or cheering on startups in Canary Wharf, his successor Philip Hammond popped up a conference this week to talk about the exciting topic of financial regulation.
The message is that cryptocurrencies will be kept on a leash by a special task force, common industry standards will be set for banks and startups, and “robo-regulation will help fintech firms police themselves,” and Hammond’s message was “let fintech firms innovate, but in a regulated way.” And arguably why shouldn’t an autonomous hedge fund, like Aidyia, which is also known as a Distributed Autonomous Organisation, or DAO for short, albeit the fact they’re on Wall Street and not in the UK for now, be regulated by a robo-regulator?
Fair enough, in theory. It seems unworkable and undesirable to regulate fintech in exactly the same way as banks, so the Chancellors new approach seems to be why not let coders have a go? Machines are already helping regulators comb through millions of pages of regulatory filings to analyse reporting behaviours and tonality of language, so it’s a tempting prospect but they will need to tread carefully, and one of the apparent problem is what to do with the human in an automated system.
Studies, for example, have shown that humans react differently, even lazily, when removed from the chain of responsibility, and there’s a danger in letting automation relieve the human burden at both ends, whether it’s the regulator who might have less incentive to make spot checks, or the company whose compliance department is only stirred into action when the algorithm, rightly or wrongly, detects a problem.
The human element is also important in watching for errors that can turn into negative feedback loops if allowed to spread. The financial industry has had its share of flash crashes, fat finger trading errors and computer bugs, so naturally it’s hard to imagine robot regulation not having its own hiccups and mistakes.
We’ve also already seen false positives in the world of regulation, like the small business owner bumped off a bank’s account list after flunking a compliance check. Meanwhile though it’s easy to see the motivation bankers and politicians have for giving the robots more oversight – cost and efficiency. Bain and Co., for example, estimate that compliance accounts for as much as 20 percent of the cost of running a bank, but cost cutting shouldn’t deprive regulators of resources.
The Financial Conduct Authority (FCA) in the UK, for example, regulates the conduct of more than 56,000 businesses, and is the prudential regulator for more than 18,000 of them, and it has a staff of about 3,635, or one employee for every 15 firms. They are also paid a median salary of £65,000, less than what you would expect to see at a big investment bank, so we should take care not to stretch that ratio too much if fintech firms hope to one day achieve their ambitious goals of safeguarding the pensions and savings plans of the general public lest we all regret it.
For now though the prospect of robo-regulators is an interesting one, so I’ll be keeping an eye on the plan to see if, and how, it develops, but if, or when they emerge, it’s unlikely they’ll just be constrained to the financial services sector…