The recent update on the Bank of England/FCA joint initiative on data collection provides a window on how hard it is to build effective partnerships between regulators and industry. These are often touted as the way forward but in practice are fraught with jeopardy.
Regulators can easily be accused of becoming too close to firms, or overly influenced by a small group of the well-connected/incumbents. Meanwhile, firms can pour significant senior management time and resource into well-intentioned initiatives that too often lose their way, or quickly leak momentum if regulatory priorities change.
Digitising data collection
Turning to the specific, the origin of this work on digitising data collection, now almost lost in time, was the 2016 RegTech Call for Input and a two-week TechSprint at the end of 2017.
This culminated in a successful proof of concept that, as the FCA webpage explains, showed that “we are able to take a regulatory requirement … and turn it into a language that machines can understand. Using that language, machines can then execute a regulatory requirement, effectively pulling the required information directly from the firm.”
Partnerships between regulators and industry are often touted as the way forward but in practice are fraught with jeopardy.
At the time this was seen as the potential answer to the twin burning bridges of contemporary data collection – exorbitant cost to firms, potentially inaccurate information for regulators.
However, there were big questions about the upfront costs, so the investment case depended heavily on certainty about the regulators’ commitment. But the support base at the FCA was narrow, while the Bank/PRA struggled to coordinate its approach. Both regulators were interested in an industry partnership, but they also wanted to control and weren’t able to make a long term commitment.
Since 2018, therefore, there has been a series of narrowly cast pilots, and the ambition is now described as “(identifying) how data collection should improve to increase the value and reduce the burden to firms”. There’s nothing wrong with this but it’s not the leap forward initially envisaged, and some of the resulting tensions and contradictions come through in the latest board minutes (Nov 2021).
Real-time data
Ironically, while this data story above has been playing out, outside events – Covid, the cost-of-living crisis, the failure of Silicon Valley Bank – have shown the importance of regulators being able to collect real-time, reliable data in a cost-effective way. This need is only going to grow.
Stepping back, partnerships with industry must have a role in modern regulation, but there’s no agreed set of principles for how they work and too many examples of where they have fallen short.
Finding a model that can reliably deliver solutions to data collection and similar challenges remains an elusive holy grail.
Gavin Stewart is an independent commentator on financial regulation; former regulator; novelist; ex-international rower and sports administrator.
He has 27 years’ experience working for financial services’ regulators (Bank of England, FSA & FCA), holding a wide variety of roles including as a Bank of England Supervisor, FSA Head of Strategy, Planning & Performance, and FCA Chief Risk Officer. See LinkedIn profile.