Could the current state of bank regulation be a bonanza for software and consulting firms?
Over the course of the last ten years — even before the financial crisis — one of the constant themes of every financial sector response to any change in regulation was a heartfelt whine de coeur on the subject of the cost and trouble of implementation. It would be wrong to say that every single one of these responses was hypocritical and unjustified (in 2013, the European industry was correct to say that the radical changes the ECB proposed to make to bad debt accounting in its Asset Quality Review could not be implemented in the time proposed, even if you put every accountant in Europe on the job). But it would not be very far wrong, and in any case, the tide has certainly turned; for right or wrong, having shelled out hundreds of billions of dollars for the costs of under-regulation, global public opinion is significantly less receptive than it used to be to industry complaints about the costs of over-regulation.
However, one man’s cost is another man’s revenue. The coming storm of reregulation looks like it could be a very significant revenue opportunity for the technology, software and support services sectors.
Here’s a dirty little secret about the banking industry — this industry has suffered from decades of underinvestment in technology. This fact only ever seeps out in painful disclosures, but so far we’ve seen that:
- Royal Bank of Scotland, considered for a long time to be an industry leader in technology and scaleability, had a set of systems that were on the point of collapse at the time of its rescue in 2008 and which subsequently collapsed.
- Deutsche Bank received a letter from the Federal Reserve effectively putting it on a final warning over the failure and inadequacy of its $1bn programme of investment in its regulatory reporting.
- The US subsidiary of Grupo Santander, a group that has for years seen as an industry leader in Europe thanks to the “Partenon” system developed for it with Accenture and deployed in successive acquisitions, failed the Federal Reserve’s CCAR on its first test, with the Fed specifically citing “specific deficiencies” in management information systems as one of the reasons for the “qualitative” decision not to allow it to pass.
These are the ones we’ve heard about, and the ones where it’s possible to point to a specific press release in which it’s admitted in so many words that there has been systems underinvestment. I do not believe for a second that Deutsche is the only bank to have received a systems-and-controls letter from the Fed, and I have strong suspicions about a few other banks which have recently “revised” cost reduction plans, to include a larger element from personnel cuts and a smaller element from IT efficiencies. This is, as far as I can tell, a system-wide problem.
For the twenty or thirty largest banks in the world, it seems very unlikely to me that the cost of catching up with the years of underinvestment will be less than $1 billion per bank. And on to that, we have to add the cost of meeting a moving set of goalposts.
Starting next year, the Basel Committee will unveil its Fundamental Review of the Trading Book, and start preparing its new rules on Interest Rate Risk In The Banking Book. Both of these will require very large overhauls of systems; big banks ought to have budgeted for them already, but the size of the projects involved is big enough to keep the overall market for financial IT specialists tight on its own.
Added to that, the European banks have the ECB’s AQR and stress tests to deal with — the initial AQR has cost the supervisors themselves the best part of €500m so far this year on external consultants alone, and the ECB intends to carry out a similar exercise every year. Substantial systems upgrades are going to be needed, as the CCAR has shown, to be able to provide the supervisors with the kind of detailed reporting and scenario analysis that they are going to be asking for.
The root of the problem is this; back in the bad old good old days, pre-2008, supervisory data requests were a matter of negotiation. No regulator wanted to be seen as excessively intrusive, nobody wanted to impose an excessive cost on the industry (if you’re over thirty years old, you’ll remember the culture of “light touch” and “industry friendly” regulation; if you’re younger than that, ask someone with a few grey hairs and a beer gut).
(When I was a bank analyst, I remember once sitting in a meeting with the management of an Irish bank, who were chortling at the foolishness of the Irish Financial Regulator employee who had asked for details of all the mortgage loans they had made above 85% loan-to-value. Apparently they had called his boss and asked for the details of the warehouse that they were meant to send three lorries of paper files to. This sort of regulator intimidation — of course they could have sent a couple of CDs and I bet today they wish they had done — really did used to go on).
But in the old days, if the regulator asked for something, the banks were very much allowed to give them what data they could scrounge up without too much inconvenience, and then tell them to go and whistle up a rope if they wanted any more. As a result, a culture grew up under which banks tended to assume that if the IT systems could process a transaction and deliver the quarterly results, then they were integrated. The failings reported in the Fed letter to Deutsche Bank — Excel spreadsheets updated by hand, databases which systematically got the sign wrong on long and short positions, all of that sort of thing — are the natural consequence of an environment in which innovation in product design was praised, but in which nobody wanted to spend money on a sufficiently flexible architecture to integrate all the new products into the existing risk management systems.
Nowadays, of course, regulatory requests for data are not like that. They come down with the force of a royal command performance, with the threat of (at the very least) not approving the year’s dividend payout. And the regulators want the data in the format they specify, without errors and on time.
We can talk about fraud detection, monitoring of employee chat and email and defence against cyber attacks later — all these things are going to cost big money, but there is reasonable room for debate about the extent to which they are already in IT budgets. My guess (and I suppose I might be wrong about this but I don’t think I am) is that they are not in current budgets at anything like the true numbers, particularly when you take into account that all three of these things are going to be the subject of regulation going forward, and that means they will have their own reporting standards too.
Just taking, though, the cost of bringing creaky systems up to scratch, plus the cost of complete systems integration, new regulation and tougher reporting standards, I think that $50bn for the thirty largest banks in the world, as a group, would be a decidedly conservative estimate. It’s not obvious to me how long the regulators are prepared to give the banks in terms of a grace period either, so the spending could end up being decidedly front-loaded. And that’s a big sum of money compared to the size of the software and support services industry. It’s five times the total 2013FY revenues of CapGemini, for example. It’s rather more than twice the annual revenue of SAP. It’s more or less the equivalent of adding another IBM Global Services’ worth of revenue to the industry. Obviously, not all of the $50bn will make it into the software & services industry revenue pool; banks will try to do as much of it as they can in house. But the one thing we know about in house IT departments at major banks is that they’re inadequately resourced for the tasks they’re doing right now; I don’t see them having much capacity to take on massive new projects. And the new revenues coming into the industry will be at unusually high margins; the banking industry are more or less forced buyers of skilled labour which is in short supply.
As far as I can tell, consensus earnings estimates for the software companies aren’t being set at levels which incorporate the revenue windfall that the bank regulators are handing them. Some investors are beginning to realise this — Patrick Lemmens, manager of Robeco’s New World Financials Equities fund, wrote an article for Robeco Insights about his long positions in a variety of outsourcing companies last month. This looks to me like it could be one of the next big earnings stories coming out of the global financial crisis — as the proverb goes, it’s an ill wind that doesn’t blow somebody a bit of good.