Why the banking industry might not be ready for IBM’s Watson
A dear friend of mine recommended an article to me about IBM’s Watson in financial services. It paints a rather confusing picture. If the consensus is that this technology will provide benefits, why is no one adopting it?
This is indeed an interesting question. The technology clearly has the potential to both save money and create completely new products and services.
On the surface one can dismiss this as a combination of short-termism and a lack of vision in an overall conservative industry. This can definitely be a partial explanation, but I am not sure it tells the full story. I will therefore try to have a bit deeper look at the issue.
One player going against the trend
To get a more profound explanation, it is worth starting with having a look at one example that goes against the trend. The Australian bank ANZ is arguably the most high-profile adopter of Watson in financial services.
ANZ has achieved a lot in the three years since its initial announcement that they would use Watson in 2013. In short a considerable effort has gone into feeding data to the system and training it to properly respond to queries. In all the hoopla surrounding big data and machine learning it is easily forgotten that it takes a rather significant effort to set these things up. Furthermore one often overlooked aspect, in my experience, is that the naming and interpretation of concepts vary within the organisation. There is a lack of a company-wide “glossary”. Without such mining data becomes very difficult. ANZ seems to have understood both these facts and acted accordingly. This is one of the reasons why I think they might be on the right track.
Changing for the right reasons
The other, and even more compelling, reason is that I believe that they do it with the right goals in mind. The aim is value creation as opposed to cost cutting. Through the technology they can provide a better service on their existing products to external stakeholders. I would not be surprised if once this is further stabilized, completely new products can be created that weren’t even thought of before. Internally value is created by automating complex but repetitive tasks. The people in the organisation will therefore be able to focus on more value added tasks. These tasks are certainly more challenging and interesting compared to the routine tasks that take up a lot of time but does not provide much in terms of intellectual stimulation. This is surely a great way to help retain talent in an organisation.
In terms of change management they also seem to be doing a lot of things right. Several of the key success factors in change management are evident especially in terms of senior sponsorship and employee participation.
Given this inspiring example, why is there such reluctance in the business as a whole?
Unclear and non-immediate benefits
It is clear that there is a significant investment into these systems involving the whole organisation. There are no quick wins. This requires an organisation with a clear vision and a long-term perspective. The advantages are not immediately apparent. It is easy to project the bottom-line effects of cost savings and layoffs, at least in the short term. It is much harder to predict the impact of this type of investment.
The products that can be created using this technology as a basis are also not apparent. Even if we know that the solution can provide answers we still have to figure out the right questions to ask. In my view these questions will materialise only as a result of exploration using the system. The investment therefore becomes a leap of faith to a certain degree.
All this taken together would understandably make risk averse decision makers reluctant.
The data is there, but might not be consumable
The information and its underlying infrastructure is another factor that affects the potential for implementation. The qualits of the data and the platforms on which it runs are both major issues in the industry.
Data quality within financial institutions is an ongoing challenge. KPMG states in a 2014 report on banking data quality that:
“Sometimes it seems that there are as many approaches to the data issue as there are banking practitioners. As a consequence, our view is that the industry is conspicuously failing to develop a coherent understanding of the strategic implications; and is failing to understand the significance of aligning a firms’ data strategy with the overall business strategy. We are nervous that the unintended consequences of this could have long term implications.
[ — -]
Under-investment in data and systems (or investment in non-optimal systems), and the continual layering up rather than redefinition of new reporting and information requirements have compounded the difficulties. Today, many major banks have at least one significant project to improve data management, IT infrastructure and reporting. However, in KPMG member firms experience, the approach being adopted for these many and often, cross-purpose projects is neither cohesive nor consistent.”
This does not paint a pretty picture of the current state of affairs in terms of data quality. Without proper data there is little point in mining it for answers.
In addition, many banks run their core systems on legacy platforms. One survey of European banks showed that up to 80% of responding banks say that outdated systems were causing them to struggle to bring new products to market quickly. Together data quality and outdated platforms are a massive blocking factor to implement a knowledge based automation solution. These factors cause issues that pose an immediate risk to the organisations and will therefore be prioritized over developing new tools.
Does senior management have the will to change?
There might be an even more straightforward explanation to why banks are not adopting this technology on a wider scale: resistance to change. Normally when we talk about resistance to change we focus on the executing layer of the organisation, the front line. But what if we can use the same reasoning on the more senior parts of the hierarchy?
According to research by Prosci there are several groups within an organisation where resistance to change are more likely. These groups include people invested in the current way of doing work, people who created the current way of doing the work, and people who have been very successful and rewarded in the current way of doing work. The senior management in the banking industry fulfill all these criteria. They are, after all, the ones with decision power over these changes. Could the answer boil down to such a simple answer?
So what is the answer to this conundrum? As usual with complex questions there is not simple answer. Instead I think that we can find the answer in the combination of the factors that I have described above.
I have no doubt that solutions and services built around platforms like Watson have a huge potential in financial services. We have seen how actors being able to harness new technology have disrupted sector after sector and I see no real argument why banking and financial services should be an exception. Instead it is rather a question of time. The financial services sector is unique in terms of the high barriers to entry both to the industry as a whole and for existing actors to offer cross-border services. This is a clear indication that the pace of adoption likely will be slower than in other industries.
It will be very interesting to see how pioneering organisations like ANZ will manage to implement and leverage the potential of these technologies and the resulting services. If they manage to generate value out of the investment I see it as almost inevitable that the rest of the industry eventually will have to follow suit.
Thank you for reading; I hope that you enjoyed my thoughts on the subject.
This post was originally published on my blog mrundberg.com.