The Role of IT in Insurance — An Interview with Dr. Tobias Rump
I met Dr. Tobias Rump, who is the Head of IT Cross-Functional Systems & Data Analytics at Zurich Group Germany, at the Big Data & Analytics conference, hosted by the German Insurance Forum in Leipzig last week. Tobias held a presentation about the role of IT between driver and enabler in today’s insurance business. We sat down and spoke about his view of what that role should be, digital transformation challenges, legacy systems, resource problems and more.
The interview has been lightly edited for clarity.
“Ultimately, the IT defines the framework of what is actually possible, which is, especially in the context of data-driven value creation, a decisive factor.”
Albert: Your talk today was titled “Data-based value creation: the role(s) of IT between driver and enabler”. Which role should the IT ideally take on?
Tobias: It is difficult to reduce the IT department to one role. I do think of the IT department as a catalyst — or influencer, if you will — and ultimately as a driver. This is something that we are obligated to be, considering that we have an advisory duty and that other business units cannot fulfil our role. Ultimately, the IT defines the framework of what is actually possible, which is, especially in the context of data-driven value creation, a decisive factor. There are business units which are experienced in defining technological possibilities and options, for instance in the actuarial field, but most units are not. That’s why I like to think in terms of cycles, where the IT starts off as an influencer and driver, moving into an enabler and then gradually a controller mode, where we manage complexity and ensure standardisation. Taking on the controller mode is somewhat of a thankless job, but it comes with the territory; and its necessary because you always have cost considerations looming on the horizon.
Albert: A common organizational problem are so-called “alignment gaps” between IT departments and other business units. How can this be avoided in your opinion?
Tobias: Communication is very, very important. HBR recently asked a data scientist “which skill is more important for a data scientist: the ability to use the most sophisticated deep learning models, or the ability to make good PowerPoint slides,” and he answered that its the latter. Sure, it was a provocative question, but I believe that it is primarily communication which can close alignment gaps between the IT and other business units quite quickly.
Albert: Can you provide us with an example of how you are working cross-functionally on data projects at Zurich?
Tobias: A good example would be our analytics steering committee, which was established by the IT department and is overseeing our data and analytics initiatives. Within the steering committee, we combine department heads and experts across different business functions, who together develop standards, policies, and prioritize initiatives. We define use cases, roles, processes and governance aspects and also develop training concepts and knowledge sharing programmes. Our work is defined by a healthy dose of pragmatism, grassroots approach and “guerrilla tactics,” whereby we take incremental steps that follow specific opportunities. The key to all of this is to have people in the committee with real decision-making power.
Albert: On the way to becoming a data-driven insurer, old legacy systems and silos can become quite a stumbling block. How are you managing your legacy systems at Zurich, so that they don’t hinder your data strategy?
Tobias: We have a very established and unconsummated data warehouse landscape at Zurich, and we’ve been investing into docking all legacy systems onto one central data warehouse for 15 years. To fill our data lake with data, the very basic approach here is to simply mirror the data warehouse into the data lake. Now, the data lake is quite undemanding, for instance when it comes to systems that are docked on, so it pretty much can ingest anything. In that sense, legacy systems do not pose a stumbling block for us, something that would be quite different, if we’d still have a number of different data silos, for instance. We’d also certainly face more of a hurdle, if we would currently have uses-cases or requirements that require real-time capabilities. At the moment, this is not an issue though. And, if you look at our non-life insurance lines, for instance, the legacy problem has already pretty much resolved itself with the implementation of Guidewire.
“I don’t always understand why the focus is solely on growth, often underlined with very rosey promises, which you are of course free to believe or not.”
Albert: Where do you currently see the biggest potential in process digitisation and automation along the insurance value chain?
Tobias: When you look at big data, a lot of the work is being done at the customer-facing front and growth in general. Process efficiency, I feel, is not emphasised the same way. However, I think that it holds enormous potential and even if you just consider the sheer size of what you’re dealing with, it should lead you to some very good use cases.
A classic example is the management of incoming documents, most of which comes in the form of unstructured data. In our case you are talking about thousands of documents per day and when you are then thinking about machine learning or self-learning algorithms, there is simply immense potential in process efficiency. There’s also the benefit of being able to measure your success quickly and tangibly. As an insurer we are “KPI’ed” and measured through and through, so that the argumentation for a specific business case should be much easier as well. I don’t always understand why the focus is solely on growth, often underlined with very rosey promises, which you are free to believe or not. I am of course exaggerating a bit, but I do believe that improvements in process efficiency can be very profitable and therefore warrant more investment and focus. Improved efficiency naturally also improves the customer experience, so it really is a win-win situation.
“You are afforded a wonderful data science playground here, something that is often not communicated well-enough to the outside.”
Albert: Which competencies do you need to build up internally in order to accomplish your goals? Any challenges on the HR front?
Tobias: I can certainly confirm that it is difficult for us to find the right talent in the market, because we as insurers are not emblematic of agility etc. What we are doing more and more now is to qualify and train people internally. For instance, we have established a data science training and qualification program which is open to employees who already possesses an affinity for data and algorithms and want to reach the next level. The participants already know all the internal workings of Zurich and also realise the benefits this company has to offer. You are afforded a wonderful data science playground here, something that is often not communicated well-enough to the outside.
Albert: Can you tell us a bit more about the internal data science training programme at Zurich?
We started this year in May and at the moment 12 employees are part of this DataCamp-based program and 25% of their work time is allocated to it. This is of course a considerable investment and not always easy to integrate into day-to-day operations, but we believe that if you setup such a program, you either go all the way or not at all. The group carries quite a load, because the daily operational pressure does not cease, but the overall feedback is excellent. And since our data scientists are working decentralized in their respective departments, it was important to chose a hybrid education model, where we can bring people together to learn, but then have them apply their learning in their respective field of expertise.
I hope you have enjoyed this interview. For questions or comments, or if you want to receive updates, get in touch via Email firstname.lastname@example.org or Twitter @cngthedots.