How Viseca improves customer engagement using Apache Kafka

Our approach to designing an event-based architecture to serve the customer-centric goals of this cashless payment provider.

Vladimiro Borsi
lenses.io
7 min readSep 28, 2020

--

Viseca is Switzerland’s leading credit card issuer. For our company, engaging with customers in the right place and time is a strategic imperative — and this means it needs to happen in real-time.

Here I share some learnings from my experience in designing a streaming data platform for Viseca, which would first orchestrate 600k targeted communications, then scale across the business and reduce the time-to-market of streaming apps by 10x.

Traditional institutions were always the bedrock of financial services, but these days, we’re under threat by newer digital-natives. The evolution of technology has heightened user expectations and increased the importance of customer experience to protect market share. In this world, the tides change fast, and if you can’t keep up with customers — you become extinct.

Like any modern enterprise Viseca has many ways of engaging with customers across touchpoints and technologies: The customer journey spans payment services, call center, app, email, SMS and web portal. And all of these touchpoints are controlled by fifteen independent applications & solutions.

To increase engagement, better understand customers and offer tailored solutions, we would need to connect these applications in real-time so that they could react to customer behavior. For instance when a customer uses their Viseca payments card on their birthday, they will automatically be sent a coupon for 10 Francs by mail!

It was on us to make this level of automation happen. It would go something like this:

Concept and design by Nerves, animation by Hitch Design

But we knew that if this project was successful, there would have been a wave of new business requirements and related architectural needs. It wouldn’t just be the fifteen application “silos” of the first phase, including Unica, an IBM/HCL marketing ERP and our loyalty system Surprize — but many more from across the business.

Viseca Real-time event streaming platform for marketing automation

We had to think big, but deliver quickly. To bring the project to scale, I chose to work with a trio of strategic partners covering three pillars of value. We adopted DataOps practices through Lenses.io, enabling Viseca to work with complex Open-Source components such as Apache Kafka but design the platform for business users. We partnered with consultancy Grid Dynamics to assist in our digital transformation, and we collaborated with Dario Carnelli, Isaca certified governance expert.

How could we create an enterprise-wide streaming data platform that could cover Viseca and the payment industry’s stringent security and governance requirements?

Our DataOps journey began with complex security and compliance requirements

As a standard Viseca practice for implementing new technologies, we needed to define and run a Proof-of-Concept for our new real-time data processing platform. I would need to convince my business stakeholders that this project could work where other companies had failed.

My previous experience and an international market survey informed our choice to use Apache Kafka as the platform’s central component combined with the Hadoop stack — a framework we had already planned to use.

Kafka was also selected as a well-proven integration with “container” technology (e.g. Kubernetes), and was under evaluation by the Enterprise IT Architecture team.

But not only did we need real-time streaming capabilities, we needed granular, robust data governance that wouldn’t slow users down. This wasn’t just a streaming challenge, it was a complex security challenge. As a regulated financial services provider, PCI and GDPR always has to be our first and last consideration.

We also understood that Marketing was just the first of several lines of business that would potentially request real-time data. Which meant all our developers, data engineers, data analysts, scientists and even business analysts would need to access this data.

And I knew that these teams wouldn’t have access to innumerable Kafka experts or knowledge of niche commands. If we wanted adoption enterprise-wide, we would require simpler administration, clear and granular governance capabilities, and greater observability and debugging tools through a UI.

Winning hearts and minds of stakeholders to support the DataOps approach

Firstly, each team would not want to adopt a platform unless they were guaranteed governance.

Secondly, we needed to win their hearts and minds. Internal development teams and data engineers were not used to the world of streaming. Most worked with traditional enterprise-tools such as Oracle, MS SQL and SAS, not Open-Source technologies. Providing a black box and expecting teams to learn new commands and languages to build, deploy and manage flows was not a good way to make friends.

As if this wasn’t challenging enough, we were short on time. Although we wanted to build a solution for the entire enterprise, Marketing needed a solution quickly.

We didn’t have the available technical skills, time nor appetite to build custom tooling to operate Kafka.

To reduce the burden on the platform team, the tenants of the platform would need self-service access; for example to make and validate schema registry changes and configure Kafka Connect connectors.

We chose Lenses.io as a secure place to manage real-time data and accompanying regulations.

Staying on the right side of the GDPR

Lenses.io offered us a safe and productive portal for the many different components that make up an enterprise data platform such as Apache Kafka and Kubernetes. It integrated with our existing systems; from security, CI/CD and alert notification solutions to identity management.

This fulfilled one of our primary goals; to take care of all data and platform governance. Using Lenses we were able to make the most of self-service access to explore data, build and deploy real-time applications and get full visibility into our infrastructure, whilst staying on the right side of GDPR.

This saved us a lot of time that we would otherwise have spent developing this type of technology in-house.

Less than one year from conception to production

With Lenses over our streaming platform, we were able to get almost instant signoff for our POC and onboard our colleagues from Marketing, fast-tracking to production in under one year. In the Swiss financial sector, that’s a blink of an eye.

But most importantly, we can now allow data engineers and developers to be productive whilst removing co-dependencies on the data platform engineering team.

Take for example building stream processing applications. We would have typically relied on Flink for both simple and complex stream processing. However many of our colleagues aren’t Flink experts which increases the dependence on just a few people internally.

We saved time and sanity with efficient data streaming interactions and compliance checks & balances.

Our Marketing colleagues can use SQL to build simple stream processing apps in Lenses.io. Due to SQL-language universal diffusion, SQL Processors have simplified the skills and effort required to deploy and maintain data pipelines in production, leaving the more complex logic to the Flink developers. This both removes unnecessary workload from a small group of specialists, and hands the data over to the people who understand it.

And if the team needs to troubleshoot an application, they can do it from within Lenses too by exploring data in topics. This drastically accelerates the time taken to resolve problems.

A view into Kafka was critical to ensure both our DevOps-native employees as well as our traditional employees would support our project. Many were used to working with UI-driven management tools. Had I presented them with Kafka, a command line and doc links to 6 different APIs they would have told me I’d lost my mind.

The schema administration has made it so easy for teams to validate and make schema changes from a secured environment.

We not only saw a huge improvement in the ease of addressing compliance needs, but a massive increase in my team’s productivity and a much faster time-to-completion on the project than would have been possible otherwise.

Life as Data Engineers is hard enough without having to learn new frameworks & languages. We now address our compliance needs without a command line, providing a view into Kafka & Hadoop whilst building, debugging and deploying new flows in minutes instead of weeks.

It’s the reason why having an environment for real-time data makes more time for building value and less time spent on seeking out edge cases: this ability to safely decentralize our data was why our new streaming data platform went to production.

Unexpected bonus

As we’ve increased our adoption of Lenses and DataOps, we realized that it’s delivering on a much wider vision that we hadn’t immediately imagined. It’s allowing us to completely abstract the infrastructure. Our platform is made of far more than Kafka, Hadoop and Flink. We can change in the future or add new components without impacting the data operations or the experience the users have to build and deploy real-time applications.

As a traditional financial institution with legacy to uphold, it means we’re able to create the best possible digital payment services for our customers — now and in the future.

For more insights, I’d encourage you to read my recent reference architecture paper co-written with Lenses.io CEO Antonios Chalkiopoulos that outlines the model, mentality and approach for our Apache Kafka project.

--

--